My research examines the emerging concept of ‘algorithmic governmentality’, the idea that we are experiencing a technologically-driven shift in the way power operates in society, based on increasing reliance on predictive models drawn from ‘Big Data’. It is argued that this shift is underpinned by a deeply capitalist ideology, based on the monetization of digitized personal data and a push for looser data protection legislation. Although rooted in the emergence of statistics in the eighteenth and nineteenth centuries, ‘algorithmic governmentality’ is distinct from previous statistical projects both because of the sheer volume of data produced and because this data is often deeply personal in nature. My research examines the Investigatory Powers Act 2016 (the ‘Snooper’s Charter’), arguing that this legislation allows state access to increasingly valuable data, and explores the possibility of resistance to algorithmic governmentality, including via EU measures such as the General Data Protection Regulation (GDPR).
My research uses the work of Michel Foucault, particularly his ideas on governmentality, as a means of exploring this new form of power and the forms of knowledge that sustain it.
I was born and raised in Northern Ireland, and came into the School of Law from a criminology background. My research interests include Big Data, algorithms and machine learning; how certain forms of knowledge are produced and characterized as 'truth', and the importance of classification and numbering as forms of power.