Max-Planck-Forschungsgruppe - AIming Toward the Future: Policing, Governance, and Artificial Intelligence
Artificial Intelligence and related technologies are already transforming our world socially, economically, and politically, and they are quickly becoming part of people’s everyday lives, most of the time in invisible ways. They are having important and immediate consequences for a wide range of issues relevant for governance and justice, including where and when law enforcement will direct policing efforts and how they will be held accountable; what decisions a judge will most likely make; and who populates prisons and for how long, to mention just a few. What these issues all have in common is the fact that decision-making takes place through algorithmic assemblages, raising questions and concerns about the fairness, accountability, and transparency of such decisions.
These systems often have an aura of objective truth and scientific legitimacy that draws attention away from their contingencies and constructive (and destructive) powers. A growing number of scholars and policy makers have emphasized the urgent need for more studies that look at the everyday and long-term effects of the use of AI, without losing sight of its original conceptualization.
The independent Max Planck research group, headed by Maria Sapignoli, AIming Toward the Future is designed to do precisely this in the context of policing and governance. The project will investigate how AI and related technologies are being conceptualized, developed, transferred, and applied in the context of the intensification of state and non-state policing. It will also consider the involvement of the private sector in governance and criminal justice initiatives.
The project accomplishes this through two interconnected research trajectories and foci: the first looks at how AI and other predictive and digital technologies are employed for law enforcement and how this plays out on the ground. In other words, it will consider these technologies in practice. The second trajectory, through an ethnography in tech institutions and laboratories, pays attention to the conceptualization, creation, and transfer of these policing technologies by tech experts, namely engineers and designers.
One of the key methodological approaches is ethnographic. An empirical approach will help in understanding the social changes and the future these new technologies produce - why they get used in the first place, why people trust them (or not), how they are created and ultimately translated and deployed. This is of particular importance at a time when governance seems to be quickly shifting from the human to the humanoid, affecting decision-making that impacts people’s basic rights.
The project is guided by concerns and open questions on the place of algorithms, and digital technologies more generally, in deploying or expressing relationships of power, and the role of tech corporations in shaping governance, policies, and laws. It aims to shed light on the use and effect of these technologies in policing, governance and society, emphasizing the implications their uses have for social inequality, law, and the outcomes of criminal justice procedures.