Why me and not us?: A study on algorithmic discrimination, collectivity and access to justice
Jenni Hakkarainen (LL.M.) will defend her doctoral dissertation entitled “hy me and not us? - A study on algorithmic discrimination, collectivity and access to justice” in the Faculty of Law, University of Helsinki, on 18 May 2024 at 12:00. The public examination will take place at the following address: Porthania, PII, Yliopistonkatu 3.
Assistant Professor Katja de Vries, Uppsala University, will serve as the opponent, and Riikka Koulu as the custos.
The dissertation is available in electronic form in Helda.
Abstract
People discriminate. Algorithms discriminate. Currently the legal and technology practitioners as well as scholars are working to understand the discriminatory features of algorithmic technologies. Northpointe COMPAS, Amazon’s recruitment algorithm and Deliveroo’s application for drivers provide only a fraction of the discriminatory features those technologies lay upon us. This dissertation is about individuals in the need of redress and legal procedures that should provide protection against unjust effects of algorithms, such as discrimination. During the dissertation I will seek answers to these questions:
1) How does algorithmic discrimination differ from discrimination before?
2) Is the procedural law equipped to manage algorithmic discrimination; If not, why not?
After answering to these questions, I will proceed to imagine the future of procedural law and algorithmic discrimination:
3) What elements are already embedded in the law that can be utilised in the design of legal procedures against discriminatory algorithms.
After analysing the final question, I will take some time to imagine the structures of the future procedural law. I used two mainstream theories: access to justice, science and technology studies (STS) and complemented them with feminist studies. My primary method is principle based doctrinal research. However, in order to understand technical effects and how they challenge some of the core assumptions of the law, such as individual based legal procedures, I have used STS throughout the dissertation.
Existence of discrimination is not novel, as the problem of discrimination and biased decision-making is ancient. Concerns over legal procedures as arenas for contestation against discrimination is less ancient, originating from the emergence of non-discrimination laws. After having decades to evolve, legal procedures still struggle managing discrimination cases. Human to human discrimination is a difficult concept for procedural law, and traditional legal procedures often fail to address both the individual issue as well as discrimination as a social, structural phenomena. Meanwhile, a whole new logic of discrimination has risen from technologies and algorithmic practices and is now challenging the law.
I argue that algorithmic discrimination is a collective and dynamic phenomenon that operates at a distance, which, in its turn, does not fit in individualistic, reactive legal procedures which often require a first-hand experience of being discriminated against. Therefore, individual and reactive procedural law has come to its end. Moreover, the fragmented nature of procedural rules for cross-border and context dependent disputes drastically weakens opportunities for accessing and receiving justice. In the final pages of the dissertation, I argue that collective, preventive and impersonal elements already exist in current procedural architecture, but in their current form they do not correspond with the needs that arise from the emergence of algorithmic discrimination.