Abstract
This paper reports on empirical work conducted to study perceptions of unfair treatment caused by automated computational systems. While the pervasiveness of algorithmic bias has been widely acknowledged, and perceptions of fairness are commonly studied in Human Computer Interaction, there is a lack of research on how unfair treatment by automated computational systems is experienced by users from disadvantaged and marginalised backgrounds. There is a need for more diversification in terms of the investigated users, domains, and tasks, and regarding the strategies that users employ to reduce harm. To unpack these issues, we ran a prescreened survey of 663 participants, oversampling those with at-risk characteristics. We collected occurrences and types of conflicts regarding unfair and discriminatory treatment and systems, as well as the actions taken towards resolving these situations. Drawing on intersectional research, we combine qualitative and quantitative approaches in order to highlight the nuances around power and privilege in the perceptions of automated computational systems. Among our participants, we discuss experiences of computational essentialism, attribute-based exclusion, and expected harm. We derive suggestions to address these perceptions of unfairness as they occur.
Original language | English |
---|---|
Article number | 445 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Volume | 6 |
Issue number | CSCW2 |
DOIs | |
Publication status | Published - 11 Nov 2022 |
Keywords
- Algorithmic fairness
- Automated computational systems
- Conflicts
- Intersectionality
Fingerprint
Dive into the research topics of 'Intersectional Experiences of Unfair Treatment Caused by Automated Computational Systems'. Together they form a unique fingerprint.Datasets
-
Unfair treatment by automated computational systems
van Nuenen, T., Such, J. & Coté, M., King's College London, 19 Aug 2022
DOI: 10.18742/20499216, https://kcl.figshare.com/articles/dataset/Unfair_treatment_by_automated_computational_systems/20499216
Dataset