Skip to main content

AI-fueled inequities

AI is fueled by the artists, writers and “zombie trainers” (see below) supplying training and evaluation data without consent or compensation. Vulnerable populations like refugees and gig workers are hired as data workers under exploitative working conditions (see Data Workers Inquiry). These same vulnerable populations are also the first to experience the harms of AI systems, whether it is through automated weapons or surveillance systems (see Surveillance Watch). This page has more of our research on the inequities fueled by AI systems.

Image: Leo Lau & Digit / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

Painting of two knowledge workers, running inside a wheel embedded within a computer mouse, struggling to keep pace as a powerful hand (representing employers or capitalistic forces).The image reveals that despite AI and digital adoption, rising productivity expectations often trap workers into producing more with less. Rather than easing labour, AI technologies risks raising the baseline demands, accelerating the pace of work under the illusion of progress. This image was selected as a winner in the Digital Dialogues Art Competition, which was run in partnership with the ESRC Centre for Digital Futures at Work Research Centre (Digit) and supported by the UKRI ESRC.o

The Data-Production Dispositif

This paper investigates outsourced machine learning data work in Latin America by studying three platforms in Venezuela and a business process outsourcing company in Argentina.

Book: Race After Technology: Abolitionist Tools for the New Jim Code

In this 2019 book, Ruha Benjamin shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite. Race After Technology offers conceptual tools for decoding tech promises with sociologically informed skepticism