[Coming May 13 in the US!] The AI Con: How to Fight Big Tech's Hype and Create the Future We Want. Emily M. Bender and Alex Hanna, Harper Books.
Book Chapter: Extracting Insights in and around Government: Experiences in South Africa. Human Development and the Data Revolution. Vukosi Marivate and Nyalleng Moorosi, Oxford University Press.
What Knowledge Do We Produce from Social Media Data and How? Proceedings of the ACM on Human-Computer Interaction.
Navigating labour’s labyrinth: Developing a typology of platform work in Sub-Saharan Africa. Platforms & Society.
The role of expertise in effectively moderating harmful social media content. ACM CHI 2025.
The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence. First Monday.
Beyond Fairness in Computer Vision: A Holistic Approach to Mitigating Harms and Fostering Community-Rooted Computer Vision Research. Foundations and Trends® in Computer Graphics and Vision. [PDF]
A critical analysis of Rwanda’s Digital skills and entrepreneurship training toward solving youth unemployment. Journal of Business and Enterprise Development (JOBED)
Guilds as Worker Empowerment and Control in a Chinese Data Work Platform. Proceedings of the ACM on Human-Computer Interaction.
U.S. and Canadian Higher Education Protests and University and Police Responses, 2012-2018. Socius.
Trapped in the Matrix: Algorithmic Control and Worker Dispossession in the African Platform Economy. Weizenbaum Journal of the Digital Society.
“We try to empower them” – Exploring Future Technologies to Support Migrant Jobseekers. FAccT.
The Subjects and Stages of AI Dataset Development: A Framework for Dataset Accountability. Ohio State Tech Law Journal.
Mobilizing Social Media Data: Reflections of a Researcher Mediating between Data and Organization. CHI.
Constructing Relational and Verifiable Protest Event data: Four Challenges and Some Solutions. Mobilization.
Labour, Automation, and Human-Machine Communication. The SAGE Handbook of Human-Machine Communication.
AI Art and its Impact on Artists. AIES. Spanish version translated by Arte es Ética.
Combating Harmful Hype in Natural Language Processing. PML4DC workshop at ICLR. (news coverage).
AI and Inequality in Hiring and Recruiting: A Field Scan. SSOAR.
AI for Whom? Shedding Critical Light on AI for Social Good. NeurIPS Computational Sustainability Workshop.
A Human Rights-Based Approach to Responsible AI. EAAMO.
Algorithmic Tools in Public Employment Services. Best student paper at FAccT!
Black Protests in the United States, 1994 to 2010. Sociological Science.
Documenting Data Production Processes: A Participatory Approach for Data Work. CSCW.
Constructing Relational and Verifiable Protest Event Data: Four Challenges and Some Solutions. Mobilization.
Documenting Data Production Processes. CSCW.
The Data-Production Dispositif. CSCW. Blog Post. Honorable Mention, Impact Award, and Methods Award!
When is Machine Learning Data Good?: Valuing in Public Health Datafication. CHI.
Constructing a Visual Dataset to Study the Effects of Spatial Apartheid in South Africa. NeurIPS Datasets and Benchmarks Track.
Datasheets for Datasets Video. CACM.
Data Workers’ Inquiry: recentering workers’ epistemic authority. Future of Work.
Replacing Federal Workers with Chatbots would be a Dystopian Nightmare. Scientific American.
The Human Labour of Data Work: Capturing Cultural Diversity through World
Wide Dishes. arXiv.
Bridging the Gap: Integrating Ethics and Environmental Sustainability in AI Research and Practice. arXiv.
Power to the People: Can decentralisation deepen Africa's development? Africa in Fact.
Community Driven Approaches to Research in Technology & Society CCC Workshop Report. CCC Blog announcement. Computing Community Consortium.
“I hope this isn’t for weapons.” How Syrian data workers train AI. Untold Magazine.
Zombie Trainers and a New Era of Forced Labor. Newsweek.
Algorithmic Shackles: How AI Erodes Worker Autonomy in the Majority World. Bot Populi.
Review: Colorblind Tools: Global Technologies of Racial Power, by Marzia Milazzo. Ethnic Studies Review.
Puny Gods and Silicon Saviors: Challenging the AI Salvation Narrative. Leeds International Festival of Ideas.
Who Trains the Data for European Artificial Intelligence? HAL Open Science.
The Human Cost of Our AI-Driven Future. Noēma.
Settling the Score on Algorithmic Discrimination in Health Care. NEJM AI.
Using labels to limit AI misuse in health. Nature Computational Science.
Racism is an ethical issue for healthcare artificial intelligence. Cell Reports Medicine.
Fostering a Federated AI Commons Ecosystem. Policy brief by the T20, a G20 engagement group.
Who is tech really for? The New York Times.
Better datasets aren't enough: Why we need AI for Africa to be developed in Africa. Nature.
General Purpose AI Poses Serious Risks, Should Not Be Excluded From the EU’s AI Act. Policy Brief.
An internet women want is free from digital colonization. CNN.
AI Causes Real Harm. Let's Focus on That over the End-of-Humanity Hype. Scientific American.
“AI” Hurts Consumers and Workers -- and Isn’t Intelligent. Tech Policy Press.
“I don’t have a gender, consciousness, or emotions. I’m just a machine learning model”. UNESCO.
Data Work and its Layers of (In)visibility. Just Tech.
The Performativity of Ground-Truth Data. unthinking photography.
Evaluating the Social Impact of Generative AI Systems. arXiv.
Timnit Gebru says harmful AI systems need to be stopped. The Economist.
Turk Wars: How AI Threatens the Workers Who Fuel It. SSIR.
What Senator Cory Booker Gets Wrong about Black People and AI. VISIBLE Magazine.
Policy Is Urgently Necessary to Enable Social Media Research. Tech Policy Press.
The Exploited Labor Behind Artificial Intelligence. (Spanish Translation). Noema Magazine.
AI Ethics Are in Danger. Funding Independent Research Could Help. SSIR.
We warned Google that people might believe AI was sentient. Now it’s happening. Washington Post.
AI & Cities Risks, Applications and Governance. UN Habitat.
Effective Altruism Is Pushing a Dangerous Brand of ‘AI Safety’. Wired.
For truly ethical AI, its research must be independent from big tech. The Guardian.
1. We, along with the AI Now Institute and Logic(s) Magazine, are proud to support Surveillance Watch: an interactive map revealing the intricate connections between surveillance companies, their funding sources and affiliations.