Article Series
Browse our collection of multi-part article series on various topics
AI Fairness 101 - Real-World Incidents 4 parts
- 1
When an Algorithm Broke Thousands of Families: The Netherlands Child Welfare Scandal
How a design-phase failure in the Dutch childcare fraud algorithm created one of the worst AI governance disasters in Europe — and what the Global South must learn from it.
- 2
Access Denied: How India's Digital 'Cure-All' Became a Real-World Fairness Crisis
How Aadhaar’s promise of digital inclusion turned into one of the largest algorithmic exclusion crises in the world.
- 3
The Golden Touch of Ruin: How Michigan’s MiDAS Algorithm Falsely Accused 40,000 People of Fraud
A deep dive into Michigan’s MiDAS unemployment fraud algorithm — and how design-phase failures, automation bias, and the removal of human oversight turned efficiency into injustice.
- 4
The COMPAS Algorithm Scandal: When AI Decides Who Goes to Jail ⚖️
As AI enters courts and welfare systems worldwide, the COMPAS debate reveals a critical lesson: fairness depends on context, and exporting models without reform risks scaling inequality.
AI Sustainability 101 1 part
AI-Policies 3 parts
- 1
Beyond America's AI Action Plan: A Global South Response on Fairness
America's AI Action Plan's redefinition of fairness, by removing Diversity, Equity, and Inclusivity (DEI), risks hard-coding inequities for the Global South, necessitating a proactive response to define its own culturally and contextually relevant AI fairness standards.
- 2
When Good Intentions Go Global: Why the EU AI Act Doesn’t Fit the Global South
Why the EU AI Act—designed for data-rich, institutionally mature European economies—breaks down when applied to the Global South.
- 3
Mind the Gap: Why the NIST AI Risk Framework Breaks Down in the Global South
The NIST AI Risk Management Framework (AI RMF) is increasingly treated as a global blueprint for “trustworthy AI.” But what happens when a framework designed for resource-rich Western institutions is applied to the Global South?