Amazon AI Recruiting Tool Systematically Penalised Women's CVs for Five Years

What happened
Amazon built an AI recruiting tool trained on CVs submitted over a 10-year period. Because the tech industry is male-dominated, the model learned to penalise CVs containing words like "women's" and downgraded graduates of all-women's colleges. The tool was scrapped in 2018 after five years of development.[1]
What went wrong
The model was trained to replicate past hiring decisions, which encoded existing gender imbalances. No fairness metrics were applied during training or evaluation. The system was in use for five years before the bias was discovered — suggesting inadequate post-deployment monitoring.[1]
Lesson learned
AI trained on historical human decisions will learn and amplify human biases. Fairness audits must be part of the deployment checklist for any system affecting people's opportunities. Past human decisions are often the worst training data for systems meant to improve on human judgment.