Amazon AI Recruiting Tool Systematically Penalised Women's CVs for Five Years

Reuters
Amazon AI Recruiting Tool Systematically Penalised Women's CVs for Five Years
Amazon headquarters building in Seattle, the company that built the recruiting tool biased against women's CVs.Image: Amazon.com, Inc. — Public domain (PD-textlogo) via Wikimedia Commons · Public domain

What happened

Amazon built an AI recruiting tool trained on CVs submitted over a 10-year period. Because the tech industry is male-dominated, the model learned to penalise CVs containing words like "women's" and downgraded graduates of all-women's colleges. The tool was scrapped in 2018 after five years of development.[1]

Amazon's AI recruiting tool learned from historical hiring data and systematically downranked CVs containing the word "women" for five years.Image: Bad.Technology archive

What went wrong

The model was trained to replicate past hiring decisions, which encoded existing gender imbalances. No fairness metrics were applied during training or evaluation. The system was in use for five years before the bias was discovered — suggesting inadequate post-deployment monitoring.[1]

Lesson learned

AI trained on historical human decisions will learn and amplify human biases. Fairness audits must be part of the deployment checklist for any system affecting people's opportunities. Past human decisions are often the worst training data for systems meant to improve on human judgment.

Est. value burned ~$100M SCP: 5 years × team cost + legal exposure for systematic gender discrimination

Sources

  1. [1]

External links can go dark — pages move, paywalls appear, domains expire. Every source above includes a Wayback Machine snapshot link as a fallback. All citations are best-effort research; if a source contradicts our summary, the primary source takes precedence.