Robodebt: Australia's Algorithm Issued 521,000 Unlawful Debt Notices — Officials Knew It Was Illegal and Continued Anyway

What happened
From 2016 to 2019, the Australian Department of Human Services operated the Online Compliance Intervention system — universally known as Robodebt — which automatically cross-referenced welfare payment records against tax data. The algorithm divided reported annual income by 26 to derive a fortnightly figure, then issued debt notices when discrepancies appeared. The methodology was mathematically wrong for any worker with variable earnings. It generated 521,000 unlawful debts totalling $1.76 billion. Internal legal advice in 2017 flagged it as unlawful. The scheme ran for two more years. A Federal Court ruled it illegal in 2020. A $1.8 billion class action settlement followed in 2021. A Royal Commission in 2023 found it was 'a crude and cruel' scheme and made criminal referrals against senior public servants.[1]
What went wrong
The core flaw was in the algorithm's foundational assumption: annual income ÷ 26 ≠ fortnightly income for seasonal workers, casual employees, or anyone whose earnings varied across the year. A fruit picker who earned $30,000 over three months would be assessed as earning $1,154 per fortnight year-round, triggering a debt notice for welfare payments received during the nine months they earned nothing. The scheme also reversed the burden of proof: rather than the government demonstrating a debt existed, it issued a notice and demanded the recipient disprove it — often requiring records from five or more years prior, which most did not have. A third-party debt collector (Dun & Bradstreet) was contracted to pursue the debts, adding fees and intensifying pressure on people who were already financially vulnerable. Internal legal advice provided to the Department of Human Services in 2017 explicitly warned the income-averaging methodology had no legal basis. A briefing note to the minister made the same point. The scheme continued regardless for two more years, collecting money the government had no legal right to demand. At least 2,000 welfare recipients targeted by the scheme have died — a figure cited repeatedly during the Royal Commission, which heard testimony linking the stress of false debt notices to suicides.[1]
Lesson learned
Automated systems that reverse the burden of proof onto the least powerful people in society are not efficiency gains — they are legal and moral failures deferred. The defence that 'it was just an algorithm' has never held in a court of law, and Robodebt is now the definitive case study in why it shouldn't. The scheme's designers knew by 2017 that it was unlawful. They continued because the political incentive — appearing fiscally tough on welfare spending — outweighed the legal and human risk, in a calculation that proved catastrophically wrong. Any automated compliance system that generates revenue by placing the burden of proof on the subject deserves the same level of legal scrutiny as a search warrant.
Sources
- [1]
External links can go dark — pages move, paywalls appear, domains expire. Every source above includes a Wayback Machine snapshot link as a fallback. All citations are best-effort research; if a source contradicts our summary, the primary source takes precedence.