COMPAS - MACHINE BIAS
2016
COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) is a risk assessment tool developed by Northpointe (later Equivant) and used in U.S. courts to predict the likelihood that defendants would reoffend. Judges and parole boards used its scores as part of decisions about bail, sentencing, and supervision.
In 2016, a major investigation by ProPublica analyzed thousands of COMPAS risk scores. The study found that while the tool was racially skewed. Black defendants were nearly twice as likely as white defendants to be misclassified as high risk, while white defendants were more often labeled low risk.
This study sparked a widespread debate about bias, accountability, and transparency in predictive algorithms used in high-stakes decisions.
Despite the controversy, COMPAS and similar tools remain in use in many jurisdictions, though often in conjunction with human oversight. The case has become a landmark example of how algorithms can reproduce and reinforce structural inequalities under the appearance of objectivity.
Additional info:
The Atlantic - A Popular Algorithm Is No Better at Predicting Crimes Than Random People
The Guardian - Software 'no more accurate than untrained humans' at judging reoffending risk
ProPublica - Machine Bias. There’s software used across the country to predict future criminals. And it’s biased against blacks