Self-determination

Crime-prediction AI biased against black defendants

Judges across the United States are using racially biased AI in assessing criminal defendants’ risk of committing future crimes. A study by ProPublica shows that the software used is biased against blacks. According to the public interest news organisation predictive policing systems are remarkably unreliable in forecasting future violent crime.

ProPublica analysed the risk scores of more than 7,000 people arrested in Broward County, Florida. “The score proved remarkably unreliable in forecasting violent crime,” ProPublica states. Algorithms assessing a defendant’s risk of re-offending are currently determining sentencing and parole for many prisoners in the United States.

According to ProPublica the AI software consistently misevaluated black defendants as high risk and white defendants as low risk. “We ran a statistical test that isolated the effect of race from criminal history and recidivism, as well as from defendants’ age and gender. Black defendants were still 77 percent more likely to be pegged as at higher risk of committing a future violent crime and 45 percent more likely to be predicted to commit a future crime of any kind.”

The analysis by ProPublica found the algorithm was wrong in 80 percent of cases at predicting who would commit a violent act. It is “somewhat more accurate than a coin flip” when predicting who would reoffend when including those who committed misdemeanours and felonies.