Sign Up
Need Help?

Pretrial Algorithms

The Bail Project / Newsroom  / Pretrial Algorithms

A study released by the Center for Court Innovation last week offers further proof that pretrial risk-assessments tools⁠—which some states have turned to in place of cash bail⁠—assign higher risk scores to Black people compared to white, meaning the former are more likely to remain incarcerated where risk assessments are used.

While others have made similar observations, the study adds value to the discussion because it suggests this kind of racism is intrinsic to the risk assessment model by design, and not particular to just one or a handful of assessments.

The New York Times reported that a glitch that prevented East Baton Rouge Parish’s jury database from updating properly left over 150,000 people off the jury rolls: “Across the country, computer-reliant jury coordinators have for years confronted database problems that kept otherwise-eligible potential jurors from being called to the nation’s courthouses…Such systemic exclusions raise constitutional concerns and threaten the integrity of the jury system, legal experts said.” The Times provides a look into the legal, constitutional, and ethical questions that arise when algorithms or other technical systems malfunction, leading to real-life consequences for people (as always, experienced along the lines of race, class, gender, and more).