Skip to main content

Can Racial Bias Ever Be Removed From Criminal Justice Algorithms? [psmag.com]

 

Dozens of people packed into a Philadelphia courtroom on June 6th to voice their objections to a proposed criminal justice algorithm. The algorithm, developed by the Pennsylvania Commission on Sentencing, was conceived of as a way to reduce incarceration by predicting the risk that a person would pose a threat to public safety and helping to divert those who are at low risk to alternatives to incarceration.

But many of the speakers worried the tool would instead increase racial disparities in a state where the incarceration rate of black Americans is nine times higher than that of white people. The outpouring of concern at public hearings, as well as from nearly 2,000 people who signed an online petition from the non-profit Color of Change, had a big effect: While the sentencing commission had planned to vote June 14th on whether to adopt the algorithm, members decided to delay the vote for at least six months to consider the objections and to solicit further input.

Algorithms that make predictions about future behavior based on factors such as a person's age and criminal history are increasingly usedβ€”and increasingly controversialβ€”in criminal justice decision-making. One of the big objections to the use of such algorithms is that they sometimes operate out of the public's view. For instance, several states have adopted a tool called COMPAS developed by the company Northpointe (now called Equivant), which claims the algorithm is proprietary and refuses to share crucial details of how it calculates scores.

[For more on this story by STEPHANIE WYKSTRA, go to https://psmag.com/social-justi...s-from-the-algorithm]

Add Comment

Comments (0)

Post
Copyright Β© 2023, PACEsConnection. All rights reserved.
×
×
×
×
Link copied to your clipboard.
×