Skip to main content

If A Pre-Trial Risk Assessment Tool Does Not Satisfy These Criteria, It Needs to Stay Out of the Courtroom [EFF.org]

 

Algorithms should not decide who spends time in a California jail. But that’s exactly what will happen under S.B. 10, a new law slated to take effect in October 2019. The law, which Governor Jerry Brown signed in September, requires the state’s criminal justice system to replace cash bail with an algorithmic pretrial risk assessment. Each county in California must use some form of pretrial risk assessment to categorize every person arrested as a “low,” “medium,” or “high” risk of failing to appear for court, or committing another crime that poses a risk to public safety. Under S.B. 10, if someone receives a “high” risk score, the person must be detainedprior to arraignment, effectively placing crucial decisions about a person’s freedom into the hands of companies that make assessment tools.

Some see risk assessment tools as being more impartial than judges because they make determinations using algorithms. But that assumption ignores the fact that algorithms, when not carefully calibrated, can cause the same sort of discriminatory outcomesas existing systems that rely on human judgement—and even make new, unexpected errors. We doubt these algorithmic tools are ready for prime time, and the state of California should not have embraced their use before establishing ways to scrutinize them for bias, fairness, and accuracy.

EFF in July joined more than a hundred advocacy groupsto urge jurisdictions in California and across the country already using these algorithmic tools to stop until they considered the many risks and consequences of their use.

To continue reading this essay by Hayley Tsukayama and Jamie Williams, go to: https://www.eff.org/deeplinks/...iteria-it-needs-stay

Add Comment

Comments (0)

Post
Copyright © 2023, PACEsConnection. All rights reserved.
×
×
×
×
Link copied to your clipboard.
×