Skip to main content

Algorithms are Biased. That Might Help Regulators End Discrimination, a New Paper Argues [psmag.com]

 

Do the multitude of algorithms engineered to govern our livelihoods reflect the biases and prejudices of those who create them? Absolutely—but, according to a new paper, that may not ultimately be such a bad thing.

Over the past decade, a growing body of research has found that algorithms themselves can contain biases: For example, a ProPublica analysis in 2016 found that an algorithm used by many states in sentencing decisions and to set bail overestimates the likelihood of recidivism of black defendants and underestimates that of white ones; a 2013 paper found that, when someone searches a name on Google, the search engine is 25 percent more likely to serve advertisements for websites suggesting that person has an arrest record if the searched name is black-identifying, like DeShawn or Jermaine.

A National Bureau of Economic Research working paper, authored by renowned legal scholar Cass Sunstein, Jon Kleinberg, Jens Ludwig, and Sendhil Mullainathan, posits that such algorithmic biases can actually help expose the intangible and often unconscious biases of their creators in measurable, and therefore legally actionable, terms. If Adam Smith's "invisible hand" guides the progress of free markets, algorithms can help make the complexities of methodological individualism, and discrimination, visible in the eyes of the state.

[For more on this story by JARED KELLER, go to https://psmag.com/social-justi...n-a-new-paper-argues]

Add Comment

Comments (0)

Post
Copyright © 2023, PACEsConnection. All rights reserved.
×
×
×
×
Link copied to your clipboard.
×