https://www.theregister.co.uk/2018/0..._justice_code/

DEF CON American police and the judiciary are increasingly relying on software to catch, prosecute and sentence criminal suspects, but the code is untested, unavailable to suspects' defense teams, and in some cases provably biased.

In a presentation at the DEF CON hacking conference in Las Vegas, delegates were given the example of the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) system, which is used by trial judges to decide sentencing times and parole guidelines.

"The company behind COMPAS acknowledges gender is a factor in its decision-making process and that, as men are more likely to be recidivists, so they are less likely to be recommended for probation," explained Jerome Greco, digital forensics staff attorney for the Legal Aid Society.

"Women [are] thus more likely to get probation, and there are higher sentences for men. We don’t know how the data is swaying it or how significant gender is. The company is hiding behind trade secrets legislation to stop the code being checked."

These so-called advanced systems are often trained on biased data sets, he said. Facial recognition software is often trained on data sets filled with predominantly white men, he said, making it less effective at correctly matching up people of color, according to research by academics.

"Take predictive policing software, which is used to make decisions for law enforcement about where to patrol," Greco said. "If you use an algorithm based on data from decades of racist policing you get racist software. Police can say 'It's not my decision, the computer told me to do it,' and racism becomes a self-feeding circle."

...
Full article at link.

---

Innocent Until Proven Guilty
Guilty Until Proven Innocent
Guilty