搜尋此網誌

2019年1月29日星期二

AI已經將人送入監獄 - 並且出錯

AI已經將人送入監獄 - 並且出錯
Artificial Intelligence Is Already Sending People To Jail — And Getting It Wrong

As machine-learning algorithms, big data methods and artificial intelligence are increasingly used in the toolkit of U.S. law enforcement agencies, many are worrying that the existing biases of the criminal justice system are simply being automated – and deepened.

Police departments are increasingly relying on predictive algorithms to figure out where to deploy their forces by blanketing cities with a mesh of human-based and computerized surveillance technology including, but not limited to, data-mining, facial recognition, and predictive policing programs.

This comes despite the flaw in such tools. Facial recognition software have often held a bias toward darker-skinned individuals, including mistaking members of Congress for criminal suspects. In essence, racial profiling has become automated while allowing law enforcement agencies to claim that the computers are race-neutral tools.

In Los Angeles County, for example, all 47 police agencies are plugged into a biometrics system maintained by NEC Corporation of America, which claims to have the capacity to incorporate 15 million subjects into its facial recognition platform – giving a powerful boost to a wide-ranging suite of technology including closed-circuit cameras, Stingray phone trackers, and earthquake prediction software that identifies alleged crime “hot spots” based on historic data.

And now, courtrooms are increasingly relying on criminal risk assessment algorithms, according to a new report from MIT Technology Review.

Under the guise of trimming the number of prisoners while processing defendants efficiently, prisoners are being assigned recidivism scores that estimate to what extent it is likely that a perpetrator may or may not reoffend.

As author Karen Hao explains:

"A judge then factors that score into a myriad of decisions that can determine what type of rehabilitation services particular defendants should receive, whether they should be held in jail before trial, and how severe their sentences should be. A low score paves the way for a kinder fate. A high score does precisely the opposite.

The logic for using such algorithmic tools is that if you can accurately predict criminal behavior, you can allocate resources accordingly, whether for rehabilitation or for prison sentences. In theory, it also reduces any bias influencing the process, because judges are making decisions on the basis of data-driven recommendations and not their gut.

You may have already spotted the problem. Modern-day risk assessment tools are often driven by algorithms trained on historical crime data."

The problem is that by relying on historical crime data, the communities that have been historically targeted by law enforcement – such as low-income and national, ethnic or religious minorities – are at risk of receiving higher recidivism scores. Hao continues:
-
So if you feed it historical crime data, it will pick out the patterns associated with crime. But those patterns are statistical correlations—nowhere near the same as causations. If an algorithm found, for example, that low income was correlated with high recidivism, it would leave you none the wiser about whether low income actually caused crime. But this is precisely what risk assessment tools do: they turn correlative insights into causal scoring mechanisms.
-
“As a result, the algorithm could amplify and perpetuate embedded biases and generate even more bias-tainted data to feed a vicious cycle,” Hao adds, noting that the proprietary nature of risk assessment algorithms renders any sort of accountability impossible.

In notes from the Data for Black Lives conference held earlier this month, Law for Black Lives executive director Marbre Stahly-Butts lays out the precise danger of new data-driven approaches to criminal justice:

"Data-driven risk assessment is a way to sanitize and legitimize oppressive systems … Demands of the community to change the system are being ‘met’ with an increased use of technology which actually lead to more over-surveillance of minority communities."

In an age when the data industry has become entirely financialized, Big Data-driven policing has become a lucrative business. And while the economic gulf widens between poor communities and the one percent, with schools being defunded and health care growing less accessible for the poor, the surveillance industry and private jail industry are seeing a windfall of taxpayer dollars.


https://themindunleashed.com/2019/01/artificial-intelligence-sending-people-jail-wrong.html









沒有留言: