Stop funding predictive policing tech without ‘evidence standards,’ lawmakers tell DOJ
Seven Democrats are urging Attorney General Merrick Garland to cease grant funding for predictive policing systems until DOJ ensures they are not having “a discriminatory impact.”
Democrats in both chambers of Congress are calling for the Department of Justice to halt grant funding for predictive policing systems until the agency ensures “that grant recipients will not use such systems in ways that have a discriminatory impact.”
In a Jan. 29 letter to Attorney General Merrick Garland, seven lawmakers — including Rep. Yvette Clarke, D-N.Y., and Sens. Ron Wyden, D-Ore., Jeff Merkley, D-Ore., Alex Padilla, D-Calif., Peter Welch, D-Vt., John Fetterman, D-Pa., and Edward Markey, D-Mass. — wrote that “mounting evidence indicates that predictive policing technologies do not reduce crime” and instead “worsen the unequal treatment of Americans of color by law enforcement.”
Predictive policing systems use historical data to determine where crimes are most likely to occur, often resulting in higher levels of neighborhood-specific policing. But the lawmakers said the data used to power the underlying algorithms in these technologies is “distorted by falsified crime reports and disproportionate arrests of people of color.”
“As a result, they are prone to over-predicting crime rates in Black and Latino neighborhoods while under-predicting crime in white neighborhoods,” the lawmakers wrote. “The continued use of such systems creates a dangerous feedback loop: biased predictions are used to justify disproportionate stops and arrests in minority neighborhoods, which further biases statistics on where crimes are happening.”
The letter comes after Clarke, Wyden and other lawmakers pressed DOJ in an April 2021 letter to discuss its funding and oversight of predictive policing programs across the country. In response to the inquiry, DOJ told the members of Congress that it did not know how much funding — awarded through its Edward Byrne Memorial Justice Assistance Grant Program — had been directed to state, local and tribal law enforcement agencies to purchase predictive policing systems.
The lawmakers said in Monday’s letter that DOJ must “periodically review whether grant recipients are complying” with Title VI of the Civil Rights Act of 1964 to ensure that its programs do not “discriminate on the basis of race, ethnicity or national origin, even unintentionally.”
President Joe Biden’s October 2023 executive order on artificial intelligence also requested, in part, that DOJ submit a report to the White House within one year on the use of AI in the criminal justice system, including when it comes to “crime forecasting and predictive policing” and “the ingestion of historical crime data into AI systems to predict high-density ‘hot spots.’”
The lawmakers asked that the White House-mandated report “assess the accuracy and precision of predictive policing models across protected classes, their interpretability and their validity, including any limits on assessing their risks posed by a lack of transparency from the companies developing them.”
The letter also recommended that DOJ determine “whether and how law enforcement agencies can use these technologies to enhance public safety without having discriminatory impacts.”
If DOJ continues to fund state, local and tribal grants for predictive policing systems after completing its report, then the lawmakers said the agency should “establish evidence standards for assessing whether using a particular predictive policing product in a particular way would have an unacceptable risk of discriminatory impact.”