Democrats Question Justice Department on Use of Predictive Policing Algorithms
They also called for more comprehensive checks on the use of such technologies.
Eight Democratic lawmakers raised concerns about the deployment of algorithms that automate policing decisions, via a letter penned to the Justice Department Thursday,
“We ask DOJ to help ensure that any predictive policing algorithms in use are fully documented, subjected to ongoing, independent audits by experts, and made to provide a system of due process for those impacted,” the lawmakers wrote to Attorney General Merrick Garland. “If DOJ cannot ensure this, DOJ should halt any funding it is providing to develop and deploy these unproven tools.”
Reps. Yvette D. Clarke, D-N.Y., and Sheila Jackson Lee, D-Texas, and Sens. Ron Wyden, D-Ore., Elizabeth Warren, D-Mass., Edward Markey, D-Mass., Jeff Merkley, D-Ore., Alex Padilla, D-Calif., and Raphael Warnock, D-Ga., signed the note.
Predictive policing involves law enforcement officials implementing mathematical and predictive analytics, and other technology-based techniques, to pinpoint potential crimes. In their letter, the lawmakers said two primary ways such methods are used are to predict locations where crimes could occur in a particular window, or predict which individuals might be involved in future illegal acts. Algorithms draw from historical crime data, and at times other data elements like weather patterns or gunfire detection, to produce the forecasts.
“But, when predictive policing systems have been exposed to scrutiny, auditors have found major problems with their effectiveness and reliability,” they wrote.
The cadre pointed to specific reviews that sparked worry, as well as a police department’s 2020 strategic plan that mentioned implementing such technologies with Justice Department funds. They also referenced a recent study that found nine out of 13 assessed law enforcement departments used what’s deemed “dirty data”—or information collected from illegal policing practices—to inform their algorithms leveraged in this sort of work.
“When datasets filled with inaccuracies influenced by historical and systemic biases are used without corrections, these algorithms end up perpetuating such biases and facilitate discriminatory policing against marginalized groups, especially Black Americans,” the lawmakers said.
They requested a range of detailed information from the federal department, including whether officials have analyzed if this tech’s use complies with relevant civil rights laws; names of each jurisdiction that has operated predictive policing algorithms funded by the agency and the actual software used; detailed annual accounting of all federal funding DOJ distributed related to developing and implementing predictive policing algorithms at federal, state, and local levels for fiscal years 2010 to 2020; and more.
“Does the DOJ provide guidance to agencies and departments using these tools on best practices for data sharing, legal discovery and evidentiary obligations?” they also asked.
The lawmakers requested that the department responds to their inquiries by May 28.