Senate

Democrats press Justice Department on police use of facial recognition

A sign marks an entrance to the Robert F. Kennedy Department of Justice Building in Washington Jan. 23, 2023.

Democratic senators pressed the Department of Justice (DOJ) about police usage of facial recognition tools in a Thursday letter.

“In recent years, facial recognition and other biometric technologies have become widely used in law enforcement,” the letter to Attorney General Merrick Garland reads. “However, these technologies can be unreliable and inaccurate, especially with respect to race and ethnicity.”

The letter’s signatories include Sens. Raphael Warnock (D-Ga.), Dick Durbin (D-Ill.), Chris Van Hollen (D-Md.), John Fetterman (D-Pa.), Ben Cardin (D-Md.), Peter Welch (D-Vt.), Jeff Merkley (D-Ore.), Tina Smith (D-Minn.), Laphonza Butler (D-Calif.), Elizabeth Warren (D-Mass.), Alex Padilla (D-Calif.), Brian Schatz (D-Hawaii), Ed Markey (D-Mass.), Cory Booker (D-N.J.), Ron Wyden (D-Ore.), Bernie Sanders (I-Vt.), Gary Peters (D-Mich) and Mark Kelly (D-Ariz.). 

The Senators note cases of arrests of Black people “based on little or nothing more than an incorrect facial recognition match” and their concern that the use of kinds of biometric technology could violate Title VI of the Civil Rights Act of 1964.

“The law prohibits intentional discrimination as well as discriminatory effects,” the letter continues. “Title VI thus restricts the ability of grant recipients funded by agencies like DOJ to deploy programs or technologies that may result in discrimination.”

The letter also asks the DOJ to answer a set of questions including those about training on the use of facial recognition tech and compliance with civil rights laws by late February.

“We are deeply concerned that facial recognition technology may reinforce racial bias in our criminal justice system and contribute to arrests based on faulty evidence,” the letter reads.

It continues, “Errors in facial recognition technology can upend the lives of American citizens. Should evidence demonstrate that errors systematically discriminate against communities of color, then funding these technologies could facilitate violations of federal civil rights laws.”