Technology

Facial recognition tools under fresh scrutiny amid police protests

Nationwide protests against police brutality are renewing scrutiny of facial recognition technology, prompting tech giants like Amazon and IBM to scale back their sales of the software to law enforcement at the state and local level.

The criticism of the programs is also reigniting congressional efforts to craft federal regulations for the technology.

IBM was the first major company to make a splash on the issue, announcing in a letter to Congress last week that it will end its facial recognition business entirely.

CEO Arvind Krishna said the decision was made in part due to concerns from activists and civil rights groups that law enforcement may be using the technology to identify individuals participating in the demonstrations that have erupted across the nation following the police killing of George Floyd.

Amazon followed IBM’s lead a few days later, although the company made a much more limited commitment, saying that for the next 12 months its facial recognition technology, known as Rekognition, will not be sold to police.

Critics, however, have pointed out that Amazon did not address its sale of the technology to Immigration and Customs Enforcement and has actively expanded partnerships between its video doorbell system Ring and police since Floyd’s killing.

Microsoft announced Thursday that it will maintain its ban on selling facial recognition tools to police departments until there is a federal law governing the technology. Amazon also suggested it’s hopeful that its one-year moratorium will give Congress enough time to implement “appropriate rules.”

While the protests have been an “inflection point,” Meredith Whittaker, the co-director of the AI Now Institute, told The Hill that pressure for companies to change their facial recognition policies has been building for some time.

“The moments like this don’t happen without a lot of ongoing work,” she told The Hill in an interview Tuesday, pointing to work by MIT researchers Joy Buolamwini and Deborah Raji as well as the American Civil Liberties Union (ACLU) that found persistent biases in the controversial technology against people of color and women.

Despite increasing bipartisan criticism of facial recognition technology, there is no federal law spelling out how, when or where such technology can be used.

In the House, legislation is most likely to emanate from the Oversight and Reform Committee.

The committee has held multiple hearings on facial recognition, first led by the late Rep. Elijah Cummings (D-Md.), an early critic of the software, and this year it appeared close to a bipartisan consensus on declaring a federal moratorium.

Both Chairwoman Carolyn Maloney (D-N.Y.) and ranking member Jim Jordan (R-Ohio) said in February that legislation was being drafted to freeze federal use to allow more time for research around the topic.

While that effort was derailed in recent months — by the coronavirus and some reshuffling of committee members — negotiations have restarted in recent days, Rep. Jimmy Gomez (D-Calif.) told The Hill in an interview Tuesday.

There is now “broad consensus” on the committee for some sort of moratorium, he said, and “now we need to find a consensus on the solution.”

Gomez, a member of the Oversight and Reform Committee who has been a vocal critic of facial recognition since being misidentified as a criminal in an ACLU study, said concerns about the technology being used on protesters has pushed lawmakers to draft legislation.

“It’s always been a concern of ours, on how facial recognition is being used especially when it comes to people exercising their First Amendment rights,” he said. “It’s something that was deeply concerning before these protests and it’s deeply concerning now.”

Maloney told The Hill on Tuesday that the committee plans to introduce a facial recognition bill in the “coming weeks.”

Apart from a moratorium on federal use, Gomez said limiting police from misusing facial recognition is a focus of his. While the broader Democratic proposal on police reform, the Justice in Policing Act, addresses the issue, the California lawmaker said it alone would not be enough.

“This issue is so big, one piece of legislation is not going to solve it,” he said. “We have to tackle it from multiple directions.”

Beyond the Oversight and Reform Committee, Rep. Pramila Jayapal (D-Wash.), co-chair of the influential Congressional Progressive Caucus, is also drafting legislation to put a freeze on the technology, a spokesperson for the lawmaker told The Hill on Tuesday.

On the other side of the Capitol, there are multiple proposals to regulate facial recognition technology.

Democratic Sens. Cory Booker (N.J.) and Jeff Merkley (Ore.) introduced a bill that would place a federal moratorium on the tool until Congress passes a bill establishing standards for its use.

It would also prohibit state and local governments from using federal funds for the technology and create a commission to provide recommendations on future federal government use.

Merkley praised IBM’s move last week, saying the risks facial recognition pose for communities of color is a “very real concern.”

Sens. Christopher Coons (D-Del.) and Mike Lee (R-Utah) introduced a bipartisan bill this year that would require law enforcement to obtain a court order to use facial recognition software for extended surveillance. However, civil rights groups have been critical of exceptions in the legislation for “exigent circumstances” where a court order would not be needed.

The majority of the efforts on Capitol Hill have been focused on the biases in facial recognition technology.

In addition to the research on bias referenced earlier, the National Institute of Standards and Technology, a federal agency within the Commerce Department, released an expansive study in December finding that the majority of facial recognition systems have “demographic differentials” that can worsen their accuracy based on a person’s age, gender or race.

But Michael Kleinman, the director of Amnesty International’s Silicon Valley Initiative, told The Hill that unbiased facial recognition technology could just make the problems worse.

“Where this technology is used for mass surveillance, ‘solving the accuracy problem’ and improving grades does not address the impact on the right to peaceful protest,” he said. “And so, in some ways, improving accuracy may only amount to increasing surveillance and disempowerment of the already disadvantaged.”