The American Civil Liberties Union (ACLU) said on Thursday that its testing showed Amazon’s facial recognition software incorrectly matched 28 of the 535 members of Congress with other individuals who had been arrested on criminal charges.
{mosads}The ACLU also said Amazon’s “Rekognition” tool, which is being sold to law enforcement, was significantly less accurate on minority members of Congress. The civil liberties group said it found that the software gave false matches in 39 percent of tests on minority members, who make up 20 percent of Congress.
“The results reinforced what we already know,” said ACLU of California attorney Jacob Snow in an interview with The Hill. “It’s flawed, and it’s dangerous for communities of color and protesters.”
The ACLU, which is pushing Amazon to stop selling the software to law enforcement agencies, said its test was conducted on well-lighted headshots taken with actual cameras. Because there are few regulations on facial recognition technology, officers can and in some cases have used lower-quality images taken on cell phones in the field at night, which could lead to even lower accuracy rates, critics say.
Amazon has defended its tool, saying Rekognition settings should be set to more precise accuracy levels for human faces. The ACLU tested it at 80 percent confidence, but an Amazon spokesperson said it should be done at 95 percent.
“It is worth noting that in real-world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment (and not to make fully autonomous decisions), where it can help find lost children, restrict human trafficking, or prevent crimes,” an Amazon Web Services spokesperson said.
Snow noted Rekognition is set to 80 percent by default and that Amazon markets the tool on its site as working on human faces at that setting.
“Amazon makes no effort to ask users what they are using Rekognition for. Instead, the tool sets one default: the same 80 percent we used in running our test,” Snow said.
“Law enforcement is not skeptical of these results and these companies are marketing their tools as advanced and accurate and powerful,” Snow added. “We know that people trust tech even when it’s flawed. That really does underscore the dangers of still using it.”
Snow’s blog post on the matter cites a case involving an elderly black woman who was held at gunpoint by police whose license plate recognition software mistook her car for another that was stolen.
Compounding the issue, Snow said, is Amazon’s lack of transparency in its testing for flaws and bias in its software
The civil liberties group isn’t alone in calling for a moratorium of the government’s use of the facial recognition. Some 150,000 individuals, Amazon shareholders and Amazon employees themselves have also voiced opposition to the company selling its facial recognition software to law enforcement.
The entire Congressional Black Caucus as well as Reps. Emanuel Cleaver (D-Mo.) and Keith Ellison (D-Minn.) have also written letters to Amazon opposing the technology.
Others have spoken about problems with technology more generally. In 2016, Georgetown Law Center issued a scathing report about the technology and law enforcement’s use of it across the country.
In June, Brian Brackeen, the CEO of the facial recognition technology company Kairos, wrote that the technology is not yet accurate enough for use by law enforcement.