- Oops!Something went wrong.Please try again later.
A rare show of bipartisan unity broke out in Washington Wednesday as Republicans and Democrats on the House Oversight Committee expressed concerns over the rapid spread of facial recognition software used by technology companies.
“I don’t want to see an authoritarian surveillance state, whether it’s run by a government or whether it’s run by five corporations,” Rep. Alexandria Ocasio-Cortez, D-N.Y., said in reference to Amazon, Microsoft, Apple, Google and Facebook.
Made up of former law enforcement and legal officials as well as face recognition experts, a panel of five witnesses testified at the hearing, with most calling for Congress to take quick action to regulate the technology. They found a receptive audience.
“You’ve hit the sweet spot that brings progressives and conservatives together,” Rep. Mark Meadows, R-N.C., said. “When you have a diverse group on this committee, as diverse as you might see on the polar ends, I’m here to tell you we’re serious about this, and let’s get together and work on legislation.”
On the other side of the country, however, Amazon shareholders rejected a proposal that would have urged the company to ensure its technology doesn’t violate civil rights.
Amazon, Microsoft, IBM and the Chinese company Face++ have all developed facial recognition software that can sift through massive amounts of photos and video to identify those pictured. Amazon Rekognition is currently being piloted by the FBI as a law enforcement tool. The company is also marketing the software to ICE as a way to monitor immigrants, according to a proposal from Amazon shareholders.
Rep. Jim Jordan, R-O.H., said the lack of standards for government agencies using recognition software reminded him of George Orwell’s dystopian novel “1984.”
“It seems to me it’s time for a time-out,” Jordan said. “It doesn’t matter what side of the political spectrum you’re on. This should concern us all.”
More than half of Americans are unwittingly part of a facial recognition system, as at least 18 states have agreements with the FBI to share their databases. In each of these cases, state legislatures did not vote to approve the partnerships, Neema Singh Guliani, senior legislative counsel for the American Civil Liberties Union, said at the hearing.
The images used in the FBI’s database often come in the form of driver’s license photos but citizens aren’t informed that they are being included in any database, said Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology, who also testified at the hearing.
The FBI has used facial recognition software for years but has not disclosed the scope of its surveillance technologies, Garvie said. At least a quarter of law enforcement agencies in the United States use recognition software, she said. That is a conservative estimate because many agencies will only disclose information after receiving a Freedom of Information Act request, she added.
The Government Accountability Office sent a letter to the Department of Justice in April that said the FBI had not made progress on recommendations issued in 2016 that would improve privacy, accuracy and security.
Joy Buolamwini, a researcher at the Massachusetts Institute of Technology, testified Wednesday that Amazon’s software misidentifies dark-skinned women as men 31% of the time. White men in her sample were not misidentified, she said.
She and other witnesses called for a moratorium on the use of facial recognition technology by government agencies until regulations are passed.
“Even if you make accurate facial recognition systems, they can and will be abused without regulations,” Buolamwini said.
In 2018, the ACLU tested Amazon Rekognition and found that 28 members of Congress were misidentified as people who had committed a crime. Three of them currently serve on the Oversight Committee.
Committee Chair Rep. Elijah Cummings, D-Md., said the issue is personal for him after the ACLU found in 2016 that the Baltimore Police Department used facial recognition software to identify and arrest people protesting the death of Freddie Gray. Cummings was one of those protesters.
Facebook, Twitter and Instagram have provided user data to a developer that police departments then used to monitor protests, according to the ACLU of Northern California.
Imperfect facial recognition technology can lead to false arrests, Buolamwini said, citing the arrest of a Brown University senior who had been incorrectly identified as one of the suspects in the Easter bombings in Sri Lanka.
Witnesses at the hearing said they were unsure if the FBI and local agencies were using recognition technology with real-time video feeds. Chicago and Detroit have both purchased the technology, Garvie said. Chicago claims not to use it, while Detroit monitors video streams at public places like gas stations, churches and schools, she said.
Andrew Ferguson, a professor of law at the University of the District of Columbia, said at the hearing that waiting for the Supreme Court to rule on the constitutionality of recognition technology would take too long to address a rapidly evolving field.
“Unregulated facial recognition technology should not be allowed to continue,” Ferguson said. “It is too powerful, too chilling and too undermining to principles of privacy, liberty and security.”
Several members of Congress seemed eager to start work on legislation to regulate recognition technology. Jordan said the issue was of “paramount importance,” and Democratic lawmakers echoed his urgency.
“We have to make sure how American values and our constitutional rights and our protections get translated in the digital age,” Ocasio-Cortez said at the end of the hearing. “We gotta get something done.”
Read more from Yahoo News: