Austin banned facial recognition technology for good reason. Why did APD use it? | Grumet

There are rules, and then there are “rules.” The rules that people know they must follow, and the “rules” some people quietly ignore when they think they know better.

The Austin City Council set a clear policy in 2020 when voting unanimously to prohibit police use of facial recognition technology to identify suspects and other members of the public — recognizing the software that compares an image of a person to billions of photos scraped from databases and social media was both imperfect and ethically fraught.

But it appears a few members of the Austin Police Department took that ban as more of a “rule,” an advisory they could ignore when an investigation was stuck.

According to recent reporting by The Washington Post, Austin police officers sidestepped the city’s ban on several occasions by sending images of unknown suspects to the Leander Police Department, which has access to facial recognition software, and asking someone there to run the scan.

“That’s him! Thank you very much,” an Austin police officer wrote after a Leander officer sent an array of photos from a facial recognition search, according to documents obtained by the Post.

The newspaper found at least 13 times that Austin police received face scan results from Leander police, based on information from sources and public record requests. It’s hard not to wonder if the number of surreptitious photo searches was even higher than that. (The Austin Police Department told me it cannot comment while it is investigating these incidents.)

Nor is there any sign those incidents fell under the one exception to Austin’s ban: Police can request the city manager’s permission to use this technology in cases of “imminent threat or danger,” with detailed disclosure to the City Council and the public.

We all want crimes to be solved, our community to be safe. But it is imperative that police follow the rules as they uphold the law — to recognize those rules are not inane technicalities but guardrails set by elected officials to protect everyone’s rights.

Consider the fact that facial recognition software has led to the wrongful arrests of at least seven people around the country — from Louisiana to Michigan to New Jersey — illustrating the fallibility of this technology and the nightmarish consequences for wrongly accused people trying to clear their names. Six of those seven wrongful arrests involved people who are Black, underscoring the worrisome reality that facial recognition programs are at greater risk of misidentifying people of color.

Yet we don’t have complete data or independent audits to verify the accuracy of this tool. Clearview AI, the program used by Leander police and many other departments, “has not publicly disclosed its rates of false positives or negatives” or submitted its algorithm for independent testing, according to a 2022 report by the nonprofit Brookings Institution.

For the little we know about them, these software companies have a lot of information on us. Clearview AI has amassed photos from hundreds of government databases and scraped billions of images from Facebook, YouTube and other websites.

“To put this tracking in perspective, the FBI only has about 640 million photos in its databases, compared to Clearview AI’s approximately 10 billion,” the Brookings Institution report found.

New Austin police officers hold up their right hands to take their oath of service at their graduation ceremony last month. The Austin City Council in 2020 voted unanimously to prohibit the Police Department from using facial recognition technology.
New Austin police officers hold up their right hands to take their oath of service at their graduation ceremony last month. The Austin City Council in 2020 voted unanimously to prohibit the Police Department from using facial recognition technology.

A 2016 Georgetown Law study found half of all U.S. adults had photos in the facial recognition databases used by law enforcement, and 1 in 4 state and local police departments had access to this software. Surely both numbers have grown significantly since then.

Still, if you see promise in this tool for crime-fighting purposes, you’re not alone. A 2022 poll by Pew Research Center found about three-quarters of Americans believed widespread use of facial recognition technology would help police find more missing persons and solve crimes more quickly and efficiently. At the same time, 69% in that poll expected police would use the tool to track everyone’s location at all times, and 66% said police would monitor Black and Hispanic neighborhoods more than other neighborhoods.

All of which means facial recognition software should not be used without robust and well-publicized safeguards — and certainly not in plain defiance of a city ban.

Trust is established through transparency. Austin police, now investigating the troubling revelations from The Washington Post, need to provide a prompt and thorough report to the public about how often facial recognition software was used and what disciplinary actions officers will face.

Additionally, police should tell anyone arrested in these cases that the investigation involved the use of facial recognition software. A match from a facial scan isn’t sufficient proof to make an arrest — other agencies say they use such information as a tip for further investigation — but defendants have the right to know if the technology flagged other potential matches that might raise doubt about their guilt.

Obviously, I remain deeply skeptical of this technology. But if the Austin Police Department believes the software has improved, and that facial recognition is a vital tool that can be used safely, it needs to make that case. Austin should have that debate. Our elected leaders should decide whether the ban should change, and if so, what the guardrails should be.

But the reach and impact of this technology are far too powerful for a handful of police officers to quietly make the rules for themselves.

Grumet is the Statesman’s Metro columnist. Her column, ATX in Context, contains her opinions. Share yours via email at bgrumet@statesman.com or on X at @bgrumet. Find her previous work at statesman.com/opinion/columns.

This article originally appeared on Austin American-Statesman: Austin police ignored city's ban on facial recognition technology