The San Mateo County Sheriff’s Office is among hundreds of public agencies that have made use of controversial facial recognition software, and, until now, it hasn’t bothered to mention it. Until we asked, the Sheriff’s Office had declined to answer questions about who in the department had made use of Clearview AI software. It wouldn’t say whether it’s been used to solve crimes or merely to identify people for the fun of it. It wouldn’t even confirm using it.
When BuzzFeedNews, which first reported local use of the software, asked San Mateo County Sheriff’s spokeswoman Rosemerry Blankswade whether the department was using Clearview AI, she told the digital news organization, “No one is able to speak about these matters at this time.” Now the department acknowledges testing it thousands of times and says it would like to buy it for use going forward. It’s important to note, Pacifica Police Department doesn’t come up on the BuzzFeedNews data set.
Why should you care?
Clearview AI, which according to the New York Times was born over meetings at the Republican National Convention in 2016, scrapes millions of photos from social media that can then be matched to a given photo using its app. If police are looking for a mad bomber and have only a surveillance still to go on, the technology can find a match on Facebook or some other photo-sharing site and ping law enforcement with a name and a strong lead. The company says its technology is well over 90 percent accurate.
That isn’t what some public agencies have found, however. There is evidence that the technology works less well on people of color and not at all in many instances. Moreover, it could be used to identify peaceful and lawful protesters simply because police agencies want to monitor them. It could be used by private companies (Bank of America and even the NBA are said to be testing it) to follow your every move. And it likely spells the end of the anonymity.
The BuzzFeedNews data set identifying the San Mateo County Sheriff’s Office came from an undisclosed source, and the company declined to verify its authenticity. The vast majority of public agencies — funded with taxpayer money — declined to respond to the news site’s inquiries about the software. Dozens denied using it at all. But hundreds acknowledged its use, including the U.S. Army and Immigration and Customs Enforcement. In dozens of instances, police agencies at first denied using facial recognition software only to tell reporters that their inquiries led command staff to learn officers were in fact using it in an unauthorized fashion. We’ve file a public records request to find out how the software has been used here thus far.
This is a bridge too far. At a time of unprecedented distrust of police, when calls for defunding police are in the air and most people want to demilitarize their local police forces, agencies no longer have carte blanche to use whatever means they deem necessary. Perhaps more to the point, as U.S. Sen. Ron Wyden of Oregon told the digital news site, “Americans shouldn’t have to rely on BuzzFeed to learn their local law enforcement agencies were using flawed facial recognition technology.”
— Clay Lambert