The maker of a powerful facial-recognition tool says hundreds of law-enforcement agencies are using its technology to blaze new trails in the fight against crime. What could possibly go wrong?
Quite a bit, apparently. The New York Times’s report on an obscure startup called Clearview AI reads like a subplot from a Batman flick.
Here’s how Clearview works
Clearview says its app stands out because it allows users to search through a database of 3B+ publicly available images, far more than the FBI or other agencies have on file.
Take a picture of a person and upload it, and the app links the image to other photos of the same person, scraped from sites like Facebook and YouTube.
The company’s founder even acknowledged creating a prototype that would allow the app to be paired with augmented-reality glasses, to potentially identify people in real time. (When the Times asked about it, he said the company had no plans to release said prototype.)
Police officials say Clearview has helped them identify suspects, sometimes in a matter of seconds. But there are reasons to believe the technology isn’t all it’s cracked up to be.
For starters, there’s no way to tell if it’s accurate…
Studies have shown that facial-recognition algorithms often misidentify women and people of color, leading to false positives.
Security is another worry. In China, information on thousands of schoolchildren was stored in a facial-recognition database without protection, The Wall Street Journal reported last week.
Some lawmakers are responding by proposing bills that would regulate law enforcement’s use of facial-recognition tech, or in some cases, ban it altogether.
Those bans might not really work
Bruce Schneier, an expert on privacy, says outright prohibitions on facial-recognition tech won’t be enough.
That’s because your face isn’t the only thing that can identify you — so can your gait, your heartbeat, or the digital fingerprint of your cellphone.
Until we “decide how much we as a society want to be spied on by governments and corporations,” he writes, we’re missing the point.