Facial recognition before tech (Source: Good Mythical Morning, YouTube)
When Google trains its search engine algorithms, it does so on billions of queries a day.
If a facial recognition startup wants to train its algorithm, it needs… faces. One such company is called Paravision (formerly Everalbum), which created an app called Ever to let users upload and modify photos.
Paravision used these images to create a facial recognition product it then sold to law enforcement and the military, according to One Zero.
It turns out the company never had permission to do that…
… which constitutes “illegal deception,” according to a statement from the Federal Trade Commission (FTC).
The FTC ultimately forced Paravision to delete its algorithm. Per the statement, this is a “course correction” from previous FTC decisions that allowed “data protection violators” to retain the value from ill-gotten data.
This ruling may set a precedent for Big Tech
With data undergirding basically every major AI-powered service we use — from recommendation engines to driver routing to credit applications — the surface area for abuse is huge.
One Zero notes that Google was fined $170m in 2019 for deceptive data collection from children without parental consent. The search giant was forced to delete the data but kept the algorithms and insights.
The Paravision case — and Biden’s appointment of Big Tech critic Rebecca Kelly Slaughter to head the FTC — draws a clear (and new) red line.
Get the 5-minute roundup you’ll actually read in your inbox
Business and tech news in 5 minutes or less