Technology
Facial recognition company scraped billions of photos to help the cops
A Facebook photo from the end of college could come back to haunt you.
A New York Times deep-dive into a facial recognition AI tool sold to law enforcement agencies uncovered that the company has amassed more than three billion images. Those images are scraped from all corners of the internet from social media sites to companies’ “About Us” pages. That’s way more than the typical police or even FBI database.
As the Saturday feature explained, Clearview AI, a small startup with initial funding from Facebook board member Peter Thiel, is bolstering its facial recognition database for police departments and others through Facebook, YouTube, Venmo, and more, despite prohibitions against this exact type of thing.
The company’s founder told the NYT that “Facebook knows” about the image scraping. Sites like Twitter don’t allow crawlers to take media posted on its platform. But it’s happening because of loose and inconsistent privacy laws when it comes to AI tools. Even if San Francisco banned facial recognition tools within the local government, federally there’s no such restriction.
I’ve had two demos of Clearview & the results were frightening/stunning in their accuracy. Both demonstrations involved me giving blurry screenshots of a video & Clearview was able to identify both people (friends who had consented) even though they barely have a presence online https://t.co/53YULsJWj6
— Yashar Ali ? (@yashar) January 18, 2020
The American Civil Liberties Union (ACLU) staff attorney Nathan Freed Wessler spoke to that lax regulation. He wrote in an email statement, “This is a disturbing demonstration of the dangerous reality of face recognition technology today, and the urgent need for lawmakers to immediately halt law enforcement use of it.”
Wessler went on, “Police should not be able to investigate and prosecute us by secretly running an error-prone and unregulated technology provided by an untested startup that has shadily assembled a database of billions of face scans of everyday Americans.”
Clearview claims it only uses publicly available images (and profile info that was once open even if you’ve since changed your settings), so according to the company it’s all good. Facebook now says it’s going to look into the service and how it uses Facebook data.
The reporter used the tool on herself and it pulled up photos of her from the past 10 years. Even with her face mostly obstructed the scanner pulled up seven matching images from the internet. Police have used faces that appeared in the background of gym selfies and other random places to identify suspects or victims. Fun times.
While law enforcement agencies are currently the main customers, Clearview has plans to work with AR glasses and wearables. With just a glance, your identity, personal information, and life history could be pulled up and exposed. Now you definitely regret that college party photo on Facebook.
-
Entertainment7 days ago
OpenAI’s plan to make ChatGPT the ‘everything app’ has never been more clear
-
Entertainment6 days ago
‘The Last Showgirl’ review: Pamela Anderson leads a shattering ensemble as an aging burlesque entertainer
-
Entertainment7 days ago
How to watch NFL Christmas Gameday and Beyoncé halftime
-
Entertainment5 days ago
Polyamorous influencer breakups: What happens when hypervisible relationships end
-
Entertainment5 days ago
‘The Room Next Door’ review: Tilda Swinton and Julianne Moore are magnificent
-
Entertainment4 days ago
‘The Wild Robot’ and ‘Flow’ are quietly revolutionary climate change films
-
Entertainment4 days ago
Mars is littered with junk. Historians want to save it.
-
Entertainment4 days ago
CES 2025 preview: What to expect