Clearview AI Demonstrates the Dangers of Facial Recognition
Facial recognition technologies (FRTs) are in the news again. This time, it is Clearview AI, a small company that until recently was virtually unknown to everyone except the 600 law enforcement agencies using its technology to match people’s photos to their online presence.
Clearview AI has scraped millions of websites – including news, business, education, and employment sites, social networks, and even the Venmo digital payment service – and built a database of three billion images. A user of its facial recognition software can take a photo of a person, and the system will match it against the database and report where else that person has appeared on the web, reports The New York Times.
The app is being used or piloted by 600 law enforcement agencies in the US and Canada, “ranging from local cops in Florida to the FBI and the Department of Homeland Security.” Reportedly, the app has helped to solve a few cases: “shoplifting, identity theft, credit card fraud, murder and child sexual exploitation cases.”
In one reported case where the tech was used to solve a crime (in 20 minutes!), the police identified the perpetrator by matching his face from a bystander’s video recording of the crime with another video on social media which included the attacker’s name in the caption.
The technology seems to be so good that it works even with less-than-perfect pictures: it can recognize photos of people whose facial features are obscured by hats or glasses, photos of people in profile, or photos with only a partial view of the face.
Our Take
While it is assuring that this technology has been helpful in solving crimes, it is alarming to think what else it can be used for.
So far, Clearview AI is used by law enforcement agencies. But what’s stopping the company from making it available to just anyone? writes The New York Times. And what inventive applications can people come up with? How about using GANs (generative adversarial networks) to create deepfakes simulating porn and use them to degrade women?
Imagine your name, home address, family, friends, likes/dislikes, and anything else you’ve ever shared/posted on the web being instantaneously available to anyone anywhere – through a picture they take of you without asking your permission or without you even being aware of it. “It would herald the end of public anonymity,” writes The New York Times.
I also like this quote which summarizes well the dangers of using technologies such as FRTs in an uncontrolled manner: “It’s creepy what they’re doing, but there will be many more of these companies. There is no monopoly on math,” said Al Gidari, a privacy professor at Stanford Law School. “Absent a very strong federal privacy law, we’re all screwed.”
There is a reason why FRTs are banned in San Francisco, why the US Senate is considering a bill to limit the use of facial recognition technology by federal agencies, and why the EU is considering to ban FRTs for up to five years (so that regulators can work out how to protect us, the public).
It is not acceptable to unleash technologies without thinking through the consequences. It is up to us, the business leaders, to ensure that we do no harm to people consuming our technologies, to the communities we live in, our societal structures, and the environment.
Want to Know More?
To learn more about how technologies are violating basic human rights, read our note Amnesty International Calls Google and Facebook a Threat to Human Rights and the Amnesty International report “Surveillance Giants: How the Business Model of Google and Facebook Threatens Human Rights” for more details.
We discuss the two-faced nature of AI in our note Google and IBM Are Calling for AI Regulation. (This is not anything specific to AI, by the way – all technologies can be used for good or bad, but we disagree with those claiming that technologies themselves are neutral.)
To learn more about the harms you can unleash on your consumers, employees, and partners and how to prevent those harms, consult Info-Tech’s blueprint Mitigate Machine Bias.