Get Our Email Newsletter

Photos ‘Cloaked’ with AI Tool; Fools Facial Recognition

Ubiquitous facial recognition is a serious threat to privacy. The idea that the photos we share are being collected by companies to train algorithms that are sold commercially is worrying. Anyone can buy these tools, snap a photo of a stranger, and find out who they are in seconds. But researchers have come up with a clever way to help combat this problem.

The solution is a tool named Fawkes, and was created by scientists at the University of Chicago’s Sand Lab. Named after the Guy Fawkes masks donned by revolutionaries in the V for Vendetta comic book and film, Fawkes uses artificial intelligence to subtly and almost imperceptibly alter your photos in order to trick facial recognition systems.

The way the software works is a little complex. Running your photos through Fawkes doesn’t make you invisible to facial recognition exactly. Instead, the software makes subtle changes to your photos so that any algorithm scanning those images in future sees you as a different person altogether. Essentially, running Fawkes on your photos is like adding an invisible mask to your selfies.

Scientists call this process “cloaking” and it’s intended to corrupt the resource facial recognition systems need to function: databases of faces scraped from social media. Facial recognition firm Clearview AI, for example, claims to have collected some three billion images of faces from sites like Facebook, YouTube, and Venmo, which it uses to identify strangers. But if the photos you share online have been run through Fawkes, say the researchers, then the face the algorithms know won’t actually be your own.

- Digital Partner -

According to the team from the University of Chicago, Fawkes is 100 percent successful against state-of-the-art facial recognition services from Microsoft (Azure Face), Amazon (Rekognition), and Face++ by Chinese tech giant Megvii.

“What we are doing is using the cloaked photo in essence like a Trojan Horse, to corrupt unauthorized models to learn the wrong thing about what makes you look like you and not someone else,” Ben Zhao, a professor of computer science at the University of Chicago who helped create the Fawkes software, told The Verge. “Once the corruption happens, you are continuously protected no matter where you go or are seen.”

The group behind the work — Shawn Shan, Emily Wenger, Jiayun Zhang, Huiying Li, Haitao Zheng, and Ben Y. Zhao — published a paper on the algorithm earlier this year. But late last month they also released Fawkes as free software for Windows and Macs that anyone can download and use. To date they say it’s been downloaded more than 100,000 times… The Verge

Loss Prevention Magazine updates delivered to your inbox

Get the free daily newsletter read by thousands of loss prevention professionals, security, and retail management from the store level to the c-suite.

What's New

Digital Partners

Become a Digital Partner

Violence in the Workplace

Download this 34-page special report from Loss Prevention Magazine about types and frequency of violent incidents, impacts on employees and customers, effectiveness of tools and training, and much more.

Webinars

View All | Sponsor a Webinar

Whitepapers

View All | Submit a Whitepaper

LP Solutions

View All | Submit Your Content

Loss Prevention Media Logo

Stay up-to-date with our free email newsletter

The trusted newsletter for loss prevention professionals, security and retail management. Get the latest news, best practices, technology updates, management tips, career opportunities and more.

No, thank you.

View our privacy policy.