Get Our Email Newsletter

More Bans on Government Use of Facial Recognition

It’s simple: Boston doesn’t want to use crappy technology.

Boston Police Department (BPD) Commissioner William Gross said last month that abysmal error rates – errors that mean it screws up most particularly with Asian, dark or female skin – make Boston’s recently enacted ban on facial recognition use by city government a no-brainer.

Thus did the city become the second-largest in the world, after San Francisco, to ban use of the infamously lousy, hard-baked racist/sexist technology. The city council voted unanimously on the bill on 24 June – here’s the full text, and here’s a video of the 3.5-hour meeting that preceded the vote – and Mayor Marty Walsh signed it into law last week.

The Boston Police Department (BPD) isn’t losing anything. It doesn’t even use the technology. Why? Because it doesn’t work. Make that it doesn’t work well. The “iffy” factor matters most particularly if you’re Native American, black, asian or female, given high error rates with all but the mostly white males who created the algorithms it runs on.

- Digital Partner -

According to a landmark federal study released by the National Institute of Standards of Technology in December 2019, asian and black people are up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Commercial facial analysis systems vary widely in their accuracy, but overall, Native Americans had the highest false-positive rate of all ethnicities.

According to a landmark federal study released by the National Institute of Standards of Technology in December 2019, asian and black people are up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Commercial facial analysis systems vary widely in their accuracy, but overall, Native Americans had the highest false-positive rate of all ethnicities.

The faces of black women were often falsely identified in the type of search wherein police compare their images with thousands or millions of others in hopes of hitting a match for a suspect. According to an MIT study from 2018, the darker the skin, the higher the error rates. For the darkest-skinned women, two commercial facial-analysis systems had an error rate of nearly 35%, while two systems got it wrong nearly 47% of the time…  Naked Security

Loss Prevention Magazine updates delivered to your inbox

Get the free daily newsletter read by thousands of loss prevention professionals, security, and retail management from the store level to the c-suite.

What's New

Digital Partners

Become a Digital Partner

Violence in the Workplace

Download this 34-page special report from Loss Prevention Magazine about types and frequency of violent incidents, impacts on employees and customers, effectiveness of tools and training, and much more.

Webinars

View All | Sponsor a Webinar

Whitepapers

View All | Submit a Whitepaper

LP Solutions

View All | Submit Your Content

Loss Prevention Media Logo

Stay up-to-date with our free email newsletter

The trusted newsletter for loss prevention professionals, security and retail management. Get the latest news, best practices, technology updates, management tips, career opportunities and more.

No, thank you.

View our privacy policy.