It’s simple: Boston doesn’t want to use crappy technology.
Boston Police Department (BPD) Commissioner William Gross said last month that abysmal error rates – errors that mean it screws up most particularly with Asian, dark or female skin – make Boston’s recently enacted ban on facial recognition use by city government a no-brainer.
Thus did the city become the second-largest in the world, after San Francisco, to ban use of the infamously lousy, hard-baked racist/sexist technology. The city council voted unanimously on the bill on 24 June – here’s the full text, and here’s a video of the 3.5-hour meeting that preceded the vote – and Mayor Marty Walsh signed it into law last week.
The Boston Police Department (BPD) isn’t losing anything. It doesn’t even use the technology. Why? Because it doesn’t work. Make that it doesn’t work well. The “iffy” factor matters most particularly if you’re Native American, black, asian or female, given high error rates with all but the mostly white males who created the algorithms it runs on.
According to a landmark federal study released by the National Institute of Standards of Technology in December 2019, asian and black people are up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Commercial facial analysis systems vary widely in their accuracy, but overall, Native Americans had the highest false-positive rate of all ethnicities.
According to a landmark federal study released by the National Institute of Standards of Technology in December 2019, asian and black people are up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Commercial facial analysis systems vary widely in their accuracy, but overall, Native Americans had the highest false-positive rate of all ethnicities.
The faces of black women were often falsely identified in the type of search wherein police compare their images with thousands or millions of others in hopes of hitting a match for a suspect. According to an MIT study from 2018, the darker the skin, the higher the error rates. For the darkest-skinned women, two commercial facial-analysis systems had an error rate of nearly 35%, while two systems got it wrong nearly 47% of the time… Naked Security