Over the years, facial recognition has gotten a bit of a bad rap. When the technology first became popular, some companies and countries developed and used the technology unethically, creating master databases without people’s consent. This ultimately led to a number of lawsuits and bans, and consumer fears of ‘Big Brother.’
But now, with a more educated consumer base and most of the controversial actors having gone under or changed practices, the technology’s true potential is becoming clear. And with a rebrand in process—the term ‘facial recognition’ is being replaced by ‘face matching’—mass adoption may be in the near future.
The most thorough public opinion research done on face matching to date, commissioned by the Security Industry Association (SIA), showed that the majority (59 percent) of US adults are favorable toward face matching technology, and 70 percent believe the technology is accurate in identifying people of all races and ethnicities.
And while those surveyed were most supportive of face matching in airports (75 percent), the majority (51 percent) were in favor of it being used in retail stores. This growing comfort combined with the rapid increase in organized retail crime (ORC) and violence against employees provides a compelling argument for retailers to give the tech a try.
Balancing Safety and Privacy
Critics of face matching technology may cite concerns about racial bias—an early flaw in the technology that has since been solved, according to face matching companies and an independent researcher—or a lack of privacy.
But, as Loss Prevention Research Council Director and University of Florida Research Scientist Read Hayes, PhD, points out, privacy is in the eye of the beholder. “It’s very transactional, transitional, and situational,” Hayes explained. “Every day we do value exchanges. I can see there’s cameras in the store, but I’m still going to go in. All day every day, for convenience, entertainment, and even safety, we all trade a little privacy to buy online, go through toll areas with transponders, tweet, text, Insta, email, and more.”
As long as retailers can balance safety and privacy, consumers will likely become more and more favorable toward the use of face matching in retail stores.
Jake Parker, senior director of government relations at the SIA, has been hard at work championing face matching’s usefulness when used correctly. According to Parker, “Virtually any technology can be misused, and ethical questions typically concern the purpose and methods by which a technology is implemented versus the technology itself. Most public concerns about facial recognition software have centered on use by law enforcement, where it is used to develop identity leads using large databases available to the government.” He added, “This is fundamentally different from most commercial applications of the technology for authentication screening, but similarly, concerns about privacy or potential misuse of the technology can be addressed with the right set of use policies and safeguards.”
Following are some policy and safeguard recommendations from SIA that retailers might consider:
Transparency. Transparency is the bedrock that governs the use of face matching technology for both commercial and government use, according to SIA. It should be clear when and for what purpose the technology is used, as well as which processes and procedures govern the collection, processing, storage, use, and transfer of related data. Transparency should ensure that every application of the technology is subject to a policy set by the implementing organization, which governs how the technology is used.
Clear and Defined Purpose. Organizations must specifically identify their purposes for using face matching technology. They must understand the capabilities and limitations of the systems they intend to use to ensure the technology is selected and configured appropriately for that purpose. Similarity thresholds and other performance settings should be highly tailored according to the intended use.
Using Accurate Technology. Organizations must strive to use the highest performing face matching technology for a given application, with accuracy validated using sound methods, such as through the National Institute of Standards and Technology Face Recognition Vendor Test program, which is the gold standard for scientific evaluations of face matching algorithm performance, according to SIA. Both developers and end users have a responsibility to minimize any negative effects that could result from variability in technology performance through proper design, selection, and configuration of technology.
Nondiscrimination. Face matching should only be used in ways and for purposes that are nondiscriminatory. There are legitimate concerns that some applications of face matching technology might negatively impact minorities. The highest performing face matching technologies have been found to have undetectable differences in performance across given demographic groups and the performance of most others was much more consistent than had been widely reported in the media and several nonscientific tests.
Data Security. Face matching data transmission, storage, and processing should be optimized to ensure privacy and security using encryption and other cybersecurity and privacy best practices that protect biometric data. Solutions should follow a distributed data approach by limiting biometric data stored in central repositories and storing this data in the form of encrypted digital templates rather than original images.
Privacy by Design. Privacy protections are critical in the deployment of face matching technology as, in most cases, these tools create a link between a person’s facial appearance and personally identifiable information. Face matching systems should be designed to facilitate compliance with current and emerging data privacy laws and support privacy practices and ongoing system maintenance.
Training and Education. It is critical that users of face matching technology know how to configure and maintain the technology, consistent with their policies. Additionally, sellers of this technology should provide buyers, installers, and operators with training on how to achieve the most accurate and unbiased results and commit to doing so on an ongoing basis.
“Increasingly, technology providers are helping ensure their customers are implementing the technology ethically through careful customer screening and selection, terms enforced through end user license agreements, software features like encryption and use audit processes, as well as providing guidance on crafting use policies and even offering staff training,” Parker added.
Help from Ethical Solution Provider Partners
“We partner with our clients to balance safety and privacy,” said Dara Riordan, president of face matching company FaceFirst. “People deserve privacy, and we do offer that to the general public. They also want to be safe. Employees and customers deserve to know when a person who is a known threat has entered a store. They want their families safe.”
Riordan emphasized that deploying face matching technology with privacy, policy, ethics, and transparency should be at the forefront of everything ethical face matching companies do. Early on, some companies worked without these values, giving facial recognition a bad name, and forcing companies like FaceFirst to educate consumers about how technologies like theirs really work.
“Many people hear the words ‘facial recognition’ and think the system knows everyone who walks in, or that it’s matching them against a master database,” Riordan said. “That’s not the case, of course. Our clients’ private databases are their own. They are only populated with individuals who are known to have caused loss or disruption to the business. They are not shared. Our clients have crafted appropriate use programs and oversight policies in partnership with us, and they have been hugely successful in reducing theft and mitigating violence.”
One Retailer’s Success Story
Retailers on average experienced a nearly 27 percent increase in ORC incidents last year, according to a 2022 report from the National Retail Federation, and eight in ten businesses surveyed experienced an increase in associated violence. Because of this, retailers’ interest in face matching technology is increasing, with more than 12 percent of respondents reporting they had implemented or were planning to implement face matching.
The director of loss prevention at one of these retailers spoke anonymously to LPM for this story, detailing the immense success they’ve had with the technology.
The company started researching face matching around five years ago and met with different providers, but the price of the technology was a little expensive. On top of that, the vendors didn’t seem to understand the challenges in the retail LP space, since they were mostly working with government entities and airports. So, they put their consideration of the technology on pause until early 2019, when they saw an opportunity with a face matching solution that checked their boxes. They felt the technology was impressive and accurate, and started thinking about how they could adopt it in their own stores.
“We knew [face matching] had some risk and negative perceptions and that it could be a challenge, so part of our job in LP was to present to our executives how we were going to use the tech and mitigate the risk of exposure,” the LP executive said. “The tool was sold as a theft apprehension tool—which it is—but we took a different spin on it and rolled it out as a customer service and theft prevention tool, not an apprehension tool. We knew that our managers are busy with a lot going on, and this is an amazing tool to give them insight on when a repeat offender walks into the store. There’s a lot of exposure and confrontation that could happen if we used it as an apprehension tool. So, we involved our store management team and trained them very specifically on how to use this tech to minimize physical confrontations and still got the results we needed by providing excellent customer service. That’s how we rolled it out on day one, and three years later, it hasn’t just been a home run—it’s been a grand slam for us.”
When a repeat offender enters the store, the manager receives an alert, and they’re able to approach the customer with the goal of offering excellent customer service, rather than apprehending them. This process was enough to help the retailer stop a shocking 90 percent of their repeat offenders.
“The first day we turned the tech on, we were training the store managers and the training stopped because we received alerts on three ORC boosters and stopped all three of them with great customer service—and we only had a few hundred faces in our database to start with,” the executive said. “So, we knew on day one it was going to be a game changer, and we quickly rolled it out in another fifty stores and started to see win after win.”
In the beginning, they would stop the repeat offender and provide great customer service, but then the offender would leave and drive to another of the retailer’s locations. So, they started rolling out the technology in geographic clusters, and now they’re live in 100 locations, with hopes to add another cluster of stores as soon as possible.
“The wins are so big that when we have something happen in a store that does not have the tech, we really feel the difference,” the executive said. “Even some of our store managers, if you talk to them, they say they don’t want to be transferred to a store without this technology.”
As the retailer deployed face matching in more stores, they started seeing more and more value. It was clearly a great tool for fighting ORC and store theft—not only for putting cases together and saving time and labor, but for issues like employee safety. Authorized users can enroll disruptive people to their watchlist, with notes to contact law enforcement immediately, and details about violent behavior that person has displayed in the past.
On two separate occasions, this retailer has had an employee hit by a car while collecting carts in the parking lot. The last time this happened, the motorist mowed an employee down, knocking him back twenty feet, causing a concussion and cracked ribs. Using the face matching system, the LP team was able to identify the person who committed the hit and run, leading to an arrest.
In another incident, an unknown male posted a threat on social media saying he was going to come into the store and start shooting and warned anyone with children to stay away. Employees immediately saw the post because it had gone viral. The LP team ran a search in the face matching software with the photo the man had attached to his social media profile and had two matches of him in two of their stores. One of the matches was from the night before he had posted the threat, and they confirmed through surveillance video that the subject had been threatening their employees, and that he was wearing a lanyard with an ID card showing where he was employed.
LP officers immediately contacted law enforcement, and they recognized him as someone who had tried to burn down the store where he worked two years ago. The retailer immediately enrolled him in their watchlist, and when he showed up eight days later and started throwing bottles of liquor onto the floor, the system matched his face immediately, notifying store employees so police were able to arrest him and grant a restraining order immediately.
“I spoke about this case study at a recent conference, and I think the impression I got was that many retailers are now starting to discover this tech, and they’re excited about it, as we were three years ago,” the retailer said. “Talking to other LP executives who say the technology didn’t work [when they previously tried it], they give the traditional answer that when a repeat offender comes in they call the manager and tell them to watch the person, and it’s just not very practical. Once we say to use it as a customer service tool and kill them with kindness, they say that makes so much sense. It appears some of those retailers are going to try [implementing face matching technology] again. It’s exciting to see other retailers discovering this and how valuable the tech can be for them as well. It’s all about getting past the myths.”
Despite all the back and forth as to whether face matching is ethical or not, the practice itself is really nothing new.
“When I started out as a store detective in college, we would carry around Polaroids of bad actors, and that was face matching,” Hayes said. “It’s just more automated now, but nothing new. Humans used photo technology to inform their decisions then and still do now.”
Every retailer has a multifaceted loss prevention program that includes efforts to identify known ORC participants. For years this has meant relying on the organization’s case files and subject photos. However, there are significant limitations to this sort of screening.
“Facial recognition simply augments these existing loss prevention programs by anonymously comparing images of individuals entering a property against a typically small list and providing an alert to staff when there is a potential match,” Parker pointed out. “This is the same process that occurs when staff determine there is a potential match without using the technology, except this is more accurate, efficient, and continuously in operation. Note that in such cases the technology is not used to confirm an identity or make a decision. Whether aided by technology or not, steps taken after an individual is flagged are still left to staff to determine based on the situation.”
Part of the reason face matching technology is so helpful is because of how much less bias it has than actual humans.
“All humans are biased as part of their survival instinct, and people often lack focus or can become confused in visually dense environments, whereas properly trained artificial intelligence (AI) models are providing more consistent alerts to decision makers,” Hayes said. “There is no silver bullet, and there never will be, but the stakes are high with life safety. Face matching seems to be a tool that better safeguards people because it can provide more accuracy and less bias.”
Riordan said that one of their client’s attorneys told them they faced far more legal risks before using face matching. “They relied solely on human decision-making, and they had no notice when known threats walked in,” Riordan said. “I think he’s correct—there’s more risk and more liability for retailers who do not use proven technology when it’s available to them. Relying on human recognition alone is a known risk. When you use AI with human oversight, it’s a better option. Face matching lets you warn associates not to approach a known threat, or to call 9-1-1 if the person fits that enrollment qualification. Face matching isn’t just a loss prevention tool. While it can help reduce shrink, it’s a gamechanger as a real-time life safety tool and exceptionally powerful investigative tool.”
States and cities are starting to realize this, with some reversing facial recognition bans or restrictions they had previously implemented. The city council of New Orleans, for example, approved an ordinance in July 2022 that restores the use of facial recognition tools to aid in criminal investigations by the New Orleans Police Department, though under new guardrails, and subject to a comprehensive use policy approved by the state and federal government. Earlier this year, Virginia lawmakers replaced the state’s ban on law enforcement’s use of facial recognition technology with comprehensive rules. In West Lafayette, Indiana, city council members rejected a proposed ban of the technology for city agencies.
Now, only three jurisdictions in the US limit the use of facial recognition technology in ways that could make it difficult to implement in retail settings. While allowing the use of biometrics under certain conditions, the Biometric Information Protection Act in Illinois creates numerous hurdles and litigation risks that discourage adoption. Portland, Oregon has banned the use of the technology in businesses and spaces open to the public. Baltimore, Maryland imposed a temporary ban on the use of the technology by businesses through 2022—though security systems are specifically excluded—and city officials are currently debating whether to extend that ban.
Retailers should not be afraid to fully explore the benefits of deploying face matching technology because of concerns about looming legislative restrictions though, according to Parker.
“While most legislative proposals we have seen that specifically limit facial recognition technology apply to state or local government, no jurisdiction has moved to prohibit or substantially limit commercial use of the technology this year,” Parker said.
The anonymous LP executive we interviewed for this story said that the people who initially oppose face matching technology often don’t really understand how it works and need to do more research to understand how effective it can be—especially as it relates to reducing crime and protecting employees.
“If you’re not considering the technology, you need to reconsider,” the executive said. “At least test it. Maybe it’s not for everyone, but at least try it and see if it works for you. If you aren’t trying this, you’re going to be left behind. The risk is so minimal compared to the enormous benefit you will see.”