When, in 2003, novelist William Gibson was famously quoted in The Economist as saying, “The future is already here—it’s just not evenly distributed,” he could easily have had the uneven development of CCTV surveillance in his crosshairs.
Like the prophetic writer H. G. Wells before him, Gibson—who coined the phrase “cyberspace” as early as 1982—has a forensic eye for trends, but as his statement implies, there is an inconsistency in the application of technology, such as CCTV surveillance and feature recognition (FR) and what many see as the tortoise-like regulations that are meant to govern their use.
Interactive Round Table
The infinitely long reach of technology versus the perceived legal lag of regulation formed the basis of a 2018 interactive roundtable industry summit entitled “From CCTV to Video Analytics: Developing a Framework for Facial and Feature Recognition Technology in Retailing,” sponsored by Canadian digital IP innovation specialists Genetec at Goodenough College in London.
Facilitated by Colin Peacock, the group strategy advisor for ECR Community, leading UK retail delegates were asked to spell out their own aspirations for and concerns about a feature recognition technology roll-out across retail.
The delegates were asked to list their questions at the beginning of the day, with a view that the roundtable discussions would provide solutions during the close of play feedback. Questions included, “To what extent do existing regulatory codes cover emerging technological changes?” and “How can retailers ensure they look beyond profitability to ensure that they remain legal in their quests for the ultimate customer experience?”
The purpose of the day was to understand how retailers using the technologies do so without breaching these codes or potentially generating highly negative media coverage, such as in the case of RFID in the early 2000s. Apart from the immaturity of the emerging technology and cost issues making a clear ROI harder to recognize, the rollout of RFID was also smothered by a privacy backlash, particularly in the United States, and has only now re-emerged as a potent solution.
Furthermore, how can regulatory bodies work closer with retailers to develop robust new codes of practice that can support the ever-changing technological landscape?
While we accept the use of CCTV surveillance as common practice and part of the everyday street furniture of our society, moderation of its use is still largely a matter for voluntary codes of practice. Now more formalized with the establishment of the surveillance camera commissioner in the UK, retailers are increasingly looking to go beyond traditional CCTV for LP and security use and more into the realms of the data analytics in order to truly understand their customers’ journeys. Questions of greater interest to the retailer than simply, “Are they going to steal from me?” include: How many people came into the store? What were they looking at and for how long? What do customers look like, and what kind of mood they are in (an indicator of their propensity to spend more money)?
Colin Peacock put it succinctly: “Bricks-and-mortar retailers are trying to understand their shoppers in order to catch up with almost perfect visibility that online retailers have to hand.”
According to Professor Adrian Beck of the Department of Criminology at the University of Leicester, a leading academic in the field of loss prevention and the origins of CCTV, and one of the keynote speakers on the day, “The key differences to consider here are consent and perceived value. When you shop online, you consent to giving your information—how else are you going to receive and pay for your goods electronically? It has to be thought about transactionally—what’s in it for the parties concerned to share or give data? It could be argued that when a customer’s data is collected automatically and without perceived benefit to themselves, in the high street, then privacy becomes an issue.”
Pushing the Boundaries
Genetec’s (former) retail strategist, Carl Boutet, set the scene of great technological strides, driven by companies pushing the boundaries—both geographical and economical—set against a backdrop of the General Data Protection Regulation (GDPR) introduction set for May 2018.
Boutet, whose focus was bridging the physical and digital divide, described his work as looking at the “analytics-fueled intersection of algorithms and outcomes.”
“China is the harbinger of future,” Boutet told the audience of more than sixty retail delegates. “Half of its population is under surveillance, and technology such as feature recognition is being used for a myriad of purposes from gaining access to theme parks to the dispensing of toilet paper.”
Commenting on the younger generation’s ease with the technology and their comfort with sharing their private information in return for a better shopping experience, Boutet added: “However, it would appear that the benefits seem to outweigh the privacy issues, so how can we use ‘scary’ technology for good?”
Boutet said that algorithm-based technology has been embraced by many householders adopting electronic personal assistants such as Amazon’s Echo, as well as Google’s Home product, and all to make their domestic lives easier.
In China, Suning, the country’s third-largest retailer, uses feature recognition to link customers and their bank accounts as they walk into the store, which allows an almost seamless customer journey.
This led Boutet to pose the question, “Is convenience the driver for the consumer rather than the need to protect liberty and our own personal privacy?”
Like RFID in its infancy, there is also technological hype around true capability. This is in part because of the problems of getting feature recognition to work in complex and challenging environments where the lighting is poor or fields of view are restricted. A video shown to delegates highlighted a facial recognition system used in a crowded canteen becoming overwhelmed by the volume of footfall. This also proved an issue for the Amazon Go concept store, although the tech giant ironed out these issues ahead of the first store’s opening in Seattle in January 2018.
Boutet described this as “no technological walk in the park” and went on to describe a trial example at Dubai Airport where feature recognition will be ostensibly used as a security measure as passengers rapidly pass through a media-rich advertising tunnel, which encourages them to look all around so that it is able to capture every aspect of the person’s face.
“This ‘face trap’ concept could help solve the problems of getting good field of vision and allows passengers through security quickly for commercial reasons. The tech knows that your levels of stress influence your appetite for transactions at the airport, where it has the potential to make a huge percentage of income from terminal sales.
“Therefore, it is likely that people like LP professionals will become purveyors of market intelligence in an age where machine learning is making great strides,” he said.
But he cautioned, “There is also a general concern about the security of security. The availability of all this technology and data increases the need to properly protect against vulnerabilities.”
Privacy Concerns
Professor Beck provocatively asked delegates, “What do we mean by privacy in a social media age when we share what we had for breakfast?
“During the 1990s, when companies were looking to roll out RFID, a libertarian group in the US called Caspian created a sense of fear around privacy issues to the point that retailers steered clear of the technology. Now this is regarded as a non-event.
“The public in the UK have shown far less concern about the widespread use of a range of surveillance technologies in public spaces, something that has not been seen in other European countries. The boundaries of acceptability have changed over time. In the 1990s, there was concern about CCTV. In the early noughties, there was concern over RFID, but for both of these technologies, concerns about privacy have largely evaporated as public attitudes have changed.
“Arguably, FR sits on the new frontier of privacy acceptability, and so users need to be aware that its use could cause concern in the short and medium term, but over time, this may dissipate as opinion levels and tolerances change.”
Another speaker, Andrew Charlesworth, reader in information technology law at the University of Bristol, said that it was a legitimate question to ask, “Why am I being watched?”
He continued, “There is a danger that technology is being rolled out without public involvement or consultation, which could result in a backlash, which could ultimately be a waste of money.
“There is a public perception that it’s all about crime, but this is no longer the case. There is also the fear of bad practice and the rapid roll-out of IP camera technology as this causes more issues. This is because digital technology by its nature is borderless and stored in the cloud, which begs the question, ‘Can it be hacked or be used for denial of service (DoS)?’ If you disregard people’s views and treat them as ‘pathetic dots’ you will get pushback.”
He said casinos highlighted the benefits of consent and taking the public with you when it openly used facial recognition in the identification of problem gamblers. He said the technology could also be used positively with age-restricted sales, part of a process Charlesworth refers to as “innovation with care” that involves and engages the customer.
Worries over GDPR Compliance
The outcome of the event was not a message to retailers to “shut up shop” on facial recognition but highlighted the desire for a roadmap of how to align the technology with the new GDPR, if this is possible without stifling innovation. Indeed, compliance is the end game, but there is still to be a debate about how regulation and best practice can more effectively keep pace with emerging fields of technology.
Up until recently, CCTV surveillance and facial recognition technology were tools for crime prevention, but data analytics and the information that can be mined from customers has changed the landscape. LP professionals are now gatekeepers of intelligence that can enhance the bricks-and-mortar customer experience in an era of mushrooming online sales.
The existing in-store signage advising customers that CCTV is in operation fails to drill deeper and inform exactly what it is being used for. It is assumed for crime prevention, but would it, as is the case with new feature recognition technology, be acceptable for retailers to admit that they are harvesting marketing material to better sell merchandise to you?
The UK’s 1998 Data Protection Act (DPA) and the GDPR act as a form of conscience for big business chasing the big bucks by tapping into more and more of the data that we as individuals seem happy to disclose through our social media profiles, for example. The more of our information companies hold, the more exposed they are to fines and brand damage in cases of international data breaches.
More worryingly, a survey conducted in December 2017 argues that while 54 percent of UK businesses expected a data breach in the next twelve months, only 48 percent of respondents believed that their companies were financially prepared to cover the fines, which could be as high as four times the global turnover of the corporation.
The jury is out as to whether the questions raised at the beginning of the workshop were adequately answered as the regulators would conclude that for many retailers, compliance with current rules is still an aspiration. So in terms of facial recognition, the maxim must be “proceed with caution,” and as Andrew Charlesworth suggested, the technology must be introduced with care and consent so that developments can be willingly accepted rather than surveillance creep having to be negatively reacted to and defended.
There is also the argument over whether regulators really understand what the public wants. Would they win a court case against the use of feature recognition if the retailer were able to demonstrate that it had reduced incidents of violence as a result of its installation? In the 1970s, a Woolworths store manager was murdered, and the perpetrator escaped. The store was widely accused of a dereliction of duty because it did not have CCTV surveillance. The pendulum of public opinion swings both ways on this issue, but the clock continues to tick towards the future as technology will continue to develop well beyond the introduction of GDPR.
For more on the regulation of feature recognition technology, check out the full article, which was originally published in LP Magazine Europe in 2018. This excerpt was updated February 27, 2019.