The efforts of retail giant Bunnings to address violence and aggression in its stores by using facial recognition technology (FRT) were described as “well-intentioned”. But the Privacy Commissioner also found what the hardware chain did wasn’t justified, even if it was helpful or convenient.
Privacy experts say the recent ruling is significant and will cause those deploying FRT to be more cautious about its use.
Between November 2018 and November 2021, Bunnings conducted a trial of FRT at 63 stores in Victoria and New South Wales. The company says there were strict controls, and the only intention was to keep staff and customers safe and to prevent unlawful activity.
In her ruling, Privacy Commissioner Carly Kind acknowledged the potential of FRT to protect against issues such as crime and violence. “However, any possible benefits need to be weighed against the impact on privacy rights, as well as our collective values as a society,” she said.
“In this instance, deploying facial recognition technology was the most intrusive option, disproportionately interfering with the privacy of everyone who entered its stores, not just high-risk individuals.”
Jodie Siganto is the founder and Director of privacy, cybersecurity consulting and law firm, Privacy 108. She says she was pleasantly surprised by the decision.
“I think there are some really important principles that are now going to be … considered in more detail, which will be fantastic.
“One of the biggest problems for privacy law in Australia has been the lack of jurisprudence around some of the key concepts and ideas.”
Bunnings has already signalled its intention to seek a review of the determination. Managing Director Mike Schneider says the trial demonstrated the technology had created a safer environment and a reduction in incidents and theft.
“We believe that customer privacy was not at risk. The electronic data was never used for marketing purposes or to track customer behaviour. Unless matched against a specific database of people known to, or banned from stores for abusive, violent behaviour or criminal conduct, the electronic data of the vast majority of people was processed and deleted in 0.00417 seconds – less than the blink of an eye,” says Schneider.
But Jodie Siganto says the ruling reflects a changing attitude towards CCTV, which has been present in Europe for some time.
“[T]hey have recognised that CCTV is like a surveillance-type technology and when you add the facial recognition aspect to it, which just automates that ability to identify particular individuals in large groups, it makes it a more targeted and pervasive kind of technology that can be used for all sorts of things.”
She says the decision will also have the effect of making people more aware about the capabilities of the technology.
“I know lots of people say, ‘no, I understand it and it’s all good,’ but I think it’s good for us to at least have a conversation about what are some of the potential harms of having this kind of pervasive surveillance in somewhere like Bunnings,” says Siganto.
The other important aspect of the ruling, according to Siganto, is that it introduces the concepts of necessity and proportionality to the use of facial recognition technology.
“You have to look at the alternatives and consider whether or not they would have been as effective and balance that up against the impact on the individual’s rights and freedoms.”