In recent years, authorities have been rolling out new kinds of facial recognition, with the promise that it would be able to spot dangerous people in real-time.
But privacy activists and others have warned that it is a vast invasion of privacy, could be used to create watchlists of people, might falsely accuse people because of racism and other biases and unfair practices.
There is still time for authorities to change their mind and avoid the vast dangers that the technology could produce, the head of the watchdog warned.
“We’re at a crossroads right now, we in the UK and other countries around the world see the deployment of live facial recognition and I think it’s still at an early enough stage that it’s not too late to put the genie back in the bottle,” Commissioner Elizabeth Denham told the PA news agency.
“When sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the impacts could be significant.
“We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights without having our biometric data collected and analysed with every step we take.”
In the future, it is possible that CCTV cameras could be overlaid with live facial recognition systems and even combined with social media data, she warned.
The Information Commissioner published an opinion report detailing the extensive concerns and legal issues that organisations should be aware of before using the technology to automatically collect biometric data in public areas, both public and privately owned, such as parks and shops.
However, it does not focus on law enforcement usage.
Six investigations found uses such as generating biometric profiles to target people with personalised advertising, though none were able to fully justify their usage nor were they fully compliant with the requirements of data protection law, the Information Commissioner’s Office (ICO) said.
Consequently, all of the organisations in question chose to stop, or not proceed with, the use of live facial recognition.
It comes after a civil rights campaigner took South Wales Police to court last year, arguing the use of similar automatic facial recognition (AFR) had caused him “distress”.
Ed Bridges had his face scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.
Court of Appeal judges ruled the use of the technology was unlawful and allowed Mr Bridges’ appeal on three out of five grounds he raised in his case.
The ruling does not prevent the force from using the technology but means it will have to make changes to the systems and policies it uses.
“It is not my role to endorse or ban a technology but, while this technology is developing and not widely deployed, we have an opportunity to ensure it does not expand without due regard for data protection,” the Information Commissioner added.
“Companies need to ask themselves whether the use of live facial recognition is necessary, is it proportionate to actually meet the purpose, are there other ways to achieve the purpose that are not so very intrusive?”
Additional reporting by Press Association
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies