Facial recognition has been used unlawfully and violated human rights, Court of Appeal rules in landmark case

Calls for ‘sinister’ technology to be banned, but police vow to continue using it

Lizzie Dearden
Home Affairs Correspondent
Tuesday 11 August 2020 11:15
comments
A South Wales Police van mounted with facial recognition cameras
A South Wales Police van mounted with facial recognition cameras

A British police force violated human rights by unlawfully using facial recognition technology, the Court of Appeal has ruled in a landmark case.

Judges had been considering two uses of the controversial technology in Cardiff, but the judgment could have an impact on its growing use in other parts of the UK.

Campaigners called for “sinister” facial recognition to be banned as a result of the ruling, but national police leaders said it does not prevent it from being used.

A watchdog accused the home secretary of being “asleep on watch” after the Court of Appeal found that current policies failed to limit how powers can be exercised by the police.

Three senior judges said that South Wales Police had violated the right to privacy under the European Convention on Human Rights, as well as data protection laws and duties to address concerns about racial or sex discrimination.

An order published on Tuesday said two deployments in December 2017 and March 2018, and others “on an ongoing basis … was not in accordance with the law”.

The legal challenge had been dismissed by High Court judges in September, but the Court of Appeal allowed a challenge by campaigner Ed Bridges on three of five grounds.

Sir Terence Etherton, the master of the rolls, Dame Victoria Sharp, the president of the Queen’s Bench Division, and Lord Justice Singh agreed unanimously that police had been given “too broad a discretion” over the watch lists used to compare scanned faces against.

They said there had not been an adequate data protection assessment, which is required by the Data Protection Act 2018, and had violated the Public Sector Equality Duty that aims to guard against discrimination.

“There was no evidence before it that there is any reason to think that the particular AFR [automatic facial recognition] technology used in this case did have any bias on racial or gender grounds,” the judgment said.

“However, the whole purpose of the positive duty is to ensure that a public authority does not inadvertently overlook information which it should take into account.”

Ed Bridges brought a legal challenge against South Wales Police over its use of facial recognition technology

Judges urged all police forces using the “novel and controversial” technology in future to do “everything reasonable … to make sure that the software used does not have a racial or gender bias”.

Mr Bridges said his human rights had been violated by the “intrusive surveillance tool”, after he was scanned at a protest and while Christmas shopping in Cardiff.

Two arrests were made in the first deployment, and the second identified a person who made a bomb threat at the same event the previous year.

The High Court found that up to 500,000 people may have been scanned by South Wales Police as of May 2019.

Mr Bridges crowdfunded more than £9,000 for the legal battle, which the High Court said was the “first time that any court in the world had considered” automatic facial recognition.

He said South Wales Police had been using the “sinister” technology indiscriminately against thousands of people without their consent.

The 37-year-old said he was “delighted” with Tuesday’s ruling, adding: “Facial recognition clearly threatens our rights.

“This technology is an intrusive and discriminatory mass surveillance tool.”

Kit Malthouse says facial recognition will make the search for suspected criminals 'quicker and more effective'

Mr Bridges was supported by the Liberty human rights group, which called for the technology to be banned in Britain.

“This judgment is a major victory in the fight against discriminatory and oppressive facial recognition,” said Megan Goulding, a lawyer at Liberty.

“It is time for the government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it has no place on our streets.”

South Wales Police said it would “work with” the judgment but continue to use AFR Locate software, which it said had resulted in 61 arrests so far.

Chief constable Matt Jukes said: “Our policies have already evolved since the trials in 2017 and 2018 were considered by the courts, and we are now in discussions with the Home Office and surveillance camera commissioner about the further adjustments we should make and any other interventions that are required.”

South Wales Police was one of only two forces using automatic facial recognition in the UK.

In London, the Metropolitan Police has started using the technology in regular deployments despite concerns about the accuracy and lawfulness of a series of trials.

The national policing lead for facial recognition, South Wales Police’s deputy chief constable Jeremy Vaughan, said there was “nothing in the Court of Appeal judgment that fundamentally undermines the use of facial recognition”.

He added: “The whole aim of facial recognition technology is to keep the public safe and assist us in identifying offenders and protecting communities from individuals who pose a risk.”

Tony Porter, the surveillance camera commissioner, said the judgment was not “fatal” to the technology but that clear parameters needed to be set on its use and regulation.

He added: “My considered view is that Home Office and the secretary of state [Priti Patel] have been asleep on watch and should reflect upon the comments of the court and now act in the public interest.

“I urge ministers and officials to listen to the independent regulatory voices which they have appointed to consider and advise on these matters, not ignore them.”

Mr Porter urged the government to ditch plans to “dilute” his role by merging it with fellow watchdog the biometrics commissioner, and commission an independent review of “the legal framework which governs overt state surveillance”.

A Home Office spokesperson said: “We note the outcome of this case and are carefully considering the details.

“The government is committed to empowering the police to use new technologies like facial recognition safely, within a strict legal framework.”

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

View comments