Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

'Death knell' for facial recognition as watchdog finds technology must 'significantly' improve policing

Exclusive: London mayor says technology must not 'cost our values as an open and free society' 

Lizzie Dearden
Home Affairs Correspondent
Monday 03 June 2019 08:50 BST
Comments
Police are trailling controversial facial recognition technology in Stratford

Facial recognition must not be used on the general public unless it can be proven to significantly improve policing, a watchdog has warned.

Campaigners said the London Policing Ethics Panel report could sound the “death knell” for the Metropolitan Police’s use of the “staggeringly inaccurate” technology amid high costs and poor results.

The panel said facial recognition should not be adopted unless it could be shown from the field trials that it would be able to significantly increase police efficiency and effectiveness in dealing with serious offences.

After identifying a series of ethical concerns, the body said: “Marginal benefit would not be sufficient to justify life facial recognition’s adoption in the face of the unease that it engenders in some, and hence the potential damage to policing by consent. Clearly there is no benefit to be gained from adopting an ineffective technology.”

A spokesperson for London mayor Sadiq Khan said Scotland Yard must only deploy the technology if the panel’s recommendations were met.

“The mayor understands that there are concerns around the use of facial recognition technology and welcomes the recommendations made by the independent ethics panel,” he told The Independent.

“Mr Khan’s number one priority is the safety of Londoners, but that must not come at the cost of our values as an open and free society.”

No arrests were made at some of the 10 trials carried out in the capital at a cost of more than £220,000.

In the first eight experiments carried out between 2016 and 2018, members of the public were misidentified as criminals in 96 per cent of potential “matches” flagged up by the software.

Campaign group Big Brother Watch has already launched legal action against the Metropolitan Police, arguing that its use of facial recognition breaks human rights laws.

The group’s director, Silkie Carlo, told The Independent that the new report could be a “death knell” for the controversial technology, but that it was unclear how its effectiveness would be judged.

“Police have used live facial recognition lawlessly, misidentifying people over 90 per cent of the time, with many of those being wrongly stopped and subject to further police intrusion,” she said.

“Facial recognition is ineffective at the moment and our investigations have shown that it is staggeringly inaccurate. It will likely improve over time, but at what cost? Police effectiveness is of course a good thing, but oppressive policing is not.”

Facial recognition trial in London's West End

Live facial recognition (LFR) compares faces caught in rolling footage of a target area to images on a watchlist compiled from police databases.

Police have conducted trials at the Notting Hill Carnival and parts of London including Stratford, the West End and Watford.

They used facial recognition cameras to scan passing members of the public and flag potential matches, as numerous plain clothed police officers waited on standby to apprehend people.

The Metropolitan Police is conducting its own assessment of the trial results, while the London Policing Ethics Panel looked into the way they were carried out.

Its report raised concern about the extent to which the public were provided with information before being filmed, after The Independent found that people were unaware they were being scanned and had not been given leaflets that had been promised by police.

The panel said that although Scotland Yard claimed that anyone refusing to be scanned would not necessarily be viewed as suspicious, a man was fined £90 after covering his face and then swearing at officers who stopped him.

Amid continuing calls for national police budgets to be increased, the report said it should be considered whether resources required to adopt LFR could be better used elsewhere.

It highlighted concerns about the accuracy of the software, particularly when scanning black and ethnic minority (BAME) people, and said it must only be rolled out if data showed unacceptable gender and racial bias would not be imported into policing operations.

Police officers stand among shoppers in Stratford, East London. (The Independent)

The report said every deployment would have to be “necessary and proportionate” for specific policing purposes.

It contained a survey finding that young and BAME people were more likely stay away from events where LFR was in use, while others said it would make them feel safer.

The ethics panel said watchlists should be limited to people who were wanted for serious offences or posed a serious threat, but Scotland Yard only restricted scans to violent offenders in later trials.

Johanna Morley, the force’s senior technologist, has subsequently admitted that funding was required to upgrade its back-end technology systems to avoid illegal images being put on the list.

A 2012 High Court ruling said police must only hold photos of subjects of interest, but two trials used lists containing people who had already been dealt with.

“There was a short time-frame between the creation of the watchlist and the deployment taking place,” a spokesperson for the Metropolitan Police told The Independent.

“This meant that on a small number of occasions, some individuals who were on the watchlist for a deployment had already been identified and dealt with by police.

“However, after receiving an alert, further checks and balances were always carried out before any police action was taken.”

As well as the legal action against Scotland Yard, South Wales Police is also awaiting a judgment on its use of facial recognition after a man scanned in Cardiff claimed his privacy and data protection rights were violated.

Megan Goulding, a lawyer at Liberty, said: “It is now for police and parliamentarians to face up to the facts: facial recognition represents an inherent risk to our rights, and has no place on our streets.”

Scotland Yard said it would take the recommendations made by the London Policing Ethics Panel seriously but any decision on the operational use of the technology would be made by the Met.

Detective Chief Superintendent Ivan Balhatchet, who has led the trials, said: “We fully accept that views vary amongst different community groups and some have concerns regarding their privacy.

“We want the public to have trust and confidence in the way we operate as a police service and we take the report’s findings seriously.

“The Met will carefully consider the contents of the report before coming to any decision on the future use of this technology.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in