Police accused of deploying facial recognition 'by stealth' in London

Fresh trial in Stratford results in no one being stopped and zero arrests

Lizzie Dearden
Home Affairs Correspondent
Friday 27 July 2018 19:28 BST
A public information poster displayed during a Metropolitan Police trial of automatic facial recognition in Stratford, east London, on 26 July
A public information poster displayed during a Metropolitan Police trial of automatic facial recognition in Stratford, east London, on 26 July

A new trial of controversial facial recognition technology in east London has failed to result in any arrests, as police face legal action and accusations of rolling out surveillance cameras by “stealth”.

Scotland Yard pledged that the operation in Stratford would be carried out overtly but monitors said few people were aware that they were being scanned on Thursday.

Cameras positioned on a bridge near the Westfield shopping centre recorded thousands of passersby over eight hours, looking for potential matches with a database of wanted violent criminals.

The Metropolitan Police said no one was stopped or arrested as a result of the operation, which is the second of 10 facial recognition trials to be carried out by the end of the year.

It came a day after Green Party politician Baroness Jones and the campaign group Big Brother Watch launched a legal challenge over the “Orwellian” technology at the High Court.

They claim the software, which was found to be returning “false positives” in 98 per cent of alerts earlier this year, is “dangerously inaccurate” and violates human rights.

Scotland Yard pledged that members of the public would be informed using leaflets and posters in Stratford, but monitors said people were not being properly warned before walking into the scanning zone.

The same criticism was levelled during a previous trial at the same location in June, which also resulted in zero arrests.

Police are trailling controversial facial recognition technology in Stratford

Sian Berry, a London Assembly member, said she walked past posters herself because they were not fully visible through throngs of shoppers.

“Officers were not proactively giving leaflets out, and they weren’t always handing them to people who asked them questions either,” she told The Independent.

“Lots of people didn’t seem to be noticing the posters at all… from a basic data protection point of view they’re not informing people well about what data they’re collecting or what is going on.”

The first half of the trial saw officers in uniforms positioned on the bridge to answer any questions, but in the afternoon they were replaced by plain-clothed colleagues.

Ms Berry, who sits on London’s police and crime committee, said it was unclear whether Scotland Yard was testing facial recognition as a deterrent against violent crime, or to see whether the software works on a real-life sample of people.

“I’m not at all happy,” the Green Party politician added. “This seems like a PR exercise – they are trying to get people used to the technology before rolling it out ahead of any legal basis.

“A lot of the goals of the day seemed to be reassuring the public this technology isn’t sinister but there needs to be an objective analysis.”

Activists from Big Brother Watch, who were giving out their own leaflets detailing human rights concerns over automatic facial recognition, said members of the public were “stunned” to learn they were being scanned.

Members of Big Brother Watch hand out leaflets during a Metropolitan Police trial of facial recognition in Stratford, London, on 26 July

“I don’t think anyone knew what was going on,” said Silkie Carlo, director of the campaign group.

“Technically police can say this is overt, but our experience was that people didn’t know it was going on and didn’t even know what automatic facial recognition is as a concept.

“The total absence of a public debate and information about the technology means a poster will not suffice.”

Ms Carlo accused police forces of rolling it out “by stealth” ahead of regulations being put in place, adding: “Oversight panels have been ignored, commissioners have been ignored, human rights bodies have been ignored.”

The Metropolitan Police failed to implement all recommendations made by the London Policing Ethics Panel before launching the new trial.

Last week, the body warned that there was a “lack of clarity about the legal basis for the use of the technology and its regulation” and called for Scotland Yard to publish its view on its legality before any further trials.

It said information should be found “quickly and easily” on the force’s website and told police to inform Londoners of the reason for trials, and consult them on where they will be carried out.

Commander Ivan Balhatchet said a “comprehensive legal framework” would be published on the force’s website within the next fortnight alongside other information on facial recognition.

Police said anyone refusing to be scanned would not be viewed with suspicion and that they only retain recordings of potential matches for 30 days for “technical assessment” before they are deleted.

When an alert is found by a master computer, it is sent to both an operations room and officers on the ground, who judge whether to intercept an individual for questioning before they leave the area.

“In some instances, officers will be guided by the operations room to carry out further checks on a potential match, in other more fast-moving instances officers on the ground may engage with an individual straight away,” a Scotland Yard spokesperson said.

Asked why plain-clothed officers were used in an overt trial, the force said a “range of tactics” and technology was being tested, adding: “It is important to test a range of methodologies during the trial period.

“Officers proactively distributed leaflets throughout the deployment in Stratford and continually engaged with members of the public to inform them about the operation.

“After each trial a de-briefing is held, and all feedback regarding the deployment is taken on board.”

Opponents argue that the software currently being used by the Met and other British police forces is “staggeringly inaccurate” and has a chilling effect on society, while supporters see it as a powerful public protection tool with the ability to help track terrorists, wanted criminals and vulnerable people.

The information commissioner threatened legal action over facial recognition in May, calling it “intrusive” and demanding answers to questions over transparency, accuracy, bias, effectiveness and a lack of national coordination.

Liberty is backing a separate attempted challenge against South Wales Police by a Cardiff resident who believes his face was scanned at a peaceful anti-arms protest and while doing his Christmas shopping.

The government has announced the creation of a new oversight and advisory board for facial recognition in law enforcement, which could be expanded to ports, airports, custody suites and police mobile devices.

The Home Office’s strategy on biometrics, which also include fingerprints and DNA, said the board would make recommendations on policy changes and oversight arrangements for the technology.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in