Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.

Corporations are refusing to abandon intrusive facial recognition in public – so let’s beat the system instead

Standing up to invasions of privacy, such as plans to use the technology on a 67-acre site in King's Cross, may not be in their commercial interests, but they won't be able to ignore public pressure for long

 

Mike Harris
Thursday 15 August 2019 13:22 BST
Comments
Police are trailling controversial facial recognition technology in Stratford

One of the most sinister but ignored elements of parliament’s Digital, Culture, Media and Sport select committee inquiry into fake news was the use of machine learning to process the images of people by Canadian firm AggregateIQ. This software could be used to scan photographs of people and potentially match them to their Facebook profile. Across the internet, it is possible right now that your face is being used to match you across different platforms in different situations to build a picture of who you are. Our faces give away a lot about us; our ethnicity, our gender, our age, our weight, and more importantly can be cross-referenced quickly using our social media profiles to discover who we are. Your social media profile is then a gateway to your likes, your mood and your location.

This is why we should be so concerned about the use of facial recognition technology by private developer Argent in their 67-acre development in the heart of King’s Cross. It isn’t clear why Argent requires such controversial technology at all, but right now in central London, tens of thousands of unsuspecting citizens are having their faces scanned by a property developer.

Innocent shoppers and tourists exploring the beautiful new shopping plaza at Coal Drops Yard may also be subjected to intrusive facial scanning, linking their faces to databases without their consent. Facial recognition takes the principle of innocent until proven guilty and turns it on its head. The police need “reasonable suspicion” to stop and search you (which hasn’t prevented this power being abused), but facial recognition means your face will be scanned regardless of whether you have done anything to warrant this.

The new world of facial recognition looks grim. Facial recognition is being piloted by China as part of their authoritarian project of total thought control. Researchers Xiaolin Wu and Xi Zhang in their paper, “Automated Inference on Criminality Using Face Images”, claim they can train algorithms to identify individuals with facial traits that suggest criminality. And those attributes? “lip curvature, eye inner corner distance, and the so-called nose-mouth angle” tell the criminal from the non-criminal.

China is leading the world in authoritarian social content, using technology including social credit, which uses facial recognition and CCTV to reduce the “social credit” of the subjects of communist rule for minor infringements of the law. Infringements which include putting your feet on train seats, let alone criticism of China’s Orwellian system of government.

And let’s be blunt. This technology is deeply problematic for black people. Facial recognition startup Faception claim on their website: “We develop proprietary classifiers, each describing a certain personality type or trait such as an extrovert, a person with high IQ, professional poker player or a threats (sic)”. The notion that our faces reveal our IQ, or potential criminality, has very troubling echoes with past experimentation in this area, especially the deeply racist Victorian “science” of physiognomy. One of the lead proponents of physiognomy, Cesare Lombroso, was an intellectual inspiration to Italian fascists and the Nazis, who took on his view that the face (and therefore your race) could denote your potential criminality in their murderous programme of eugenics and genocide.

And in practice, the racial element to these technologies is real. Obtained minutes for a police committee shows that the former head of facial recognition for the UK police knew that race is a live issue. At the Notting Hill carnival in 2017 and 2018 (and at a Remembrance Sunday event), facial recognition technology made over a hundred false matches but led to not a single arrest. Silkie Carlo, director of campaign group Big Brother Watch, who is leading the charge against this dangerous technology said: "The police's failure to do basic accuracy testing for race speaks volumes. Their wilful blindness to the risk of racism, and the risk to Brits' rights as a whole, reflects the dangerously irresponsible way in which facial recognition has crept on to our streets."

Independent Minds Events: get involved in the news agenda

Corporations are often the weakest link in protecting our privacy. Why would Argent, the owner of the land, stand up to police requests for facial data? In recent years we have seen how major companies such as EE, Vodafone and Three gave the police backdoor access to their subscriber databases, what guarantees do we have that Argent won’t hand over the faces of thousands of innocent people to the Metropolitan police for racial profiling or insights on human behaviour? In a statement, Argent said: “These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public.” Yeah right. Without legal guarantees in place, or transparency about how this database is used, how can we know what's going on?

As the Open Rights Group told me, there is a very real concern that police databases used for facial recognition services include peaceful protesters such as Extinction Rebellion or trade unionists, and that if private companies link their cameras to police databases, suddenly the police have real time knowledge of the movements of entirely innocent people.

We know that companies do respond to public pressure. If you care about freedom, tell the shops in Coal Drops Yard that you aren't happy your face is being surveilled. Choose a shopping complex that doesn't view your facial attributes as fair game. In an open society, as citizens and consumers, we can send a clear message to companies that think our privacy is a tradable commodity: we’ll go somewhere else.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in