Sainsbury’s shopper felt like a ‘criminal’ after facial recognition error
Warren Rajah was left ‘distraught’ after staff at the supermarket mistook him for an offender
A data strategist has described feeling like a “criminal” after Sainsbury’s staff mistakenly identified him as an offender using facial recognition software, leading to his removal from a store.
Warren Rajah, 42, from Elephant and Castle, south London, was shopping in his local branch on 27 January when he was approached by employees, asked to leave, and had his purchases confiscated.
A “distraught” Mr Rajah questioned the decision, with staff reportedly pointing to a sign indicating the store’s use of facial recognition technology.
It later emerged that he had been confused with another individual, who was listed as an offender in the system and was also present in the store at the time.

Sainsbury’s has since apologised to Mr Rajah, stating there was no fault with the Facewatch technology, which is currently deployed in seven of its stores.
On being misidentified, Mr Rajah told the Press Association: “You feel horrible, you feel like a criminal and you don’t even understand why.”
He added: “To tell you to leave the store without any explanation gives you the impression that you’ve done something wrong.
“If you speak to anyone in the public, that is what they will tell you, when you’ve been forced and excluded from an environment, you automatically think you’ve done something wrong, especially with security.
“That’s just a normal human response.”
Mr Rajah said that after being removed from the store he contacted Facewatch, which told him he was not on its database after he sent a copy of his passport and a photo of himself.
Sainsbury’s later apologised and offered him a £75 shopping voucher.
A spokesperson for the firm said: “We have been in contact with Mr Rajah to sincerely apologise for his experience in our Elephant and Castle store.
“This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store.”

The UK’s second largest supermarket chain has said the technology is part of its efforts to identify shoplifters and curb a sharp increase in retail crime in recent years.
Its website says that the system has a “99.98 per cent accuracy rate and every alert is reviewed by trained colleagues before any action is taken”.
Sainsbury’s said that the system issues an alert based on criminal behaviour submitted by the store, or other retailers using Facewatch nearby.
Mr Rajah said that he now has “no interest” in shopping in Sainsbury’s and said he wants people to be aware of facial recognition technology being used in stores.
He said: “It’s borderline fascistic as well, how can you just have something done to you and not have an understanding? How can you be excluded from a space and not have an understanding or an explanation?”
A Facewatch spokesperson said: “We’re sorry to hear about Mr Rajah’s experience and understand why it would have been upsetting. This incident arose from a case of human error in store, where a member of staff approached the wrong customer.
“Our data protection team followed the usual process to confirm his identity and verified that he was not on our database and had not been subject to any alerts generated by Facewatch.”
They added that if someone makes a subject access request, the data is not stored or used for any other purpose and is deleted after the individual proves who they say they are.
Jasleen Chaggar of Big Brother Watch said: “The idea that we are all just one facial recognition mistake away from being falsely accused of a crime or ejected from a store without any explanation is deeply chilling.
“To add insult to injury, innocent people seeking remedy must jump through hoops and hand over even more personal data just to discover what they’re accused of.
“In the vast majority of cases, they are offered little more than an apology when companies are finally forced to admit the tech got it wrong.”
Ms Chaggar said the organisation “regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance”.
The Information Commissioner’s Office (ICO) said: “Facial recognition technology can help retailers detect and prevent crime and has clear benefits in the public interest. However, its use must comply with data protection law.
“Retailers should carefully consider the risks of misidentification and have robust procedures in place to ensure the accuracy and integrity of the personal information they collect and process.
“This is especially important where personal information is used in situations which can have a serious impact on a person.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments
Bookmark popover
Removed from bookmarks