FACIAL RECOGNITION NIGHTMARE: Innocent Man SHAMED as a THIEF!

FACIAL RECOGNITION NIGHTMARE: Innocent Man SHAMED as a THIEF!

Warren Rajah’s decade-long routine of shopping at the Sainsbury’s near Elephant and Castle station was shattered on Tuesday. He was abruptly stopped by staff and a security guard, his groceries taken from him, and escorted from the store – all because a facial recognition system flagged him as a criminal.

The explanation was chillingly simple: a sign revealed the store utilized facial recognition technology. Rajah, 42, was bewildered, initially offering his Nectar card when asked for his “bar code.” The resulting humiliation of being publicly removed from a familiar place, in front of his community, left him reeling.

The technology, provided by Facewatch, boasts an astonishing 99.98% accuracy in identifying offenders. However, Rajah’s experience highlights a terrifying flaw – the potential for devastating errors. He was forced to prove his innocence by submitting a photo and passport copy to be removed from the system.

30th January 2026, LONDON, UK: Warren Rajah is pictured in front of the Sainsbury supermarket in Elephant and Castle where he was wrongly identified by facial recognition cameras. ??Justin Griffiths-Williams 00442085339882 00447850053473

Rajah described the incident as a profound invasion of privacy, questioning the scope of the data collected and its potential impact on his life. He worried about the company’s practices, wondering if crimes were recorded in real-time and if the data was shared with law enforcement.

The use of facial recognition is rapidly expanding across London, deployed by both police and retailers. While intended to proactively identify known criminals, the system relies on databases compiled by private companies – databases that don’t require proof of wrongdoing before adding a face.

This lack of verification opens the door to misidentification and public shaming, as Rajah painfully discovered. He received an apology and a £75 voucher from Sainsbury’s, but argued the issue went far beyond a simple monetary gesture, fearing the consequences for more vulnerable individuals.

30th January 2026, LONDON, UK: Warren Rajah is pictured in front of the Sainsbury supermarket in Elephant and Castle where he was wrongly identified by facial recognition cameras. ??Justin Griffiths-Williams 00442085339882 00447850053473

Both Sainsbury’s and Facewatch attributed the error to “human intervention,” claiming staff approached the wrong person. Facewatch acknowledged the upsetting experience and confirmed Rajah was quickly removed from their database after verification.

Despite the claims of technological perfection, Rajah warns that without proper training and careful oversight, innocent people will inevitably suffer. The system, he argues, is only as reliable as the humans who operate it.

Sainsbury’s reiterated their apology, emphasizing that the incident wasn’t a fault of the technology itself, but a case of mistaken identity. The incident serves as a stark reminder of the potential pitfalls of increasingly pervasive surveillance technology.

FRIDAY 30 JAN Man wrongly identified as criminal by Sainsbury's facial recognition cameras

Live Facial Recognition (LFR) works by streaming images of people to a system that compares them to a watchlist. It’s used to prevent crime, identify wanted individuals, and even establish identity when someone is unable to communicate. Often deployed at large events or in busy areas, LFR first saw use by UK police at a major sporting event in 2017.

The Metropolitan Police currently faces legal challenges regarding its use of the technology, which relies on matching faces against criminal databases. However, private companies maintain their own databases, compiled from lists provided by retailers, without requiring any confirmation of criminal activity.