Equality watchdog calls for facial recognition to be halted

Ryan Daws is a senior editor at TechForge Media, with a seasoned background spanning over a decade in tech journalism. His expertise lies in identifying the latest technological trends, dissecting complex topics, and weaving compelling narratives around the most cutting-edge developments. His articles and interviews with leading industry figures have gained him recognition as a key influencer by organisations such as Onalytica. Publications under his stewardship have since gained recognition from leading analyst houses like Forrester for their performance. Find him on X (@gadget_ry) or Mastodon (@gadgetry@techhub.social)


The Equalities and Human Rights Commission (EHRC) has called for the public use of facial recognition to be halted.

Concerns have been raised about facial recognition’s potential to automate racial discrimination and hinder freedom of expression.

The UK, the second most surveilled nation after China, has been at the forefront of testing facial recognition systems in the West. Police in London and South Wales have tested facial recognition in stadiums, arenas, and shopping centres.

Facial recognition tests in the UK so far have been nothing short of a complete failure. An initial trial, at the 2016 Notting Hill Carnival, led to not a single person being identified. A follow-up trial the following year led to no legitimate matches but 35 false positives.

An independent report into the Met Police’s facial recognition trials, conducted by Professor Peter Fussey and Dr Daragh Murray last year, concluded that it was only verifiably accurate in just 19 percent of cases.

Last month, Met Police Chief Commissioner Cressida Dick dismissed critics of law enforcement using facial recognition systems as being “highly inaccurate or highly ill-informed.”

The EHRC wants public use of facial recognition to be halted until the technology and its impact has been independently scrutinised and laws governing its use are improved. However, last September, the high court in Cardiff ruled that the police’s use of automatic facial recognition to find people in crowds is lawful.

In a report to the UN on civil and political rights in the UK, the EHRC said: “Evidence indicates many AFR algorithms disproportionately misidentify black people and women and therefore operate in a potentially discriminatory manner … Such technologies may replicate and magnify patterns of discrimination in policing and have a chilling effect on freedom of association and expression.”

The calls by the EHRC, in addition to organisations like Amnesty International and the ACLU, puts more pressure on law enforcement to halt their trials.

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

View Comments
Leave a comment

Leave a Reply