Equality watchdog calls for facial recognition to be halted

Equality watchdog calls for facial recognition to be halted Ryan is a senior editor at TechForge Media with over a decade of experience covering the latest technology and interviewing leading industry figures. He can often be sighted at tech conferences with a strong coffee in one hand and a laptop in the other. If it's geeky, he’s probably into it. Find him on Twitter (@Gadget_Ry) or Mastodon (@gadgetry@techhub.social)


The Equalities and Human Rights Commission (EHRC) has called for the public use of facial recognition to be halted.

Concerns have been raised about facial recognition’s potential to automate racial discrimination and hinder freedom of expression.

The UK, the second most surveilled nation after China, has been at the forefront of testing facial recognition systems in the West. Police in London and South Wales have tested facial recognition in stadiums, arenas, and shopping centres.

Facial recognition tests in the UK so far have been nothing short of a complete failure. An initial trial, at the 2016 Notting Hill Carnival, led to not a single person being identified. A follow-up trial the following year led to no legitimate matches but 35 false positives.

An independent report into the Met Police’s facial recognition trials, conducted by Professor Peter Fussey and Dr Daragh Murray last year, concluded that it was only verifiably accurate in just 19 percent of cases.

Last month, Met Police Chief Commissioner Cressida Dick dismissed critics of law enforcement using facial recognition systems as being “highly inaccurate or highly ill-informed.”

The EHRC wants public use of facial recognition to be halted until the technology and its impact has been independently scrutinised and laws governing its use are improved. However, last September, the high court in Cardiff ruled that the police’s use of automatic facial recognition to find people in crowds is lawful.

In a report to the UN on civil and political rights in the UK, the EHRC said: “Evidence indicates many AFR algorithms disproportionately misidentify black people and women and therefore operate in a potentially discriminatory manner … Such technologies may replicate and magnify patterns of discrimination in policing and have a chilling effect on freedom of association and expression.”

The calls by the EHRC, in addition to organisations like Amnesty International and the ACLU, puts more pressure on law enforcement to halt their trials.

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

View Comments
Leave a comment

Leave a Reply