NEWSAR
Multi-perspective news intelligence
SRCThe Guardian - World News
LANGEN
LEANCenter-Left
WORDS527
ENT12
THU · 2026-03-19 · 15:26 GMTBRIEF NSR-2026-0319-26095
News/Essex police pause facial recognition camera use after study…
NSR-2026-0319-26095News Report·EN·Social Justice

Essex police pause facial recognition camera use after study finds racial bias

Essex police have suspended their use of live facial recognition (LFR) technology after a study revealed racial bias in its accuracy. The study, commissioned by Essex police and conducted by University of Cambridge academics, found the system was more likely to correctly identify Black individuals compared to other ethnic groups, raising fairness concerns.

Robert Booth and Mark WildingThe Guardian - World NewsFiled 2026-03-19 · 15:26 GMTLean · Center-LeftRead · 3 min
Essex police pause facial recognition camera use after study finds racial bias
The Guardian - World NewsFIG 01
Reading time
3min
Word count
527words
Sources cited
4cited
Entities identified
12entities
Quality score
100%
§ 01

Briefing Summary

AI-generated
NEWSAR · AI

Essex police have suspended their use of live facial recognition (LFR) technology after a study revealed racial bias in its accuracy. The study, commissioned by Essex police and conducted by University of Cambridge academics, found the system was more likely to correctly identify Black individuals compared to other ethnic groups, raising fairness concerns. The Information Commissioner’s Office (ICO) revealed the pause and cautioned other police forces using LFR to implement mitigation strategies. The study involved actors walking past LFR cameras in Chelmsford, revealing the system's increased accuracy in identifying Black individuals on a watchlist. The findings come as the Home Office plans to increase the availability of LFR vans to police forces across England and Wales.

Confidence 0.90Sources 4Claims 5Entities 12
§ 02

Article analysis

Model · rule-based
Framing
Social Justice
Technology
Tone
Measured
AI-assessed
CalmNeutralAlarmist
Factuality
0.80 / 1.00
Factual
LowHigh
Sources cited
4
Well sourced
FewMany
§ 03

Key claims

5 extracted
01

Home Office said LFR cameras led to more than 1,300 arrests from Jan 2024 to Sept 2025.

statisticThe Home Office
Confidence
1.00
02

In January, the home secretary announced the number of LFR vans would increase five-fold.

factualShabana Mahmood
Confidence
1.00
03

Essex police paused live facial recognition (LFR) use after a study found racial bias.

factualInformation Commissioner’s Office (ICO)
Confidence
1.00
04

A University of Cambridge study showed LFR was statistically significantly more likely to correctly identify black participants.

statisticUniversity of Cambridge academics
Confidence
0.90
05

If you’re black, the chances of being identified as being on a police watchlist are greater.

quoteDr Matt Bland, criminologist
Confidence
0.80
§ 04

Full report

3 min read · 527 words
Essex Police have paused the use of live facial recognition (LFR) technology after a study found cameras were significantly more likely to target black people than people of other ethnicities.The move to suspend use of the AI-enabled systems was revealed by the Information Commissioner’s Office (ICO), which regulates the use of the technology deployed so far by at least 13 police forces in London, south and north Wales, Leicestershire, Northamptonshire, Hampshire, Bedfordshire, Suffolk, Greater Manchester, West Yorkshire, Surrey and Sussex.The ICO said Essex Police had paused LFR deployments “after identifying potential accuracy and bias risks” and warned other forces to have mitigations in place. LFR systems are either mounted to fixed locations or deployed in vans. In January, the home secretary, Shabana Mahmood, announced the number of LFR vans would increase five-fold, with 50 available to every police force in England and Wales.Essex commissioned University of Cambridge academics to conduct a study, which involved 188 actors walking past cameras being actively deployed from marked police vans in Chelmsford. The results were published last week and showed about half of the people on a watchlist were correctly identified and incorrect identifications were extremely rare, but the system was more likely to correctly identify men than women and it was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.Live facial recognition vans are being made available more widely to police forces across England and Wales. Photograph: Andrew Matthews/PAThis “raises questions about fairness that require continued monitoring”, the report concluded. One of its authors, Dr Matt Bland, a criminologist, told the Guardian and Liberty Investigates: “If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re black. To me, that warrants further investigation.”The problem differs from the more common public concern about the technology which is that it identifies innocent people. Last month it emerged that police arrested a man for a burglary in a city he had never visited 100 miles away after retrospective face scanning software confused him with another person of south Asian heritage.Possible reasons for the latest issue with LFR include overtraining of the algorithm on the faces of black people. Experts believe it could be rectified by adjusting system settings. A separate study of the same technology by the government’s National Physical Laboratory found black men were most likely to be correctly matched by the system and white men least likely, but the effect was not statistically significant.The Home Office has said LFR cameras deployed in London from January 2024 to September 2025, led to more than 1,300 arrests of people wanted for crimes including rape, domestic abuse, burglary and grievous bodily harm. But opponents of facial recognition technology said the latest research showed warnings about bias in LFR technology were being borne out.“Police across the country must take note of this fiasco,” said Jake Hurfurt, the head of research and investigations at Big Brother Watch. “AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.”Essex Police were contacted for comment.
§ 05

Entities

12 identified
§ 06

Keywords & salience

8 terms
facial recognition
1.00
racial bias
0.90
essex police
0.80
live facial recognition
0.70
artificial intelligence
0.60
police watchlist
0.50
accuracy
0.50
data privacy
0.40
§ 07

Topic connections

Interactive graph
Network visualization showing 8 related topics
View Full Graph
Person Organization Location Event|Click node to navigate|Edge numbers = shared articles