Essex police pause facial recognition camera use after study finds racial bias
Essex police have suspended their use of live facial recognition (LFR) technology after a study revealed racial bias in its accuracy. The study, commissioned by Essex police and conducted by University of Cambridge academics, found the system was more likely to correctly identify Black individuals compared to other ethnic groups, raising fairness concerns.

Briefing Summary
AI-generatedEssex police have suspended their use of live facial recognition (LFR) technology after a study revealed racial bias in its accuracy. The study, commissioned by Essex police and conducted by University of Cambridge academics, found the system was more likely to correctly identify Black individuals compared to other ethnic groups, raising fairness concerns. The Information Commissioner’s Office (ICO) revealed the pause and cautioned other police forces using LFR to implement mitigation strategies. The study involved actors walking past LFR cameras in Chelmsford, revealing the system's increased accuracy in identifying Black individuals on a watchlist. The findings come as the Home Office plans to increase the availability of LFR vans to police forces across England and Wales.
Article analysis
Model · rule-basedKey claims
5 extractedHome Office said LFR cameras led to more than 1,300 arrests from Jan 2024 to Sept 2025.
In January, the home secretary announced the number of LFR vans would increase five-fold.
Essex police paused live facial recognition (LFR) use after a study found racial bias.
A University of Cambridge study showed LFR was statistically significantly more likely to correctly identify black participants.
If you’re black, the chances of being identified as being on a police watchlist are greater.