‘Urgent clarity’ sought over racial bias in UK police facial recognition technology

AI Summary
The UK's Information Commissioner's Office (ICO) is seeking urgent clarification from the Home Office regarding racial bias in police facial recognition technology. A National Physical Laboratory (NPL) report revealed the technology, used within the police national database, incorrectly matches Black and Asian individuals more often than white individuals. The ICO was not previously informed of this bias despite ongoing engagement with the Home Office and police. The ICO's next steps could include enforcement actions, such as fines or a legally binding order to stop using the technology. Police and crime commissioners have expressed concern about the bias and urged caution regarding national expansion plans for the technology.
Key Entities & Roles
Keywords
Sentiment Analysis
Source Transparency
This article was automatically classified using rule-based analysis. The political bias score ranges from -1 (far left) to +1 (far right).
Topic Connections
Explore how the topics in this article connect to other news stories