Police AI chief admits crime-fighting tech will have bias but vows to tackle it

AI Summary
A UK police chief acknowledged that artificial intelligence (AI) used in crime fighting will likely contain bias, despite efforts to minimize it. Alex Murray, the national lead for AI, stated that a new national police AI center will address the risks of bias in AI systems like facial recognition and predictive policing. These biases, often stemming from historical data, could lead to unfair targeting of minority communities. While Labour advocates for expanded AI use in policing, Murray emphasized the importance of data scientists and engineers cleaning data, training models appropriately, and testing for bias before deployment. The Association of Police and Crime Commissioners has also highlighted the need for independent oversight of AI tools, citing past system failures and biases in retrospective facial recognition technology.
Key Entities & Roles
Keywords
Sentiment Analysis
Source Transparency
This article was automatically classified using rule-based analysis. The political bias score ranges from -1 (far left) to +1 (far right).
Topic Connections
Explore how the topics in this article connect to other news stories