AI tools make potentially harmful errors in social work records, research says

The Guardian - World NewsCenter-LeftEN 4 min read 75% complete by Robert Booth UK technology editorFebruary 11, 2026 at 08:39 PM

AI Summary

long article 4 min

A recent study by the Ada Lovelace Institute found that AI tools used for social work record-keeping are generating potentially harmful errors. The research, conducted across 17 English and Scottish councils, revealed instances of AI "hallucinations" in transcriptions, including false reports of suicidal ideation and nonsensical content. These inaccuracies, occurring in systems like Magic Notes and Microsoft Copilot, raise concerns about misrepresenting service users' experiences and potentially missing crucial behavioral patterns. While AI transcription offers time-saving benefits and improved relational aspects of care work, social workers have also reported issues with transcribing regional accents and generating "gibberish." Dozens of councils are using these tools to address staff shortages, but the errors highlight the need for careful oversight and validation of AI-generated records.

Sentiment Analysis

Negative
Score: -0.40

Source Transparency

Source
The Guardian - World News
Political Lean
Center-Left (-0.40)
Far LeftCenterFar Right
Classification Confidence
90%

This article was automatically classified using rule-based analysis. The political bias score ranges from -1 (far left) to +1 (far right).