NEWSAR
Multi-perspective news intelligence
SRCThe Guardian - World News
LANGEN
LEANCenter-Left
WORDS761
ENT5
FRI · 2026-01-30 · 07:00 GMTBRIEF NSR-2026-0130-11866
News/Abusers using AI and digital tech to attack and control wome…
NSR-2026-0130-11866News Report·EN·Human Rights

Abusers using AI and digital tech to attack and control women, charity warns

A recent report from the domestic abuse charity Refuge reveals a surge in technology-facilitated abuse against women. In the last three months of 2025, Refuge saw a 62% increase in complex cases, totaling 829 women, and a 24% increase in referrals of women under 30.

Rachel HallThe Guardian - World NewsFiled 2026-01-30 · 07:00 GMTLean · Center-LeftRead · 4 min
Abusers using AI and digital tech to attack and control women, charity warns
The Guardian - World NewsFIG 01
Reading time
4min
Word count
761words
Sources cited
2cited
Entities identified
5entities
Quality score
100%
§ 01

Briefing Summary

AI-generated
NEWSAR · AI

A recent report from the domestic abuse charity Refuge reveals a surge in technology-facilitated abuse against women. In the last three months of 2025, Refuge saw a 62% increase in complex cases, totaling 829 women, and a 24% increase in referrals of women under 30. Abusers are exploiting AI, smartwatches, and smart home devices to track, stalk, and control victims. Refuge emphasizes that current technology design and regulations fail to adequately protect women, making it easy for abusers to weaponize these tools. The charity urges for safety to be a foundational principle in technology development and regulation to prevent further harm.

Confidence 0.90Sources 2Claims 5Entities 5
§ 02

Article analysis

Model · rule-based
Framing
Human Rights
Technology
Tone
Mixed Tone
AI-assessed
CalmNeutralAlarmist
Factuality
0.70 / 1.00
Factual
LowHigh
Sources cited
2
Limited
FewMany
§ 03

Key claims

5 extracted
01

One survivor's abuser used her smartwatch to track her by using linked cloud accounts to locate her emergency accommodation.

factualMina
Confidence
1.00
02

Referrals to Refuge’s specialist services increased, including a 62% increase in the most complex cases to total 829 women in the last three months of 2025.

statisticRefuge
Confidence
1.00
03

It is currently far too easy for perpetrators to access and weaponise smart accessories.

quoteEmma Pickering, Refuge
Confidence
0.90
04

Domestic abusers are increasingly using AI, smartwatches and other technology to attack and control their victims.

factualRefuge
Confidence
0.90
05

Perpetrators are using wearable tech such as smartwatches, Oura rings and Fitbits to track and stalk women.

factual
Confidence
0.80
§ 04

Full report

4 min read · 761 words
Domestic abusers are increasingly using AI, smartwatches and other technology to attack and control their victims, a domestic abuse charity says.Record numbers of women who were abused and controlled through technology were referred to Refuge’s specialist services during the last three months of 2025, including a 62% increase in the most complex cases to total 829 women. There was also a 24% increase in referrals of under-30s.Recent cases included perpetrators using wearable tech such as smartwatches, Oura rings and Fitbits to track and stalk women, disrupting their lives through smart home devices that control lights and heating, and using AI spoofing apps to impersonate people.Emma Pickering, head of the tech-facilitated abuse team at Refuge, said: “Time and again, we see what happens when devices go to market without proper consideration of how they might be used to harm women and girls. It is currently far too easy for perpetrators to access and weaponise smart accessories, and our frontline teams are seeing the devastating consequences of this abuse.“It is unacceptable for the safety and wellbeing of women and girls to be treated as an afterthought once a technology has been developed and distributed. Their safety must be a foundational principle shaping both the design of wearable technology and the regulatory frameworks that surround it.”Refuge said it was far too easy to access and weaponise smart accessories and that women’s safety needed to be factored into their design.One survivor Refuge worked with, Mina, left behind her smartwatch in a rush to flee her abuser, who then used it to track her by using linked cloud accounts to locate her emergency accommodation.“[It] was deeply shocking and frightening. I felt suddenly exposed and unsafe, knowing that my location was being tracked without my consent. It created a constant sense of paranoia; I couldn’t relax, sleep properly, or feel settled anywhere because I knew my movements weren’t private,” she said.Despite Police returning the device to Mina, she was located at her next Refuge by a private investigator hired by her abuser, using suspected tracking via technology. She reported the breaches to Police but was told no crime had been committed because she had “not come to any harm”.“I was repeatedly asked to move for my safety, rather than the technology being dealt with directly or the smart watch being confiscated from him. Each move made me feel more unstable and displaced,” she said.“Overall, the experience left me feeling unsafe, unheard, and responsible for managing a situation that was completely out of my control. It showed me how tech abuse can quietly and powerfully extend coercive control, and how easily survivors can be left to carry the emotional and practical burden when systems don’t fully understand or respond to it.”Abusers were also increasingly using AI tools to manipulate survivors, Pickering said. For example, they might alter a video of the survivor so that she appeared drunk, enabling them to tell social services that “she’s acting erratic again, slurring speech, she’s got a drink problem” and that she was therefore an unfit mother or a risk to herself and others. “We’ll see more and more of that as these videos and applications advance,” Pickering said.Pickering said she had also heard of AI tools being used to develop authentic-looking fraudulent documents, for example job offers or legal summons, which can be sent to survivors to make them believe they are in debt, or to persuade them to turn up to the same location as their abuser.Pickering feared that in coming years, medical tech would increasingly be misused, for example by controlling insulin levels through a diabetes tracker, which can be fatal.She urged the government to act on digital technology-enabled and online crimes, including providing more funding to develop and train digital investigations teams. “They want short-term wins, they don’t want to think about longer-term investment in this area, but if we don’t do that we’ll never get ahead,” she said.She also wants to see the technology industry held to account for failing to ensure devices and platforms are designed and function in ways that are safe for vulnerable people.“Ofcom and the Online Safety Act don’t go far enough,” she said.A government spokesperson said: “Tackling violence against women and girls in all its forms, including when it takes place online or is facilitated by technology, is a top priority for this government.“Our new VAWG strategy sets out how the full power of the state will be deployed online and offline. We are working with Ofcom to set out how online platforms tackle the disproportionate abuse women and girls face online.”
§ 05

Entities

5 identified
§ 06

Keywords & salience

10 terms
domestic abuse
1.00
technology-facilitated abuse
0.90
women's safety
0.80
smartwatches
0.80
wearable tech
0.70
artificial intelligence
0.70
refuge
0.70
digital technology
0.60
stalking
0.60
privacy
0.50
§ 07

Topic connections

Interactive graph
Network visualization showing 3 related topics
View Full Graph
Person Organization Location Event|Click node to navigate|Edge numbers = shared articles