NEWSAR
Multi-perspective news intelligence
SRCThe Guardian - World News
LANGEN
LEANCenter-Left
WORDS897
ENT10
FRI · 2026-04-10 · 06:00 GMTBRIEF NSR-2026-0410-61436
News/‘Irresponsible failure’: Google, Meta, Snap and Microsoft sl…
NSR-2026-0410-61436News Report·EN·Legal & Judicial

‘Irresponsible failure’: Google, Meta, Snap and Microsoft slam EU over child sexual abuse law lapse

The European Parliament allowed a law permitting tech companies to scan for child sexual abuse material (CSAM) on their platforms to expire on April 3, creating a legal gap. This temporary measure, a carve-out of the EU Privacy Act established in 2021, allowed automated detection of CSAM, grooming, and sextortion.

Katie McQueThe Guardian - World NewsFiled 2026-04-10 · 06:00 GMTLean · Center-LeftRead · 4 min
‘Irresponsible failure’: Google, Meta, Snap and Microsoft slam EU over child sexual abuse law lapse
The Guardian - World NewsFIG 01
Reading time
4min
Word count
897words
Sources cited
4cited
Entities identified
10entities
Quality score
100%
§ 01

Briefing Summary

AI-generated
NEWSAR · AI

The European Parliament allowed a law permitting tech companies to scan for child sexual abuse material (CSAM) on their platforms to expire on April 3, creating a legal gap. This temporary measure, a carve-out of the EU Privacy Act established in 2021, allowed automated detection of CSAM, grooming, and sextortion. Tech giants like Google, Meta, Snap, and Microsoft criticized the decision, stating they would continue voluntary scanning despite the legal uncertainty. Child protection advocates warn that the lapse will likely decrease reports of child sexual abuse, citing a similar occurrence in 2021. The European Parliament stated it is prioritizing permanent legislation to combat online child sexual abuse, but offered no timeline for implementation.

Confidence 0.90Sources 4Claims 5Entities 10
§ 02

Article analysis

Model · rule-based
Framing
Legal & Judicial
Human Rights
Tone
Measured
AI-assessed
CalmNeutralAlarmist
Factuality
0.80 / 1.00
Factual
LowHigh
Sources cited
4
Well sourced
FewMany
§ 03

Key claims

5 extracted
01

When detection tools are disrupted, we lose visibility that directly impacts our ability to find and protect child sexual abuse victims.

quoteJohn Shehan, vice-president at NCMEC
Confidence
1.00
02

In 2025, NCMEC received 21.3m reports that included more than 61.8m images, videos and other files suspected of being related to child abuse.

statistic
Confidence
1.00
03

Reports of child sexual abuse material from EU-based accounts to NCMEC fell by 58% during a similar legal gap in 2021.

statistic
Confidence
1.00
04

Google, Meta, Snap and Microsoft said they would continue to voluntarily scan their platforms for CSAM.

quoteGoogle, Meta, Snap and Microsoft
Confidence
1.00
05

The EU parliament blocked the extension of a law permitting big tech firms to scan for child sexual exploitation.

factual
Confidence
1.00
§ 04

Full report

4 min read · 897 words
The European Parliament has blocked the extension of a law that permits big tech firms to scan for child sexual exploitation on their platforms, creating a legal gap that child safety experts say will lead to crimes going undetected.The law, which was a carve-out of the EU Privacy Act, was put in place in 2021 as a temporary measure allowing companies to use automated detection technologies to scan messages for harms, including child sexual abuse material (CSAM), grooming and sextortion. However, it expired on 3 April, and the EU parliament decided not to vote to extend it, amid privacy concerns from some lawmakers.The regulatory gap has created uncertainty for big tech companies, because while scanning for harms on their platforms is now illegal, they still remain liable to remove any illegal content hosted on their platforms under a different law, the Digital Services Act. Google, Meta, Snap and Microsoft said they would continue to voluntarily scan their platforms for CSAM, in a joint statement posted on a Google blog.“We are disappointed by this irresponsible failure to reach an agreement to maintain established efforts to protect children online,” the statement said.The European Parliament said in a statement that it was prioritizing its work on legislation to prevent and combat child sexual abuse online, and that negotiations on a permanent legal framework were ongoing, though the body had offered no timeline for agreements or implementation.Child protection advocates had warned that allowing the legislation to lapse would probably trigger a steep fall in reports of child sexual abuse. They point to a similar legal gap that occurred in 2021, when reports of such material from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) fell by 58% over a period of 18 weeks.“When detection tools are disrupted, we lose visibility that directly impacts our ability to find and protect child sexual abuse victims,” said John Shehan, vice-president at NCMEC, a US-based organisation that acts as a clearinghouse for child abuse reports, which it forwards to relevant law enforcement agencies around the world. “When detection goes dark, the abuse doesn’t stop.”In 2025, NCMEC received 21.3m reports that included more than 61.8m images, videos and other files suspected of being related to child abuse, from around the world. About 90% of these reports are related to countries outside the US.A spokesperson for the EU parliament declined to comment on whether the legislative body had conducted any assessments to determine the consequences of the lapse of the law.The EU’s decision to prohibit scanning will have ripple effects in other regions around the world, child safety experts said. Many internet crimes are cross-border, with perpetrators sending illegal images to people or targeting children in other countries. “Sextortionists”, who pose as romantic interests to trick people into sending intimate photographs before making blackmail attempts, may also capitalize on the law change, Shehan said.“The offender can be anywhere in the world, but they could have unfettered access to minors in Europe now that there’s legal uncertainty around those safeguards and protections to identify when a child is being groomed,” Shehan said.Years of tense negotiations lead to lapse of vital carve-out lawFor the past four years, the proposed child sexual abuse regulation has been under negotiation, with contention arising because it would obligate companies to take measures to minimise risks on their platforms, said Hannah Swirsky, head of policy and public affairs at the Internet Watch Foundation, a UK-based child safety non-profit.Privacy advocates argue that big tech scanning messages for child abuse threatens fundamental privacy rights and data security for EU citizens, equating these measures to “chat control” that could lead to mass surveillance and false positives.“There are claims of surveillance or infringement of privacy,” Swirsky said. “Blocking CSAM is not an evasion of privacy. Free speech does not include sexual abuse of children.”The scanning technology uses machine learning that performs pattern detection to identify known images or videos of abuse, as well as language associated with child exploitation, and does not store any data, said Emily Slifer, director of policy at Thorn, a non-profit that builds technology to detect online child abuse, which is commonly used by companies and law enforcement.The system works by having trained analysts review known CSAM obtained from external sources, such as reports from police, the public or investigations into websites known for hosting child abuse material. When analysts confirm that content is illegal child sexual abuse, they generate a unique digital fingerprint – known as a hash value – that identifies that exact image. Lists of hash values are then shared with platforms, which use automated systems to scan uploads and block matching content instantly, without the need for a human to view it.“The technology doesn’t find babies in bathtubs and things like that. If you just think of what an image of abuse would look like versus what consensual content would look like: those are two very different pieces of material, and technology can determine those patterns between them,” Slifer said.While the EU has blocked scanning for child abuse, it has allowed tech companies to voluntary scan messages for the detection of terrorist content under legislation adopted in 2021, she said.“The EU is effectively risking open doors for predators,” Swirsky said. “If the EU is serious about protecting children online, then it needs to agree on a permanent legislative framework for safeguarding children and for enabling detection.”
§ 05

Entities

10 identified
§ 06

Keywords & salience

9 terms
child sexual abuse
1.00
csam
0.90
online safety
0.80
scanning for harms
0.70
eu law
0.70
legal gap
0.70
privacy concerns
0.60
ncmec
0.60
digital services act
0.50
§ 07

Topic connections

Interactive graph
Network visualization showing 51 related topics
View Full Graph
Person Organization Location Event|Click node to navigate|Edge numbers = shared articles