Meta told to pay $375m for misleading users over child safety
A New Mexico court has ordered Meta to pay $375 million for misleading users about the safety of its platforms, including Facebook and Instagram, for children. The jury found Meta liable for endangering children by exposing them to sexually explicit material and contact with sexual predators, violating the state's Unfair Practices Act.

Briefing Summary
AI-generatedA New Mexico court has ordered Meta to pay $375 million for misleading users about the safety of its platforms, including Facebook and Instagram, for children. The jury found Meta liable for endangering children by exposing them to sexually explicit material and contact with sexual predators, violating the state's Unfair Practices Act. The verdict marks the first time a state has successfully sued Meta over child safety issues. The trial included internal Meta documents and testimony from a former employee who revealed the company's awareness of child predators on its platforms and the prevalence of sexualized content served to underage users. Meta disagrees with the verdict and intends to appeal, citing its efforts to protect teens online.
Article analysis
Model · rule-basedKey claims
5 extractedNew Mexico sued Meta in 2023, claiming the company "steered" young users to sexually explicit content.
Meta disagrees with the verdict and intends to appeal.
The jury found that Meta was responsible for violating New Mexico's Unfair Practices Act.
A court in New Mexico has ordered Meta to pay $375m for misleading users over the safety of its platforms for children.
Internal Meta research found 16% of all Instagram users had reported being shown unwanted nudity or sexual activity in a single week.