Instagram to alert parents if teens repeatedly search self-harm terms

AI Summary
Instagram will now alert parents enrolled in its supervision program if their teenage children repeatedly search for terms related to self-harm or suicide. The announcement from Meta, Instagram's parent company, comes amidst ongoing trials examining the platform's potential harm to children. These trials question whether Meta's platforms are deliberately addictive and fail to protect minors from harmful content. While Instagram already blocks such content from search results and directs users to helplines, the new alerts aim to empower parents to intervene when concerning search patterns emerge. The alerts will be sent via email, text, WhatsApp, or Instagram notification, depending on the parent's contact information. Parental supervision requires mutual consent from both the teen (ages 13-17) and parent.
Key Entities & Roles
Keywords
Sentiment Analysis
Source Transparency
This article was automatically classified using rule-based analysis. The political bias score ranges from -1 (far left) to +1 (far right).
Topic Connections
Explore how the topics in this article connect to other news stories
Find Similar Articles
AI-PoweredDiscover articles with similar content using semantic similarity analysis.