Protecting Teens on Instagram: New Tools to Monitor Self-Harm Content

Instagram's new features will alert parents if their teen searches for content related to self-harm or suicide, allowing them to provide support. But critics argue Meta is shifting responsibility.
In a move aimed at bolstering teen safety on its platform, Instagram has announced new features that will alert parents if their children search for content related to self-harm or suicide. The initiative, part of Meta's broader efforts to address online safety concerns, is designed to empower parents to provide support for their struggling adolescents.
The new tools will send notifications to parents when their teenage children search for specific terms or hashtags associated with self-harm or suicidal ideation. This information will allow parents to have open conversations with their kids and connect them with mental health resources if needed.
However, the announcement has sparked a mixed reaction from online safety advocates. While some applaud the effort to address mental health challenges faced by young users, others have accused Meta, the parent company of Instagram, of
Source: BBC News


