Meta Platforms to Alert Parents Over Teens’ Self-Harm Searches on Instagram

Instagram: Social media platform Instagram has implemented a new policy. Now, if teens repeatedly search for terms related to suicide or self-harm, their parents will receive automatic alerts.

Fri, 27 Feb 2026 12:59 PM (IST)
 0
Meta Platforms to Alert Parents Over Teens’ Self-Harm Searches on Instagram
Meta Platforms to Alert Parents Over Teens’ Self-Harm Searches on Instagram

Meta, the parent company of Instagram, says that if a teen searches repeatedly for suicide or self-harm-related content, their parents will automatically receive alerts. This is the first time Meta has alerted parents directly about what the teen is searching for. The Molly Rose Foundation, a suicide prevention organization, says that this is a dangerous and hasty decision that could make the problem worse.

According to Meta, parents and teens in the UK, US, Australia, and Canada who experience Instagram's teen accounts will be notified of these alerts starting next week. This feature will then be implemented in other parts of the world. This feature applies to families using Instagram's child supervision tools. According to the BBC, the company says that if a teen repeatedly searches for terms related to suicide or self-harm in a short period of time, the system will flag it as unusual behavior and send an alert to parents via email, text message, WhatsApp, or the Instagram app. Until now, Instagram has blocked such searches and directed users to external support resources. However, this is the first time the company will proactively notify parents based on analysis of search patterns.

Meta says that the Molly Rose Foundation is wrong about their intentions and that they are misrepresenting what Meta is doing. The alerts to parents will also include expert resources to help them discuss difficult subjects with their children. Meta also stated that in the coming months, it is considering implementing such alerts when teens discuss suicide or self-harm with Instagram's AI chatbots. The company argues that children are increasingly turning to AI for support, so monitoring is necessary there as well.

Advertisement

Want to get your story featured as above? click here!

Advertisement

Want to get your story featured as above? click here!

Muskan Kumawat Journalist & Writer