Instagram, owned by Meta, will start notifying parents if their children repeatedly search for terms clearly related to suicide or self-harm, the company announced.
The move comes as Meta faces two ongoing trials regarding harm to minors. In Los Angeles, the court is examining whether Meta’s platforms deliberately addict and harm children. A separate trial in New Mexico is investigating whether the company failed to protect kids from sexual exploitation on its platforms.
The alerts will only go to parents enrolled in Instagram’s parental supervision program. The platform already blocks such content from appearing in teen accounts and directs users to helplines instead.
Thousands of families, along with school districts and government entities, have sued Meta and other social media companies, claiming the platforms are deliberately addictive and fail to protect children from content that can contribute to depression, eating disorders, or suicidal behavior.
Meta CEO Mark Zuckerberg has denied that its platforms cause addiction. During questioning at the Los Angeles trial, he said current scientific evidence does not prove that social media causes mental health harm.
Instagram head Adam Mosseri also pushed back on the concept of social media addiction, describing high usage among teens as “problematic use” similar to “watching TV longer than you feel good about,” rather than a clinical addiction.
The new alerts will be sent via email, text, WhatsApp, and Instagram notifications, depending on the parent’s available contact information. Setting up parental supervision requires consent from both the teen (ages 13–17) and the parent, and only one parent may supervise each teen account.
“Our goal is to empower parents to step in if their teen’s searches suggest they may need support. We also want to avoid over-notifying, which could reduce the effectiveness of these alerts,” Meta said.
The company is also developing similar notifications to alert parents if teens engage with its artificial intelligence in conversations related to suicide or self-harm, with more updates expected in the coming months.



