Instagram announced Thursday it will begin notifying parents when their teens repeatedly search for terms strongly linked to suicide or self-harm — but only if the parents are enrolled in the platform’s voluntary parental supervision tools.
The alerts will arrive via email, text, WhatsApp (based on available contact info), or in-app notification, directing parents to resources and encouraging intervention. Instagram already blocks such content from teen search results and redirects users to helplines, Meta said in a blog post.
“Our goal is to empower parents to step in if their teen’s searches suggest they may need support,” the company stated. “We also want to avoid sending these notifications unnecessarily, which, if done too much, could make the notifications less useful overall.”
Meta is also developing similar alerts for teen interactions with its AI features involving suicide or self-harm topics, with more details expected in coming months.
The move arrives as Meta faces two concurrent federal trials over alleged harms to minors. In Los Angeles, plaintiffs argue Instagram and Facebook deliberately addict children through addictive design. In New Mexico, the state claims Meta failed to prevent sexual exploitation. Thousands of families, school districts, and governments have sued Meta and other social media companies, alleging platforms contribute to depression, eating disorders, and suicide by prioritizing engagement over safety.
Meta executives, including CEO Mark Zuckerberg, have pushed back. During LA trial questioning, Zuckerberg reiterated that scientific evidence has not proven social media causes mental health harms.
Josh Golin, executive director of child advocacy nonprofit Fairplay, criticized the announcement as reactive: “Instagram is clearly making this move now because the company is currently on trial in two different states for addicting and harming kids. Once again, Meta is shifting the burden to parents rather than fixing the dangerous flaws in how it designs its algorithms and platforms. And all children deserve to be protected, regardless of whether their parents have enrolled in and utilize Meta’s supervision tools. If a product is not safe for teens to use without parental intervention, it shouldn’t be marketed to teens at all.”
Meta maintains its tools help families while respecting teen privacy and autonomy.

