TLDR
- Instagram will alert parents when teens repeatedly search for suicide or self-harm terms in a short time period
- Alerts will roll out next week in the US, UK, Australia, and Canada, with Ireland and other regions later this year
- Parents will be notified via email, text, WhatsApp, or in-app notification
- Meta says it consulted experts to set the alert threshold and will continue refining it
- Meta [META] also plans to build similar alerts for teens’ AI conversations later this year
Instagram is rolling out a new parental alert feature for teen accounts, notifying parents when their child repeatedly searches for suicide or self-harm terms on the platform.
Starting next week, parents will get an alert if their teen repeatedly searches for certain terms related to self-harm or suicide in a short time span. https://t.co/iw8WrBBscO
— CBS News (@CBSNews) February 26, 2026
The feature is part of Instagram’s parental supervision tools. It will begin in the US, UK, Australia, and Canada next week.
Parents will receive alerts by email, text, WhatsApp, or through a notification inside the app. Tapping the alert opens a full-screen message explaining what was searched.
The alerts are triggered when a teen searches multiple times in a short period for phrases linked to suicide or self-harm. Instagram said it worked with its Suicide and Self-Harm Advisory Group to set the threshold.
Meta said it does not want to send too many alerts that could make the feature less useful over time. The company said it will keep listening to feedback and adjust the threshold as needed.
Instagram already blocks searches for suicide and self-harm content. When a teen tries to search these terms, the platform redirects them to helplines and support resources instead.
The platform said the vast majority of teens do not search for this type of content on Instagram. It also hides related content from teen accounts, even if it comes from accounts they follow.
Meta Faces Legal Pressure on Teen Safety
The announcement comes as Meta faces two ongoing trials over child safety on its platforms. Experts have compared these cases to the tobacco industry’s legal battles, arguing social media companies misled the public about harm to young users.
Other platforms including YouTube, TikTok, and Snap face similar legal challenges. The cases focus on whether these platforms’ designs have caused harm to the mental health of young people.
AI Notifications Also Planned
Meta said it is also developing parental alerts for teens’ conversations with AI tools, though no release date has been given. That feature is expected to arrive later in 2025.
Instagram said Thursday’s announcement is the latest addition to its Teen Accounts and parental supervision features. The feature will expand to Ireland and other countries later this year.
Meta’s stock ticker is META on the Nasdaq. The company has not commented on the financial impact of the ongoing trials.





