On Thursday, Meta revealed that Instagram is set to alert parents when a child consistently looks up content connected to self-harm and suicide on the platform. 

Notifications will be based on the timing of the searches and will be sent to parents only when they occur within a short interval. The company also stated that it is creating a comparable tool to alert parents about teens engaging in conversations with AI about self-harm. 

The alerts will be sent exclusively to parents enrolled in Instagram’s parental supervision tool across the United States, United Kingdom, Australia and Canada. According to Meta, the system will expand to additional countries later this year. 

The step is being taken amid increasing pressure on Meta regarding the impact of its platforms on young people. For years, parents alleging that social media use led their children to take their own lives have urged United States Congress to impose tougher oversight on the company, gaining greater visibility as legislators push forward the Kids Online Safety Act. 

The company is currently defending itself in two trials underway in New Mexico and California, facing claims that it hooks teens on social media, heightens anxiety, and provides an environment conducive to sexual predators. 

During the 18 February proceedings in California, Mark Zuckerberg faced rigorous questioning from the plaintiffs’ attorneys, who alleged that Instagram was intentionally created to foster addiction. 

‘Erring on the side of caution’ 

According to the blog post, searches that may activate an alert include not only the specific terms themselves but also phrases that promote suicide or self-harm. 

Alerts will be sent via WhatsApp, email or SMS, and when parents open them, they will receive advice on discussing self-harm and suicide with their child. 

Meta said it aims to exercise caution and will not generate alerts indiscriminately, as too many notifications might make them “less useful overall.” It explained that decisions on when to activate alerts were informed by an analysis of search activity on Instagram and recommendations from its Suicide and Self-Harm Advisory Group.