tech
February 27, 2026
Instagram to alert parents if teens repeatedly search self-harm terms
Feature for supervised accounts rolls out as Meta platform faces US trials over alleged harms to children

TL;DR
- Instagram will notify parents if teens repeatedly search for terms linked to suicide or self-harm.
- Alerts are only sent to parents participating in Instagram's parental supervision program.
- Meta is also developing similar AI-based notifications for children's interactions with AI concerning self-harm.
- The company is currently involved in two trials concerning alleged harms to minors on its platforms.
- Meta executives, including Mark Zuckerberg, dispute the scientific evidence linking social media to mental health harms.
- Critics argue that these tools place the burden on parents instead of fixing platform design issues.
- Parental supervision requires agreement from both teen and parent, with teens aged 13-17.
Continue reading the original article