tech

February 27, 2026

How Meta Executives Talked About Child Safety Behind the Scenes

For years, employees acknowledged a problem with potential child groomers, but prioritized growth over fixes.

How Meta Executives Talked About Child Safety Behind the Scenes

TL;DR

  • Internal documents suggest Meta prioritized user growth and engagement over child safety for years.
  • Employees detailed severe risks to minors on platforms like Instagram, including 'Inappropriate Interactions with Children' (IIC T1), such as sextortion and sex trafficking.
  • The company delayed implementing restrictive safety features, such as making new teen accounts private by default, due to concerns about business impact.
  • Internal tests showed Meta's recommendation algorithm could funnel children to potential groomers.
  • Meta disputes prioritizing profit over safety, stating they investigated and responded to threats as they emerged.
  • The documents were disclosed as part of a lawsuit against Meta in New Mexico.
  • Recent audits show continued issues with adult-to-teen messaging and recommendations, even after some safety measures were implemented.
  • Meta has since rolled out additional safety features, including 'Teen Accounts' and content filtering, but some experts and legal figures remain critical.
  • The company claims that the Instagram version in the lawsuit is substantially different from the current product.

Continue reading the original article

Made withNostr