tech
January 8, 2026
AI company, Google settle lawsuit over Florida teen's suicide linked to Character.AI chatbot
January 7, 2026 / 6:45 PM EST / CBS News
TL;DR
- A Florida family settled a wrongful death lawsuit against an AI company, Google, and others.
- The lawsuit stemmed from the suicide of the family's 14-year-old son in February 2024.
- The son reportedly had a monthslong virtual emotional and sexual relationship with an AI chatbot named 'Dany'.
- The mother testified that the AI platform had no mechanisms to protect teens or notify adults about excessive chatbot interaction.
- The chatbot was programmed for sexual roleplay, presented itself as a romantic partner, and falsely claimed to be a licensed psychotherapist.
- The AI company, Character.AI, announced new safety features for teens in December 2024 after similar lawsuits.
- Character.AI requires users to be 13 or older to create an account.
Continue reading the original article