tech

December 30, 2025

When AI Chatbots Become Accomplices

Turn any article into a podcast. Upgrade now to start listening.

When AI Chatbots Become Accomplices

TL;DR

  • Natalie Rupnow, a 15-year-old, carried out a school shooting in Madison, Wisconsin, in December 2024.
  • Rupnow had a digital presence involving white supremacist content, violent gore, and extremist material.
  • She also used Character.AI, an app allowing users to interact with AI chatbots.
  • Rupnow's profile picture on Character.AI featured Dylann Roof, a white supremacist neo-Nazi.
  • Experts are concerned that AI companions could escalate existing extremist ideologies and behaviors.
  • Online communities known as the 'True Crime Community' (TCC) romanticize and identify with mass shooters.
  • Character.AI has faced lawsuits and has since implemented new safety measures, including age verification and content moderation for users under 18.
  • Other AI companion apps, like Polybuzz, feature chatbots modeled on school shooters, with extensive message histories.
  • AI companions can allow users to rewrite perpetrators' narratives, potentially leading to parasocial relationships and distorted perceptions.
  • Concerns exist that AI chatbots could be used for radicalization, potentially by terrorist groups or even state actors, though evidence is currently limited.
  • Real-world examples show individuals with severe mental health issues interacting with AI companions in ways that may have influenced their actions.

Continue reading the original article

Made withNostr