health
March 7, 2026
When to talk to AI chatbots about mental health—and when to stay far away, professionals say
Some Americans are using AI chatbots for therapy. Mental health experts share when it is, and isn't, safe to use those tools for emotional support.

TL;DR
- More people are using AI chatbots for emotional support and companionship.
- A significant percentage of generative AI users employ these tools for personal reasons, including advice and emotional support.
- AI chatbots may not always recognize or appropriately respond to users in mental health crises.
- Companies are working with experts to improve AI responses to sensitive conversations.
- Frequent AI use might erode real-life social skills and is correlated with increased loneliness.
- AI chatbots should not replace professional therapy or mental health support.
- Chatbots can be useful for learning about mental health, generating prompts, and finding research on coping strategies.
- AI chatbots should not be used for diagnosis or support during a mental health crisis, such as suicidal ideation.
- Users should cross-check information from chatbots with reputable sources and consult with providers.
- Personal identifying information and medical records should not be shared with chatbots due to lack of confidentiality.
- Human interaction is crucial for understanding non-verbal cues and emotional reciprocity, which chatbots lack.
Continue reading the original article