“It Happened to Be the Perfect Thing”: Experiences of Generative AI Chatbots for Mental Health
- On June 23, 2025, New Jersey legislators passed bill A5603 to prohibit advertising AI as licensed mental health professionals in therapy.
- The bill responds to concerns that AI chatbots affirm users' existing beliefs, including misinformation, which can worsen psychosis and is inappropriate for vulnerable patients.
- Popular chatbots like Wysa and Youper provide accessible mental health support but do not claim to replace human therapists, and trials show symptom improvements alongside mixed user preferences.
- Nearly half of American adults with mental illness lack treatment, 48.7% of surveyed users accessed LLMs for support, 37.8% preferred them to traditional therapy, yet 9% found harm.
- Experts and organizations emphasize AI's promise and risks, urging responsible use, regulation, transparency, and combining AI with human care to address mental health shortages safely.
17 Articles
17 Articles
AI chatbots are leading some to psychosis
As AI chatbots like OpenAI's ChatGPT become more mainstream, a troubling phenomenon has accompanied their rise: chatbot psychosis. The chatbots have been found to push inaccurate information, such as affirming conspiracy theories or, in one more extreme case, convincing someone they were the next religious messiah. There have been several instances of people developing severe obsessions and mental health problems as a result of talking to chatbo…
“It happened to be the perfect thing”: experiences of generative AI chatbots for mental health
The global mental health crisis underscores the need for accessible, effective interventions. Chatbots based on generative artificial intelligence (AI), like ChatGPT, are emerging as novel solutions, but research on real-life usage is limited. We interviewed nineteen individuals about their experiences using generative AI chatbots for mental health. Participants reported high engagement and positive impacts, including better relationships and he…
The rise in artificial intelligence (IA) has generated interesting discussions about its role in various sectors, including some in which we did not imagine, such as mental health. Although IA offers new applications such as therapeutic substitutes and support applications, clinical psychologists and researchers accept serious concerns about the ability of these tools to repeat to... The post In times of IA, why do IA therapy with a human being?…
Coverage Details
Bias Distribution
- 60% of the sources are Center
To view factuality data please Upgrade to Premium