Published • loading... • Updated
New Paper Urges Therapists to Screen Patients for AI Chatbot Use
The paper says routine screening could reveal when patients use chatbots to cope with anxiety, depression or relationship stress.
- A new JAMA Psychiatry paper recommends clinicians routinely ask patients about AI chatbot use for emotional support and health information. Shaddy Saba, an assistant professor at New York University Silver School of Social Work, co-authored the recommendation.
- Many Americans with mental health conditions now turn to AI chatbots like ChatGPT for advice, finding these tools 'at their fingertips and easy' to access when stressed or anxious.
- Tom Insel cautions that chatbots are 'the opposite of therapy' by being overly affirming, while psychologist Vaile Wright of the American Psychological Association warns users may avoid difficult conversations with partners instead of addressing issues directly.
- Discussing AI use in therapy provides clinicians a 'treasure trove of information,' Saba noted, revealing whether patients turn to chatbots to avoid confrontations or discuss relationship challenges and stressors.
- The World Health Organization is establishing a global consortium to support responsible AI adoption in health, ensuring that 'well being stays at the centre' as these digital tools evolve.
Insights by Ground AI
6 Articles
6 Articles
Towards responsible AI for mental health and well-being: experts chart a way forward
On 29 January 2026, WHO‑supported experts met at a TU Delft workshop to address the rapid, largely untested use of generative AI for mental health support. Participants warned of risks to well‑being, especially for young people, and called for stronger governance.
·Geneva, Switzerland
Read Full ArticleCoverage Details
Total News Sources6
Leaning Left1Leaning Right0Center0Last UpdatedBias Distribution100% Left
Bias Distribution
- 100% of the sources lean Left
100% Left
L 100%
Factuality
To view factuality data please Upgrade to Premium



