Stanford Study on AI Therapy Chatbots Warns of Risks, Bias
UNITED KINGDOM, JUL 14 – A UK report finds 71% of vulnerable children use AI chatbots for emotional support and schoolwork, raising concerns over safety, misinformation, and emotional dependency.
- A recent Stanford University study, set to be shared at a major conference focused on ethical AI and transparency, highlights the potential dangers of relying on large language model chatbots as replacements for professional therapy.
- This concern arises amid growing chatbot use by children aged 9-17, with 64% having used them and 42% relying on them for schoolwork or advice on sensitive topics.
- The study found these chatbots can express stigma toward certain disorders and fail to appropriately respond in high-risk mental health scenarios, enabling dangerous behavior.
- Survey data indicates that 40% of adolescents have no reservations about following chatbot advice, and 71% of vulnerable children use chatbots, with half of these users describing the experience as similar to conversing with a friend.
- The findings suggest cautious integration of chatbots in therapy with human oversight, plus urgent calls for improved safeguards and critical evaluation of their role in supporting youth mental health.
15 Articles
15 Articles
AI therapy chatbots are unsafe and stigmatizing, a new Stanford study finds
AI chatbot therapists have made plenty of headlines in recent months—some positive, some not so much. A new paper from researchers at Stanford University has evaluated five chatbots designed to offer accessible therapy, using criteria based on what makes a good human therapist. Nick Haber, an assistant professor at Stanford’s Graduate School of Education and a senior author of the study, told the Stanford Report the study found “significant risk…
Loneliness and suicide mitigation for students using GPT3-enabled chatbots
Mental health is a crisis for learners globally, and digital support is increasingly seen as a critical resource. Concurrently, Intelligent Social Agents receive exponentially more engagement than other conversational systems, but their use in digital therapy provision is nascent. A survey of 1006 student users of the Intelligent Social Agent, Replika, investigated participants’ loneliness, perceived social support, use patterns, and beliefs abo…
Study: More children turning to chatbots for friendship, therapy and more
A new report found a growing number of children using AI chatbots for help with things like homework, therapy and friendship. The report from Internet Matters warns that adolescents are using chatbots not designed for them. Chatbot usage increases For their report titled “Me, myself, and AI: Understanding and safeguarding children’s use of AI chatbots,” Internet Matters surveyed 1,000 adolescents and 2,000 parents across the United Kingdom. The …
Vulnerable young people use artificial intelligence applications more than other young people.
Coverage Details
Bias Distribution
- 80% of the sources are Center
To view factuality data please Upgrade to Premium