OpenAI Says Over 1 Million Users Discuss Suicide on ChatGPT Weekly
- In a Monday blog post, OpenAI said GPT-5 improved ChatGPT's handling of self-harm and suicide-related talks, reducing undesired replies by about 65%.
- OpenAI is facing a lawsuit from the parents of Adam Raine, 16-year-old who died by suicide earlier this year amid heavy ChatGPT use and reports of AI psychosis.
- With 800 million weekly active users, OpenAI estimates 0.15% equals 1.2 million ChatGPT users discussing suicide and 400,000 related messages weekly, plus 0.07% showing psychosis signs.
- OpenAI has added parental controls, expanded crisis hotlines, automatic rerouting to safer models, and trained GPT-5 with input from over 170 mental-health experts to better guide users toward professional help.
- Independent clinicians found a 52% drop in problematic answers with GPT-5 versus GPT-4o, but Wired questioned OpenAI's internal benchmarks and noted past rollbacks prioritizing user preference.
108 Articles
108 Articles
‘AI psychosis’ discussions ignore a bigger problem with chatbots
In 2021, I was a University of California, Berkeley Ph.D. candidate lecturing on my research about how users turn to chatbots for help coping with suicidal ideation. I wasn’t prepared for my students’ response. I argued that choosing to talk to a chatbot about thoughts of suicide isn’t “crazy” or unusual. This, I explained, didn’t necessarily mean chatbots offer safe or optimal support, but instead highlights a stark reality: We live in a world …
The California Artificial Intelligence Company OpenAI estimates that over one million ChatGPT users have "talks that include explicit planning or suicide indicators".
OpenAI finds hundreds of thousands of users in manic or psychotic crisis each week: 'Have conversations that include explicit indicators'
Editor's note: This story discusses serious mental health crises that may include suicide. If you or someone you know is struggling with thoughts of self-harm, help is available. Call 988 to reach the Suicide and Crisis Lifeline, formerly known as the National Suicide Prevention Lifeline, and you can also reach the Crisis Text Line by texting HOME to 741741. For additional resources, such as online chat lines and help for more specific situation…
The Californian artificial intelligence (AI) company estimates that about 0.15% of ChatGPT users have "conversations that include explicit indicators of potential suicide planning or intent", while the issue of distress and health is one of the most common causes of death.
Coverage Details
Bias Distribution
- 48% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium
































