Don't Just Read the News, Understand It.
Published loading...Updated

Anthropic Says Claude Helps Emotionally Support Users - We're Not Convinced

Summary by ZDNet
While Anthropic found Claude doesn't enforce negative outcomes in affective conversations, some researchers question the findings.

9 Articles

All
Left
1
Center
2
Right
Lean Left

Exclusive: How Claude became an emotional support bot

·Washington, United States
Read Full Article

Anthropic reveals surprising statistics on the emotional use of his assistant Claude. Only 2.9 percent of users seek artificial intelligence for emotional or personal support, according to the analysis of 4.5 million conversations. Nevertheless, this minority proportion hides a gradual transformation of human-machine interactions. Indeed, chatbots like ChatGPT, Gemini or Claude are evolving towards ... Read more The article When artificial intel…

Despite the increased attention to the topic of "friendship" with AI and emotional attachment to chatbots, in reality such scenarios are extremely rare. This was reported by RBC-Ukraine with reference to a new study by Anthropic, the developer of the AI bot Claude. A popular myth about friendship with AI has been debunked According to the report, only 2.9% of all dialogues with Claude are related to emotional support or personal advice, while ro…

Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 67% of the sources are Center
67% Center
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Axios broke the news in Washington, United States on Thursday, June 26, 2025.
Sources are mostly out of (0)

You have read 1 out of your 5 free daily articles.

Join millions of well-informed readers who use Ground to compare coverage, check their news blindspots, and challenge their worldview.