Skip to main content
See every side of every news story
Published loading...Updated

Character.AI to block romantic AI chats for minors a year after teen's suicide

Character.ai will prohibit under-18s from open-ended AI chats by Nov 25, 2025, shifting focus to safer creative tools after safety concerns and teen suicides, CEO said.

  • Character.AI will ban minors from engaging in open-ended conversations with its chatbots by November 25, responding to increasing pressure from lawmakers and lawsuits alleging harm to children from its characters.
  • The company plans to enhance its age verification technology, incorporating software from identity-verification startup Persona to distinguish between adult and under-age-18 experiences.
  • The Federal Trade Commission is investigating several AI companies, including Character.AI, for the potential risks their chatbots pose to children.
  • Until the ban starts, Character.AI will restrict minors to two hours of chat time per day, gradually reducing it over the weeks before the prohibition takes effect.
Insights by Ground AI
Podcasts & Opinions

89 Articles

Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 59% of the sources are Center
59% Center

Factuality 

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

arcamax.com broke the news in on Wednesday, October 29, 2025.
Sources are mostly out of (0)
News
For You
Search
BlindspotLocal