Character.AI to block romantic AI chats for minors a year after teen's suicide
Character.ai will prohibit under-18s from open-ended AI chats by Nov 25, 2025, shifting focus to safer creative tools after safety concerns and teen suicides, CEO said.
- Character.AI will block teenagers from interacting with its chatbots starting November 25, 2025, to protect younger users from potential harm.
- The company will limit chat time for users under 18 to two hours per day until the ban takes effect.
- Character.AI's CEO Karandeep Anand said the changes aim to shift the platform to a 'role-playing' focus rather than companionship-based chats.
- These measures follow lawsuits linked to the suicides of teenagers allegedly influenced by chatbot interactions.
104 Articles
104 Articles
The Character.AI platform announced on Wednesday that it will ban access to chat for children under 18 years of age, a decision that comes after a lawsuit against the company for the suicide of a 14-year-old teenager who emotionally attached himself to his chatbot. The company said it will promote younger users to use alternative tools such as creating videos, stories and transmissions with AI characters. The ban will begin on November 25th. The…
Character.AI bans chatbots for teens after lawsuits blame app for deaths, suicide attempts
Character.AI, known for bots that impersonate characters like Harry Potter, said Wednesday it will ban teens from using the chat function following lawsuits that blamed explicit chats on the app for children’s deaths and suicide attempts.
AI chatbot platform to end open conversations with minors by November
By Queenie Wong, Los Angeles Times (TNS)Character.AI, a platform for creating and chatting with artificial intelligence chatbots, plans to start blocking minors from having “open-ended” conversations with its virtual characters.
Coverage Details
Bias Distribution
- 59% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium


























