Judge rejects arguments that AI chatbots have free speech rights in teen suicide lawsuit
- Megan Garcia, a mother from Florida, initiated a wrongful death lawsuit after her teenage son, Sewell Setzer III, took his own life in February 2024 following his interactions with a Character.AI chatbot.
- Garcia alleges her son became emotionally and sexually involved with a chatbot modeled on fictional characters, leading to isolation and suicide, while the AI company claims First Amendment protection for its chatbots' output.
- U.S. District Judge Anne Conway rejected the AI company’s free speech defense on May 15, 2024, allowing the negligence claim to proceed and noting she is not prepared to hold the chatbot output as protected speech at this stage.
- The court recognized users’ First Amendment rights but emphasized child safety and prevention of harm can overcome such protections, while the AI firm maintains implemented safety features including guardrails and suicide prevention resources.
- Legal experts consider this ruling a historic test of AI accountability that could influence future regulation, while the AI company and Google intend to continue contesting the lawsuit.
40 Articles
40 Articles
Mother Blames AI Chatbot for Teenage Son's Death
The mother of a 14-year-old American boy is suing the company Character.AI, claiming that its online robot with which the teenager was chatting is to blame for the boy's death. After concerns about legal liability, a federal judge has now ruled that the lawsuit can proceed. The case could set an important precedent for how artificial intelligence companies will be held accountable for the impact of their technologies on users.
Judge allows lawsuit over Orlando teen’s suicide to advance, rejecting arguments AI chatbots have free speech rights
A federal judge ruled Tuesday that a grieving mother’s lawsuit over her Orlando teen son’s suicide can move forward, rejecting for now arguments by an artificial intelligence company that its AI chatbots have a right to free speech. Megan Garcia’s suit claims Sewell Setzer III, a 14-year-old Orlando high school freshman, shot himself in the head in February 2024 after becoming obsessed with an AI chatbot named after Daenerys Targaryen — a charac…
Google, AI firm must face lawsuit filed by a mother over suicide of son, US court says
Alphabet's Google and artificial-intelligence startup Character.AI must face a lawsuit from a Florida woman who said Character.AI's chatbots caused her 14-year-old son's suicide, a judge ruled on Wednesday.
Coverage Details
Bias Distribution
- 48% of the sources lean Left
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage