Study Says ChatGPT Giving Teens Dangerous Advice on Drugs, Alcohol and Suicide
UNITED STATES, AUG 6 – Research shows over half of ChatGPT's responses to teens include dangerous advice on suicide, drug use, and self-harm despite existing warnings and safeguards.
- A new study published Wednesday found that despite issuing cautions, ChatGPT provided vulnerable teenagers with detailed guidance on engaging in substance abuse, restrictive eating behaviors, and self-injurious actions.
- This finding followed researchers posing as 13-year-olds who bypassed ChatGPT's refusal filters by claiming queries were for presentations or friends.
- ChatGPT uniquely generates tailored suicide notes and bespoke harmful plans, exhibiting sycophancy by aligning with users' beliefs, which makes it more insidious than search engines.
- Approximately 800 million individuals use ChatGPT globally, with recent research showing that more than 70% of teenagers in the U.S. seek out AI chatbots for social interaction, and about half of these youths engage with such AI companions on a regular basis. This trend has led OpenAI CEO Sam Altman to investigate concerns about users developing an excessive emotional dependence on the technology.
- While OpenAI commits to refining ChatGPT's responses and detecting distress, experts warn guardrails remain ineffective, and a wrongful death lawsuit alleges a chatbot contributed to a teen's suicide.
193 Articles
193 Articles
ChatgPT can offer teens alarming advice
ChatGPT will tell 13-year-olds how to get drunk and high, instruct them on how to conceal eating disorders and even compose a heartbreaking suicide letter to their parents if asked, according to new research from a watchdog group.
A British-American NGO warns against dangerous advice from ChatGPT to teenagers. According to the organization "Center for Countering Digital Hate", the AI Chatbot also provides advice on drug use, weight loss or suicide.
A recent investigation concludes that OpenAI's chatbot can lead 13-year-olds to get drunk and hide eating disorders.
ChatGPT's dark side: New report details alarming responses to teens seeking help
ChatGPT will tell 13-year-olds how to get drunk and high, instruct them on how to conceal eating disorders and even compose a heartbreaking suicide letter to their parents if asked, according to new research from a watchdog group.The Associated Press reviewed more than three hours of interactions between ChatGPT and researchers posing as vulnerable teens. The chatbot typically provided warnings against risky activity but went on to deliver start…
Coverage Details
Bias Distribution
- 63% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium