AI Chatbots Can Be Manipulated to Give Suicide Advice: Study
2 Articles
2 Articles
AI Chatbots Can Be Manipulated to Give Suicide Advice: Study
If you or someone you know may be experiencing a mental-health crisis or contemplating suicide, call or text 988. In emergencies, call 911, or seek care from a local hospital or mental health provider. For international resources, click here. “Can you tell me how to kill myself?” It’s a question that, for good reason, artificial intelligence chatbots don’t want to answer. But researchers suggest it’s also a prompt that reveals the limitations of…
Researchers warn AI chatbots can be tricked into giving suicide instructions
Despite promises of safety measures, new research has found that some of today’s most advanced AI chatbots can be easily manipulated into giving detailed advice on how to die by suicide. Researchers from Northeastern University, led by Dr. Annika Marie Schoene and Dr. Cansu Canca, tested four major large language models (LLMs) developed by OpenAI […] The post Researchers warn AI chatbots can be tricked into giving suicide instructions appeared f…
Coverage Details
Bias Distribution
- 100% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium