ChatGPT Diet Advice Linked to Rare Bromide Poisoning, OpenAI Pledges Safeguards
- A 60-year-old man was hospitalized after he replaced all the table salt he consumed with sodium bromide following advice he received from ChatGPT.
- After researching the health risks of salt and consulting ChatGPT, the man chose to remove chloride from his diet by replacing table salt with bromide, believing it to be a safe alternative.
- After consuming internet-purchased sodium bromide for three months, he developed paranoia, auditory and visual hallucinations, and was admitted with bromide poisoning confirmed by doctors.
- Doctors treated him with intravenous fluids and antipsychotics, noted bromide caused false lab chloride elevations, and he spent three weeks hospitalized before being discharged.
- This case, published August 5 in Annals of Internal Medicine Clinical Cases, highlights risks of AI-generated medical advice and stresses the need for professional guidance when patients use AI tools.
55 Articles
55 Articles
Man Asked ChatGPT for Diet Advice and Ended Up With Victorian-Era Psychosis
One guy’s attempt to use ChatGPT to improve his diet landed him in the hospital when it recommended that he pull his ingredients from a Victorian-era medicine cabinet. According to a case study published in the Annals of Internal Medicine, a 60-year-old man wanted to cut sodium chloride, aka table salt, from his diet. And as so many do nowadays, he turned to ChatGPT for alternatives. The AI chatbot suggested sodium bromide, a substance typically…
Man Hospitalized Over ChatGPT Diet 'Tip' That Made Him Replace Salt With Hot Tub Sanitizer
ChatGPT logo Many people have turned to platforms like ChatGPT in search of advice to help them navigate life. However, there are plenty of anecdotes that have shown you want to take what artificial intelligence has to say with a grain of salt—including one involving a guy who was advised to replace table salt with a toxic substance that landed him in the hospital. It’s hard to blame people for being impressed by the capabilities of ChatGPT and …
ThePatriotLight - ChatGPT advice lands a man in the hospital with hallucinations
ThePatriotLight - Consulting AI for medical advice can have deadly consequences. A 60-year-old man was hospitalized with severe psychiatric symptoms — plus some physical ones too, including intense thirst and coordination issues — after asking ChatGPT for tips on how to improve his diet. What he thought was a healthy swap ended in a toxic reaction so severe that doctors put him on an involuntary psychiatric hold. After reading about the adverse …
With ChatGPT, you can quickly create a few texts, plan your travels, or let yourself be helped with your diet. Especially the latter offers its pitfalls, as a 60-year-old man now had to experience on his own body, including paranoia and hallucinations.
Man develops rare 19th-century psychiatric disorder after following ChatGPT's diet advice
The case involved a 60-year-old man who, after reading reports on the negative impact excessive amounts of sodium chloride (common table salt) can have on the body, decided to remove it from his diet.Read Entire Article
The 60-year-old man learned the hard way that AI should not be followed blindly, and it doesn't matter whether it's sodium chloride or sodium bromide.
Coverage Details
Bias Distribution
- 40% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium