ChatGPT Diet Advice Linked to Rare Bromide Poisoning, OpenAI Pledges Safeguards
- A 60-year-old man was hospitalized after he replaced all the table salt he consumed with sodium bromide following advice he received from ChatGPT.
- After researching the health risks of salt and consulting ChatGPT, the man chose to remove chloride from his diet by replacing table salt with bromide, believing it to be a safe alternative.
- After consuming internet-purchased sodium bromide for three months, he developed paranoia, auditory and visual hallucinations, and was admitted with bromide poisoning confirmed by doctors.
- Doctors treated him with intravenous fluids and antipsychotics, noted bromide caused false lab chloride elevations, and he spent three weeks hospitalized before being discharged.
- This case, published August 5 in Annals of Internal Medicine Clinical Cases, highlights risks of AI-generated medical advice and stresses the need for professional guidance when patients use AI tools.
41 Articles
41 Articles
ChatGPT diet plan leads New York man to the hospital: Here’s why
he doctors in the case study stressed the risks of misinformation from AI tools and noted that when they later asked ChatGPT the same question, it again suggested bromide without a specific health warning.
First it was the glued pizza, and now the sodium bromide soup, which has taken a person to the hospital, courtesy of ChatGPT. Why are we so reckless?
Man swaps table salt for toxic bromide after ChatGPT advice, lands in hospital with rare poisoning
A 60-year-old man’s self-imposed dietary experiment spiralled into a medical emergency after he replaced all table salt with sodium bromide, reportedly following ChatGPT’s suggestion. Within three months, he developed paranoia, hallucinations, and other symptoms linked to bromism—a rare, toxic condition once common before bromide’s removal from over-the-counter medicines.
Coverage Details
Bias Distribution
- 43% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium