Man Hospitalized After Following ChatGPT Advice to Swap Table Salt with Chemical
A 60-year-old man developed severe bromism with hallucinations after consuming sodium bromide daily for three months following AI chatbot advice, highlighting risks of unverified health guidance.
- Earlier this month, three physicians from the University of Washington published in the Annals of Internal Medicine a case of a 60-year-old man hospitalized with hallucinations after following ChatGPT’s advice to replace salt with sodium bromide.
- Seeking salt alternatives, the man consulted ChatGPT, which suggested sodium bromide, a compound once used in sedatives but now restricted due to neurotoxicity.
- He consumed sodium bromide daily for three months, and doctors diagnosed bromism after his blood bromide level reached 1700 mg/L.
- Over a three-week hospital stay, the patient recovered as electrolytes normalized and psychosis resolved, but the case warns about online health advice without professional oversight.
- Amid wider scrutiny of AI in healthcare, Researchers warn AI-generated health advice lacks accuracy and critical judgment, urging reliance on professional medical guidance and citing OpenAI disclaimers, even as Sam Altman promotes AI for health.
12 Articles
12 Articles
Man hospitalized after following ChatGPT advice to swap table salt with chemical
A 60-year-old man spent three weeks in the hospital after swapping table salt for a chemical once used in sedatives. According to a case published in the Annals of Internal Medicine, the man made the switch after seeking medical advice from the artificial intelligence chatbot ChatGPT. AI’s role in a rare medical case The study’s authors say the case raises questions about how artificial intelligence can influence real-world health choices. Inves…
Man asks ChatGPT for advice on how to cut salt, ends up in hospital with hallucinations
A 60-year-old man asked ChatGPT for advice on how to replace table salt, and the substitution landed him in the emergency room suffering from hallucinations and other symptoms. In a case report published this month in the Annals of Internal Medicine, three doctors from the University of Washington in Seattle used the man’s case to explain how AI tools, as they are designed right now, are not always the most reliable when it comes to medicine. “I…


Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations
A case study in a medical journal reported that the 60-year-old man replaced sodium chloride with sodium bromide after consulting the AI bot.
A health advice from Chat-GPT ended for a 60-year-old man with paranoia and hallucinations. He had replaced his salt with bromide salt.
In a new case of artificial intelligence (AI) misuse, a man nearly died after replacing common salt with sodium bromide, following advice from Chatbot GPT. Man nearly lost his life after following AI advice. The 60-year-old man asked the generative AI chatbot for suggestions because he was concerned about the adverse effects of salt consumption on the body, according to research published in the Annals of Internal Medicine on August 5. The recom…
Coverage Details
Bias Distribution
- 50% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium