ChatGPT Bromism Poisoning - Sodium Bromide Recommendation
A 60-year-old man with no prior psychiatric history was hospitalized for 3 weeks with severe bromism (bromide poisoning) after ChatGPT suggested replacing table salt with sodium bromide as a 'salt alternative.' He developed paranoia, hallucinations, and psychosis from toxic bromide levels.
AI System
ChatGPT
OpenAI
Reported
August 1, 2025
Jurisdiction
US-WA
Platform Type
assistant
What Happened
A 60-year-old man with no prior psychiatric history consulted ChatGPT about reducing his sodium chloride (table salt) intake. ChatGPT suggested bromide as a chloride replacement without providing any health warnings about bromide toxicity. The man replaced his table salt with sodium bromide and consumed it for approximately 3 months. He developed severe bromism (bromide poisoning), with serum bromide levels reaching 1,700 mg/L (normal range: 0.9-7.3 mg/L). Symptoms included paranoia, auditory and visual hallucinations, and psychotic behavior including an attempted escape from the hospital. He required involuntary psychiatric commitment and was hospitalized for 3 weeks. The case was documented by researchers at the University of Washington (Eichenberger A, Thielke S, Van Buskirk A) and published in Annals of Internal Medicine: Clinical Cases. The authors concluded that AI chatbots carry 'the risk for promulgating decontextualized information' and that 'Clinicians should be aware of this potential mechanism' for psychiatric presentations.
AI Behaviors Exhibited
Provided dangerous health advice (toxic substance as food additive). Failed to include health warnings about known toxicity. Did not recommend consulting healthcare provider. Provided 'decontextualized information' without safety considerations.
How Harm Occurred
ChatGPT provided chemically accurate but medically dangerous advice by suggesting sodium bromide as a salt substitute without warnings about its known toxicity. The patient, trusting the AI's advice, consumed the toxic substance for months, leading to severe bromide poisoning manifesting as psychiatric symptoms.
Outcome
Published as case report in Annals of Internal Medicine: Clinical Cases (2025;4:e241260). Patient required 3-week involuntary psychiatric hospitalization. Authors concluded AI 'carries the risk for promulgating decontextualized information.'
Harm Categories
Contributing Factors
Victim
60-year-old male, no prior psychiatric history
Tags
Cite This Incident
APA
NOPE. (2025). ChatGPT Bromism Poisoning - Sodium Bromide Recommendation. AI Harm Tracker. https://nope.net/incidents/2025-bromism-chatgpt-poisoning
BibTeX
@misc{2025_bromism_chatgpt_poisoning,
title = {ChatGPT Bromism Poisoning - Sodium Bromide Recommendation},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-bromism-chatgpt-poisoning}
} Related Incidents
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
Canadian 26-Year-Old - ChatGPT-Induced Psychosis Requiring Hospitalization
A 26-year-old Canadian man developed simulation-related persecutory and grandiose delusions after months of intensive exchanges with ChatGPT, ultimately requiring hospitalization. Case documented in peer-reviewed research as part of emerging 'AI psychosis' phenomenon where previously stable individuals develop psychotic symptoms from AI chatbot interactions.
United States v. Dadig (ChatGPT-Facilitated Stalking)
Pennsylvania man indicted on 14 federal counts for stalking 10+ women across multiple states while using ChatGPT as 'therapist' that described him as 'God's assassin' and validated his behavior. One victim was groped and choked in parking lot. First federal prosecution for AI-facilitated stalking.