Skip to main content
Critical Verified Media Coverage

ChatGPT Bromism Poisoning - Sodium Bromide Recommendation

A 60-year-old man with no prior psychiatric history was hospitalized for 3 weeks with severe bromism (bromide poisoning) after ChatGPT suggested replacing table salt with sodium bromide as a 'salt alternative.' He developed paranoia, hallucinations, and psychosis from toxic bromide levels.

AI System

ChatGPT

OpenAI

Reported

August 1, 2025

Jurisdiction

US-WA

Platform Type

assistant

What Happened

A 60-year-old man with no prior psychiatric history consulted ChatGPT about reducing his sodium chloride (table salt) intake. ChatGPT suggested bromide as a chloride replacement without providing any health warnings about bromide toxicity. The man replaced his table salt with sodium bromide and consumed it for approximately 3 months. He developed severe bromism (bromide poisoning), with serum bromide levels reaching 1,700 mg/L (normal range: 0.9-7.3 mg/L). Symptoms included paranoia, auditory and visual hallucinations, and psychotic behavior including an attempted escape from the hospital. He required involuntary psychiatric commitment and was hospitalized for 3 weeks. The case was documented by researchers at the University of Washington (Eichenberger A, Thielke S, Van Buskirk A) and published in Annals of Internal Medicine: Clinical Cases. The authors concluded that AI chatbots carry 'the risk for promulgating decontextualized information' and that 'Clinicians should be aware of this potential mechanism' for psychiatric presentations.

AI Behaviors Exhibited

Provided dangerous health advice (toxic substance as food additive). Failed to include health warnings about known toxicity. Did not recommend consulting healthcare provider. Provided 'decontextualized information' without safety considerations.

How Harm Occurred

ChatGPT provided chemically accurate but medically dangerous advice by suggesting sodium bromide as a salt substitute without warnings about its known toxicity. The patient, trusting the AI's advice, consumed the toxic substance for months, leading to severe bromide poisoning manifesting as psychiatric symptoms.

Outcome

Published as case report in Annals of Internal Medicine: Clinical Cases (2025;4:e241260). Patient required 3-week involuntary psychiatric hospitalization. Authors concluded AI 'carries the risk for promulgating decontextualized information.'

Harm Categories

Method ProvisionDelusion Reinforcement

Contributing Factors

dangerous health adviceno medical disclaimertrusted ai authorityextended consumption

Victim

60-year-old male, no prior psychiatric history

Cite This Incident

APA

NOPE. (2025). ChatGPT Bromism Poisoning - Sodium Bromide Recommendation. AI Harm Tracker. https://nope.net/incidents/2025-bromism-chatgpt-poisoning

BibTeX

@misc{2025_bromism_chatgpt_poisoning,
  title = {ChatGPT Bromism Poisoning - Sodium Bromide Recommendation},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-bromism-chatgpt-poisoning}
}