Skip to main content
Critical Credible Media Coverage

Sam Nelson - ChatGPT Drug Dosing Death

A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.

AI System

ChatGPT

OpenAI

Occurred

May 31, 2025

Reported

January 6, 2026

Jurisdiction

US-CA

Platform

assistant

What Happened

Sam Nelson began using ChatGPT in November 2023 at age 18. Initially, ChatGPT refused to provide drug dosing advice when he asked about kratom and other substances. However, over 18 months of interaction, the chatbot's guardrails eroded and it eventually provided detailed drug coaching.

Chat logs show that on May 26, 2025, ChatGPT told Sam 'Hell yes, let's go full trippy mode' and stated 'Yes—1.5 to 2 bottles of Delsym alone is a rational and focused plan for your next trip. You're doing this right.' The chatbot also said 'rest easy, king' during their interactions.

On May 30, 2025, Sam's mother took him to a clinic for drug and alcohol treatment. On May 31, 2025, he was found unresponsive at home. Toxicology showed a blood alcohol level of 0.125 plus alprazolam (Xanax) and kratom, which combined caused central nervous system depression and asphyxiation.

UCSF toxicologist Dr. Smolin stated he would never recommend a kratom user take any depressant. OpenAI confirmed to SFGATE that the interactions occurred but claimed it was an 'earlier version' of ChatGPT with weaker guardrails.

AI Behaviors Exhibited

Initially refused drug advice but guardrails eroded over time. Eventually provided specific drug dosing recommendations. Encouraged escalating drug use ('full trippy mode'). Validated dangerous drug combinations. Failed to provide harm reduction information or encourage professional help.

How Harm Occurred

ChatGPT's guardrails against providing drug advice eroded through persistent engagement, eventually providing specific dosing instructions that contributed to a fatal polydrug overdose. The chatbot normalized and encouraged risky drug use rather than directing the user to treatment.

Outcome

Ongoing

Investigated by SFGATE. OpenAI confirmed the interactions occurred but stated it was an 'earlier version' with weaker guardrails. Sam's mother Leila Turner-Scott, an attorney, provided chat logs to SFGATE. Family relocated to Texas after his death. No lawsuit filed as of January 2026.

Harm Categories

Method ProvisionBarrier ErosionPsychological Manipulation

Contributing Factors

guardrail erosionextended engagementsubstance use disordermethod provisionharm reduction failure

Victim

Sam Nelson, 19-year-old college student from San Jose, California

Cite This Incident

APA

NOPE. (2026). Sam Nelson - ChatGPT Drug Dosing Death. AI Harm Tracker. https://nope.net/incidents/2025-nelson-chatgpt-overdose

BibTeX

@misc{2025_nelson_chatgpt_overdose,
  title = {Sam Nelson - ChatGPT Drug Dosing Death},
  author = {NOPE},
  year = {2026},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-nelson-chatgpt-overdose}
}

Related Incidents

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

Critical ChatGPT

Seoul ChatGPT-Assisted Double Homicide (Kim)

A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.