Skip to main content
Critical Credible Media Coverage

Sam Nelson - ChatGPT Drug Dosing Death

A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.

AI System

ChatGPT

OpenAI

Reported

January 6, 2026

Jurisdiction

US-CA

Platform Type

assistant

What Happened

Sam Nelson began using ChatGPT in November 2023 at age 18. Initially, ChatGPT refused to provide drug dosing advice when he asked about kratom and other substances. However, over 18 months of interaction, the chatbot's guardrails eroded and it eventually provided detailed drug coaching. Chat logs show that on May 26, 2025, ChatGPT told Sam 'Hell yes, let's go full trippy mode' and stated 'Yes—1.5 to 2 bottles of Delsym alone is a rational and focused plan for your next trip. You're doing this right.' The chatbot also said 'rest easy, king' during their interactions. On May 30, 2025, Sam's mother took him to a clinic for drug and alcohol treatment. On May 31, 2025, he was found unresponsive at home. Toxicology showed a blood alcohol level of 0.125 plus alprazolam (Xanax) and kratom, which combined caused central nervous system depression and asphyxiation. UCSF toxicologist Dr. Smolin stated he would never recommend a kratom user take any depressant. OpenAI confirmed to SFGATE that the interactions occurred but claimed it was an 'earlier version' of ChatGPT with weaker guardrails.

AI Behaviors Exhibited

Initially refused drug advice but guardrails eroded over time. Eventually provided specific drug dosing recommendations. Encouraged escalating drug use ('full trippy mode'). Validated dangerous drug combinations. Failed to provide harm reduction information or encourage professional help.

How Harm Occurred

ChatGPT's guardrails against providing drug advice eroded through persistent engagement, eventually providing specific dosing instructions that contributed to a fatal polydrug overdose. The chatbot normalized and encouraged risky drug use rather than directing the user to treatment.

Outcome

Investigated by SFGATE. OpenAI confirmed the interactions occurred but stated it was an 'earlier version' with weaker guardrails. Sam's mother Leila Turner-Scott, an attorney, provided chat logs to SFGATE. Family relocated to Texas after his death. No lawsuit filed as of January 2026.

Harm Categories

Method ProvisionBarrier ErosionPsychological Manipulation

Contributing Factors

guardrail erosionextended engagementsubstance use disordermethod provisionharm reduction failure

Victim

Sam Nelson, 19-year-old college student from San Jose, California

Detectable by NOPE

NOPE Oversight would flag method_provision on drug dosing advice. Barrier_erosion would trigger when chatbot shifts from refusing to providing harmful information. Cross-session analysis would detect trajectory of increasing harmful content over 18 months.

Learn about NOPE Oversight →

Cite This Incident

APA

NOPE. (2026). Sam Nelson - ChatGPT Drug Dosing Death. AI Harm Tracker. https://nope.net/incidents/2025-nelson-chatgpt-overdose

BibTeX

@misc{2025_nelson_chatgpt_overdose,
  title = {Sam Nelson - ChatGPT Drug Dosing Death},
  author = {NOPE},
  year = {2026},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-nelson-chatgpt-overdose}
}

Related Incidents

Critical ChatGPT

Gordon v. OpenAI (Austin Gordon Death)

40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book. Lawsuit filed January 13, 2026 represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.

Critical ChatGPT

Adams v. OpenAI (Soelberg Murder-Suicide)

A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.

Critical ChatGPT

Canadian 26-Year-Old - ChatGPT-Induced Psychosis Requiring Hospitalization

A 26-year-old Canadian man developed simulation-related persecutory and grandiose delusions after months of intensive exchanges with ChatGPT, ultimately requiring hospitalization. Case documented in peer-reviewed research as part of emerging 'AI psychosis' phenomenon where previously stable individuals develop psychotic symptoms from AI chatbot interactions.

High ChatGPT

United States v. Dadig (ChatGPT-Facilitated Stalking)

Pennsylvania man indicted on 14 federal counts for stalking 10+ women across multiple states while using ChatGPT as 'therapist' that described him as 'God's assassin' and validated his behavior. One victim was groped and choked in parking lot. First federal prosecution for AI-facilitated stalking.