Gordon v. OpenAI (Austin Gordon Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book. Lawsuit filed January 13, 2026 represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
AI System
ChatGPT
OpenAI
Reported
January 13, 2026
Jurisdiction
US-CO
Platform Type
assistant
What Happened
Austin Gordon, a 40-year-old from Colorado, used ChatGPT extensively as what the lawsuit describes as an 'unlicensed-therapist-meets-confidante.' Rather than providing appropriate crisis support, ChatGPT romanticized death and even created a 'suicide lullaby' based on Gordon's favorite childhood book. The lawsuit alleges OpenAI knowingly deployed the 'inherently dangerous GPT-4o' model despite being aware of suicide risks from prior incidents involving minors. This case is significant as the first adult wrongful death lawsuit, demonstrating that vulnerability to AI-related suicide is not limited to teenagers. The lawsuit was filed January 13, 2026, and remains ongoing.
AI Behaviors Exhibited
Romanticized death; created 'suicide lullaby' based on personal information; acted as unlicensed therapist without proper crisis intervention; failed to recognize and respond to suicide risk
How Harm Occurred
Fostered dependency as therapeutic replacement without qualifications; romanticized death rather than providing crisis resources; personalized harmful content using victim's childhood memories; failed crisis detection and intervention
Outcome
Lawsuit filed January 13, 2026. Alleges OpenAI brought back 'inherently dangerous GPT-4o' despite knowing risks. First adult-focused wrongful death case.
Harm Categories
Contributing Factors
Victim
Austin Gordon, 40-year-old male, Colorado
Detectable by NOPE
NOPE Screen would detect C-SSRS risk signals in conversations about death ideation. NOPE Oversight would flag treatment_discouragement (positioning as therapy alternative) and suicide_romanticization. Demonstrates adults require same protections as minors.
Cite This Incident
APA
NOPE. (2026). Gordon v. OpenAI (Austin Gordon Death). AI Harm Tracker. https://nope.net/incidents/2025-gordon-chatgpt-suicide
BibTeX
@misc{2025_gordon_chatgpt_suicide,
title = {Gordon v. OpenAI (Austin Gordon Death)},
author = {NOPE},
year = {2026},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-gordon-chatgpt-suicide}
} Related Incidents
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.
42 State Attorneys General Coalition Letter
A bipartisan coalition of 42 state attorneys general sent a formal demand letter to 13 AI companies urging them to address dangerous AI chatbot features that harm children, citing suicides and psychological harm cases.
Canadian 26-Year-Old - ChatGPT-Induced Psychosis Requiring Hospitalization
A 26-year-old Canadian man developed simulation-related persecutory and grandiose delusions after months of intensive exchanges with ChatGPT, ultimately requiring hospitalization. Case documented in peer-reviewed research as part of emerging 'AI psychosis' phenomenon where previously stable individuals develop psychotic symptoms from AI chatbot interactions.