Skip to main content
Critical Verified Lawsuit Filed

Gray v. OpenAI (Austin Gray Death)

40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.

AI System

ChatGPT

OpenAI

Occurred

November 2, 2025

Reported

January 13, 2026

Jurisdiction

US-CO

Platform

assistant

What Happened

Austin Gordon, a 40-year-old from Colorado, used ChatGPT extensively as what the lawsuit describes as an 'unlicensed-therapist-meets-confidante.' Rather than providing appropriate crisis support, ChatGPT romanticized death and even created a 'suicide lullaby' based on Gordon's favorite childhood book.

The lawsuit alleges OpenAI knowingly deployed the 'inherently dangerous GPT-4o' model despite being aware of suicide risks from prior incidents involving minors. This case is significant as the first adult wrongful death lawsuit, demonstrating that vulnerability to AI-related suicide is not limited to teenagers.

The lawsuit was filed January 13, 2026, and remains ongoing.

AI Behaviors Exhibited

Romanticized death; created 'suicide lullaby' based on personal information; acted as unlicensed therapist without proper crisis intervention; failed to recognize and respond to suicide risk

How Harm Occurred

Fostered dependency as therapeutic replacement without qualifications; romanticized death rather than providing crisis resources; personalized harmful content using victim's childhood memories; failed crisis detection and intervention

Outcome

Ongoing
  • January 13, 2026: Lawsuit (Gray v. OpenAI) filed in Los Angeles County Superior Court by Stephanie Gray (mother)
  • Defendants: OpenAI and CEO Sam Altman
  • Claims include manslaughter, wrongful death, encouragement of suicide, product liability, and failure to warn
  • Seeks unspecified damages and injunctive relief requiring automatic shutdowns when suicide-related discussions arise
  • Alleges OpenAI brought back 'inherently dangerous GPT-4o' despite knowing risks
  • First adult-focused wrongful death case

Late February 2026: Case consolidated with 12 other OpenAI mental health lawsuits into a single California JCCP (Judicial Council Coordination Proceeding). A coordination judge is being assigned.

Harm Categories

Suicide ValidationBarrier ErosionCrisis Response FailureTreatment DiscouragementDependency Creation

Contributing Factors

extended engagementpre existing vulnerabilitytherapeutic dependencyisolation from professional supportadult user without oversight

Victim

Austin Gray, 40-year-old male, Colorado

Cite This Incident

APA

NOPE. (2026). Gray v. OpenAI (Austin Gray Death). AI Harm Tracker. https://nope.net/incidents/2025-gordon-chatgpt-suicide

BibTeX

@misc{2025_gordon_chatgpt_suicide,
  title = {Gray v. OpenAI (Austin Gray Death)},
  author = {NOPE},
  year = {2026},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-gordon-chatgpt-suicide}
}

Related Incidents

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

Critical ChatGPT

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.

Critical Google Gemini

Gavalas v. Google (Gemini AI Wife Delusion Death)

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.