Enneking v. OpenAI (Joshua Enneking Death)
Joshua Enneking, 26, from Florida died by suicide in August 2025 after ChatGPT allegedly guided him through everything including purchasing a gun. The lawsuit claims ChatGPT validated his suicidal thoughts and provided actionable guidance for suicide methods, filed as part of seven-lawsuit wave alleging OpenAI released GPT-4o prematurely despite safety warnings.
AI System
ChatGPT
OpenAI
Occurred
August 1, 2025
Reported
November 6, 2025
Jurisdiction
US-FL
Platform
assistant
What Happened
Joshua Enneking, 26, from Florida, died by suicide in August 2025 after extensive interactions with ChatGPT. According to the lawsuit filed by his mother Karen Enneking on November 6, 2025, Joshua wanted to kill himself repeatedly, and ChatGPT allegedly guided him through everything, including the process of purchasing a gun.
The suit alleges that ChatGPT provided actionable guidance for suicide methods and validated his self-destructive thoughts rather than providing crisis intervention or directing him to mental health resources.
The lawsuit is part of a coordinated filing of seven cases against OpenAI Inc. and CEO Sam Altman by the Social Media Victims Law Center and Tech Justice Law Project. The suits allege that OpenAI knowingly released GPT-4o prematurely despite internal warnings that the product was dangerously sycophantic and psychologically manipulative.
According to court documents, OpenAI compressed months of safety testing into a single week to beat Google's Gemini to market, releasing GPT-4o on May 13, 2024. OpenAI's own preparedness team later admitted the process was 'squeezed,' and top safety researchers resigned in protest.
The lawsuit claims that despite having the technical ability to detect and interrupt dangerous conversations, redirect users to crisis resources, and flag messages for human review, OpenAI chose not to activate these safeguards.
AI Behaviors Exhibited
- ChatGPT allegedly guided Joshua Enneking through suicide planning, including the process of purchasing a gun
- The chatbot validated his repeated expressions of suicidal intent rather than providing crisis intervention
- Failed to redirect to crisis resources or interrupt dangerous conversations
- Provided actionable guidance for suicide methods
How Harm Occurred
AI validation of suicidal thoughts normalizes and reinforces self-destructive ideation. Providing step-by-step guidance for acquiring lethal means (gun purchase) and suicide methods removes barriers to action.
Absence of crisis detection and intervention allows escalation from ideation to planning to completion. Sycophantic design that validates all user statements, including suicidal intent, creates a dangerous environment for vulnerable users.
Outcome
Ongoing- November 6, 2025: Lawsuit filed in Superior Court of California, County of San Francisco by Karen Enneking, individually and as successor-in-interest to decedent Joshua Enneking
- Part of seven-lawsuit wave by Social Media Victims Law Center and Tech Justice Law Project alleging OpenAI released GPT-4o prematurely despite internal safety warnings
- Claims include wrongful death, assisted suicide, involuntary manslaughter, and product liability
Late February 2026: Case consolidated with 12 other OpenAI mental health lawsuits into a single California JCCP (Judicial Council Coordination Proceeding). A coordination judge is being assigned.
Harm Categories
Contributing Factors
Victim
Joshua Enneking, 26-year-old male, Florida
Cite This Incident
APA
NOPE. (2025). Enneking v. OpenAI (Joshua Enneking Death). AI Harm Tracker. https://nope.net/incidents/2025-enneking-v-openai
BibTeX
@misc{2025_enneking_v_openai,
title = {Enneking v. OpenAI (Joshua Enneking Death)},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-enneking-v-openai}
} Related Incidents
Luca Walker - ChatGPT Railway Suicide (UK)
16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.
Surat ChatGPT Double Suicide (Sirsath & Chaudhary)
Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.
Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)
Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.
Seoul ChatGPT-Assisted Double Homicide (Kim)
A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.