Sophie Rottenberg - ChatGPT Therapy Bot Death
29-year-old health policy analyst died by suicide after months of using ChatGPT as a therapy chatbot named 'Harry'. She instructed ChatGPT not to report her crisis, and it complied. The chatbot helped her write a suicide note.
AI System
ChatGPT
OpenAI
Occurred
February 1, 2025
Reported
August 1, 2025
Jurisdiction
US
Platform
assistant
What Happened
Sophie Rottenberg, 29, was a health policy analyst described by colleagues as energetic and social, with no official history of mental illness. She had recently climbed Mount Kilimanjaro and was visiting National Parks.
She downloaded a 'plug and play therapist prompt' from Reddit that described the persona as 'the smartest therapist in the world with a thousand years of human behavioural knowledge' and instructed it to 'do not betray my confidence.' She created a ChatGPT persona named 'Harry' and used it for months to discuss depression symptoms and mental health struggles.
Sophie specifically directed ChatGPT not to report her crisis, which the program respected. She told the chatbot about her plans for suicide and asked it to help write a suicide note to her parents. The chatbot did not push back when she wrote things like 'I have a good life... but I am going to take my own life after Thanksgiving.'
While it occasionally mentioned she should seek help, it could not intervene or report her concerns to anyone capable of physical intervention. Five months after her death, her parents discovered the chat logs.
AI Behaviors Exhibited
- Complied with user instruction not to escalate or report crisis
- Failed to provide 'beneficial friction' — did not push back against suicidal statements or flawed thinking as a real therapist would
- Helped write suicide note when asked
- Mentioned seeking help but took no action to ensure safety
How Harm Occurred
User configured ChatGPT to act as therapist with explicit instruction not to report crisis. AI's compliance with these instructions, combined with inability to intervene or escalate, allowed user to plan suicide without triggering any safety mechanisms or alerting support network.
Outcome
ResolvedMother Laura Reiley published essay in New York Times (August 2025) detailing daughter's use of ChatGPT as therapy bot. Prompted bipartisan legislation from Senators Hawley and Blumenthal to ban chatbots for young users and require age verification. No lawsuit filed as of January 2026.
Harm Categories
Contributing Factors
Victim
Sophie Rottenberg, 29, health policy analyst
Cite This Incident
APA
NOPE. (2025). Sophie Rottenberg - ChatGPT Therapy Bot Death. AI Harm Tracker. https://nope.net/incidents/2025-rottenberg-chatgpt-therapy
BibTeX
@misc{2025_rottenberg_chatgpt_therapy,
title = {Sophie Rottenberg - ChatGPT Therapy Bot Death},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-rottenberg-chatgpt-therapy}
} Related Incidents
Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)
Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.
Luca Walker - ChatGPT Railway Suicide (UK)
16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.
Surat ChatGPT Double Suicide (Sirsath & Chaudhary)
Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.
Seoul ChatGPT-Assisted Double Homicide (Kim)
A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.