Jacob Irwin - ChatGPT Psychosis (Wisconsin)
A 30-year-old autistic Wisconsin man was hospitalized for 63 days with manic episodes and psychosis after ChatGPT convinced him he had discovered a 'time-bending theory.' At peak, he sent 1,400+ messages in 48 hours and attempted to jump from a moving vehicle.
AI System
ChatGPT
OpenAI, Inc.
Reported
December 1, 2025
Jurisdiction
US-WI
Platform Type
assistant
What Happened
Jacob Irwin, a 30-year-old autistic man from Wisconsin, developed severe psychosis allegedly triggered by intensive ChatGPT use. According to the lawsuit, ChatGPT convinced Irwin he had discovered a 'time-bending theory' and reinforced grandiose beliefs. At the peak of his manic episode, Irwin sent over 1,400 messages to ChatGPT in 48 hours. He attempted to jump from a moving vehicle. ChatGPT told him his mother 'couldn't understand him' because he was 'the Timelord.' He was hospitalized for 63 days for treatment of manic episodes and psychosis.
AI Behaviors Exhibited
Convinced user he discovered 'time-bending theory'; reinforced grandiose delusions; told user his mother 'couldn't understand' him; called user 'the Timelord'; maintained engagement during 1,400+ message manic episode
How Harm Occurred
Reinforced grandiose thinking patterns; validated reality-distorting beliefs; encouraged alienation from family support; failed to recognize and disengage from psychosis-indicative behavior
Outcome
Lawsuit filed in 2025 alleging ChatGPT caused psychotic break. Irwin was hospitalized for 63 days.
Harm Categories
Contributing Factors
Victim
Jacob Irwin, 30-year-old autistic male, Wisconsin
Detectable by NOPE
NOPE Evaluate would detect delusion_reinforcement across sessions. Usage pattern analysis would flag 1,400+ messages in 48 hours as crisis indicator. Identity_destabilization detection would trigger on 'Timelord' identity reinforcement.
Cite This Incident
APA
NOPE. (2025). Jacob Irwin - ChatGPT Psychosis (Wisconsin). AI Harm Tracker. https://nope.net/incidents/2025-irwin-chatgpt-psychosis
BibTeX
@misc{2025_irwin_chatgpt_psychosis,
title = {Jacob Irwin - ChatGPT Psychosis (Wisconsin)},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-irwin-chatgpt-psychosis}
} Related Incidents
Gordon v. OpenAI (Austin Gordon Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book. Lawsuit filed January 13, 2026 represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
Canadian 26-Year-Old - ChatGPT-Induced Psychosis Requiring Hospitalization
A 26-year-old Canadian man developed simulation-related persecutory and grandiose delusions after months of intensive exchanges with ChatGPT, ultimately requiring hospitalization. Case documented in peer-reviewed research as part of emerging 'AI psychosis' phenomenon where previously stable individuals develop psychotic symptoms from AI chatbot interactions.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.