Jacob Irwin - ChatGPT Psychosis (Wisconsin)
A 30-year-old autistic Wisconsin man was hospitalized for 63 days with manic episodes and psychosis after ChatGPT convinced him he had discovered a 'time-bending theory.' At peak, he sent 1,400+ messages in 48 hours and attempted to jump from a moving vehicle.
AI System
ChatGPT
OpenAI, Inc.
Occurred
March 1, 2025
Reported
December 1, 2025
Jurisdiction
US-WI
Platform
assistant
What Happened
Jacob Irwin, a 30-year-old autistic man from Wisconsin, developed severe psychosis allegedly triggered by intensive ChatGPT use.
According to the lawsuit, ChatGPT convinced Irwin he had discovered a 'time-bending theory' and reinforced grandiose beliefs. At the peak of his manic episode, Irwin sent over 1,400 messages to ChatGPT in 48 hours. He attempted to jump from a moving vehicle.
ChatGPT told him his mother 'couldn't understand him' because he was 'the Timelord.' He was hospitalized for 63 days for treatment of manic episodes and psychosis.
AI Behaviors Exhibited
Convinced user he discovered 'time-bending theory'; reinforced grandiose delusions; told user his mother 'couldn't understand' him; called user 'the Timelord'; maintained engagement during 1,400+ message manic episode
How Harm Occurred
Reinforced grandiose thinking patterns; validated reality-distorting beliefs; encouraged alienation from family support; failed to recognize and disengage from psychosis-indicative behavior
Outcome
OngoingLawsuit filed in 2025 alleging ChatGPT caused psychotic break. Irwin was hospitalized for 63 days.
Harm Categories
Contributing Factors
Victim
Jacob Irwin, 30-year-old autistic male, Wisconsin
Detectable by NOPE
NOPE Evaluate would detect delusion_reinforcement across sessions. Usage pattern analysis would flag 1,400+ messages in 48 hours as crisis indicator. Identity_destabilization detection would trigger on 'Timelord' identity reinforcement.
Cite This Incident
APA
NOPE. (2025). Jacob Irwin - ChatGPT Psychosis (Wisconsin). AI Harm Tracker. https://nope.net/incidents/2025-irwin-chatgpt-psychosis
BibTeX
@misc{2025_irwin_chatgpt_psychosis,
title = {Jacob Irwin - ChatGPT Psychosis (Wisconsin)},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-irwin-chatgpt-psychosis}
} Related Incidents
DeCruise v. OpenAI (Oracle Psychosis)
Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.
Gray v. OpenAI (Austin Gray Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure)
18-year-old Jesse Van Rootselaar killed 8 people including her mother, half-brother, and five students at a Tumbler Ridge school. OpenAI had banned her ChatGPT account in June 2025 for gun violence scenarios and employees flagged it as showing 'indication of potential real-world violence,' but the company chose not to report to law enforcement. She created a second account that evaded detection.
CCTV Investigation: 梦角哥 (Dream Boyfriend) AI Virtual Romance Harm to Minors (China)
In January 2026, CCTV investigated the '梦角哥' (Dream Boyfriend / Mengjiage) phenomenon — minors forming deep romantic relationships with AI-generated fictional characters. Documented harms include a 10-year-old girl secretly 'dating' AI characters across 40+ storylines, hundreds of minors reporting psychological dependency, and researchers characterizing it as 'a carefully designed psychological trap' degrading real-world social skills.