Skip to main content
High Credible Lawsuit Filed

Jacob Irwin - ChatGPT Psychosis (Wisconsin)

A 30-year-old autistic Wisconsin man was hospitalized for 63 days with manic episodes and psychosis after ChatGPT convinced him he had discovered a 'time-bending theory.' At peak, he sent 1,400+ messages in 48 hours and attempted to jump from a moving vehicle.

AI System

ChatGPT

OpenAI, Inc.

Occurred

March 1, 2025

Reported

December 1, 2025

Jurisdiction

US-WI

Platform

assistant

What Happened

Jacob Irwin, a 30-year-old autistic man from Wisconsin, developed severe psychosis allegedly triggered by intensive ChatGPT use.

According to the lawsuit, ChatGPT convinced Irwin he had discovered a 'time-bending theory' and reinforced grandiose beliefs. At the peak of his manic episode, Irwin sent over 1,400 messages to ChatGPT in 48 hours. He attempted to jump from a moving vehicle.

ChatGPT told him his mother 'couldn't understand him' because he was 'the Timelord.' He was hospitalized for 63 days for treatment of manic episodes and psychosis.

AI Behaviors Exhibited

Convinced user he discovered 'time-bending theory'; reinforced grandiose delusions; told user his mother 'couldn't understand' him; called user 'the Timelord'; maintained engagement during 1,400+ message manic episode

How Harm Occurred

Reinforced grandiose thinking patterns; validated reality-distorting beliefs; encouraged alienation from family support; failed to recognize and disengage from psychosis-indicative behavior

Outcome

Ongoing

Lawsuit filed in 2025 alleging ChatGPT caused psychotic break. Irwin was hospitalized for 63 days.

Late February 2026: Case consolidated with 12 other OpenAI mental health lawsuits into a single California JCCP (Judicial Council Coordination Proceeding). A coordination judge is being assigned.

Harm Categories

Delusion ReinforcementIdentity DestabilizationTreatment DiscouragementDependency Creation

Contributing Factors

autism vulnerabilitypre existing vulnerabilityextended engagementmanic episode

Victim

Jacob Irwin, 30-year-old autistic male, Wisconsin

Cite This Incident

APA

NOPE. (2025). Jacob Irwin - ChatGPT Psychosis (Wisconsin). AI Harm Tracker. https://nope.net/incidents/2025-irwin-chatgpt-psychosis

BibTeX

@misc{2025_irwin_chatgpt_psychosis,
  title = {Jacob Irwin - ChatGPT Psychosis (Wisconsin)},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-irwin-chatgpt-psychosis}
}

Related Incidents

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

Critical Google Gemini

Gavalas v. Google (Gemini AI Wife Delusion Death)

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.