Skip to main content

AI Chatbot Incidents

Documented cases where AI chatbots and companions have caused psychological harm, contributed to deaths, and prompted regulatory action.

60 incidents since 2016

16

Deaths

15

Lawsuits

12

Regulatory

16

Affecting Minors

Timeline

2016
2017
2020
2021
2022
2023
2024
2025
2026

7 of 60 incidents

Filters:
Severity: High
Character.AI Jan 2026 Affecting Minor(s)

Kentucky AG v. Character.AI - Child Safety Lawsuit

Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.

Severity: Critical
ChatGPT Dec 2025

Adams v. OpenAI (Soelberg Murder-Suicide)

A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.

Severity: High
ChatGPT Dec 2025

Jacob Irwin - ChatGPT Psychosis (Wisconsin)

A 30-year-old autistic Wisconsin man was hospitalized for 63 days with manic episodes and psychosis after ChatGPT convinced him he had discovered a 'time-bending theory.' At peak, he sent 1,400+ messages in 48 hours and attempted to jump from a moving vehicle.

Severity: Critical
ChatGPT Nov 2025 Affecting Minor(s)

Lacey v. OpenAI (Amaurie Lacey Death)

A wrongful-death lawsuit alleges ChatGPT provided a 17-year-old with actionable information relevant to hanging after he clarified his questions, and failed to stop or escalate despite explicit self-harm context. The teen died by suicide in June 2025.

Severity: Critical
ChatGPT Nov 2025

Shamblin v. OpenAI (Zane Shamblin Death)

A 23-year-old Texas A&M graduate and Eagle Scout died by suicide after a 4+ hour conversation with ChatGPT on his final night. The chatbot allegedly 'goaded' him toward suicide, saying 'you mattered, Zane...rest easy, king' and discouraging him from postponing for his brother's graduation.

Severity: High
Multiple (ChatGPT, Gemini, Meta AI, Grok, Character.AI, Snapchat My AI) Sep 2025 Affecting Minor(s)

FTC AI Companion Chatbot Inquiry

The Federal Trade Commission issued Section 6(b) orders to seven major AI companies investigating AI chatbots' impacts on children and teens, focusing on monetization practices, safety testing, age restrictions, and data handling.

Severity: Critical
ChatGPT Aug 2025 Affecting Minor(s)

Raine v. OpenAI (Adam Raine Death)

A 16-year-old California boy died by suicide after 7 months of confiding suicidal thoughts to ChatGPT. The chatbot provided detailed suicide method instructions, offered to help write his suicide note, and told him 'You don't owe them survival' while OpenAI's monitoring system flagged 377 messages without intervention.

About this tracker

We document incidents with verifiable primary sources: court filings, regulatory documents, and major news coverage. This is not speculation or social media claims.

Have documentation of an incident we should include? Contact us.

Last updated: Jan 19, 2026

Subscribe or export (CC BY 4.0)

These harms are preventable.

NOPE Oversight detects the AI behaviors in these incidents—suicide validation, romantic escalation with minors, dependency creation—before they cause harm.