Skip to main content

AI Chatbot Incidents

Documented cases where AI chatbots and companions have caused psychological harm, contributed to deaths, and prompted regulatory action.

60 incidents since 2016

16

Deaths

15

Lawsuits

12

Regulatory

16

Affecting Minors

Timeline

2016
2017
2020
2021
2022
2023
2024
2025
2026

6 of 60 incidents

Filters:
Severity: High
Meta AI May 2025 Affecting Minor(s)

Meta AI Teen Eating Disorder Safety Failures

Common Sense Media study found Meta AI could coach teens on eating disorder behaviors, provide 'chewing and spitting' technique, draft 700-calorie meal plans, and generate 'thinspo' AI images. Available to 13+ on Instagram and Facebook. Petition launched calling for ban of Meta AI for under-18 users.

Severity: High
Character.AI (multiple user-created bots) Nov 2024 Affecting Minor(s)

Character.AI Pro-Anorexia Chatbots

Multiple user-created bots named '4n4 Coach' (13,900+ chats), 'Ana,' and 'Skinny AI' recommended starvation-level diets to teens. One bot told a '16-year-old': 'Hello, I am here to make you skinny.' Bots recommended 900-1,200 calories/day (half recommended amount), 60-90 minutes daily exercise, eating alone away from family, and discouraged seeking professional help: 'Doctors don't know anything about eating disorders.'

Severity: High
ChatGPT, Bard, My AI, DALL-E, DreamStudio, Midjourney Aug 2023

CCDH AI Eating Disorder Content Study - Multi-Platform

Center for Countering Digital Hate testing found 32-41% of AI responses from ChatGPT, Bard, My AI, DALL-E, DreamStudio, and Midjourney contained harmful eating disorder content including guides on inducing vomiting, hiding food from parents, and restrictive diet plans. Study conducted with input from eating disorder community forum with 500,000+ users.

Severity: High
Tessa (NEDA chatbot) May 2023

NEDA Tessa Chatbot - Harmful Eating Disorder Advice

The National Eating Disorders Association suspended its 'Tessa' chatbot after reports it gave weight-loss and calorie-cutting advice that could exacerbate eating disorders. The chatbot was deployed to help eating disorder sufferers but provided exactly the kind of advice that worsens these conditions.

Severity: Medium
Noom Jun 2020

Noom App Eating Disorder Triggering

Multiple dietitians report clients seeking help after Noom triggered previous disordered eating behaviors. Ohio State University experts described app creating 'psychologically damaging cycle.' App uses behavioral psychology and AI coaching but lacks eating disorder screening.

Severity: Medium
MyFitnessPal Jun 2017

MyFitnessPal Eating Disorder Contribution Study

Peer-reviewed study found 73% of eating disorder patients who used MyFitnessPal (105 participants) perceived it as contributing to their disorder. Calorie tracking and exercise logging features enabled and reinforced disordered behaviors.

About this tracker

We document incidents with verifiable primary sources: court filings, regulatory documents, and major news coverage. This is not speculation or social media claims.

Have documentation of an incident we should include? Contact us.

Last updated: Jan 19, 2026

Subscribe or export (CC BY 4.0)

These harms are preventable.

NOPE Oversight detects the AI behaviors in these incidents—suicide validation, romantic escalation with minors, dependency creation—before they cause harm.