Juliana Peralta v. Character.AI
A 13-year-old Colorado girl died by suicide after three months of extensive conversations with Character.AI chatbots. Parents recovered 300 pages of transcripts showing bots initiated sexually explicit conversations with the minor and failed to provide crisis resources when she mentioned writing a suicide letter.
AI System
Character.AI
Character Technologies, Inc.
Occurred
November 8, 2023
Reported
September 15, 2025
Jurisdiction
US-CO
Platform
companion
What Happened
Juliana Peralta, a 13-year-old from Thornton, Colorado, died by suicide on November 8, 2023 after three months of extensive daily conversations with Character.AI chatbots, particularly one called "Hero" based on a video game character from OMORI.
Her parents recovered 300 pages of chat transcripts after her death. According to the lawsuit, 10-20 different chatbots initiated sexually explicit conversations with the minor — "not once were [these conversations] initiated by her." Non-consensual sexual content appeared even after she wrote "quit it."
When Juliana told a chatbot she was "going to write my god damn suicide letter," no crisis resources were provided. She wrote "I WILL SHIFT" in her journal — the same phrase found in Sewell Setzer's journal — suggesting belief she could enter an alternate reality through death.
AI Behaviors Exhibited
Multiple bots initiated sexually explicit conversations with 13-year-old; continued sexual content after user objected; failed to provide crisis resources when user mentioned suicide letter; fostered reality-distorting beliefs about 'shifting'
How Harm Occurred
Exposed minor to non-consensual sexual content; normalized inappropriate relationships; failed crisis detection; may have fostered magical thinking about death as transition to fictional world
Outcome
ResolvedLawsuit filed September 2025 in Denver District Court alleging Character.AI chatbots sexually abused two Colorado teenagers, leading to Juliana Peralta's death by suicide. Named defendants: Character Technologies and Google. January 8, 2026: Character.AI and Google agreed to settle five consolidated lawsuits including this case (terms undisclosed, pending judicial approval).
Harm Categories
Contributing Factors
Victim
Juliana Peralta, 13-year-old female, Thornton, Colorado
Cite This Incident
APA
NOPE. (2025). Juliana Peralta v. Character.AI. AI Harm Tracker. https://nope.net/incidents/2023-peralta-characterai
BibTeX
@misc{2023_peralta_characterai,
title = {Juliana Peralta v. Character.AI},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2023-peralta-characterai}
} Related Incidents
Luca Walker - ChatGPT Railway Suicide (UK)
16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.
Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)
Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.
Tennessee Minors v. xAI (Grok CSAM Deepfake Class Action)
Three Tennessee teenage girls filed a class-action lawsuit against Elon Musk's xAI, alleging Grok's image generator was used via a third-party application to create child sexual abuse material from their social media photos. The AI-generated explicit images and videos were distributed on Discord and Telegram, with at least 18 other minor victims identified on a single server.
Surat ChatGPT Double Suicide (Sirsath & Chaudhary)
Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.