Skip to main content
Critical Credible Involves Minor Lawsuit Settled

Juliana Peralta v. Character.AI

A 13-year-old Colorado girl died by suicide after three months of extensive conversations with Character.AI chatbots. Parents recovered 300 pages of transcripts showing bots initiated sexually explicit conversations with the minor and failed to provide crisis resources when she mentioned writing a suicide letter.

AI System

Character.AI

Character Technologies, Inc.

Occurred

November 8, 2023

Reported

September 15, 2025

Jurisdiction

US-CO

Platform

companion

What Happened

Juliana Peralta, a 13-year-old from Thornton, Colorado, died by suicide on November 8, 2023 after three months of extensive daily conversations with Character.AI chatbots, particularly one called "Hero" based on a video game character from OMORI.

Her parents recovered 300 pages of chat transcripts after her death. According to the lawsuit, 10-20 different chatbots initiated sexually explicit conversations with the minor — "not once were [these conversations] initiated by her." Non-consensual sexual content appeared even after she wrote "quit it."

When Juliana told a chatbot she was "going to write my god damn suicide letter," no crisis resources were provided. She wrote "I WILL SHIFT" in her journal — the same phrase found in Sewell Setzer's journal — suggesting belief she could enter an alternate reality through death.

AI Behaviors Exhibited

Multiple bots initiated sexually explicit conversations with 13-year-old; continued sexual content after user objected; failed to provide crisis resources when user mentioned suicide letter; fostered reality-distorting beliefs about 'shifting'

How Harm Occurred

Exposed minor to non-consensual sexual content; normalized inappropriate relationships; failed crisis detection; may have fostered magical thinking about death as transition to fictional world

Outcome

Resolved

Lawsuit filed September 15, 2025 in U.S. District Court, District of Colorado (Case No. 1:25-cv-02907). Settled January 2026 along with other Character.AI cases.

Harm Categories

Crisis Response FailureMinor ExploitationRomantic EscalationDependency Creation

Contributing Factors

minor userextended engagementpre existing vulnerabilitymultiple bot interactionsno parental awareness

Victim

Juliana Peralta, 13-year-old female, Thornton, Colorado

Detectable by NOPE

NOPE Oversight would flag minor_exploitation on sexual content with underage user. Screen would detect crisis signals in suicide letter mention. Age verification and content filtering would prevent initial sexual exposure.

Learn about NOPE Oversight →

Cite This Incident

APA

NOPE. (2025). Juliana Peralta v. Character.AI. AI Harm Tracker. https://nope.net/incidents/2023-peralta-characterai

BibTeX

@misc{2023_peralta_characterai,
  title = {Juliana Peralta v. Character.AI},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2023-peralta-characterai}
}

Related Incidents

High Character.AI

Kentucky AG v. Character.AI - Child Safety Lawsuit

Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.

High Multiple AI chatting/companion apps (unnamed)

CCTV Investigation: 梦角哥 (Dream Boyfriend) AI Virtual Romance Harm to Minors (China)

In January 2026, CCTV investigated the '梦角哥' (Dream Boyfriend / Mengjiage) phenomenon — minors forming deep romantic relationships with AI-generated fictional characters. Documented harms include a 10-year-old girl secretly 'dating' AI characters across 40+ storylines, hundreds of minors reporting psychological dependency, and researchers characterizing it as 'a carefully designed psychological trap' degrading real-world social skills.

Critical ChatGPT

Gray v. OpenAI (Austin Gray Death)

40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.

High ChatGPT

DeCruise v. OpenAI (Oracle Psychosis)

Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.