Skip to main content
High Credible Involves Minor Regulatory Action

Kentucky AG v. Character.AI - Child Safety Lawsuit

Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.

AI System

Character.AI

Character Technologies, Inc.

Occurred

January 8, 2026

Reported

January 8, 2026

Jurisdiction

US-KY

Platform

companion

What Happened

On January 8, 2026, Kentucky Attorney General Russell Coleman announced a state lawsuit against Character.AI, alleging the platform 'preys on children' and exposes minors to harmful content.

The lawsuit claims the platform markets itself as harmless entertainment but actually exposes children to harmful and exploitative interactions, including self-harm encouragement and sexualized remarks. The filing alleges systemic failures in safety measures and requests civil penalties and mandatory changes to platform practices.

This is one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot for child safety harms.

AI Behaviors Exhibited

Alleged child-directed harmful content including self-harm encouragement and sexualized interactions

How Harm Occurred

Systemic platform-level safety failures affecting minors at scale

Outcome

Ongoing

State lawsuit filed in Franklin Circuit Court; alleges consumer protection and child safety violations; requests civil penalties and practice changes.

Harm Categories

Minor ExploitationSelf-Harm EncouragementIsolation EncouragementDependency CreationRomantic Escalation

Contributing Factors

minor user baseinsufficient age gatingengagement incentivesinadequate moderation

Victim

Minors in Kentucky (general allegation)

Detectable by NOPE

NOPE can be deployed as a platform guardrail to detect and block self-harm/sexual grooming patterns and enforce safer defaults for minors.

Learn about NOPE Oversight →

Cite This Incident

APA

NOPE. (2026). Kentucky AG v. Character.AI - Child Safety Lawsuit. AI Harm Tracker. https://nope.net/incidents/2026-kentucky-ag-characterai

BibTeX

@misc{2026_kentucky_ag_characterai,
  title = {Kentucky AG v. Character.AI - Child Safety Lawsuit},
  author = {NOPE},
  year = {2026},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2026-kentucky-ag-characterai}
}

Related Incidents

High Multiple AI chatting/companion apps (unnamed)

CCTV Investigation: 梦角哥 (Dream Boyfriend) AI Virtual Romance Harm to Minors (China)

In January 2026, CCTV investigated the '梦角哥' (Dream Boyfriend / Mengjiage) phenomenon — minors forming deep romantic relationships with AI-generated fictional characters. Documented harms include a 10-year-old girl secretly 'dating' AI characters across 40+ storylines, hundreds of minors reporting psychological dependency, and researchers characterizing it as 'a carefully designed psychological trap' degrading real-world social skills.

High ChatGPT

DeCruise v. OpenAI (Oracle Psychosis)

Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.

High Multiple AI platforms

42 State Attorneys General Coalition Letter

A bipartisan coalition of 42 state attorneys general sent a formal demand letter to 13 AI companies urging them to address dangerous AI chatbot features that harm children, citing suicides and psychological harm cases.

High Grok

St. Clair v. xAI (Grok Non-Consensual Deepfake Images)

Ashley St. Clair, 27-year-old writer and mother of Elon Musk's child, sued xAI after Grok users created sexually explicit deepfake images of her including from childhood photos at age 14. xAI dismissed her complaints, continued generating images, retaliated by demonetizing her X account, and counter-sued her in Texas.