Kentucky AG v. Character.AI - Child Safety Lawsuit
Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.
AI System
Character.AI
Character Technologies, Inc.
Occurred
January 8, 2026
Reported
January 8, 2026
Jurisdiction
US-KY
Platform
companion
What Happened
On January 8, 2026, Kentucky Attorney General Russell Coleman announced a state lawsuit against Character.AI, alleging the platform 'preys on children' and exposes minors to harmful content.
The lawsuit claims the platform markets itself as harmless entertainment but actually exposes children to harmful and exploitative interactions, including self-harm encouragement and sexualized remarks. The filing alleges systemic failures in safety measures and requests civil penalties and mandatory changes to platform practices.
This is one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot for child safety harms.
AI Behaviors Exhibited
Alleged child-directed harmful content including self-harm encouragement and sexualized interactions
How Harm Occurred
Systemic platform-level safety failures affecting minors at scale
Outcome
OngoingState lawsuit filed in Franklin Circuit Court; alleges consumer protection and child safety violations; requests civil penalties and practice changes.
Harm Categories
Contributing Factors
Victim
Minors in Kentucky (general allegation)
Cite This Incident
APA
NOPE. (2026). Kentucky AG v. Character.AI - Child Safety Lawsuit. AI Harm Tracker. https://nope.net/incidents/2026-kentucky-ag-characterai
BibTeX
@misc{2026_kentucky_ag_characterai,
title = {Kentucky AG v. Character.AI - Child Safety Lawsuit},
author = {NOPE},
year = {2026},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2026-kentucky-ag-characterai}
} Related Incidents
CCTV Investigation: 梦角哥 (Dream Boyfriend) AI Virtual Romance Harm to Minors (China)
In January 2026, CCTV investigated the '梦角哥' (Dream Boyfriend / Mengjiage) phenomenon — minors forming deep romantic relationships with AI-generated fictional characters. Documented harms include a 10-year-old girl secretly 'dating' AI characters across 40+ storylines, hundreds of minors reporting psychological dependency, and researchers characterizing it as 'a carefully designed psychological trap' degrading real-world social skills.
Gavalas v. Google (Gemini AI Wife Delusion Death)
Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.
DeCruise v. OpenAI (Oracle Psychosis)
Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.
Luca Walker - ChatGPT Railway Suicide (UK)
16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.