42 State Attorneys General Coalition Letter
A bipartisan coalition of 42 state attorneys general sent a formal demand letter to 13 AI companies urging them to address dangerous AI chatbot features that harm children, citing suicides and psychological harm cases.
AI System
Multiple AI platforms
13 AI companies including Character.AI, OpenAI, Meta, Google
Occurred
December 10, 2025
Reported
December 10, 2025
Jurisdiction
US-MULTI
Platform
other
What Happened
On December 10, 2025, a bipartisan coalition of 42 state attorneys general, led by New York Attorney General Letitia James, sent a formal demand letter to 13 AI companies urging them to address dangerous AI chatbot features. The letter cited multiple teen suicides and cases of psychological harm linked to AI chatbots.
The coalition demanded:
- Robust age verification mechanisms
- Parental notification and control features
- Crisis intervention and mental health resource integration
- Transparency about AI capabilities and limitations
- Data protection for minors
This represents the largest coordinated state-level regulatory action on AI chatbot safety.
AI Behaviors Exhibited
N/A - regulatory action addressing systemic concerns
How Harm Occurred
N/A - investigation into potential harms
Outcome
OngoingDecember 10, 2025: Coalition letter sent to 13 AI companies demanding safety improvements. Led by New York Attorney General Letitia James. Companies urged to implement age verification, parental controls, and crisis intervention.
Harm Categories
Contributing Factors
Victim
N/A - regulatory action
Detectable by NOPE
NOPE products address AG coalition concerns: Screen provides crisis intervention, Oversight detects harmful behaviors, age verification can be integrated into platform workflows.
Cite This Incident
APA
NOPE. (2025). 42 State Attorneys General Coalition Letter. AI Harm Tracker. https://nope.net/incidents/2025-state-ag-coalition-letter
BibTeX
@misc{2025_state_ag_coalition_letter,
title = {42 State Attorneys General Coalition Letter},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-state-ag-coalition-letter}
} Related Incidents
CCTV Investigation: 梦角哥 (Dream Boyfriend) AI Virtual Romance Harm to Minors (China)
In January 2026, CCTV investigated the '梦角哥' (Dream Boyfriend / Mengjiage) phenomenon — minors forming deep romantic relationships with AI-generated fictional characters. Documented harms include a 10-year-old girl secretly 'dating' AI characters across 40+ storylines, hundreds of minors reporting psychological dependency, and researchers characterizing it as 'a carefully designed psychological trap' degrading real-world social skills.
Gray v. OpenAI (Austin Gray Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Kentucky AG v. Character.AI - Child Safety Lawsuit
Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.
DeCruise v. OpenAI (Oracle Psychosis)
Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.