CHAT Act
CHAT Act (S.2714)
Explicitly defines "companion AI chatbot" and "suicidal ideation" in statutory context. Sets covered-entity obligations including age verification.
Jurisdiction
United States
US
Enacted
Unknown
Effective
Unknown
Enforcement
Not specified
119th Congress
What It Requires
Harms Addressed
Who Must Comply
Safety Provisions
- • Statutory definition of "companion AI chatbot"
- • Statutory definition of "suicidal ideation" in AI context
- • Age verification requirements
- • Covered entity obligations
Quick Facts
- Binding
- No
- Mental Health Focus
- Yes
- Child Safety Focus
- Yes
- Algorithmic Scope
- No
Why It Matters
Most explicit federal bill on companion AI and suicide risk. Statutory definitions would establish federal framework for crisis detection requirements.
Cite This
APA
United States. (n.d.). CHAT Act (S.2714). Retrieved from https://nope.net/regs/us-chat-act
BibTeX
@misc{us_chat_act,
title = {CHAT Act (S.2714)},
author = {United States},
year = {n.d.},
url = {https://nope.net/regs/us-chat-act}
} Related Regulations
FTC Companion AI Study
September 2025 FTC compulsory orders to 7 AI companion companies demanding information on children's mental health impacts. Precursor to enforcement.
State AG AI Warning
Coordinated state AG warnings: 44 AGs (Aug 25, 2025, led by TN, IL, NC, and SC AGs) and 42 AGs (Dec 2025, led by PA AG) to OpenAI, Meta, and others citing chatbots "flirting with children, encouraging self-harm, and engaging in sexual conversations."
CA SB243
First US law specifically regulating companion chatbots. Uses capabilities-based definition (not intent-based). Requires evidence-based suicide detection, crisis referrals, and published protocols. Two-tier regime: baseline duties for all users, enhanced protections for known minors. Private right of action with $1,000 per violation.
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
AU Online Safety Act
Grants eSafety Commissioner powers to issue removal notices with 24-hour compliance. Basic Online Safety Expectations (BOSE) formalize baseline safety governance requirements.
C-63
Would have established Digital Safety Commission with platform duties for seven harmful content categories including content inducing children to harm themselves. Required 24-hour CSAM takedown.