Skip to main content

CHAT Act

CHAT Act (S.2714)

Explicitly defines "companion AI chatbot" and "suicidal ideation" in statutory context. Sets covered-entity obligations including age verification.

Jurisdiction

United States

Enacted

Pending

Effective

TBD

Enforcement

TBD

119th Congress

Congress.gov

Why It Matters

Most explicit federal bill on companion AI and suicide risk. Statutory definitions would establish federal framework for crisis detection requirements.

At a Glance

Applies to

AI CompanionCharacter ChatbotMental Health App

Harms addressed

Who Must Comply

  • Companion AI chatbot operators

Safety Provisions

  • Statutory definition of "companion AI chatbot"
  • Statutory definition of "suicidal ideation" in AI context
  • Age verification requirements
  • Covered entity obligations

Compliance & Enforcement

Penalties

Penalties pending regulatory determination

View on map

United States

Focus Areas

Mental health & crisis
Child safety

Cite This

APA

United States. (n.d.). CHAT Act (S.2714).

Related Regulations

In Effect US

FTC Companion AI Study

September 2025 FTC compulsory orders to 7 AI companion companies demanding information on children's mental health impacts. Precursor to enforcement.

In Effect US

State AG AI Warning

Coordinated state AG warnings: 44 AGs (Aug 25, 2025, led by TN, IL, NC, and SC AGs) and 42 AGs (Dec 2025, led by PA AG) to OpenAI, Meta, and others citing chatbots "flirting with children, encouraging self-harm, and engaging in sexual conversations."

Pending US-ID

ID Conversational AI Safety

Establishes safety requirements for public-facing conversational AI, including crisis service referrals for suicidal ideation, AI disclosure obligations, and enhanced protections for minors including anti-gamification and content safeguards.

In Effect BR

Brazil ECA Digital

Comprehensive child digital safety law applying to any IT product or service directed at or likely to be accessed by minors in Brazil, with extraterritorial reach.

In Effect AU

AU OSA Phase 2 Codes

Phase 2 industry codes under Australia's Online Safety Act extending age-restricted material obligations to AI companion chatbots, generative AI services, search engines, app stores, and gaming platforms. Requires robust age assurance, prohibits AI-generated sexually explicit conversations with minors, and mandates suicide/self-harm content safeguards.

In Effect UK

UK OSA

One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.

Last updated January 22, 2026. Verify against primary sources before relying on this information.