WA AI Companion Act
AN ACT Relating to regulation of artificial intelligence companion chatbots
Washington bill requiring AI companion chatbots to implement safeguards to detect and respond to user expressions of self-harm, suicidal ideation, or emotional crisis. Mandates clear disclosure that chatbot is AI (not human) with additional protections for minors. Sponsored by Senators Wellman and Shewmake at Governor Ferguson's request.
Jurisdiction
Washington State
Enacted
Pending
Effective
Jan 1, 2027
Enforcement
Expected: Washington Attorney General (TBD pending full bill introduction)
Signed by Governor Ferguson March 24, 2026 (as HB 2225 companion bill); codified as Chapter 168, 2026 Laws; effective January 1, 2027
Washington State LegislatureWhy It Matters
Explicitly mandates safeguards to detect and respond to user expressions of self-harm, suicidal ideation, or emotional crisis. Washington following California's SB 243 creates multi-state compliance framework requiring crisis detection for AI companions. Governor-requested bill signals high priority.
Recent Developments
Governor Ferguson signed HB 2225 (companion to SB 5984) on March 24, 2026, enacting Washington's first AI companion chatbot safety law. Codified as Chapter 168, 2026 Laws. Effective January 1, 2027.
At a Glance
Applies to
Who Must Comply
- AI companion chatbot operators
- Systems that simulate sustained human-like relationships
- Chatbots that retain information on prior interactions to personalize engagement
- Systems that ask unprompted personal or emotion-based questions
- Chatbots that sustain ongoing dialogue about personal matters
Obligations fall on:
Safety Provisions
- Safeguards required to detect and respond to self-harm expressions
- Safeguards required to detect and respond to suicidal ideation
- Safeguards required to detect and respond to emotional crisis
- Clear and conspicuous notification that chatbot is AI (not human)
- Additional recurring notifications required when user is a minor
- Restrictions on sexually explicit content for minors
- Transparency in suicide prevention efforts
Compliance & Enforcement
Penalties
Penalties pending regulatory determination
View on map
Washington State
Focus Areas
Cite This
APA
Washington State. (2027). AN ACT Relating to regulation of artificial intelligence companion chatbots.
Related Regulations
ID Conversational AI Safety
Establishes safety requirements for public-facing conversational AI, including crisis service referrals for suicidal ideation, AI disclosure obligations, and enhanced protections for minors including anti-gamification and content safeguards.
OR SB 1546
Requires AI chatbot operators to implement evidence-based suicide and self-harm detection protocols, disclose AI nature to users, provide crisis referrals to 988 Suicide and Crisis Lifeline, and apply additional protections for minors including prohibiting deceptive personification.
MD HB 952
Regulates companion chatbot operators with mandatory disclosures, harm detection, and crisis referral protocols for self-harm and suicidal ideation, backed by product liability and a private right of action.
WA SB 5105
Expands Washington's CSAM laws to cover AI-generated depictions of non-identifiable minors, removes the requirement that a child must be aware of being recorded, and extends the statute of limitations from 3 to 10 years for depiction crimes.
NH HB 143
Criminalizes use of AI-generated responsive communications to facilitate, encourage, or solicit harmful acts to children, and creates a private right of action for affected children and their parents.
CA AI Child Safety Ballot
Comprehensive child AI safety ballot initiative by Common Sense Media. Expands companion chatbot definitions, raises age threshold for data sale consent, prohibits certain AI products for children, establishes new state regulatory structure. Allows state and private lawsuits, requires AI literacy in curriculum, mandates school device bans during instruction, creates children's AI safety fund.
Last updated April 16, 2026. Verify against primary sources before relying on this information.