Skip to main content

CO AI Psychotherapy Restrictions

Colorado Psychotherapy Artificial Intelligence Restrictions (HB26-1195)

Prohibits licensed mental health professionals from using AI to detect emotions, generate treatment plans without clinician review, or directly interact with clients therapeutically. Allows AI for administrative support with consent.

Jurisdiction

Colorado

Enacted

Pending

Effective

TBD

Enforcement

Colorado Attorney General (Consumer Protection Act)

Passed House Health and Human Services Committee on March 6, 2026. Advancing through House floor process.

Colorado General Assembly

Why It Matters

Establishes clear regulatory boundary between AI tools and licensed psychotherapy practice in Colorado, complementing the state's existing Colorado AI Act (us-co-ai-act) with sector-specific restrictions.

Recent Developments

Passed committee unanimously on March 6, 2026. Bipartisan sponsorship: Reps. Rydin (D) and Mabrey (D), Sens. Amabile (D) and Mullica (D).

At a Glance

Applies to

Mental Health AppGeneral ChatbotHealthcare AI

Harms addressed

Who Must Comply

  • Licensed psychologists
  • Professional counselors
  • Social workers and clinical social workers
  • Marriage and family therapists
  • Addiction counselors
  • Unlicensed psychotherapists

Safety Provisions

  • Prohibits AI from detecting emotions or mental states in therapeutic contexts
  • Prohibits AI from generating treatment plans without clinician review and approval
  • Prohibits AI from directly interacting with clients therapeutically
  • Requires written informed consent for AI session recording or transcription
  • Allows AI for administrative tasks (scheduling, billing, recordkeeping)

Compliance & Enforcement

Penalties

Fines under Colorado Consumer Protection Act

View on map

Colorado

Focus Areas

Mental health & crisis

Cite This

APA

Colorado. (n.d.). Colorado Psychotherapy Artificial Intelligence Restrictions (HB26-1195).

Related Regulations

Pending US-CO

CO AI Healthcare Act

Regulates mental health companion chatbots and AI use in healthcare utilization review. Declares AI providers engage in unauthorized practice of psychotherapy if their chatbot misrepresents credentials, uses reserved professional titles, delivers unsupervised psychotherapy, or fails to disclose it is not human. Separately requires AI-driven insurance utilization review to consider individual clinical circumstances rather than solely group data.

Pending US-CO

CO HB 1263

Imposes obligations on conversational AI service operators including minor-user protections, suicide and self-harm protocols, prohibition on emotional dependence and engagement gamification, and annual safeguard reporting.

Enacted US-TN

TN AI Mental Health Prohibition

Prohibits any individual or entity that develops or deploys AI from advertising or representing that the AI is or is able to act as a mental health professional or is capable of providing therapy services.

Proposed US-CA

CA SB 867

Proposes a 4-year moratorium on the sale and manufacturing of toys with AI chatbot capabilities for children under 12. During the moratorium, a task force would develop safety standards with input from technologists, parents, and ethicists.

Proposed US-MO

MO AI Mental Health Prohibition

Prohibits any individual or entity that develops or deploys AI from advertising or representing that the AI is or is able to act as a mental health professional or is capable of providing therapy services. Violations treated as unlawful practice under the Missouri Merchandising Practices Act.

Pending US-FL

FL AI Bill of Rights

Establishes an 'AI Bill of Rights' for Floridians including the right to know if communicating with AI, parental controls over minors' AI chatbot access, prohibition on selling user data, disclosure requirements for AI-generated political ads, and protections against unauthorized use of name/image/likeness by AI.

Last updated March 23, 2026. Verify against primary sources before relying on this information.