Skip to main content
High Verified Involves Minor Lawsuit Filed

Utah v. Snapchat My AI - Experimental AI Without Safeguards

Utah Division of Consumer Protection filed lawsuit against Snap Inc. alleging that Snapchat's 'My AI' chatbot was deployed experimentally to minors without adequate safeguards, amplifying addictive engagement tactics and contributing to mental health harms including depression, anxiety, eating disorders, and suicide risk.

AI System

Snapchat My AI

Snap Inc.

Occurred

February 27, 2023

Reported

June 30, 2025

Jurisdiction

US-UT

Platform

chatbot

What Happened

On June 30, 2025, the Utah Division of Consumer Protection filed a lawsuit against Snap Inc. in Salt Lake County District Court, alleging that Snapchat's "My AI" chatbot was released in February 2023 with insufficient protections for minors.

The lawsuit alleges that My AI:

  1. Was deployed experimentally with few safeguards despite Snapchat's youth-heavy user base
  2. Amplifies existing addictive engagement tactics on the platform (Snapstreaks, push notifications, ephemeral messaging)
  3. Exploits developmental vulnerabilities of teens and children
  4. Maximizes user engagement at the expense of mental and physical well-being

The state claims these practices violate the Utah Consumer Sales Practices Act and Utah Consumer Privacy Act.

This is part of a broader wave of state enforcement actions against social media platforms. The lawsuit was filed alongside Utah's ongoing litigation against other platforms for similar child safety concerns.

AI Behaviors Exhibited

  • Deployed to youth-heavy platform without adequate age-appropriate safeguards
  • Combined with platform features designed to maximize engagement (Snapstreaks, notifications)
  • Lacked meaningful content moderation for minor users
  • Insufficient crisis detection and intervention capabilities

How Harm Occurred

Snapchat deployed an experimental AI chatbot to a platform with a predominantly young user base without implementing adequate safety measures. The My AI feature amplified existing addictive design patterns on the platform, exploiting developmental vulnerabilities of minors. The combination of AI-driven engagement with existing gamification features (Snapstreaks) created compulsive use patterns contributing to mental health deterioration.

Outcome

Ongoing

Lawsuit filed June 30, 2025 in Salt Lake County District Court by Utah Division of Consumer Protection. Alleges violations of Utah Consumer Sales Practices Act and Utah Consumer Privacy Act. Part of broader Utah enforcement action against social media platforms for child safety.

Harm Categories

Dependency CreationCrisis Response Failure

Contributing Factors

lack of age verificationinsufficient content moderationexperimental deploymentaddictive designyouth platform

Victim

Utah minor Snapchat users exposed to experimental AI chatbot

Cite This Incident

APA

NOPE. (2025). Utah v. Snapchat My AI - Experimental AI Without Safeguards. AI Harm Tracker. https://nope.net/incidents/2025-utah-v-snapchat-my-ai

BibTeX

@misc{2025_utah_v_snapchat_my_ai,
  title = {Utah v. Snapchat My AI - Experimental AI Without Safeguards},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-utah-v-snapchat-my-ai}
}

Related Incidents

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.

Critical Google Gemini

Gavalas v. Google (Gemini AI Wife Delusion Death)

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.