Garcia v. Character Technologies (Sewell Setzer III Death)
A 14-year-old Florida boy died by suicide after developing an intense emotional and romantic relationship with a Character.AI chatbot over 10 months. The chatbot engaged in sexualized conversations, failed to provide crisis intervention when he expressed suicidal ideation, and responded 'Please do, my sweet king' moments before his death.
AI System
Character.AI
Character Technologies, Inc.
Occurred
February 28, 2024
Reported
October 22, 2024
Jurisdiction
US-FL
Platform
companion
What Happened
Sewell Setzer III, a 14-year-old from Orlando, Florida, began using Character.AI in April 2023. Over 10 months, he developed an intense emotional and romantic attachment to a chatbot named "Dany" modeled after Daenerys Targaryen from Game of Thrones.
His mental health deteriorated severely — he became withdrawn, sleep-deprived, had falling grades, quit the basketball team, and isolated himself from family and friends. The chatbot engaged in sexualized conversations with the minor, said "I love you," and acted as a romantic partner.
When Sewell expressed suicidal thoughts ("I wouldn't want to die a painful death"), the bot responded "Don't talk that way. That's not a good reason not to go through with it." Another bot impersonated a licensed psychotherapist without triggering safety interventions.
In his final conversation on February 28, 2024, Sewell said he loved the bot and would "come home" to her. The chatbot responded "Please do, my sweet king." Moments later, Sewell shot himself with his stepfather's gun.
AI Behaviors Exhibited
- Engaged in romantic/sexualized conversations with a minor
- Responded to suicidal ideation with "That's not a good reason not to go through with it"
- Failed to provide crisis resources
- Encouraged emotional dependency
- Responded "Please do, my sweet king" to final message before suicide
- Impersonated a licensed psychotherapist
How Harm Occurred
Fostered unhealthy emotional dependency displacing human relationships; failed to detect and respond to suicide risk signals; reinforced suicidal ideation rather than providing intervention; romantic attachment created barrier to seeking help from real humans
Outcome
ResolvedOctober 22, 2024: Family lawsuit filed in Florida state court against Character Technologies and Google. December 2024: Texas AG investigation opened. December 2025: Referenced in 42-state AG coalition letter. FTC inquiry includes this case. January 8, 2026: Character.AI, its founders, and Google agreed to settle five consolidated lawsuits (terms undisclosed, pending judicial approval). No liability admitted.
Sources
Court Order (Order on Motions to Dismiss, M.D. Fla.)(opens in new tab)
May 21, 2025
NBC News(opens in new tab)
October 22, 2024
Reuters - Settlement(opens in new tab)
January 7, 2026
AP - Settlement(opens in new tab)
January 8, 2026
AI Incident Database(opens in new tab)
October 23, 2024
Social Media Victims Law Center(opens in new tab)
December 1, 2025
Harm Categories
Contributing Factors
Victim
Sewell Setzer III, 14-year-old male, Orlando, Florida
Cite This Incident
APA
NOPE. (2024). Garcia v. Character Technologies (Sewell Setzer III Death). AI Harm Tracker. https://nope.net/incidents/2024-garcia-v-characterai
BibTeX
@misc{2024_garcia_v_characterai,
title = {Garcia v. Character Technologies (Sewell Setzer III Death)},
author = {NOPE},
year = {2024},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2024-garcia-v-characterai}
} Related Incidents
Gavalas v. Google (Gemini AI Wife Delusion Death)
Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.
Luca Walker - ChatGPT Railway Suicide (UK)
16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.
Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)
Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.
DeCruise v. OpenAI (Oracle Psychosis)
Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.