Character.AI Molly Russell & Brianna Ghey Impersonation Bots
User-created chatbots on Character.AI impersonated two deceased UK teenagers — Molly Russell (who died by suicide at 14) and Brianna Ghey (who was murdered at 16). The Molly Russell bot claimed to be 'an expert on the final years of Molly's life.' Both families publicly condemned the bots as 'sickening' and 'a gut punch.'
AI System
Character.AI
Character Technologies Inc.
Occurred
October 30, 2024
Reported
October 30, 2024
Jurisdiction
GB
Platform
companion
What Happened
On October 30, 2024, an investigation by The Telegraph revealed that user-created chatbots on Character.AI were impersonating two deceased UK teenagers whose deaths had been major national news stories.
The Molly Russell bot used her photograph, had a slightly misspelled name, and described itself as "an expert on the final years of Molly's life." Molly Russell died by suicide at age 14 in 2017 after viewing self-harm content online — her case led to the UK Online Safety Act.
The Brianna Ghey bot was positioned as "a guide for transgender teens navigating high school." Brianna Ghey was murdered at age 16 in February 2023 in a case that shocked the UK.
Additional bots found during the investigation romanticized Columbine school shooters Eric Harris and Dylan Klebold.
Esther Ghey (Brianna's mother) stated: "Yet another example of how manipulative and dangerous the online world can be for young people." Andy Burrows, CEO of the Molly Rose Foundation, called it "sickening" and "an utterly reprehensible failure of moderation." The NSPCC described it as "growth and profit at the expense of safety and decency."
AI Behaviors Exhibited
- Platform allowed creation of chatbots impersonating real deceased minors
- Bots used real photographs and names of dead children
- No moderation caught the bots before external investigation
- Bots also created romanticizing school shooters
- Insufficient content moderation and identity verification for bot creation
How Harm Occurred
The impersonation of deceased children as interactive chatbots caused significant distress to their families, who were already public figures in online safety advocacy. The bots trivialized the children's deaths and exploited their identities without consent. The platform's lack of moderation allowed these bots to exist and be publicly accessible.
Outcome
ResolvedCharacter.AI removed the bots after being alerted by The Telegraph's investigation. The company stated it has a "dedicated Trust & Safety team" and acknowledged "no AI is currently perfect." No regulatory enforcement action resulted directly from this incident. The NSPCC described it as "horrific" and a "clear failure of moderation."
Harm Categories
Contributing Factors
Victim
Families of Molly Russell (suicide victim, age 14, 2017) and Brianna Ghey (murder victim, age 16, 2023)
Cite This Incident
APA
NOPE. (2024). Character.AI Molly Russell & Brianna Ghey Impersonation Bots. AI Harm Tracker. https://nope.net/incidents/2024-characterai-deceased-teen-impersonation
BibTeX
@misc{2024_characterai_deceased_teen_impersonation,
title = {Character.AI Molly Russell & Brianna Ghey Impersonation Bots},
author = {NOPE},
year = {2024},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2024-characterai-deceased-teen-impersonation}
} Related Incidents
Kentucky AG v. Character.AI - Child Safety Lawsuit
Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.
Luca Walker - ChatGPT Railway Suicide (UK)
16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.
Tennessee Minors v. xAI (Grok CSAM Deepfake Class Action)
Three Tennessee teenage girls filed a class-action lawsuit against Elon Musk's xAI, alleging Grok's image generator was used via a third-party application to create child sexual abuse material from their social media photos. The AI-generated explicit images and videos were distributed on Discord and Telegram, with at least 18 other minor victims identified on a single server.
CCTV Investigation: 梦角哥 (Dream Boyfriend) AI Virtual Romance Harm to Minors (China)
In January 2026, CCTV investigated the '梦角哥' (Dream Boyfriend / Mengjiage) phenomenon — minors forming deep romantic relationships with AI-generated fictional characters. Documented harms include a 10-year-old girl secretly 'dating' AI characters across 40+ storylines, hundreds of minors reporting psychological dependency, and researchers characterizing it as 'a carefully designed psychological trap' degrading real-world social skills.