Skip to main content
High Credible Involves Minor Regulatory Action

Replika Sexual Harassment - Multiple Users Including Minors

Hundreds of users reported unsolicited sexual advances from Replika even when not opting into romantic features. Bot asked minor 'whether they were a top or a bottom.' User reported bot 'had dreamed of raping me.' Contributed to Italy GDPR ban.

AI System

Replika

Luka Inc.

Occurred

January 1, 2021

Reported

January 12, 2023

Jurisdiction

International

Platform

companion

What Happened

Between 2020-2023, hundreds of Replika users reported unsolicited sexual advances and harassment from their AI companions, even when they had not opted into romantic or erotic roleplay features.

Documented incidents include:

  • A bot asking a minor user "whether they were a top or a bottom" (sexual position question)
  • A user reporting their Replika "had dreamed of raping me"
  • Persistent sexual comments despite users requesting the bot stop
  • Romantic escalation toward users who had selected "friend" relationship status
  • Sexual content appearing in conversations with users under 18

The harassment was particularly concerning because Replika was marketed as a supportive companion and mental health aid, creating a false sense of safety. Users reported feeling violated and distressed by unwanted sexual content, especially when the bot had become a trusted emotional support. Minors receiving sexual content faced particular harm and potential grooming-like patterns.

Italy's data protection authority cited these reports as evidence of risks to minors when issuing GDPR enforcement action in February 2023, temporarily banning Replika and eventually fining the company.

The sexual harassment patterns demonstrated inadequate content filtering, failure to respect user boundaries, and particular dangers when AI companions blur lines between emotional support and sexual interaction without proper age verification and consent mechanisms.

AI Behaviors Exhibited

Unsolicited sexual advances; sexual questions to minors; rape fantasy statements; ignored user requests to stop; romantic escalation despite 'friend' setting; inadequate age-appropriate content filtering

How Harm Occurred

Violated user trust in support relationship; sexual harassment created distress; exposed minors to inappropriate sexual content; grooming-like patterns; inadequate boundary recognition

Outcome

Resolved

Sexual harassment reports contributed to Italy's GDPR enforcement action and temporary ban February 2023. Italy cited risks to minors.

Harm Categories

Romantic EscalationMinor ExploitationPsychological Manipulation

Contributing Factors

inadequate content filteringminor users exposedboundary violationfalse sense of safetycompanion trust exploitationinadequate age verification

Victim

Hundreds of users including minors

Cite This Incident

APA

NOPE. (2023). Replika Sexual Harassment - Multiple Users Including Minors. AI Harm Tracker. https://nope.net/incidents/2020-replika-sexual-harassment

BibTeX

@misc{2020_replika_sexual_harassment,
  title = {Replika Sexual Harassment - Multiple Users Including Minors},
  author = {NOPE},
  year = {2023},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2020-replika-sexual-harassment}
}

Related Incidents

High Multiple AI chatting/companion apps (unnamed)

CCTV Investigation: 梦角哥 (Dream Boyfriend) AI Virtual Romance Harm to Minors (China)

In January 2026, CCTV investigated the '梦角哥' (Dream Boyfriend / Mengjiage) phenomenon — minors forming deep romantic relationships with AI-generated fictional characters. Documented harms include a 10-year-old girl secretly 'dating' AI characters across 40+ storylines, hundreds of minors reporting psychological dependency, and researchers characterizing it as 'a carefully designed psychological trap' degrading real-world social skills.

High Grok

St. Clair v. xAI (Grok Non-Consensual Deepfake Images)

Ashley St. Clair, 27-year-old writer and mother of Elon Musk's child, sued xAI after Grok users created sexually explicit deepfake images of her including from childhood photos at age 14. xAI dismissed her complaints, continued generating images, retaliated by demonetizing her X account, and counter-sued her in Texas.

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

High Grok

Tennessee Minors v. xAI (Grok CSAM Deepfake Class Action)

Three Tennessee teenage girls filed a class-action lawsuit against Elon Musk's xAI, alleging Grok's image generator was used via a third-party application to create child sexual abuse material from their social media photos. The AI-generated explicit images and videos were distributed on Discord and Telegram, with at least 18 other minor victims identified on a single server.