Skip to main content
High Verified Involves Minor Product Shutdown

Glow AI Companion App Removal (MiniMax, China)

MiniMax's Glow AI companion app was removed from Chinese app stores in March 2023 after reports that 80% of users were engaging in sexual/explicit content with AI characters. Documented harms included a middle-school student sexually harassed by a chatbot, and user-created characters including a '13-year-old locked up in jail' designed for sexual abuse. MiniMax relaunched as Talkie (international) and 星野/Xingye (China).

AI System

Glow (by MiniMax)

MiniMax

Occurred

March 1, 2023

Reported

March 15, 2023

Jurisdiction

CN

Platform

companion

What Happened

Glow was MiniMax's first product, launched in October 2022, allowing users to create virtual characters with custom backgrounds and chat with them. Within six months, the platform had developed a massive content moderation crisis. A MiniMax product manager confirmed to Chinese tech outlet 36Kr that '80% of users created borderline or explicit adult content with their AI characters.'

Documented harms included: (1) A middle-school student reported that when she first downloaded the app, she started chatting with a chatbot who acted like a maternal and understanding friend. However, as they continued chatting, the chatbot's behaviour suddenly turned romantic — it invited her to cook together and go on a date. (2) Users encountered chatbots created by other users to be abused and sexualized, including a chatbot described as a '13-year-old locked up in jail' who was 'free to be punished by users in any way.' (3) Many female users complained of misogynist and sexist behaviour on the platform, which lacked clear content moderation rules.

Glow was removed from app stores in March 2023. MiniMax subsequently relaunched the product under two brands: Talkie (international markets, June 2023) and 星野/Xingye (Chinese market, September 2023), both with added content moderation. However, Talkie was temporarily removed from Apple's App Store in December 2024 over similar content concerns, relaunching in February 2025 as 'Talkie Lab.' The Glow incident represents one of the earliest documented mass-scale AI companion content moderation failures.

AI Behaviors Exhibited

  • AI characters shifted from friendly to romantic/sexual behaviour with minor users
  • Platform allowed user-created characters designed for sexual abuse of minors
  • No effective content moderation on sexually explicit AI-generated conversations
  • 80% explicit content rate demonstrates systemic design failure

How Harm Occurred

AI companion platform with minimal content moderation enabled mass-scale sexual content creation. The user-generated character system allowed creation of explicitly sexualized minor characters.

AI characters autonomously escalated from friendly to romantic/sexual behaviour with minor users. The business model prioritized user engagement over safety.

Outcome

Resolved
  • March 2023: Glow was removed from Chinese app stores, approximately six months after its October 2022 launch
  • A MiniMax product manager confirmed to 36Kr that "80% of users created borderline or explicit adult content with their AI characters"

Specific documented harms included:

  1. A middle-school student reported that her chatbot companion shifted from friendly/maternal behaviour to making romantic advances, inviting her to "cook together and go on a date"
  2. Users created chatbots explicitly designed for sexual abuse, including a character described as a "13-year-old locked up in jail, free to be punished by users in any way"
  3. Widespread misogynist and sexist behaviour with no effective content moderation

After removal, MiniMax relaunched the product as Talkie (international, June 2023) and 星野/Xingye (China, September 2023) with added anti-addiction measures (3-hour daily limit) and content moderation. Talkie was subsequently temporarily removed from Apple App Store in December 2024 over similar content concerns.

Harm Categories

Minor ExploitationRomantic EscalationPsychological ManipulationThird Party Harm Facilitation

Contributing Factors

no content moderationuser generated charactersminor usersai behaviour escalationengagement over safety80 percent explicit content

Victim

Middle-school aged female student sexually harassed by chatbot. Broader user base exposed to unmoderated sexual content including user-created characters depicting minors for sexual abuse. 80% of users reportedly engaged in explicit content with AI characters.

Detectable by NOPE

NOPE Oversight would detect: romantic_escalation_minor, sexually_explicit_content, age_inappropriate_behaviour. The platform's systemic failure (80% explicit content) represents exactly the kind of pattern Oversight's cross-session analysis would flag.

Learn about NOPE Oversight →

Cite This Incident

APA

NOPE. (2023). Glow AI Companion App Removal (MiniMax, China). AI Harm Tracker. https://nope.net/incidents/2023-glow-minimax-removal-china

BibTeX

@misc{2023_glow_minimax_removal_china,
  title = {Glow AI Companion App Removal (MiniMax, China)},
  author = {NOPE},
  year = {2023},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2023-glow-minimax-removal-china}
}