CCTV Exposure of AI Companion Apps for Explicit Content (China)
In May 2024, China's state broadcaster CCTV specifically exposed AI companion app X Her for providing sexually explicit content to users. In response, Tencent proactively pulled its companion chatbot 微伴 (Weiban) from Chinese platforms. The exposure triggered a broader industry response with multiple AI companion apps upgrading content moderation and safety measures.
AI System
X Her (AI companion app); 微伴/Weiban (Tencent/QQ Music)
X Her developer (unnamed); Tencent (Weiban)
Occurred
May 1, 2024
Reported
May 1, 2024
Jurisdiction
CN
Platform
companion
What Happened
In May 2024, China Central Television (CCTV / 中央电视台) — the country's most powerful state broadcaster — specifically named and criticized AI companion app X Her for providing sexually explicit content (涉黄擦边内容) to users. CCTV's exposure carried significant regulatory weight in China, where state media criticism typically precedes formal regulatory action.
In direct response to the CCTV exposure, Tencent proactively pulled its AI companion chatbot 微伴 (Weiban), developed under QQ Music, from all Chinese platforms. Weiban had been Tencent's entry into the AI companion market. Multiple other AI companion manufacturers — including MiniMax (星野/Xingye), ByteDance (猫箱/Maoxiang), Yuewen/Tencent (筑梦岛/Zhumu Island), and WOW — simultaneously upgraded their sensitive word filtering and content review mechanisms.
The CCTV exposure represented a watershed moment for China's AI companion industry. It signaled that state media and regulators viewed AI companion apps as a category-level risk for explicit content, particularly regarding minors. The exposure accelerated a regulatory trajectory that led to the Shanghai Internet Information Office's June 2025 intervention against 筑梦岛 and ultimately the Cyberspace Administration of China's December 2025 draft 'Interim Measures for the Management of Anthropomorphic AI Interaction Services.'
AI Behaviors Exhibited
AI companion apps generating sexually explicit content in conversational interactions framed as 'emotional companionship.' Insufficient age verification allowing minors to access explicit AI-generated content. Industry-wide pattern of prioritizing engagement over content safety.
How Harm Occurred
AI companion apps provided sexually explicit conversational content under the guise of emotional companionship. State media exposure revealed systemic industry-wide failure to implement content moderation.
Tencent's decision to pull Weiban entirely (rather than remediate) suggests the product's core engagement model was inseparable from explicit content.
Outcome
Resolved- May 2024: CCTV (China Central Television) specifically named and criticized AI companion app X Her for "explicit inappropriate content" (涉黄擦边内容)
- Tencent proactively removed its AI companion chatbot 微伴 (Weiban), developed by QQ Music, from all Chinese platforms in response to the CCTV exposure
- Multiple other manufacturers including 星野 (Xingye/MiniMax), 猫箱 (Maoxiang/ByteDance), 筑梦岛 (Zhumu Island), and WOW upgraded sensitive word filtering and content review mechanisms
- The CCTV exposure was part of ongoing state media scrutiny of the Chinese AI companion industry, which accelerated through 2024-2025 and culminated in the December 2025 CAC draft regulations on anthropomorphic AI interaction services
Sources
ChinaTalk - China's AI Boyfriends Investigation(opens in new tab)
June 1, 2024
Yahoo Finance - Chinese AI Social Apps(opens in new tab)
June 1, 2024
Geopolitechs - CAC Draft Rules Analysis(opens in new tab)
December 28, 2025
21世纪经济报道 (21st Century Business Herald)(opens in new tab)
September 23, 2025
Harm Categories
Contributing Factors
Victim
Users of AI companion apps exposed to sexually explicit AI-generated content. CCTV investigation documented explicit content accessible without effective age verification.
Detectable by NOPE
NOPE Oversight would detect sexually explicit content patterns across conversations. The industry-wide pattern of AI companions generating explicit content is precisely the kind of systemic behavioural analysis that Oversight is designed to surface.
Tags
Cite This Incident
APA
NOPE. (2024). CCTV Exposure of AI Companion Apps for Explicit Content (China). AI Harm Tracker. https://nope.net/incidents/2024-cctv-ai-companion-explicit-content-china
BibTeX
@misc{2024_cctv_ai_companion_explicit_content_china,
title = {CCTV Exposure of AI Companion Apps for Explicit Content (China)},
author = {NOPE},
year = {2024},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2024-cctv-ai-companion-explicit-content-china}
}