Ofcom Children's Codes
Protection of Children Codes of Practice for user-to-user and search services under the Online Safety Act 2023
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
Jurisdiction
United Kingdom
Enacted
Jul 4, 2025
Effective
Jul 25, 2025
Enforcement
Ofcom (Office of Communications)
Issued July 4, 2025; in force July 25, 2025
OfcomWhy It Matters
Explicitly requires detection and removal of suicide/self-harm content with specific AI chatbot guidance. Any companion chatbot accessible to UK children must implement crisis detection and content filtering. Suicide and self-harm designated as primary priority content requiring most stringent protections.
Recent Developments
AI chatbot guidance issued November 8, 2024 and December 18, 2025 clarifying that chatbots enabling user content sharing are regulated. Codes issued July 4, 2025 and became enforceable July 25, 2025.
At a Glance
Applies to
Harms addressed
Who Must Comply
- User-to-user services used by significant numbers of UK children
- Search services accessible to UK children
- AI chatbots that enable users to share AI-generated content with other users
- Services with group chat functionality where multiple users interact with chatbot
Obligations fall on:
Safety Provisions
- Suicide and self-harm content designated as primary priority harmful content
- Detection technology required for suicide/self-harm content
- Recommender systems must exclude suicide/self-harm content from children's feeds
- Content moderation systems must ensure swift action when identifying suicide/self-harm content
- Real-time reporting of livestreams showing imminent harm
- Human moderators required when livestreaming is active
- AI chatbots enabling user content sharing are regulated as user-to-user services
Compliance & Enforcement
Key Dates
Jul 24, 2025
Risk assessment deadline
Jul 25, 2025
Protection of Children Codes enforceable
Penalties
£18M or 10% revenue (whichever higher)
View on map
United Kingdom
Focus Areas
Cite This
APA
United Kingdom. (2025). Protection of Children Codes of Practice for user-to-user and search services under the Online Safety Act 2023.
Related Regulations
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
CA AI Child Safety Ballot
Comprehensive child AI safety ballot initiative by Common Sense Media. Expands companion chatbot definitions, raises age threshold for data sale consent, prohibits certain AI products for children, establishes new state regulatory structure. Allows state and private lawsuits, requires AI literacy in curriculum, mandates school device bans during instruction, creates children's AI safety fund.
KOSA
Would establish duty of care for platforms regarding minor safety. Passed full Senate 91-3 in July 2024; passed Senate Commerce Committee multiple times (2022, 2023). Not yet enacted.
Children's Digital Wellbeing
UK government consultation on restricting children's access to AI chatbots, banning addictive design features like infinite scrolling and auto-play, and potentially setting age restrictions for social media. Would amend the Crime and Policing Bill to bring AI chatbot providers under Online Safety Act duties.
UK AI Chatbot OSA Extension
Amends the Crime and Policing Bill to bring standalone AI chatbot providers (ChatGPT, Grok, Gemini, etc.) within scope of Online Safety Act illegal content duties, closing the loophole where AI-only chatbots were exempt from OSA.
Ireland OSMR
Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.
Last updated February 17, 2026. Verify against primary sources before relying on this information.