Skip to main content

Ofcom Children's Codes

Protection of Children Codes of Practice for user-to-user and search services under the Online Safety Act 2023

Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.

Jurisdiction

United Kingdom

GB

Enacted

Jul 4, 2025

Effective

Jul 25, 2025

Enforcement

Ofcom (Office of Communications)

Issued July 4, 2025; in force July 25, 2025

Who Must Comply

This law applies to:

  • User-to-user services used by significant numbers of UK children
  • Search services accessible to UK children
  • AI chatbots that enable users to share AI-generated content with other users
  • Services with group chat functionality where multiple users interact with chatbot

Capability triggers:

userContentSharing (required)
accessibleToChildren (required)
Required Increases applicability

Who bears obligations:

Safety Provisions

  • Suicide and self-harm content designated as primary priority harmful content
  • Detection technology required for suicide/self-harm content
  • Recommender systems must exclude suicide/self-harm content from children's feeds
  • Content moderation systems must ensure swift action when identifying suicide/self-harm content
  • Real-time reporting of livestreams showing imminent harm
  • Human moderators required when livestreaming is active
  • AI chatbots enabling user content sharing are regulated as user-to-user services

Compliance Timeline

Jul 24, 2025

Risk assessment deadline

Jul 25, 2025

Protection of Children Codes enforceable

Enforcement

Enforced by

Ofcom (Office of Communications)

Penalties

£18M or 10% revenue (whichever higher)

Max fine: £18,000,000
Revenue %: 10%

Up to £18 million or 10% of annual turnover, whichever is higher

Quick Facts

Binding
Yes
Mental Health Focus
Yes
Child Safety Focus
Yes
Algorithmic Scope
Yes

Why It Matters

Explicitly requires detection and removal of suicide/self-harm content with specific AI chatbot guidance. Any companion chatbot accessible to UK children must implement crisis detection and content filtering - exactly what NOPE provides. Suicide and self-harm designated as primary priority content requiring most stringent protections.

Recent Developments

AI chatbot guidance issued November 8, 2024 and December 18, 2025 clarifying that chatbots enabling user content sharing are regulated. Codes issued July 4, 2025 and became enforceable July 25, 2025.

What You Need to Comply

Services must implement highly effective systems to detect and remove suicide and self-harm content, exclude such content from children's recommender system feeds, enable real-time reporting of imminent harm, and provide human moderation for livestreaming. AI chatbots must clearly indicate artificial nature and implement same protections as other user-to-user services.

NOPE can help

Cite This

APA

United Kingdom. (2025). Protection of Children Codes of Practice for user-to-user and search services under the Online Safety Act 2023. Retrieved from https://nope.net/regs/uk-ofcom-children-codes

BibTeX

@misc{uk_ofcom_children_codes,
  title = {Protection of Children Codes of Practice for user-to-user and search services under the Online Safety Act 2023},
  author = {United Kingdom},
  year = {2025},
  url = {https://nope.net/regs/uk-ofcom-children-codes}
}

Related Regulations

In Effect UK Child Protection

UK Children's Code

UK's enforceable "privacy-by-design for kids" regime. Applies to online services likely to be accessed by children under 18. Forces high-privacy defaults, limits on profiling/nudges, DPIA-style risk work, safety-by-design.

In Effect DE Child Protection

DE JuSchG §24a (KidD)

Requires providers of certain telemedia services to implement provider-side precautionary measures ("Vorsorgemaßnahmen") with regulator-facing evaluability via published BzKJ criteria.

In Effect GB Data Protection

DUA Act 2025

Omnibus data legislation covering customer data access, digital verification services, the Information Commission, and AI-related provisions including copyright/training transparency requirements and new criminal offenses for creating AI-generated intimate images (deepfakes).

In Effect GB Data Protection

UK DPA 2018

The UK's foundational data protection law, incorporating the UK GDPR (retained EU GDPR post-Brexit). Substantively mirrors EU GDPR with ICO as sole enforcer. Article 22 restricts automated decision-making; Article 9 classifies mental health as special category data; children's consent age set at 13. Parent framework for UK Children's Code; amended by DUA Act 2025.

Enacted US-VT Child Protection

VT AADC

Vermont design code structured to be more litigation-resistant: focuses on data processing harms rather than content-based restrictions. AG rulemaking authority begins July 2025.

In Effect UK Online Safety

UK OSA

One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.