UK Children's Code
Age Appropriate Design Code (Children's Code) — UK DPA 2018 / UK GDPR
UK's enforceable "privacy-by-design for kids" regime. Applies to online services likely to be accessed by children under 18. Forces high-privacy defaults, limits on profiling/nudges, DPIA-style risk work, safety-by-design.
Jurisdiction
United Kingdom
Enacted
Sep 2, 2020
Effective
Sep 2, 2021
Enforcement
Information Commissioner's Office (ICO)
Statutory code with binding effect under DPA 2018
ICOWhy It Matters
One of most enforceable "design code" models globally—hits recommender systems + engagement design. Template for US state AADCs.
Recent Developments
ICO "tech sector sweep" 2024-2025 examining compliance. Active enforcement on profiling and nudge design.
At a Glance
Applies to
Harms addressed
Who Must Comply
- Information society services likely to be accessed by children under 18
- Apps, sites, platforms, connected services
Safety Provisions
- Best interests of child must be primary consideration
- High privacy settings by default
- Data minimization + purpose limitation for children
- DPIAs for child-accessible processing
- Profiling restrictions: avoid unless justified; mitigate harms
- Nudge techniques: cannot encourage unnecessary data sharing or weaken privacy
- Geolocation and high-risk features off by default
- Transparency in child-appropriate language
- Parental controls with age-appropriate information
Compliance & Enforcement
Key Dates
Sep 2, 2020
Code comes into force, transition period begins
Sep 2, 2021
Full compliance required, enforcement begins
Penalties
£17.5M or 4% revenue (whichever higher)
View on map
United Kingdom
Focus Areas
Compliance Help
Requires high-privacy defaults, limited profiling/nudges, child-appropriate disclosures, DPIA evidence. Recommender systems must be configured for child safety.
See how NOPE helpsCite This
APA
United Kingdom. (2020). Age Appropriate Design Code (Children's Code) — UK DPA 2018 / UK GDPR.
Related Regulations
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
UK AI Approach
Sector-specific, principles-based approach using existing regulators. Five cross-sector principles guide regulatory application rather than horizontal AI legislation.
Children's Digital Wellbeing
UK government consultation on restricting children's access to AI chatbots, banning addictive design features like infinite scrolling and auto-play, and potentially setting age restrictions for social media. Would amend the Crime and Policing Bill to bring AI chatbot providers under Online Safety Act duties.
VT AADC
Vermont design code structured to be more litigation-resistant: focuses on data processing harms rather than content-based restrictions. AG rulemaking authority begins July 2025.
Finland AI Act
Finland's EU AI Act implementation using decentralized supervision model. Traficom serves as single point of contact and coordination authority. Ten market surveillance authorities share enforcement across sectors. New Sanctions Board handles fines over EUR 100,000.
Last updated February 17, 2026. Verify against primary sources before relying on this information.