DE JuSchG §24a (KidD)
Germany Youth Protection Act (JuSchG) — §24a Precautionary Measures + BzKJ Criteria (KidD)
Requires providers of certain telemedia services to implement provider-side precautionary measures ("Vorsorgemaßnahmen") with regulator-facing evaluability via published BzKJ criteria.
Jurisdiction
Germany
Enacted
May 14, 2024
Effective
May 14, 2024
Enforcement
Bundeszentrale für Kinder- und Jugendmedienschutz (BzKJ)
Why It Matters
Most audit-oriented youth safety regime in Europe. Requires documented, testable precautionary measures with regulator evaluation criteria - not just policy statements.
At a Glance
Applies to
Harms addressed
Who Must Comply
- Telemedia service providers within scope of JuSchG youth protection duties
Obligations fall on:
Safety Provisions
- Provider-side precautionary measures for youth protection in digital/interactive services
- Regulator-facing evaluability: published criteria (KidD) for assessing adequacy and effectiveness of safeguards
- Designed to operationalize "safety-by-design" rather than complaint-only moderation
- Interaction risk focus (where self-harm grooming, coercion, crisis escalation live)
Compliance & Enforcement
Key Dates
May 14, 2024
Structural precautionary measures effective, KidD enforcement body established
Dec 1, 2025
JMStV sixth amendment effective (pending ratification by 16 states)
Penalties
Regulatory offenses: fines up to €50,000. Media violations (harmful content to minors): up to 1 year imprisonment or fine. Additional provisions under DDG (Digital Services Act implementation).
View on map
Germany
Focus Areas
Compliance Help
Requires documented, testable precautionary measures for interaction risks (including user protection flows), plus evidence measures are effective under regulator criteria.
See how NOPE helpsCite This
APA
Germany. (2024). Germany Youth Protection Act (JuSchG) — §24a Precautionary Measures + BzKJ Criteria (KidD).
Related Regulations
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
EU CSAR (Proposed)
Proposed permanent framework replacing interim derogation. Parliament position (Nov 2023) limits detection to known/new CSAM, excludes E2EE services. Council has not agreed General Approach.
CA SB 867
Proposes a 4-year moratorium on the sale and manufacturing of toys with AI chatbot capabilities for children under 12. During the moratorium, a task force would develop safety standards with input from technologists, parents, and ethicists.
FR SREN
France's 2024 "digital space" law strengthening national digital regulation and enforcement levers via ARCOM across platform safety and integrity issues.
Finland AI Act
Finland's EU AI Act implementation using decentralized supervision model. Traficom serves as single point of contact and coordination authority. Ten market surveillance authorities share enforcement across sectors. New Sanctions Board handles fines over EUR 100,000.
Hungary AI Act
Hungary's comprehensive AI law implementing the EU AI Act. Designates the National Media and Infocommunications Authority (NMHH) as the primary supervisory authority, with sectoral regulators for specific domains.
Last updated February 17, 2026. Verify against primary sources before relying on this information.