DE JuSchG §24a (KidD)
Germany Youth Protection Act (JuSchG) — §24a Precautionary Measures + BzKJ Criteria (KidD)
Requires providers of certain telemedia services to implement provider-side precautionary measures ("Vorsorgemaßnahmen") with regulator-facing evaluability via published BzKJ criteria.
Jurisdiction
Germany
Enacted
May 14, 2024
Effective
May 14, 2024
Enforcement
Bundeszentrale für Kinder- und Jugendmedienschutz (BzKJ)
Why It Matters
Most audit-oriented youth safety regime in Europe. Requires documented, testable precautionary measures with regulator evaluation criteria - not just policy statements.
At a Glance
Applies to
Harms addressed
Who Must Comply
- Telemedia service providers within scope of JuSchG youth protection duties
Obligations fall on:
Safety Provisions
- Provider-side precautionary measures for youth protection in digital/interactive services
- Regulator-facing evaluability: published criteria (KidD) for assessing adequacy and effectiveness of safeguards
- Designed to operationalize "safety-by-design" rather than complaint-only moderation
- Interaction risk focus (where self-harm grooming, coercion, crisis escalation live)
Compliance & Enforcement
Key Dates
May 14, 2024
Structural precautionary measures effective, KidD enforcement body established
Dec 1, 2025
JMStV sixth amendment effective (pending ratification by 16 states)
Penalties
Regulatory offenses: fines up to €50,000. Media violations (harmful content to minors): up to 1 year imprisonment or fine. Additional provisions under DDG (Digital Services Act implementation).
View on map
Germany
Focus Areas
Cite This
APA
Germany. (2024). Germany Youth Protection Act (JuSchG) — §24a Precautionary Measures + BzKJ Criteria (KidD).
Related Regulations
EU DSA Minors Guidelines
Commission guidelines under DSA Article 28(1) establishing measures for online platforms to protect minors, including age assurance, default privacy settings, anti-addictive design restrictions, recommender system safeguards, and protections against grooming and exploitation.
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
Germany KI-MIG
German national law implementing the EU AI Act, designating the Bundesnetzagentur (BNetzA) as the lead market surveillance authority under a centralized hybrid model.
CA SB 867
Proposes a 4-year moratorium on the sale and manufacturing of toys with AI chatbot capabilities for children under 12. During the moratorium, a task force would develop safety standards with input from technologists, parents, and ethicists.
FR SREN
France's 2024 "digital space" law strengthening national digital regulation and enforcement levers via ARCOM across platform safety and integrity issues.
Finland AI Act
Finland's EU AI Act implementation using decentralized supervision model. Traficom serves as single point of contact and coordination authority. Ten market surveillance authorities share enforcement across sectors. New Sanctions Board handles fines over EUR 100,000.
Last updated February 17, 2026. Verify against primary sources before relying on this information.