EU CSAR (Proposed)
Proposal for a Regulation to prevent and combat child sexual abuse (CSAR)
Proposed permanent framework replacing interim derogation. Parliament position (Nov 2023) limits detection to known/new CSAM, excludes E2EE services. Council has not agreed General Approach.
Jurisdiction
European Union
EU
Enacted
Unknown
Effective
Unknown
Enforcement
Not specified
Inter-institutional negotiation; blocked on encryption/scanning
What It Requires
Who Must Comply
This law applies to:
- • Online services and communications providers (scope TBD)
Who bears obligations:
Safety Provisions
- • Risk assessment + mitigation duties for at-risk services
- • Potential detection orders (contested)
- • EU Centre for coordination/reporting
- • Parliament: excludes E2EE, limits to known/new CSAM (not grooming)
Quick Facts
- Binding
- No
- Mental Health Focus
- No
- Child Safety Focus
- Yes
- Algorithmic Scope
- No
Why It Matters
If EU adopts detection-order model, becomes global reference—directly colliding with encrypted messaging and private AI chat.
Recent Developments
Germany formed blocking minority Oct 7, 2025. Denmark compromise (Oct 31, 2025) made client-side scanning voluntary. Council reached general approach Nov 26, 2025; trilogues ahead.
What You Need to Comply
If adopted with detection orders: scalable detection + reporting, auditability, strategy for encrypted contexts.
NOPE can helpCite This
APA
European Union. (n.d.). Proposal for a Regulation to prevent and combat child sexual abuse (CSAR). Retrieved from https://nope.net/regs/eu-csar
BibTeX
@misc{eu_csar,
title = {Proposal for a Regulation to prevent and combat child sexual abuse (CSAR)},
author = {European Union},
year = {n.d.},
url = {https://nope.net/regs/eu-csar}
} Related Regulations
EU CSAM Interim
Temporary legal bridge allowing certain communications providers to voluntarily detect/report/remove CSAM, notwithstanding ePrivacy constraints. Extended via 2024/1307 while permanent CSAR negotiated.
DSA
Comprehensive platform regulation with tiered obligations. VLOPs (45M+ EU users) face systemic risk assessments, algorithmic transparency, and independent audits.
DE JuSchG §24a (KidD)
Requires providers of certain telemedia services to implement provider-side precautionary measures ("Vorsorgemaßnahmen") with regulator-facing evaluability via published BzKJ criteria.
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
Ireland OSMR
Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.
FR SREN
France's 2024 "digital space" law strengthening national digital regulation and enforcement levers via ARCOM across platform safety and integrity issues.