EU CSAR (Proposed)
Proposal for a Regulation to prevent and combat child sexual abuse (CSAR)
Proposed permanent framework replacing interim derogation. Parliament position (Nov 2023) limits detection to known/new CSAM, excludes E2EE services. Council has not agreed General Approach.
Jurisdiction
European Union
Enacted
Pending
Effective
TBD
Enforcement
TBD
Trilogues began December 9, 2025. Council dropped mandatory chat scanning November 2025. Interim derogation expires April 3, 2026. Final deal expected by June 2026.
EUR-LexWhy It Matters
If EU adopts detection-order model, becomes global reference—directly colliding with encrypted messaging and private AI chat.
Recent Developments
Council position (November 2025) dropped mandatory encrypted message scanning in favor of voluntary framework. Political trilogues scheduled Feb 26, May 4, June 29, 2026. EU Centre on Child Sexual Abuse to be established.
At a Glance
Applies to
Harms addressed
Requires
Who Must Comply
- Online services and communications providers (scope TBD)
Obligations fall on:
Safety Provisions
- Risk assessment + mitigation duties for at-risk services
- Potential detection orders (contested)
- EU Centre for coordination/reporting
- Parliament: excludes E2EE, limits to known/new CSAM (not grooming)
Compliance & Enforcement
Penalties
Penalties pending regulatory determination
View on map
European Union
Focus Areas
Compliance Help
If adopted with detection orders: scalable detection + reporting, auditability, strategy for encrypted contexts.
See how NOPE helpsCite This
APA
European Union. (n.d.). Proposal for a Regulation to prevent and combat child sexual abuse (CSAR).
Related Regulations
EU CSAM Interim
Temporary legal bridge allowing certain communications providers to voluntarily detect/report/remove CSAM, notwithstanding ePrivacy constraints. Extended via 2024/1307 while permanent CSAR negotiated.
DSA
Comprehensive platform regulation with tiered obligations. VLOPs (45M+ EU users) face systemic risk assessments, algorithmic transparency, and independent audits.
DE JuSchG §24a (KidD)
Requires providers of certain telemedia services to implement provider-side precautionary measures ("Vorsorgemaßnahmen") with regulator-facing evaluability via published BzKJ criteria.
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
FR SREN
France's 2024 "digital space" law strengthening national digital regulation and enforcement levers via ARCOM across platform safety and integrity issues.
Ireland OSMR
Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.
Last updated February 11, 2026. Verify against primary sources before relying on this information.