FR SREN
Loi SREN (Loi n° 2024-449 du 21 mai 2024 visant à sécuriser et réguler l'espace numérique)
France's 2024 "digital space" law strengthening national digital regulation and enforcement levers via ARCOM across platform safety and integrity issues.
Jurisdiction
France
FR
Enacted
May 21, 2024
Effective
Unknown
Enforcement
ARCOM (and other competent authorities depending on provision)
What It Requires
Harms Addressed
Who Must Comply
This law applies to:
- • Digital platforms and in-scope online services in France
Who bears obligations:
Safety Provisions
- • National-level platform regulation and enforcement posture (overlaying EU frameworks)
- • Compliance surface relevant to large platforms and safety controls
- • Age verification strengthening
- • Cyberbullying and online harassment provisions
Compliance Timeline
May 23, 2024
Most provisions enter into force
Oct 11, 2024
ARCOM publishes final age verification standard
Jan 11, 2025
Age verification compliance deadline (robust age checks required)
Apr 11, 2025
Transition period ends for card-based verification
Enforcement
Enforced by
ARCOM (and other competent authorities depending on provision)
Quick Facts
- Binding
- Yes
- Mental Health Focus
- Yes
- Child Safety Focus
- Yes
- Algorithmic Scope
- No
Why It Matters
France is a major EU market with an active regulator. SREN is a significant national overlay worth tracking for compliance planning.
Cite This
APA
France. (2024). Loi SREN (Loi n° 2024-449 du 21 mai 2024 visant à sécuriser et réguler l'espace numérique). Retrieved from https://nope.net/regs/fr-sren
BibTeX
@misc{fr_sren,
title = {Loi SREN (Loi n° 2024-449 du 21 mai 2024 visant à sécuriser et réguler l'espace numérique)},
author = {France},
year = {2024},
url = {https://nope.net/regs/fr-sren}
} Related Regulations
DSA
Comprehensive platform regulation with tiered obligations. VLOPs (45M+ EU users) face systemic risk assessments, algorithmic transparency, and independent audits.
Ireland OSMR
Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
DE JuSchG §24a (KidD)
Requires providers of certain telemedia services to implement provider-side precautionary measures ("Vorsorgemaßnahmen") with regulator-facing evaluability via published BzKJ criteria.
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
EU CSAR (Proposed)
Proposed permanent framework replacing interim derogation. Parliament position (Nov 2023) limits detection to known/new CSAM, excludes E2EE services. Council has not agreed General Approach.