DSA
Regulation (EU) 2022/2065 (Digital Services Act)
Comprehensive platform regulation with tiered obligations. VLOPs (45M+ EU users) face systemic risk assessments, algorithmic transparency, and independent audits.
Jurisdiction
European Union
Enacted
Nov 16, 2022
Effective
Feb 17, 2024
Enforcement
European Commission (VLOPs); National Digital Services Coordinators
Why It Matters
Risk assessments must address negative effects on minors' mental wellbeing. First major enforcement actions demonstrate teeth.
Recent Developments
Dec 5, 2025: €120M fine on X for transparency violations (deceptive blue checkmarks €45M, ad transparency €35M, researcher access €40M). Active proceedings against TikTok, Meta, AliExpress.
At a Glance
Applies to
Harms addressed
Who Must Comply
- Digital intermediaries (tiered by size)
- Very Large Online Platforms (VLOPs): 45M+ EU monthly users
- Very Large Online Search Engines (VLOSEs)
Obligations fall on:
Applicability thresholds:
Safety Provisions
- VLOPs must conduct systemic risk assessments including mental health impacts
- Algorithmic transparency: explain recommender parameters, offer non-profiling option
- Prohibition on targeted advertising to known minors
- Researcher data access requirements
- Independent annual audits
Compliance & Enforcement
Key Dates
Aug 25, 2023
VLOPs and VLOSEs compliance deadline
Feb 17, 2024
Full application to all in-scope platforms
Member state Digital Services Coordinators operational
Penalties
6% revenue
Primary Source
EUR-Lex
https://eur-lex.europa.eu/eli/reg/2022/2065/oj
View on map
European Union
Focus Areas
Cite This
APA
European Union. (2022). Regulation (EU) 2022/2065 (Digital Services Act).
Related Regulations
EU DSA Minors Guidelines
Commission guidelines under DSA Article 28(1) establishing measures for online platforms to protect minors, including age assurance, default privacy settings, anti-addictive design restrictions, recommender system safeguards, and protections against grooming and exploitation.
TCO Regulation
Requires hosting services to remove terrorist content within one hour of receiving a removal order. One of few regulations with real-time moderation mandates.
FR SREN
France's 2024 "digital space" law strengthening national digital regulation and enforcement levers via ARCOM across platform safety and integrity issues.
Ireland OSMR
Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.
Italy AI Act
First EU member state comprehensive national AI law complementing the EU AI Act. 28 articles covering AI governance principles, sector-specific rules for healthcare, employment, justice, and public administration, criminal provisions, copyright protections, and a EUR 1 billion AI investment fund.
DE JuSchG §24a (KidD)
Requires providers of certain telemedia services to implement provider-side precautionary measures ("Vorsorgemaßnahmen") with regulator-facing evaluability via published BzKJ criteria.
Last updated February 17, 2026. Verify against primary sources before relying on this information.