Ireland OSMR
Online Safety and Media Regulation Act 2022 + Online Safety Code
Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.
Jurisdiction
Ireland
Enacted
Dec 10, 2022
Effective
Mar 15, 2023
Enforcement
Coimisiún na Meán (Media Commission)
Why It Matters
Binding example of platform duties explicitly covering self-harm, suicide, and eating disorders under EU DSA framework. Ireland is a key EU jurisdiction for tech platform regulation.
At a Glance
Applies to
Harms addressed
Who Must Comply
- Video-sharing platform services established in or serving Ireland
Obligations fall on:
Safety Provisions
- Online Safety Code for video-sharing platforms
- Explicit duties regarding self-harm, suicide, and eating disorder content
- Age verification and parental control requirements
- Complaints handling and user empowerment measures
- Systemic risk assessments
Compliance & Enforcement
Key Dates
Mar 15, 2023
Online Safety and Media Regulation Act commenced
Oct 21, 2024
First Online Safety Code published
Nov 18, 2024
Part A effective - VSPs must comply with core obligations
Jul 21, 2025
Part B effective - detailed rules for harmful content categories
Penalties
Up to €20 million or 10% of annual turnover (whichever higher). Coimisiún na Meán can seek prosecution of senior management, block access to services, and issue content limitation notices.
View on map
Ireland
Focus Areas
Cite This
APA
Ireland. (2022). Online Safety and Media Regulation Act 2022 + Online Safety Code.
Related Regulations
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
AU Online Safety Act
Grants eSafety Commissioner powers to issue removal notices with 24-hour compliance. Basic Online Safety Expectations (BOSE) formalize baseline safety governance requirements.
C-63
Would have established Digital Safety Commission with platform duties for seven harmful content categories including content inducing children to harm themselves. Required 24-hour CSAM takedown.
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
Brazil ECA Digital
Comprehensive child digital safety law applying to any IT product or service directed at or likely to be accessed by minors in Brazil, with extraterritorial reach.
Ireland AI Bill 2026
Ireland's national implementation of the EU AI Act under a distributed model, designating 15 existing sectoral Market Surveillance Authorities (MSAs) and establishing a coordinating AI Office.
Last updated February 17, 2026. Verify against primary sources before relying on this information.