Skip to main content

Ireland OSMR

Online Safety and Media Regulation Act 2022 + Online Safety Code

Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.

Jurisdiction

Ireland

Enacted

Dec 10, 2022

Effective

Mar 15, 2023

Enforcement

Coimisiún na Meán (Media Commission)

Irish Statute Book

Why It Matters

Binding example of platform duties explicitly covering self-harm, suicide, and eating disorders under EU DSA framework. Ireland is a key EU jurisdiction for tech platform regulation.

At a Glance

Applies to

Social PlatformOnline Platform

Who Must Comply

  • Video-sharing platform services established in or serving Ireland

Safety Provisions

  • Online Safety Code for video-sharing platforms
  • Explicit duties regarding self-harm, suicide, and eating disorder content
  • Age verification and parental control requirements
  • Complaints handling and user empowerment measures
  • Systemic risk assessments

Compliance & Enforcement

Key Dates

Mar 15, 2023

Online Safety and Media Regulation Act commenced

Oct 21, 2024

First Online Safety Code published

Nov 18, 2024

Part A effective - VSPs must comply with core obligations

Jul 21, 2025

Part B effective - detailed rules for harmful content categories

Penalties

Up to €20 million or 10% of annual turnover (whichever higher). Coimisiún na Meán can seek prosecution of senior management, block access to services, and issue content limitation notices.

View on map

Ireland

Focus Areas

Mental health & crisis
Child safety
Active safeguards required

Compliance Help

Requires systems to protect users from harmful content including self-harm/suicide/eating disorder material; age verification; transparent complaints handling; risk assessments.

See how NOPE helps

Cite This

APA

Ireland. (2022). Online Safety and Media Regulation Act 2022 + Online Safety Code.

Related Regulations

In Effect UK

UK OSA

One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.

In Effect AU

AU Online Safety Act

Grants eSafety Commissioner powers to issue removal notices with 24-hour compliance. Basic Online Safety Expectations (BOSE) formalize baseline safety governance requirements.

Failed CA

C-63

Would have established Digital Safety Commission with platform duties for seven harmful content categories including content inducing children to harm themselves. Required 24-hour CSAM takedown.

In Effect GB

Ofcom Children's Codes

Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.

Proposed US-CA

CA AI Child Safety Ballot

Comprehensive child AI safety ballot initiative by Common Sense Media. Expands companion chatbot definitions, raises age threshold for data sale consent, prohibits certain AI products for children, establishes new state regulatory structure. Allows state and private lawsuits, requires AI literacy in curriculum, mandates school device bans during instruction, creates children's AI safety fund.

Pending US

KOSA

Would establish duty of care for platforms regarding minor safety. Passed full Senate 91-3 in July 2024; passed Senate Commerce Committee multiple times (2022, 2023). Not yet enacted.

Last updated February 17, 2026. Verify against primary sources before relying on this information.