SG Online Safety Code
Code of Practice for Online Safety
Under Broadcasting Act framework, requires major social media services to implement systems reducing exposure to harmful content. Child safety is key driver.
Jurisdiction
Singapore
SG
Enacted
Unknown
Effective
Unknown
Enforcement
Not specified
Harms Addressed
Who Must Comply
This law applies to:
- • Major social media services designated under the Act
Who bears obligations:
Safety Provisions
- • Systems to reduce harmful content exposure
- • Child safety focused measures
- • Reporting and transparency requirements
Quick Facts
- Binding
- Yes
- Mental Health Focus
- No
- Child Safety Focus
- Yes
- Algorithmic Scope
- No
Why It Matters
Singapore balancing innovation hub status with increasing safety requirements. MAS AI Guidelines (mandatory for financial sector) coming.
Recent Developments
Government pursuing amendments to strengthen online safety obligations.
Cite This
APA
Singapore. (n.d.). Code of Practice for Online Safety. Retrieved from https://nope.net/regs/sg-online-safety
BibTeX
@misc{sg_online_safety,
title = {Code of Practice for Online Safety},
author = {Singapore},
year = {n.d.},
url = {https://nope.net/regs/sg-online-safety}
} Related Regulations
Malaysia OSA
Requires licensed platforms to implement content moderation systems, child-specific safeguards, and submit Online Safety Plans. Nine categories of harmful content regulated.
SG MAS AI Governance
First mandatory AI governance requirements in Singapore, shifting from voluntary Model AI Governance Framework to binding obligations for financial sector. Establishes three mandatory focus areas: oversight and governance, risk management systems, and development/validation/deployment protocols.
SG GenAI Gov
Singapore's GenAI-specific guidance: risks (hallucinations, harmful outputs, IP/provenance, misuse) and operational controls (evaluation, transparency, policies, incident response).
AU Online Safety Act
Grants eSafety Commissioner powers to issue removal notices with 24-hour compliance. Basic Online Safety Expectations (BOSE) formalize baseline safety governance requirements.
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
AU Social Media Age Ban
World's first social media minimum age law. Platforms must prevent under-16s from holding accounts. Implementation depends on age assurance technology.