AU Social Media Age Ban
Online Safety Amendment (Social Media Minimum Age) Act 2024
World's first social media minimum age law. Platforms must prevent under-16s from holding accounts. Implementation depends on age assurance technology.
Jurisdiction
Australia
AU
Enacted
Unknown
Effective
Dec 10, 2025
Enforcement
eSafety Commissioner
What It Requires
Harms Addressed
Who Must Comply
This law applies to:
- • Social media platforms
Applicability thresholds:
Under 16 years old
Age verification required for access
Who bears obligations:
Safety Provisions
- • Platforms must prevent under-16s from creating accounts
- • Age assurance requirements (technology TBD)
- • Enforcement discretion during rollout period
Compliance Timeline
Dec 10, 2025
Law takes effect - platforms must begin age verification processes
Enforcement
Enforced by
eSafety Commissioner
Penalties
A$49.5M
Up to AUD $49.5M for non-compliance
Quick Facts
- Binding
- Yes
- Mental Health Focus
- No
- Child Safety Focus
- Yes
- Algorithmic Scope
- No
Why It Matters
Most aggressive child safety intervention globally. If successful, may prompt similar laws elsewhere. If challenged successfully, may limit regulatory options.
Recent Developments
Effective Dec 2025. Reddit launched High Court challenge. Implementation guidance and age assurance standards still developing.
Cite This
APA
Australia. (2025). Online Safety Amendment (Social Media Minimum Age) Act 2024. Retrieved from https://nope.net/regs/au-social-media-age
BibTeX
@misc{au_social_media_age,
title = {Online Safety Amendment (Social Media Minimum Age) Act 2024},
author = {Australia},
year = {2025},
url = {https://nope.net/regs/au-social-media-age}
} Related Regulations
AU Privacy Amendment 2024
Strengthens Privacy Act requirements for biometric data collection, raising the standard of conduct for collecting biometric information used for automated verification or identification. Cannot collect such information unless individual has consented and it is reasonably necessary.
AU Online Safety Act
Grants eSafety Commissioner powers to issue removal notices with 24-hour compliance. Basic Online Safety Expectations (BOSE) formalize baseline safety governance requirements.
VT AADC
Vermont design code structured to be more litigation-resistant: focuses on data processing harms rather than content-based restrictions. AG rulemaking authority begins July 2025.
CA AB 489
Prohibits AI systems from using terms, letters, or phrases that falsely indicate or imply possession of a healthcare professional license.
UAE Child Digital Safety Law
UAE federal law establishing comprehensive child digital safety requirements for digital platforms and internet service providers, with extraterritorial reach to foreign platforms targeting UAE users. Requires age verification, privacy-by-default, content filtering, and proactive AI-powered content detection.
SG Online Safety Code
Under Broadcasting Act framework, requires major social media services to implement systems reducing exposure to harmful content. Child safety is key driver.