KOSA
Kids Online Safety Act
Would establish duty of care for platforms regarding minor safety. Passed full Senate 91-3 in July 2024; passed Senate Commerce Committee multiple times (2022, 2023). Not yet enacted.
Jurisdiction
United States
Enacted
Pending
Effective
TBD
Enforcement
TBD
Reintroduced as S.1748 in 119th Congress (May 2025). House version (H.R.6484) lacks Democratic support. Bipartisan coalition fragmenting over duty of care removal.
Congress.govWhy It Matters
Would be first US federal duty-of-care for platforms. 'Duty of care' framing mirrors UK approach but faces First Amendment concerns.
Recent Developments
Failed to pass 118th Congress despite Senate passage (July 2024). Reintroduced May 2025. House Republican version removed core duty of care standard, causing bipartisan coalition to fragment.
At a Glance
Applies to
Harms addressed
Who Must Comply
- Online platforms likely to be accessed by minors
Obligations fall on:
Safety Provisions
- Duty of care for minors using covered platforms
- Requirements to prevent and mitigate specific harms
- Parental tools and controls
Compliance & Enforcement
Penalties
Penalties pending regulatory determination
View on map
United States
Focus Areas
Cite This
APA
United States. (n.d.). Kids Online Safety Act.
Related Regulations
KIDS Act
Omnibus children's internet safety legislation incorporating the SAFE BOTs Act (AI chatbot safeguards) and AWARE Act (AI education resources). Requires AI chatbot operators to disclose AI status to minors, provide crisis hotline information, and implement break prompts.
COPPA
Baseline US children's data privacy regime. Applies to operators of websites/online services directed to children under 13, and to general-audience services with actual knowledge they collect personal info from under-13 users.
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
CA SB 1119
Comprehensive companion chatbot children's safety framework establishing mandatory design features, default settings, prohibited conduct, parental controls, independent audit requirements, and a private right of action.
Ireland OSMR
Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.
Last updated February 11, 2026. Verify against primary sources before relying on this information.