KOSA
Kids Online Safety Act
Would establish duty of care for platforms regarding minor safety. Passed full Senate 91-3 in July 2024; passed Senate Commerce Committee multiple times (2022, 2023). Not yet enacted.
Jurisdiction
United States
Enacted
Pending
Effective
TBD
Enforcement
TBD
Reintroduced as S.1748 in 119th Congress (May 2025). House version (H.R.6484) lacks Democratic support. Bipartisan coalition fragmenting over duty of care removal.
Congress.govWhy It Matters
Would be first US federal duty-of-care for platforms. 'Duty of care' framing mirrors UK approach but faces First Amendment concerns.
Recent Developments
Failed to pass 118th Congress despite Senate passage (July 2024). Reintroduced May 2025. House Republican version removed core duty of care standard, causing bipartisan coalition to fragment.
At a Glance
Applies to
Harms addressed
Who Must Comply
- Online platforms likely to be accessed by minors
Obligations fall on:
Safety Provisions
- Duty of care for minors using covered platforms
- Requirements to prevent and mitigate specific harms
- Parental tools and controls
Compliance & Enforcement
Penalties
Penalties pending regulatory determination
View on map
United States
Focus Areas
Compliance Help
Would require: systems to prevent and mitigate harms to minors; likely similar to UK OSA in requiring proactive detection rather than reactive moderation; parental controls alone would not satisfy the duty of care
See how NOPE helpsCite This
APA
United States. (n.d.). Kids Online Safety Act.
Related Regulations
COPPA
Baseline US children's data privacy regime. Applies to operators of websites/online services directed to children under 13, and to general-audience services with actual knowledge they collect personal info from under-13 users.
COPPA 2.0
Would expand COPPA-style protections to teens (13-16) and add stronger constraints including limits on targeted advertising to minors. Often paired politically with KOSA.
Ofcom Children's Codes
Ofcom codes requiring user-to-user services and search services to protect children from harmful content including suicide, self-harm, and eating disorder content. Explicitly covers AI chatbots that enable content sharing between users. Requires detection technology, content moderation, and recommender system controls.
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
Ireland OSMR
Establishes Coimisiún na Meán (Media Commission) with binding duties for video-sharing platforms. One of the cleaner examples of explicit self-harm/suicide/eating-disorder content duties in platform governance.
FL Companion Chatbot Act
Regulates companion AI chatbots with emphasis on self-harm prevention and crisis intervention. Requires suicide/self-harm detection protocols, 988 crisis referrals, prohibition on chatbots discussing self-harm with users, and annual reporting on crisis interventions. Includes minor-specific protections including AI disclosure, break reminders, and prohibition on sexually explicit content.
Last updated February 11, 2026. Verify against primary sources before relying on this information.