KOSA
Kids Online Safety Act
Would establish duty of care for platforms regarding minor safety. Passed full Senate 91-3 in July 2024; passed Senate Commerce Committee multiple times (2022, 2023). Not yet enacted.
Jurisdiction
United States
US
Enacted
Unknown
Effective
Unknown
Enforcement
Not specified
Reintroduced as S.1748 in 119th Congress
What It Requires
Harms Addressed
Who Must Comply
This law applies to:
- • Online platforms likely to be accessed by minors
Who bears obligations:
Safety Provisions
- • Duty of care for minors using covered platforms
- • Requirements to prevent and mitigate specific harms
- • Parental tools and controls
Quick Facts
- Binding
- No
- Mental Health Focus
- Yes
- Child Safety Focus
- Yes
- Algorithmic Scope
- No
Why It Matters
Would be first US federal duty-of-care for platforms. 'Duty of care' framing mirrors UK approach but faces First Amendment concerns.
What You Need to Comply
Would require: systems to prevent and mitigate harms to minors; likely similar to UK OSA in requiring proactive detection rather than reactive moderation; parental controls alone would not satisfy the duty of care
NOPE can helpCite This
APA
United States. (n.d.). Kids Online Safety Act. Retrieved from https://nope.net/regs/us-kosa
BibTeX
@misc{us_kosa,
title = {Kids Online Safety Act},
author = {United States},
year = {n.d.},
url = {https://nope.net/regs/us-kosa}
} Related Regulations
COPPA
Baseline US children's data privacy regime. Applies to operators of websites/online services directed to children under 13, and to general-audience services with actual knowledge they collect personal info from under-13 users.
COPPA 2.0
Would expand COPPA-style protections to teens (13-16) and add stronger constraints including limits on targeted advertising to minors. Often paired politically with KOSA.
CA AI Child Safety Ballot
Comprehensive child AI safety ballot initiative by Common Sense Media. Expands companion chatbot definitions, raises age threshold for data sale consent, prohibits certain AI products for children, establishes new state regulatory structure. Allows state and private lawsuits, requires AI literacy in curriculum, mandates school device bans during instruction, creates children's AI safety fund.
UK OSA
One of the most comprehensive platform content moderation regimes globally. Creates specific duties around suicide, self-harm, and eating disorder content for children with 'highly effective' age assurance requirements.
AU Online Safety Act
Grants eSafety Commissioner powers to issue removal notices with 24-hour compliance. Basic Online Safety Expectations (BOSE) formalize baseline safety governance requirements.
CA SB243
First US law specifically regulating companion chatbots. Uses capabilities-based definition (not intent-based). Requires evidence-based suicide detection, crisis referrals, and published protocols. Two-tier regime: baseline duties for all users, enhanced protections for known minors. Private right of action with $1,000 per violation.