GUARD Act
GUARD Act (S.3062)
Would require age verification, disclosures, and broader child protections for AI chatbots. Part of emerging federal focus on companion AI safety for minors.
Jurisdiction
United States
Enacted
Pending
Effective
TBD
Enforcement
TBD
119th Congress
Congress.govWhy It Matters
Federal attempt to address companion AI + minors. If enacted, would create national baseline for AI chatbot child safety.
At a Glance
Applies to
Harms addressed
Requires
Who Must Comply
- AI chatbot operators
Obligations fall on:
Safety Provisions
- Age verification requirements for AI chatbots
- Disclosure obligations
- Child protection provisions
View on map
United States
Focus Areas
Cite This
APA
United States. (n.d.). GUARD Act (S.3062).
Related Regulations
COPPA
Baseline US children's data privacy regime. Applies to operators of websites/online services directed to children under 13, and to general-audience services with actual knowledge they collect personal info from under-13 users.
COPPA 2.0
Would expand COPPA-style protections to teens (13-16) and add stronger constraints including limits on targeted advertising to minors. Often paired politically with KOSA.
China Minor Platform Identification Measures
Establishes quantified thresholds and assessment criteria for identifying internet platforms with massive minor user bases or significant impact on minors. Specifies identification procedures and delisting rules for platforms that no longer meet criteria. Platforms meeting thresholds face enhanced obligations for minor protection.
OH K-12 AI Mandate
First-in-nation mandate requiring all Ohio K-12 public schools to adopt formal AI usage policies by July 1, 2026. Ohio Department of Education and Workforce released model policy on December 30, 2025 covering academic integrity, procurement/privacy, and anti-bullying. Districts can adopt state model or create their own aligned policy.
SG Online Safety Code
Under Broadcasting Act framework, requires major social media services to implement systems reducing exposure to harmful content. Child safety is key driver.
AVMSD
EU directive setting baseline safety and minor-protection duties for audiovisual media services and video-sharing platform services, including measures to protect minors from harmful content.
Last updated January 22, 2026. Verify against primary sources before relying on this information.