News

In 2024, New York enacted the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, a statute aimed at limiting how digital platforms engage users under the age of 18. Although regulators have not yet begun enforcement, the law signals a meaningful shift in how the state approaches youth protection, platform accountability, and product design.

Unlike traditional privacy statutes, the SAFE for Kids Act focuses less on data collection and more on how platforms structure and deliver content. As a result, the law directly targets engagement-driven design features commonly used by social media and digital platforms.

Overview

The SAFE for Kids Act restricts how covered platforms may design and deploy algorithmically curated content feeds for minors. Rather than regulating personal data alone, the statute addresses the operational mechanics of digital products themselves.

Although New York has already enacted the law, enforcement is not expected to begin until 2026 at the earliest. Before then, the New York Attorney General must issue implementing regulations that clarify compliance standards.

Key Provisions

Limits on Algorithmic Feeds for Minors

Under the Act, platforms may not automatically provide users under 18 with algorithmically curated feeds that predict, rank, or amplify content based on prior activity, engagement history, or device data unless a parent or legal guardian gives verifiable consent.

However, the statute allows limited exceptions, including:

  • Direct user searches
  • Chronological or sequential content from the same creator
  • Private or direct messaging

Through these restrictions, lawmakers aim to reduce design features that encourage prolonged or compulsive engagement by minors.

Overnight Notification Restrictions
  • In addition, platforms may not send push notifications tied to algorithmic feeds to minors between 12:00 a.m. and 6:00 a.m., unless a parent affirmatively opts in. This provision specifically targets late-night engagement practices that lawmakers view as harmful to sleep and well-being.
Age Verification & Parental Consent
  • Covered platforms must implement reasonable and technically feasible methods to verify user age and obtain verifiable parental consent when required. Importantly, companies must delete any data collected solely for age verification or consent purposes once it is no longer needed.
  • Because of this requirement, platforms must carefully balance compliance with privacy and data-minimization obligations.
Privacy & Anti-Coercion Protections
  • The Act also prohibits coercive consent practices. Platforms may not reduce functionality, limit access, or impose higher fees if a minor or parent declines consent for algorithmic feeds. In effect, companies cannot penalize users for choosing non-algorithmic experiences.
Enforcement & Penalties
  • The New York Attorney General holds enforcement authority and may seek civil penalties of up to $5,000 per violation. For platforms with large user bases, these penalties could escalate quickly if compliance failures occur at scale.

Enforcement Timeline & Regulatory Guidance

Although the statute is already law, enforcement remains delayed while regulators develop rules addressing:

  • Acceptable age-verification methods
  • Standards for parental consent
  • Technical compliance expectations

Until the Attorney General issues this guidance, enforcement authority remains limited. Nevertheless, businesses should treat this period as a preparation window rather than a grace period.

The SAFE for Kids Act represents a significant expansion of state oversight into product design and engagement mechanics, not just privacy practices. Accordingly, digital platforms should evaluate several risk areas:

  • Product design: Algorithmic feeds for minors may require redesign or conditional access
  • Compliance infrastructure: Age verification and consent systems must function reliably and lawfully
  • Regulatory exposure: Civil penalties may accumulate rapidly for systemic violations
  • Broader impact: Although a New York statute, the law may influence national design standards as companies seek uniform solutions across jurisdictions

Final Thoughts

New York’s SAFE for Kids Act reflects growing scrutiny of how technology platforms engage younger users. While enforcement remains on the horizon, businesses have a limited window to review product architecture, assess regulatory exposure, and plan for compliance. Early evaluation and thoughtful preparation can help platforms adapt without sacrificing operational flexibility or inviting unnecessary enforcement risk.

This post is for informational purposes only and does not constitute legal advice.

SAFE for Kids Act | New York

In 2024, New York enacted the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, a statute aimed at limiting how digital platforms engage users under the age of 18. Although regulators have not yet begun enforcement, the law signals a meaningful shift in how the state approaches youth protection, platform accountability, and product design.

Unlike traditional privacy statutes, the SAFE for Kids Act focuses less on data collection and more on how platforms structure and deliver content. As a result, the law directly targets engagement-driven design features commonly used by social media and digital platforms.

Overview

The SAFE for Kids Act restricts how covered platforms may design and deploy algorithmically curated content feeds for minors. Rather than regulating personal data alone, the statute addresses the operational mechanics of digital products themselves.

Although New York has already enacted the law, enforcement is not expected to begin until 2026 at the earliest. Before then, the New York Attorney General must issue implementing regulations that clarify compliance standards.

Key Provisions

Limits on Algorithmic Feeds for Minors

Under the Act, platforms may not automatically provide users under 18 with algorithmically curated feeds that predict, rank, or amplify content based on prior activity, engagement history, or device data unless a parent or legal guardian gives verifiable consent.

However, the statute allows limited exceptions, including:

  • Direct user searches
  • Chronological or sequential content from the same creator
  • Private or direct messaging

Through these restrictions, lawmakers aim to reduce design features that encourage prolonged or compulsive engagement by minors.

Overnight Notification Restrictions
  • In addition, platforms may not send push notifications tied to algorithmic feeds to minors between 12:00 a.m. and 6:00 a.m., unless a parent affirmatively opts in. This provision specifically targets late-night engagement practices that lawmakers view as harmful to sleep and well-being.
Age Verification & Parental Consent
  • Covered platforms must implement reasonable and technically feasible methods to verify user age and obtain verifiable parental consent when required. Importantly, companies must delete any data collected solely for age verification or consent purposes once it is no longer needed.
  • Because of this requirement, platforms must carefully balance compliance with privacy and data-minimization obligations.
Privacy & Anti-Coercion Protections
  • The Act also prohibits coercive consent practices. Platforms may not reduce functionality, limit access, or impose higher fees if a minor or parent declines consent for algorithmic feeds. In effect, companies cannot penalize users for choosing non-algorithmic experiences.
Enforcement & Penalties
  • The New York Attorney General holds enforcement authority and may seek civil penalties of up to $5,000 per violation. For platforms with large user bases, these penalties could escalate quickly if compliance failures occur at scale.

Enforcement Timeline & Regulatory Guidance

Although the statute is already law, enforcement remains delayed while regulators develop rules addressing:

  • Acceptable age-verification methods
  • Standards for parental consent
  • Technical compliance expectations

Until the Attorney General issues this guidance, enforcement authority remains limited. Nevertheless, businesses should treat this period as a preparation window rather than a grace period.

The SAFE for Kids Act represents a significant expansion of state oversight into product design and engagement mechanics, not just privacy practices. Accordingly, digital platforms should evaluate several risk areas:

  • Product design: Algorithmic feeds for minors may require redesign or conditional access
  • Compliance infrastructure: Age verification and consent systems must function reliably and lawfully
  • Regulatory exposure: Civil penalties may accumulate rapidly for systemic violations
  • Broader impact: Although a New York statute, the law may influence national design standards as companies seek uniform solutions across jurisdictions

Final Thoughts

New York’s SAFE for Kids Act reflects growing scrutiny of how technology platforms engage younger users. While enforcement remains on the horizon, businesses have a limited window to review product architecture, assess regulatory exposure, and plan for compliance. Early evaluation and thoughtful preparation can help platforms adapt without sacrificing operational flexibility or inviting unnecessary enforcement risk.

This post is for informational purposes only and does not constitute legal advice.

Website developed in accordance with Web Content Accessibility Guidelines 2.2.
If you encounter any issues while using this site, please contact us: 718.500.1093