Skip to main content
Publication

State Kids’ Privacy Laws Proliferate Despite Legal Challenges

State legislatures have continued to enact privacy laws aimed at protecting kids and teens despite significant—and often successful—legal challenges that largely focus on First Amendment flaws. Some laws have recently gone into effect, or will become effective soon, while others are not slated to take effect until 2027. The Children’s Online Privacy Protection Act (COPPA) remains the primary federal law protecting children’s online privacy (updated implementing regulations took effect earlier this year, with a compliance deadline of April 22, 2026) and bars inconsistent state laws. While there have been some recent legislative efforts at the federal level to expand children and teens’ privacy protections (including a bill introduced this week to regulate the use of AI chatbots and companions by minors), these have failed to pass. However, states continue to pursue their own online privacy laws with a goal of enhancing protections for children and teenagers, particularly around social media use and exposure to AI. The unabated pace of legislative action reflects rare bipartisan support for protecting kids and teens, adding to the growing patchwork of laws that now make up the state privacy landscape. Because these laws do not simply cover websites or services “directed to children,” as defined in COPPA, but to websites and services that are “likely to be accessed” by children, they often effectively regulate businesses that target general, largely adult-only audiences. Many of these laws are therefore being challenged as overbroad, unconstitutional restrictions on speech.

We review recent developments affecting children’s privacy, and potentially the broader online privacy landscape, including current and likely challenges on the horizon.

California: Social Media Warning Labels, AI Chatbot, and Age Signaling
California was the first state to adopt a version of the United Kingdom’s (UK) Age-Appropriate Design Code Act in the form of AB 2273, the California Age-Appropriate Design Code Act (CAADCA), on September 15, 2022. This law was challenged by NetChoice, a trade association representing online businesses, and is currently stayed. A federal district court granted a preliminary injunction, largely on First Amendment grounds, but declined to rule on a COPPA preemption question in a 2023 ruling. While that legal battle, now in the Ninth Circuit, continues to play out, the California legislature has not stopped its efforts to protect minors. Most recently, in October 2025, the governor signed a number of new privacy laws, including several laws to address perceived harms to kids and teens associated with social media use and AI.

California’s AB 56 includes a requirement that covered platforms, as defined under the law, display a periodic warning to users aged 17 or younger, stating: “The Surgeon General has warned that while social media may have benefits for some young users, social media is associated with significant mental health harms and has not been proven safe for young users.” AB 56 will go into effect January 2027. Meanwhile, also beginning January 2027, AB 1043 will require app stores and operating system providers to send age verification signals through an accessible interface at account setup that identifies the age or age range of the device user. And beginning July 2027, SB 243 will require a disclosure to users known to be minors that an AI companion chatbot is not human, plus periodic reminders to take breaks. Development and implementation of protocols to prevent suicide and self-harm, and measures to protect minors from sexually explicit content, also feature in the bill.

It would not be surprising to see a legal challenge to the forced speech aspect of some of these bills, as both NetChoice and the Computer & Communications Industry Association (CCIA) expressed concerns before these bills were passed. For example, in a letter sent last month to California Governor Gavin Newsom, NetChoice pointed to similarities between AB 56 and a Colorado law that the tech association is already challenging.

Ohio, Arizona, and Missouri: Age Verification for Adult Content
Many other states are passing laws that are more narrowly focused on age verification. As a recent example, an Ohio law went into effect September 30, 2025, requiring certain organizations “presenting any material or performance that is obscene or harmful to juveniles on the internet” to obtain age verification that the person attempting to access the material or performance is 18 or older. Ohio joined at least 23 other states with similar age verification laws aimed primarily at preventing minors’ access to pornography. A similar law went into effect September 26, 2025, in Arizona, and Missouri’s pornography age verification regulation, under its Merchandising Practices Act, will go into effect at the end of November 2025.

The Free Speech Coalition, a trade association for the adult-entertainment industry, previously challenged age verification requirements on First Amendment grounds, including a law in Texas. However, the Supreme Court made clear this summer in Free Speech Coalition, Inc. v. Paxton, that “[t]he First Amendment leaves undisturbed States’ traditional power to prevent minors from accessing speech that is obscene from their perspective. That power includes the power to require proof of age before an individual can access such speech. It follows that no person—adult or child—has a First Amendment right to access such speech without first submitting proof of age.”

With the Supreme Court’s ruling in this case, it seems settled that age verification is likely to be deemed permissible to restrict minors’ access to material that is “obscene only to minors.” Ohio Attorney General Dave Yost is already monitoring compliance with that state’s recently effective law. His office released a statement on October 8, 2025, indicating that Notices of Violation (NOV) were being sent to noncompliant companies, warning of legal action if platforms fail to comply within 45 days.

Colorado and Montana: Dark Patterns
On October 1, 2025, several children’s privacy-related amendments to the Colorado Privacy Act and Montana Consumer Data Privacy Act went into effect. Neither law requires age verification for minors, but both require controllers to adopt requirements to address a “heightened risk of harm” to minors. Unlike the CAADCA’s “likely to be accessed” standard, these laws apply to controllers who know they are dealing with a minor, or willfully disregard information on age. The most controversial aspect of the Colorado law is the provision barring “use of any system design feature to significantly increase, sustain, or extend a minor’s use of an online service, product, or feature.”

As part of the public comment period on the rulemaking to clarify the scope of the Colorado amendments, NetChoice submitted comments indicating that it would challenge the amendments and pointed to past successes in challenging similar laws and regulations, including the CAADCA. NetChoice takes issue with laws that purport to impose liability for system design features that are “shown to increase use of or engagement with an online service, product, or feature beyond what is reasonably expected,” arguing that this type of regulation is vague, punishes success and innovation, and “seeks to regulate expressive conduct and editorial discretion protected by the First Amendment.”

The amendments in Montana establish similar rights to those under the Colorado model and may also be the subject of future challenges.

Maryland Online Data Privacy Act
The Maryland Online Data Privacy Act also went into effect on October 1. It is a comprehensive data privacy law, but also categorizes “personal data of a consumer that the controller knows or has reason to know is a child” as one of several types of “sensitive data.” As of this writing, we are not aware of a challenge to this Maryland law, but NetChoice is currently suing to prevent implementation of H.B. 603, the Maryland Age-Appropriate Design Code Act (MAADCA). Like other similar state laws, the MAADCA would require for-profit websites that are “reasonably likely to be accessed by children” (defined as anyone under 18) to “ensure the best interests of children when designing, developing, and providing” online services. NetChoice contends that the MAADCA requires the same type of data privacy impact assessment that the Ninth Circuit found was unconstitutional in NetChoice’s challenge to the CAADCA. It also argues that the law is preempted by COPPA and by § 230 of the Communications Decency Act.

More of the Same?
As society grapples with how to address legitimate worries about the impact of online products and services, including social media and AI, on children and teens, legislators continue to enact imperfect laws that draw challenges on constitutional and other grounds. These laws create vague standards that nevertheless impose broad responsibilities to act in “the best interests” of minors, should minors “access” their sites. While child- and teen-directed platforms and social media companies are clear targets, a host of other businesses may not know they could be subject to the laws. For example, would e-commerce platforms built on targeted advertising be viewed as “likely to be accessed” by minors? If so, prohibitions on “profiling” and targeted advertising could wreak havoc on their business models. Regardless, for the time being, states are continuing to adopt “age-appropriate design code acts” and other legislation intended to protect kids and teens, despite the likelihood that some or all aspects of these laws are likely to be found unconstitutional.