Kids and Teens Privacy: 2025 Look Back and 2026 Predictions – Part II: The State Privacy Patchwork
As we discussed in Part 1 of this series, while the Children’s Online Privacy Protection Act (COPPA) remains the primary federal law protecting children’s online privacy, there is a growing patchwork of state laws aimed at protecting both children and teens online. These laws identify a variety of potential harms, but many of them expand the universe of businesses subject to “children’s privacy” obligations and greatly complicate their compliance challenges.
State laws tend to fall into one or more of the following categories: age-appropriate design codes; age verification; health-based warning labels for social media platforms; AI companion chatbot disclosures; and restrictions on so-called “dark patterns” that are said to increase the time children spend online or their exposure to potentially harmful content. Some laws apply to a variety of online businesses, while others more narrowly apply to social media platforms.
It is no secret to anyone who follows the state privacy law landscape that these laws create constitutional and other legal concerns. The industry trade group NetChoice, which represents major internet and social media companies, has filed a variety of challenges to myriad state kids’ privacy laws, including those that fall into the categories identified above. These challenges further complicate the compliance landscape, as they often result in judicial actions that stay, strike down, or limit at least portions of the laws.
In this second part of our two-part series, we briefly review the state landscape and provide some predictions for 2026.
Overview of State Laws and Industry Legal Challenges
Age-Appropriate Design Codes
Several states have enacted laws that essentially require a large swath of businesses to operate in a “safe for minors” mode, with California being the first. The California Age-Appropriate Design Code Act (CAADCA), styled after the UK’s Age-Appropriate Design Code Act, was enacted in 2022, but legal challenges have delayed enforcement. The CAADCA applies to “businesses” (as defined in the California Consumer Privacy Act (CCPA)) that provide online services, products, or features “likely to be accessed” by children (defined as under age 18), rather than applying the COPPA standard covering online services directed at children under 13 or those with actual knowledge that they collected information from a child under 13. Given how the internet operates, laws applying a “likely to be accessed by minors” standard cover a poorly defined but broad category of businesses. For example, businesses that target general audiences, or offer games, sports, and even e-commerce, may be subject to the CAADCA. Covered businesses must take steps to “ensure the best interests of children when designing, developing, and providing” online services. This includes a bar on behavioral or targeted advertising, a potentially crippling problem for websites and online services engaged primarily in e-commerce. Covered businesses are also required to perform a data protection impact assessment (DPIA).
Other states have enacted laws similar to the CAADCA, but some key differences are the age of a “child” (some laws follow COPPA’s under-13 standard, while others use CAADCA’s under-18 standard) and the standard for determining if an online service is directed to “children,” however defined. Like the CAADCA, the Maryland Age-Appropriate Design Code (Maryland Kids Code), effective October 1, 2024, defines a “child” as under 18. Similarly, the Vermont Age-Appropriate Design Code Act (Vermont AADC), which was enacted June 12, 2025, and will take effect on January 1, 2027, has an under-18 age threshold for a “minor” and imposes a requirement for covered businesses to “configure all default privacy settings provided to a covered minor through the online service, product, or feature to the highest level of privacy.” Likewise, a “minor” is a “child who is younger than 18 years of age” under the Texas Securing Children Online through Parental Empowerment (SCOPE) Act. By contrast, the Nebraska Age-Appropriate Online Design Code Act (Nebraska AADC), enacted May 30, 2025, and the just-enacted South Carolina Age-Appropriate Design Code (South Carolina AADC), signed into law on February 5, 2026, use COPPA’s under-13 standard to define “child.” Both laws adopt an actual knowledge standard, in line with COPPA.
As we wrote here, NetChoice frequently disputes the constitutionality of laws that purport to impose liability for system design features that are “shown to increase use of or engagement with an online service, product, or feature beyond what is reasonably expected.” To date, neither the Vermont AADC nor the Nebraska AADC has been challenged in court. However, NetChoice sent an opposition request to the Nebraska state Banking, Commerce, and Insurance Committee on February 3, 2025, and then a veto request to Governor Pillen on May 30, 2025, contending that the Nebraska AADC forces websites “to over-censor content in order to avoid being penalized under the law’s vague concept of what might be harmful to minors.” Most recently, NetChoice brought suit challenging the constitutionality of the South Carolina AADC just days after it was signed into law.
The Maryland Kids Code is also the subject of ongoing litigation. That law requires online companies to undertake a DPIA for any online service, product, or feature reasonably likely to be accessed by children, and prohibits businesses from processing the personal data of children “in a way inconsistent with the children’s best interests.” On February 3, 2025, NetChoice challenged the Maryland Kids Code and used that very phrase as a key point, arguing that requiring websites to act in the “best interests of children" essentially turns websites into self-censors. In the ongoing lawsuit, NetChoice contends that the Maryland Kids Code requires the same type of DPIA that the Ninth Circuit found was unconstitutional in NetChoice’s earlier challenge to the CAADCA. NetChoice also argues that (1) the law is preempted by COPPA and Section 230 of the Communications Decency Act (CDA), and (2) COPPA’s actual knowledge standard preempts the Maryland Kids Code’s vaguer provisions. On November 24, 2025, a federal judge in Maryland denied the state’s motion to dismiss NetChoice’s complaint, and the case will proceed.
Meanwhile, on October 1, 2025, several children’s privacy-related amendments to the Colorado Privacy Act and the Montana Consumer Data Privacy Act went into effect. Neither law mandates age verification for minors, but both require businesses/controllers to address a “heightened risk of harm” to minors and conduct a DPIA. Unlike the CAADCA’s “likely to be accessed” standard, these laws apply to controllers who know they are dealing with a minor or willfully disregard information on age. The most controversial aspect of the Colorado Privacy Act is a provision barring “use of any system design feature to significantly increase, sustain, or extend a minor’s use of an online service, product, or feature.” Content creators want to offer engaging content, but broad legislative restrictions barring “any” system design feature that might “significantly” increase a minor’s use create a barrier to achieving that goal. While reflecting concerns about the adverse impact “dark patterns” may have on kids and teens, these broad restrictions seem ripe for a constitutional challenge. At present, no legal challenge has been filed.
Age Verification
The U.S. Supreme Court’s 2025 decision in Free Speech Coalition, Inc. v. Paxton, which held that the age verification mandate under Texas HB 1181 could be valid when a state was restricting minors’ access to pornographic content, set the stage for enactment of age verification laws in other states. In 2025, Ohio, Missouri, and Arizona followed the Texas approach, enacting legislation that requires age verification to prevent minors from accessing obscene materials online. Anti-pornography laws mandating age verification went into effect in Arizona and Ohio on September 26, 2025, and September 30, 2025, respectively. Arizona’s law requires businesses that “knowingly and intentionally publish or distribute material on a website” where “one-third is sexual material that is harmful to minors” to use age verification to establish that the person attempting access is over 18. Ohio’s law requires age verification from businesses “presenting any material or performance that is obscene or harmful to juveniles on the internet.” Going beyond limiting access to pornography or content illegal for minors, the requirement of age verification prior to allowing any minor to access “harmful” content raises a host of questions and may result in a determination that the law is overbroad. Finally, Missouri’s anti-pornography law went into effect on November 30, 2025, requiring age verification where a website “contains a substantial portion of material pornographic for minors.” These three age verification laws have not been challenged thus far, although on June 16, 2025, NetChoice wrote to the Missouri attorney general (AG), outlining the organization’s concerns about the legislation, so it would not be surprising if the tech group eventually files suit.
App stores and apps themselves are a newer area of focus for state legislators seeking to limit access to potentially harmful or inappropriate content by kids and teens. In California, the Digital Age Assurance Act (enacted October 13, 2025) requires an “operating system provider” (meaning a person or entity that develops, licenses, or controls the operating system software on a computer, mobile device, or any other general purpose computing device) to provide an accessible interface at account setup requiring age indication (including date of birth, age, or both). The provider must then transmit a digital signal designating the user’s age bracket to apps sold in covered app stores.
Several other states also passed laws that require app stores to conduct age verification and transmit a signal to app stores verifying age. These include Utah’s App Store Accountability Act (SB 142) (enacted March 26, 2025), Texas’ App Store Accountability Act (SB 2420) (enacted May 27, 2025), Louisiana’s Act 481 (enacted June 30, 2025), and most recently, Alabama’s App Store Accountability Act (HB 161) (enacted February 17, 2026). However, on December 23, 2025, a federal court in Texas granted a motion for a preliminary injunction enjoining the Texas App Accountability Act from enforcement, which was due to begin January 1, 2026. NetChoice also challenged an older Louisiana law, Act 456, on March 18, 2025. The tech group alleged that the Act’s requirements to verify the ages of all Louisiana account holders and ban targeted advertising to minors violated the First Amendment. On December 15, 2025, a federal Louisiana court granted NetChoice summary judgment and permanently enjoined Act 456, but Act 481 remains unchallenged, with an effective date of July 1, 2026.
Though it has become uncommon for NetChoice to lose state law challenges, in a rare judicial disappointment, the Eleventh Circuit stayed an earlier motion for a preliminary injunction that NetChoice and the Computer and Communications Industry Association (CCIA) won against Florida HB 3 on November 25, 2025. The case is still ongoing, as the Eleventh Circuit has yet to rule on the merits. NetChoice and CCIA argue that HB 3, which bans anyone under age 14 from creating or holding an account on certain “social media platforms,” and requires 14- and 15-year-olds to obtain a parent’s consent before opening such an account, violates the First Amendment. Oral argument is scheduled for March 10, 2026.
Social Media Warning Labels
California’s Protecting Our Kids from Social Media Addiction Act (AB 56) (enacted October 13, 2025, the same day that the California app age verification law was enacted) requires covered platforms to post the following warning: “The Surgeon General has warned that while social media may have benefits for some young users, social media is associated with significant mental health harms and has not been proven safe for young users.” The warning must be posted each day a user initially accesses the social media platform, again after three hours of cumulative active use, and thereafter at least once per hour of cumulative active use. A similar Minnesota law requires a health warning to appear each time a user accesses the social media platform and disappears only when the user logs off or agrees to proceed at their own risk. Minnesota’s commissioners of health and commerce are charged with developing guidelines for warning labels by March 1, 2026.
AI Companion Chatbots
California SB 243 (enacted October 13, 2025) requires operators of a companion chatbot platform to issue a clear and conspicuous notice stating that the companion chatbot is artificially generated and not human. Operators must also maintain a protocol for detecting, removing, and responding to instances of suicidal ideation by users. New York’s AI Companion Models Law (effective November 5, 2025) imposes similar obligations. While not directed at companion chatbots, Utah HB 452 (enacted March 25, 2025) requires mental health chatbots to reveal they are not human therapists and prohibits suppliers of mental health chatbots from selling or sharing with third parties any individually identifiable health information or user input of Utah users.
Dark Patterns and Targeted Advertising to Children
Connecticut, Georgia, and Louisiana laws that, among other things, restrict targeted advertising to “children,” all went into effect July 1, 2025. Connecticut’s amendments (Substitute SB 1295 Public Act No. 25-113) to the Connecticut Data Privacy Act (CTDPA) ban the sale of minors’ personal data and prohibit targeted advertising to anyone under 18. The amendments also require DPIAs for businesses “profiling” minors. Other changes include an expanded definition of “heightened risk to minors” to include harassment, violence, and exploitation. The Protecting Georgia’s Children on Social Media Act of 2024 (Georgia SB 351) requires express parental consent for a minor to open a social media account and prohibits advertising to minors “based on such minor account holder's personal information, except age and location.” Under Louisiana HB 577, social media platforms with more than 1 million users are prohibited from collecting personal data to use for targeted advertising to minors (under 18). Oregon’s HB 2008, which went into effect on January 1, 2026, imposes a ban on targeted advertising where the controller has actual knowledge or willfully disregards that a consumer is 13-15 years old.
Arkansas Act 901 (SB 612), enacted April 22, 2025, bans companies from showing minors (under 16) targeted ads and attempts to protect minors from designs, algorithms, or features that are addictive or otherwise harmful. NetChoice filed a complaint on June 27, 2025, alleging that Act 901 was unconstitutional and preempted by Section 230 of the CDA. A federal court granted NetChoice's request for a preliminary injunction of Act 901 on December 15, 2025.
State Enforcement and Litigation
AG Actions and Enforcement
State AGs have often pursued an aggressive agenda in enforcing children’s privacy, but actions by California AG Rob Bonta and Texas AG Ken Paxton stood out for the breadth of their enforcement initiatives in 2025.
California
On October 30, 2025, AG Bonta announced a $530,000 settlement with Sling TV LLC and Dish Media Sales LLC (Sling TV), a streaming service, resolving allegations that the company violated the California Consumer Privacy Act (CCPA) by failing to provide an easy-to-use method for consumers to stop the sale of their personal information and provide sufficient privacy protections for children. This is just one of several cases brought by Bonta last year over alleged violations of CCPA, but it does not allege violations of a specific children’s privacy law. Instead, the case stemmed from an investigative sweep announced in January 2024 by the California Department of Justice (DOJ), which focused on the compliance of streaming services and connected TVs with CCPA’s right to opt-out. Under the settlement, in addition to paying the civil penalty and implementing changes to ensure compliance with CCPA, Sling TV agreed to provide parents with clear disclosures and tools to minimize collection and use of their children’s data.
On November 6, 2025, AG Bonta, along with Connecticut AG William Tong and New York AG Letitia James, announced a $5.1 million settlement with educational technology company Illuminate Education, Inc. (Illuminate) for alleged failure to protect students’ data. In 2021, the company experienced a data breach that exposed students’ sensitive personal and medical information. In addition to paying the civil penalty, with $3.25 million going to California alone, the company agreed to: implement appropriate access control and account management; implement appropriate real-time monitoring and alerts for suspicious access and activity; put in place appropriate safeguards to protect backup databases; inform California DOJ of breaches involving student data; and provide reminders to school districts that they should perform a review of the student data stored by Illuminate. In this case as well, no violation of a specific children’s privacy law was alleged.
On November 21, 2025, AG Bonta announced a $1.4 million settlement with game app developer Jam City, Inc. According to the AG’s complaint, Jam City allegedly sold and shared the personal information of users between 13 and 16 without their permission and failed to provide them with a means of opting out of the sale or sharing of their personal information, in violation of the CCPA. In addition to paying the $1.4 million penalty, Jam City must offer in-app methods for users to opt out of the sale or sharing of their data and must not sell or share the personal information of consumers who are at least 13 and less than 16 years old without their affirmative opt-in consent. Here, again, no violation of a specific children’s privacy law was alleged.
Separate from formal enforcement actions, AG Bonta has been active in sending informal warning letters, specifically to AI companies. In August 2025, AG Bonta joined 44 AGs and sent a letter to 12 of the top AI companies after reports of sexually inappropriate interactions between AI chatbots and children. The letter informs recipients that they are being monitored by state AGs to ensure that the companies are protecting children. The following month, AG Bonta and Delaware AG Kathy Jennings sent a separate letter to OpenAI expressing concerns over increased reports of how OpenAI’s products may be harming children. OpenAI, which also received the August 2025 letter, met with the two AGs in September 2025.
Continuing the focus on AI companies this year, on January 14, 2026, AG Bonta announced an investigation into xAI and its AI model, Grok, over explicit images of women and children produced by Grok. The investigation is aimed at determining whether and how xAI violated the law in its creation and dissemination of this material. Two days after announcing the investigation, AG Bonta sent a cease and desist letter to the company, demanding that it take immediate action to stop the creation and distribution of “deepfake nonconsensual intimate images (NCII) or child sexual abuse material (CSAM).” The letter provided the company five days to comply. More recently, at the federal level, Democratic members of the House Energy and Commerce Committee sent their own letter to xAI, demanding answers about the company’s role in generating “abusive, exploitative, and sexually harassing content,” including CSAM.
Texas
On January 9, 2025, AG Paxton announced a lawsuit against social media giant TikTok for allegedly deceptively promoting its app as safe for minors, despite regularly showing inappropriate and explicit material to children. This lawsuit alleged violations of the state’s Deceptive Trade Practices Act (DTPA). A few months later, on October 3, 2025, AG Paxton announced another lawsuit against TikTok, this time for alleged violations of the Texas Scope Act, which prohibits digital service providers from sharing, disclosing, or selling a minor’s (under 18) personal identifying information without parental consent. The complaint alleges that TikTok failed to provide parental tools to control their children’s accounts as required under the Act.
On September 3, 2025, AG Paxton announced he had filed a lawsuit against PowerSchool, a California-based provider of cloud-based services for K-12 schools, after a data breach allegedly exposed sensitive personal identifying information and protected health information of more than 880,000 Texas school-aged children and teachers. The lawsuit alleges that the data breach and PowerSchool’s failure to protect this information violated both the DTPA and the state’s Identity Theft Enforcement and Protection Act (ITEPA). Although children’s data is involved, no violation of a specific children’s privacy law is alleged.
On November 6, 2025, AG Paxton filed a lawsuit against Roblox for allegedly violating the DTPA and creating a common nuisance by “engaging in deceptive trade practices, namely, promising parents that its interactive gaming platform was safe for children while Roblox knowingly facilitated the sexual exploitation of teen and preteen children and the distribution of child sexual abuse material.” Here, again, although children’s data is involved, no violation of a specific children’s privacy law is alleged. AGs in Florida, Louisiana, Kentucky, Tennessee, and Iowa have brought similar challenges against Roblox, and AGs in Georgia and South Carolina launched formal investigations into Roblox over child‑safety and consumer‑protection concerns. The company is the target of many other lawsuits, including several cases that were consolidated into a multidistrict litigation (MDL).
AG Paxton has also opened investigations into online sites using generative AI chatbots. On August 18, 2025, he launched probes into Meta AI Studio and Character.AI over their alleged marketing of themselves as mental health tools. This follows an existing investigation into Character.AI and several other companies for potential violations of the SCOPE Act and the Texas Data Privacy and Security Act. That investigation in turn followed a prior lawsuit, initiated by AG Paxton in 2024, against TikTok for alleged violations of the SCOPE Act and the DTPA.
Most recently, on February 11, 2026, AG Paxton filed a lawsuit against Snap, Inc. for alleged violations of the DTPA and SCOPE Act. The complaint claims that the creators of Snapchat knowingly misrepresented the app’s safety to parents and consumers by promoting it as safe for children and using “12+” age ratings in app stores when the app allegedly exposed users to content that includes drugs, nudity, alcohol, and profanity. The lawsuit also claims that Snap designed its app to be highly addictive and thereby causes harm to young minds. As discussed below, similar private litigation against Snap and others also focuses on the issue of social media addiction.
Private Litigation
While a complete review of private litigation is outside the scope of this summary, this has been an active area.
Roblox, in addition to battling various AG challenges and the MDL, is also defending itself against class actions, including one for which it recently won the right to arbitrate. But Roblox is hardly the only big tech company under fire by multiple plaintiffs for violating children’s privacy rights. On August 18, 2025, Google and its subsidiary, YouTube, agreed to pay $30 million to settle a class action lawsuit. The parents of 34 children filed the class action lawsuit against Google, YouTube, and several YouTube channel owners, claiming that the use of persistent identifiers to track and collect children’s personal information without parental consent violated multiple state laws and COPPA and amounted to an “unlawful invasion of the right to privacy and reasonable expectation of privacy of millions of children under the age of 13 from July 1, 2013 through September 4, 2019.” The case is currently proceeding against the other parties, including the Cartoon Network and DreamWorks.
MDL lawsuits are increasingly common and likely to set important precedents. As noted above, Roblox is currently facing an MDL in the Northern District of California. Meanwhile, an MDL against several social media companies for alleged social media addiction is also underway in the Northern District of California, with similar Judicial Council Coordination Proceedings (JCCP) in the Los Angeles County Superior Court. Days before jury selection was to start in a bellwether JCCP case, TikTok and Snap reached settlements with the plaintiff, leaving Meta and YouTube as the only defendants in that case. However, the settlements do not resolve the roughly 1,000 other consolidated cases in the JCCP or the cases in the MDL. These cases are part of a growing trend of new lawsuits against social media companies based on tort claims such as negligence and product liability rather than on privacy or advertising violations. We will continue to track developments in this area, so stay tuned.
Our Predictions for State Laws in 2026
Laws aimed at protecting children and teens online are clearly popular with both federal and state legislators. While strong preemptive federal legislation would bring uniformity and consistency, whether that goal can be achieved in a manner that respects constitutional norms and avoids a floodgate of litigation remains unclear. In the interim, we expect states to continue to pass laws that seek to protect kids and teens by imposing “design-code” requirements, age verification obligations, and measures to restrict minors’ access to certain online services, including social media and AI companions, despite significant losses in court. We also expect more restrictions on what content minors can see and to whom businesses advertise, with inevitable lawsuits to follow, as well as more lawsuits grounded in tort theories in response to concerns about the potential for social media to harm children and teens.
While the continuing proliferation of these laws and challenges to them creates ongoing confusion for the regulated community, some questions are likely to be decided by the U.S. Supreme Court in due course. Tracking the state landscape to identify which laws are currently in effect, which laws are stayed preliminarily, and which laws have been permanently enjoined is a distinct challenge, and litigation is likely to continue. In the meantime, businesses continue to face confusing and sometimes inconsistent obligations, making it difficult to operationalize business processes.