Skip to main content
Publication

Youth Online Safety in the Crosshairs with NTIA Comment Request and Joint AG Meta Complaint

Children’s and teen’s online privacy and safety – particularly their mental health – continues to be an area of intense scrutiny for lawmakers, regulators, and enforcers. Last May, the Biden administration announced the creation of a new task force focused on the safety, privacy, and wellbeing of children online, linked to an Advisory on Social Media and Youth Mental Health issued by the U.S. Surgeon General the same day. The task force is slated to produce voluntary guidance, policy recommendations, and a toolkit on safety, health, and privacy-by-design for industry-developing digital products and services by Spring 2024. As part of this initiative, the National Telecommunications and Information Administration (NTIA) of the Department of Commerce (DOC) published a Request for Comments (RFC) in the Federal Register on October 10, 2023. The RFC seeks public feedback on the best ways to protect the mental health, safety, and privacy of minors online, now characterized as an urgent public health issue by the Surgeon General.

But there’s more. Proving that both red states and blue states can agree on some issues, a bipartisan group of state attorneys (AGs) general filed a federal lawsuit against social media giant Meta Platforms Inc (Meta) and other Meta entities on October 24, 2023, and nine AGs filed complaints in their states. The complaints allege violations of the Children’s Online Privacy Protection Act (COPPA) and other legal violations related to allegedly harmful design features and practices by the Meta entities that, the complaints allege, contribute to body dysmorphia, sadness, suicidal thoughts, and other mental health harms.

NTIA RFC on Children’s Online Safety

The goal of the NTIA RFC is “to identify existing and emerging risks to minors, suggest further research, and recommend best practices and standards to evaluate, prevent, and reduce potential online harm to young people,” and to help parents and caregivers protect the online health and safety of children and teens. The RFC cites the May 23, 2023, Advisory by the U.S. Surgeon General characterizing use of online platforms by minors as an “urgent public health issue” that requires action by tech companies and online service providers. As the Advisory itself says, “[a]dvisories are reserved for significant public health challenges that require the nation’s immediate awareness and action.” The Advisory describes a “growing consensus about the need to fund research to more fully understand the complexity of the overall impact of social media, and technology use more generally on youth mental health and socio-emotional and cognitive development.”

Expanding on the Surgeon General’s Advisory and the Administration’s announcement, NTIA seeks input on a wide range of questions related to:

  • Identifying the health, safety, and privacy risks and potential benefits of social media and online platforms on minors.
  • Current practices and technologies employed by social media and online platforms that have a significant effect on minors’ health, safety, and privacy.
  • Guidance or best practices that might help caregivers and companies better protect the health, safety, and privacy of minors online.

The task force will hold several roundtable discussions in the coming months. The deadline for written comments in response to the NTIA RFC is November 16, 2023.

State AG Lawsuit Against Meta

Shortly after NTIA announced its RFC, a bipartisan coalition of attorneys general filed a federal lawsuit against social media giant Meta Platforms Inc (Meta) that also focuses on the mental health and wellbeing of minors online. On October 24, 2023, 33 states, including the District of Columbia, filed suit against Meta in a California federal district court for violating COPPA and myriad state laws governing false advertising. The suit charges Meta with routinely harvesting the personal information of children without verifiable parental consent as required by COPPA, and deliberately misleading the public about the harms to minors caused by the company’s business practices. The complaint alleges that those business practices target minors and encourage harmful and addictive behavior. The complaint draws from testimony offered last year by a Meta whistleblower who identified infinite scrolling, auto-play, likes, and algorithmic design features, among others, as features that are intended to keep children and teens engaged and on the platform. The complaint requests restitution and injunctive relief.

Nine attorneys general also filed suits in their own states on similar grounds. An earlier class action suit filed in the Northern District of California took aim more broadly at Meta, Snapchat, TikTok, and YouTube, alleging that these social media giants endanger the mental health of minors through “addictive and dangerous social media products,” and allege that the platforms should be held liable on strict liability and negligence grounds for physical and mental harms to minors. In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation (MDL No. 3047 Case No. 4:22-md-03047-YG (Oct. 6, 2022))

COPPA Claims

A key allegation of the joint federal complaint is that Meta does not properly verify the age of its users and fails to obtain parental consent before collecting children’s personal information, even though the company has constructive knowledge that many of its users are children. COPPA requires covered businesses to inform parents and get their consent before processing the personal information of a child under 13 where websites or online services are directed to children (defined as under 13) and where such services or websites that are not child-directed, have actual knowledge that they are collecting personal information online from a child under 13 years of age.

COPPA, however, does not require “age verification” of children or teens. In fact, online services that are primarily directed to children are barred from age-gating visitors, since the audience is assumed to be comprised primarily of those under 13. In contrast, operators of an online service that does not target children under 13 as its primary audience are permitted to screen for age to assure that underage users are offered an age-appropriate experience and that verifiable parental consent (VPC) is obtained. “General audience” sites or online services are under no general obligation to either age-screen or verify age. (Tobacco, alcohol, gambling, and similar sites only available to adults of legal age do age-screen visitors.) The COPPA Rule outlines several methods of VPC, which vary depending on the type of data collected and whether personal information will be shared with third parties.

Advertising Claims

The complaint also alleges that Meta violated federal and state advertising laws that prohibit deceptive or misleading practices. Complainants assert that Meta deliberately hid “the ways in which these platforms exploit and manipulate its most vulnerable consumers: teenagers and children.” The AGs further claim that Meta uses a business model that deliberately keeps minors engaged in social media for long periods; employs product features designed to be addictive while publishing misleading reports that showed a low incidence of user harms; and refuses to change its social media practices that in the face of substantial evidence that such practices are harming minors.

The complaint raises important questions for children’s privacy and online advertising.

  • What are a platform’s obligations and rights from a First Amendment standpoint?
  • Is a false advertising charge the appropriate legal basis for a complaint largely focused on exposure to content?
  • What is meant by “keeping kids safe”? Are platforms responsible for preventing kids and teens from seeing negative comments from other users, or from accessing or viewing possibly disturbing content more broadly? Who decides?
  • Should everyone under 18 be treated like children under 13 until they demonstrate they are older? What are the rights and responsibilities of platforms, children, teens, and parents? What are the privacy burdens of age verification? And what is the future of free ad-supported online content?

The Surgeon General’s Advisory attempts to offer answers to some of these questions, outlining possible areas of responsibility for policymakers, technology companies, parents, and children and teens. How stakeholders believe those competing considerations on rights, responsibilities, and constitutional considerations might be appropriately balanced within the existing legal framework may become clearer as comments to NTIA on the RFC are submitted and the Meta litigation proceeds, but the topic is sure to garner continued attention in the coming months.