brief
In recent years, both state and federal legislatures have stepped up efforts to enact laws designed to protect minors in the digital world. However, several court decisions have found that these legislative actions exceed constitutional limits. This article highlights key legislative initiatives taken by the federal government and California and Texas to protect children and teens online, as well as lawsuits challenging the legality of California and Texas measures as of early September 2024.
Federal Children’s Online Safety and Privacy Act (KOSPA)
KOSPA is a package of legislation that combines two bills – the Child Online Safety Act, first introduced in 2022, and the Child and Youth Online Privacy Protection Act, first introduced in 2019. On July 30, 2024, the U.S. Senate passed KOSPA in a vote of 91 to 3.
KOSPA is intended to protect “minors,” which are individuals under the age of 17. KOSPA will establish certain protections for “children,” which are individuals under the age of 13, and for “youth,” which are individuals between the ages of 13 and 16.
KOSPA will impose obligations on various entities, including:
- “Covered Platforms” means online platforms, online video games, messaging applications, or video streaming services that are connected to the Internet and are used or may be used by minors, subject to various exceptions.
- “Online Platform” means any public-facing website, online service, online application or mobile application that primarily provides a community forum for User Generated Content.
- Operators of online services that target children or young people or who actually know or reasonably could reasonably be expected to know that they are collecting personal information from children or young people.
The following are some examples of the obligations that KOSPA would impose on companies if it were passed in its current form:
- Duty of Care: Regulated platforms must exercise reasonable care in the creation and implementation of design features to prevent and mitigate a variety of specified harms to minors. These harms include certain mental health disorders, addictive behaviors, cyberbullying, sexual exploitation, the promotion and marketing of drugs, tobacco products, gambling, or alcohol, and financial harm.
- Safeguard measures: Regulated platforms must implement certain safeguards to protect users they know are minors. These safeguards include limiting the ability of others to communicate with minors and view minors’ personal data, and limiting the use of design features that lead to compulsive use of their platforms by minors.
- Parent Notifications, Tools and Consent: Regulated platforms must provide various notices and easily accessible and easy-to-use settings for parents to support users that the platform knows are minors. If a regulated platform knows that an individual is a child, the platform must obtain verifiable consent from the child’s parents before the child uses the platform for the first time.
- Transparency Report: In the United States, regulated platforms with more than 10 million monthly active users that primarily provide online forums for user-generated content must publish a public report at least once a year that describes reasonably foreseeable risks of harm to minors and assesses the prevention and mitigation measures taken to address such risks based on an independent third-party audit that conducts a reasonable examination of the regulated platform.
- Privacy obligations: KOSPA will make a number of significant revisions to the existing Children’s Online Privacy and Protection Act (COPPA). One of the revisions will expand the group of “operators” subject to the revised law, including the inclusion of 13-16 year olds as a category of individuals protected by the law, and expand the circumstances in which operators are aware that they process personal information of children or teenagers. KOSPA will also impose new rules and restrictions, including prohibitions on profiling and targeted advertising to children and teenagers, subject to certain limited exceptions. Operators will be subject to these new requirements to the extent they are not already subject to such requirements of COPPA.
California Age-Appropriate Design Code Act (CAADCA)
In September 2022, California enacted CAADCA, which explicitly requires companies to consider the best interests of minors under the age of 18 when designing, developing, and providing online services. In September 2023, the U.S. District Court for the Northern District of California approved a preliminary injunction prohibiting the enforcement of CAADCA on the grounds that CAADCA may violate the First Amendment. However, in August 2024, the U.S. Court of Appeals for the Ninth Circuit partially vacated the district court’s preliminary injunction, determining that some (but not necessarily all) of CAADCA may be constitutionally invalid, and sent the case back to the district court for further trial.
Subject to ongoing constitutional challenges, CAADCA imposes a wide range of obligations and restrictions on any “business” that provides online services that could be used by minors. A “business” is any for-profit organization that determines how and for what purposes it processes the personal information of California residents and meets one of the following three criteria: (1) has annual gross revenues of $25 million, adjusted for inflation; (2) purchases, sells, or shares the personal information of 100,000 or more California residents or households annually; or (3) derives at least 50% of its annual revenues from selling or sharing the personal information of California residents.
Both the district court and the appellate court agreed that the potentially unconstitutional portions of the California Data Protection Impact Assessment Act are the provisions requiring businesses to conduct data protection impact assessments and take certain steps in connection with the assessments (i.e., Cal. Civ. Code sections 1798.99.31(a)(1)–(4); 1798.99.31(c); 1798.99.33; and 1798.99.35(c)).
The district court will ultimately assess the constitutionality of the remaining provisions of CAADCA, which include requirements covering businesses that:
- Estimating the age of underage users with reasonable and appropriate certainty, otherwise treat all users who reside in the State of California as minors.
- Configure all default privacy settings for minors to settings that provide a high level of privacy, unless an exception applies.
- Provide privacy information, terms of service, policies, and community standards concisely, prominently, and in clear language appropriate to the age of minors who may access the online services.
- Provide prominent, accessible, and responsive tools to help minors or their parents or guardians exercise their privacy rights and report concerns.
- The personal information of a minor shall not be intentionally used in a manner that seriously harms the minor’s physical, mental, or health or well-being.
Texas passes SCOPE Act to protect children online
The SCOPE Act was enacted in 2023 and will take effect on September 1, 2024. It is intended to regulate “digital service providers” (DSPs) and protect minors under the age of 18. On August 30, 2024, the Austin Division of the United States District Court for the Western District of Texas granted a preliminary injunction prohibiting Texas from enforcing the “monitoring and filtering requirements” of the SCOPE Act (i.e., Texas Business and Commerce Code § 509.053) on the grounds of the First Amendment, while suspending the blocking of other provisions of the SCOPE Act.
The SCOPE Act defines a DSP as an owner or operator of a website or online software that determines the manner and purposes for which personal information about users is collected and processed, connects users in a way that allows them to interact socially with other users, allows users to create public or semi-public profiles to log in and use digital services, and allows users to create or publish content that can be viewed by other users of digital services. The SCOPE Act also lists various exceptions to the definition of a DSP.
The “monitoring and filtering requirements” that the District Court enjoined would require DSPs to monitor certain categories of content and filter that content from being displayed to known minors. Specifically, the SCOPE Act would require DSPs to develop and implement a strategy to prevent known minors from being exposed to content that promotes, glorifies, or facilitates a variety of topics, including suicide, substance abuse, bullying, and grooming.
The District Court declined to enjoin enforcement of other provisions of the SCOPE Act. Therefore, DSPs must carefully evaluate their obligations under the SCOPE Act, including the following requirements:
- Have users register their age before creating an account.
- Use commercially reasonable means to verify the identity of the individual claiming to be the parent of the minor and their relationship to the minor.
- Allow parents to dispute their child’s registered age.
- Limit the collection of personal information from known minors to that which is reasonably necessary to provide the Digital Services.
- Do not allow known minors to make purchases or engage in other financial transactions through the Digital Services; and
- If their algorithms automatically recommend, promote or rank information to known minors on digital services, they should publish an explanation of how their algorithms work.
Just a snapshot
The federal proposals mentioned above, as well as the laws in California and Texas, are just three examples of legal developments in the area of online protection for minors. Many other bills, laws, constitutional challenges, and enforcement actions are advancing rapidly across the United States, including child privacy regulations, age-appropriate design rules, addictive information restrictions, and parental consent and management tool requirements. Stay tuned for more updates from the Baker McKenzie team.
Leave a Reply Cancel reply
You must be logged in to post a comment.