December 21, 2024

New Developments in California’s Age-Appropriate Design Code Litigation

The ongoing legal challenge to California’s Age-Appropriate Design Code (AADC) has entered a new phase, with tech industry group NetChoice pursuing an injunction on First Amendment grounds. This update examines recent developments in the case, including our clinic’s amicus brief and the emerging legal arguments around platform regulation. A preliminary injunction hearing is scheduled for January 23, 2025. We have previously discussed this case here.

Evidence on Dark Patterns’ Impact

Our amicus brief synthesizes multiple streams of evidence demonstrating how dark patterns affect vulnerable users. The National Academies of Sciences, Engineering, and Medicine report identifies how certain design features, engineered specifically to increase platform usage, can detrimentally affect adolescent mental health. For instance, continuous scrolling eliminates natural breaking points, extending engagement time and creating feedback loops that reinforce existing beliefs and amplify sensationalized content. This connects with recent research by CITP’s Molly Crockett on outrage-fueled feedback loops in social media.

Research from the University of Michigan’s Institute for Social Research reveals concerning patterns in youth engagement with social media platforms. Their data shows that 8th and 10th graders average 3.5 hours daily on social media, with a quarter of students exceeding 5 hours daily and approximately 14 percent spending more than 7 hours per day on these platforms. 

The Surgeon General’s 2023 Advisory on Social Media and Youth Mental Health validates these concerns, revealing that nearly one-third of adolescents use screen media until midnight or later on weekdays, primarily on social media applications. A subsequent interagency report in July 2024 concluded that digital service providers should prioritize young people’s safety and well-being over profit in product design.

Key Arguments from Amici

Industry Position

The International Center for Law & Economics, supporting NetChoice, argues that the AADC would create an untenable situation for platforms, forcing them to either water down content moderation policies to avoid liability or implement overly aggressive enforcement of existing standards. They contend that data collection and content curation are “inextricably intertwined,” arguing that restricting data collection inevitably impacts content delivery. Their position suggests that without data-driven content tailoring, platforms would need to switch to subscription models or exclude minors entirely, fundamentally altering their current business models and reducing the speech available online.

Age Verification Implementation

Common Sense Media addresses age assurance challenges, emphasizing that the AADC’s requirements are deliberately flexible and privacy-focused. Their brief outlines several implementation approaches: platforms can employ technical methods that don’t require collecting private information, implement universal privacy protections without age estimation, or make enhanced privacy the default setting with adult opt-out options. The verification methods are designed to scale with each company’s existing data practices, allowing companies that collect minimal data to use correspondingly minimal age estimation methods. The organization emphasizes that claims about unconstitutionality are premature without specific evidence of how verification methods might burden users or chill speech.

Section 230 Debate

The Chamber of Progress coalition argues that the AADC’s policy enforcement provision violates Section 230. They contend this requirement would force platforms to abandon detailed content moderation policies in favor of vague standards, leading to over-enforcement through automated tools. Their analysis suggests the law would disproportionately impact marginalized communities, especially LGBTQ+ youth who rely on social media platforms for support and connection. Furthermore, they argue that the provision’s vague definition of “enforcement” would discourage innovation in content moderation solutions.

The Electronic Privacy Information Center’s response, joined by whistleblower Frances Haugen and several former public officials, provides a detailed counter-analysis of the industry’s Section 230 claims. They argue that Section 230 doesn’t preempt privacy laws like the AADC, as it only shields platforms from liability for third-party content. The AADC, they contend, regulates platform behavior rather than user content, placing it squarely within the scope of standard privacy regulations. They emphasize that data protection requirements, including Data Protection Impact Assessments (DPIAs), are standard practice in regulatory frameworks both domestically and internationally. EPIC’s brief also notes that interpreting Section 230 to block privacy laws would expand its scope far beyond its intended purpose.

Looking Ahead

If NetChoice prevails, we anticipate an appeal to the Ninth Circuit. We remain engaged in this case, and others, to provide technical insights that help courts protect user rights in the context of the complex challenges of online safety regulation.

Speak Your Mind

*