Export Compliance Daily is a Warren News publication.
'Arbitrary Examples' Cited

Calif. AADC Fails to Show How It Protects Minors' Mental Health: NetChoice

California asserts the Age-Appropriate Design Code (AADC) is necessary to protect the “health and well-being of minors,” but it “fails to show how the law serves that boundless objective, much less how it is tailored to do so,” said NetChoice Friday in a supplemental brief (docket 5:22-cv-08861) in support of its motion for preliminary injunction against the social media design law, due to take effect in June.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

NetChoice sued to block AB-2273 in December, arguing the law violates the First Amendment by telling sites how to manage constitutionally protected speech (see 2212140063). NetChoice argued the new law illegally tries to regulate across state lines and is preempted by federal law. California disagrees that AB-2273 regulates speech, said California Attorney General Rob Bonta (D) in an August brief (see 2308160043).

California cites “arbitrary examples of speech and features it considers harmful,” such as Snapchat’s discontinued “speed filter,” said the NetChoice brief, but the AADC “applies to a ‘virtually … infinite’ universe of online speech," the brief said. Even for examples it cites, the state doesn’t explain why the law is necessary, it said. “These defects permeate the law, which is also overbroad, unconstitutionally vague, barred by the Commerce Clause, and preempted," it said. "No portion of the law is severable, and it should be enjoined as a whole."

AADC regulates speech because it “restricts how, under what conditions, and to whom content may be published,” and it “indisputably regulates speech based on content,” the brief said. California acknowledged the law requires services to evaluate whether they publish content “likely to be accessed by minors … rendering the entire law content-based,” it said.

Certain provisions of the law regulate speech based on content, said the brief. They require services to assess and report “risks” that editorial decisions “could” expose minors to “potentially harmful” content, and create a “timed plan” to “mitigate or eliminate” those risks, the brief noted. Services must estimate user age with more or lesser certainty based on “the risks” presented by content, “and not publish content based on users’ preferences or through ‘dark patterns’ (like alerts or pop-ups), unless the content is in the user’s ‘best interests’ or would not be ‘materially detrimental’ to the user,” the brief said.

Speech regulation isn’t valid just because it's done by limiting data use, said the brief, saying the Constitution protects using information to publish and target speech. In deciding how to protect privacy interests, “the State cannot engage in content-based discrimination,” it said, citing Sorrell v. IMS Health. That's what AADC does by using terms like “harmful,” “well-being,” and “best interests” to restrict content, contacts and advertising; what minors “witness” online; and algorithms that curate speech, the brief said.

SCOTUS repeatedly said restricting speech is “not a permissible basis to promote the well-being of minors, unless the speech falls into a traditionally unprotected category like obscenity,” said the brief, citing Brown v. Entertainment Merchants Association. A speech restriction survives only if the state proves the law will serve a substantial interest unrelated to suppression of free expression in "a direct and material way" that's "not merely conjectural," and is “at least narrowly tailored to suppress no more speech 'than is essential to the furtherance of that interest,’” it said, citing Turner Broadcasting System v. FCC.

The AADC is “’underinclusive’ when judged against its asserted justification’ to protect children’s well-being,” said the brief, referencing Brown. It leaves “unregulated” potential harms that occur offline or on TV, in video games and print media, plus “junk food, late bedtimes and not-for-profit websites,” said the brief.

Californians are already protected by constitutional and common-law privacy rights and a “comprehensive statutory data privacy regime that requires services to inform consumers what data they collect and why” and prohibits the sharing of data about users known to be younger than 16 without authorization, the brief said. The state doesn’t explain why those protections are “ineffective,” it said. California says the Children's Online Privacy Protection Act (COPPA) is insufficient, “but its experts say only that COPPA is not sufficiently enforced,” it said. COPPA and state law “address the State’s concerns while undisputedly restricting less speech."