Export Compliance Daily is a Warren News publication.
Preliminary Injunction Motion

AB-2273 Would Censor the Internet Under'Guise of Privacy’: NetChoice

California’s age-appropriate social media design law (AB-2273) “is the most extensive attempt by any state to censor speech since the birth of the internet,” said NetChoice’s motion Friday (docket 5:22-cv-08861) for a preliminary injunction in U.S. District Court for Northern California in San Jose to invalidate the statute. California’s response to the motion, which had been expected (see 2301310034), is due April 21. NetChoice’s reply brief is due May 19.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

AB-2273 “enacts a system of prior restraint over protected speech using undefined, vague terms,” said NetChoice’s motion. It creates “a regime of proxy censorship, forcing online services to restrict speech” in ways that California “could never do directly,” it said. NetChoice and its members will suffer irreparable harm in the loss of First Amendment freedoms if AB-2273 is "not enjoined," said its motion.

The law violates the First Amendment and the Commerce Clause, and is preempted by the Children’s Online Privacy Protection Act (COPPA) and Section 230 of the Communications Decency Act, said NetChoice’s motion. Because AB-2273 “forces online providers to act now to redesign services,” irrespective of its formal July 1, 2024, effective date, “it will cause imminent irreparable harm,” it said.

AB-2273 attempts to censor the internet “under the guise of privacy,” said NetChoice’s motion. Rather than empower parents to supervise children’s privacy online, AB-2273 requires services to redesign their offerings to be appropriate for children, as California defines it, it said. Under AB-2273, even with a parent’s consent, a provider can’t use collected information or algorithms to serve content to a minor unless such use is deemed “necessary” to what the minor is already engaged in or is found to be “in the best interests of children,” it said.

The new law “will subject a global medium to state supervision and hobble a free and open resource that has become indispensable,” said NetChoice’s motion. AB-2273 “is subject to the highest level of scrutiny and is presumptively invalid under the First Amendment,” it said. It creates a system of unlawful prior restraint, “suppresses and chills swaths of internet speech,” applies vague, subjective and undefined terms and regulates speech “on a content and speaker basis,” it said.

AB-2273’s reporting mechanisms “are similarly unconstitutional,” said NetChoice’s motion. By subjecting online services to “drastic penalties” for failing to sufficiently review, identify and mitigate or eliminate “harmful” speech, the new law “coerces these services to err on the side of censorship,” it said. Self-censorship is AB-2273's “self-professed aim,” it said.

COPPA preempts AB-2273, and the two laws’ approaches to children’s privacy online “could hardly be more inconsistent,” said NetChoice’s motion. The FTC’s July 2020 guidance said COPPA’s primary goal is to place parents in control over what information is collected from their young children online, it said: “AB-2273 takes that parental control away and places a host of new and different state-imposed obligations on services.”

Section 230 “expressly preempts” AB-2273’s requirement that providers “enforce their own published terms, as well as its restrictions on the use of personal information,” said NetChoice’s motion. AB-2273’s enforcement requirement also is preempted by Section 230, which protects providers’ ability to take “good faith” discretionary actions to restrict access to objectional content, it said.

AB-2273, by contrast, “requires publishers to either over-moderate -- censoring protected speech -- or forgo discretionary moderation altogether, lest they be accused of failing to apply their content rules adequately,” said NetChoice’s motion: “This is the very moral hazard Section 230 exists to prevent.”