Tech Faces Liability Under Advancing Calif. Social Media Bill
The California Assembly’s Judiciary Committee unanimously passed legislation Tuesday to make social media platforms liable for addiction- and design-related harm to children. AB-2408 would impose penalties on major social media platforms for negligent design.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
Social media business models depend on addicting children, said Assemblymember Jordan Cunningham, lead Republican sponsor. The platforms understand the addictive nature of products and their impact on child self-esteem, said lead Democratic sponsor Buffy Wicks.
The bill was crafted partly in response to testimony from Facebook whistleblower Frances Haugen (see 2202180068), who helped spur bipartisan federal legislation from Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn. Despite bipartisan consensus, Congress isn’t going to do anything, so California should lead the way in protecting children online, said Cunningham.
Common Sense Media, the Consumer Federation of California, academics and victims spoke in support of the bill. TechNet, Chamber of Progress and the Entertainment Software Association opposed it. Facebook didn’t comment.
Companies would face civil penalties of up to $25,000 per violation with a two-year statute of limitations. The bill would impose a “duty not to addict” users under the age of 17. The duty extends to practices in which platforms use or sell a young user’s data, the use of notifications and certain design features. The bill creates safe harbors, one of which would allow platforms to remove addictive design features after passage to avoid penalties for past practices. Companies would have the right to cure, allowing them 30 days to disable new features that are determined to be addictive.
Persuasive design features enable platforms to exploit young users’ vulnerability and fear of social rejection, testified child psychologist Richard Freed. Half the Story Executive Director Larissa May testified about her past social media addiction, saying she spent 10-12 hours a day on platforms as a young user. This led to body image issues and suicidal thoughts, she said.
TechNet is “strongly opposed” to this “deeply flawed bill,” which attempts to address an “incredibly important” but complex issue, said Executive Director-California Dylan Hoffman. The bill raises constitutional concerns by infringing on platforms’ editorial discretion and protected speech, he testified. The bar to prove harm is too low, and the liability is excessive, he said, claiming a class-action lawsuit could result in damages of $500 million or more.
The bill may drive young users away from mainstream platforms that invest heavily in child safety toward smaller, unvetted platforms like Signal, said Tyler Smith, Chamber of Progress director-state and local government relations, Central Region. He described his organization as a center-left tech industry association. Safety is a top priority for members, he said. The Entertainment Software Association “strongly believes in protecting kids online” but opposes the bill, said Vice President-State Government Affairs Tara Ryan, calling into the hearing.
The only “trailblazing they’re doing” with this legislation is the “burning” of the First Amendment, said NetChoice Vice President Carl Szabo: Attempting to define a medical term like addiction subjectively as a legal term will be problematic and carries significant speech risks. “This is not just trying to use a cannon to shoot a fly." he said. "This is like burning the house down to roast a pig.”
Courts will find the bill constitutional, said Cunningham. He cited Lemmon v. Snap, a 9th Circuit U.S. Court of Appeals decision finding Snap liable for design features that caused “dangerous behavior.” Snap installed a “speedometer” feature, which young users enabled while driving around at high speeds. The feature was linked to several fatal accidents. The court rejected Snap’s argument for Communications Decency Act Section 230 liability protection.
On speech claims, there’s a good chance the bill would hold up constitutionally because it mirrors case law on social media liability, said Assembly member Eloise Gomez Reyes (D). She noted bills are often introduced with flaws, and she looks forward to improving the measure. Policymakers have grappled with these questions over addictive video games, said Richard Bloom (D). He said he might vote differently on the floor, saying he wants to address concerns about how the bill would function in the real world.
Lawmakers shouldn’t be worried about the bill's constitutionality, said Committee Chairman Mark Stone (D). A bill like this challenges the status quo, and these policy questions fall in an area that hasn’t been thoroughly litigated, he said: “These companies know what they’re doing with algorithm generation.”