Export Compliance Daily is a Warren News publication.
Meta Urges Federal Bill

Social Media MDL Mental Health Claims'Simply Not True,' Says Google

A day after U.S. District Judge Yvonne Gonzalez Rogers for Northern California denied social media defendants’ motion to dismiss a negligence lawsuit against them for their alleged role in fueling a youth mental health crisis in the U.S., Facebook and Instagram parent Meta blogged in favor of federal legislation “to create simple, efficient ways for parents to oversee their teens’ online experiences.”

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Rogers said in her Tuesday order (docket 3047) that certain claims in hundreds of lawsuits bundled in the In Re: Social Media Adolescent Addiction/Personal Injury Products Liability multidistrict litigation against Meta, Google, TikTok and Snap are not shielded from negligence claims by Telecommunications Act Section 230 or the First Amendment defenses.

Meta, ByteDance and Snap didn’t comment on Rogers’ denial of their motion to dismiss. A Google spokesperson emailed Wednesday that “protecting kids across our platforms has always been core to our work.” Google worked with child development specialists to build “age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls,” he said. The allegations in the social media MDL complaints “are simply not true.”

In a Wednesday blog post, Meta Global Head of Safety Antigone Davis pushed for legislation “so all apps teens use can be held to the same standard.” Meta supports federal legislation requiring app stores to get parents’ approval whenever their under-16 children download apps. Federal laws would avoid the “patchwork of different laws” states are adopting, “many of which require teens (of varying ages) to get their parent’s approval to use certain apps, and for everyone to verify their age to access them,” she said. Teens move among websites and apps, and “social media laws that hold different platforms to different standards in different states will mean teens are inconsistently protected," she said.

The MDL is in the same Oakland court where 33 attorneys general last month filed a similar suit in California v. Meta Platforms, et al. (docket 4:23-cv-05448 ), alleging the Facebook and Instagram parent “has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens," and did so in the name of profit.

Rogers found that plaintiffs’ claim of negligence per se for alleged violations of the Children’s Online Privacy Protection Act is not barred by Section 230 or the First Amendment. The claim alleges defendants failed to provide required notice and obtain parental consent before collecting certain information from children, and that “in no way impacts their role as publishers of third-party content,” Rogers said.

On the negligent design and negligence claims, plaintiffs make “myriad allegations that do not implicate publishing or monitoring of third-party content and thus are not barred by Section 230," Rogers said. The defects are not equivalent to speaking or publishing “and can be fixed by defendants without altering the publishing of third-party content,” said the judge.

Those defects relate to not providing parental controls, including notification to parents that children are using the platforms; not providing options for self-restricting time used on a platform; making it challenging for users to choose or delete their account or to report predator accounts and content; not using robust age verification; offering appearance-altering filters; not labeling filtered content; timing and clustering notifications of defendants’ content to drive addictive use; and not implementing reporting protocols to allow users to report child sexual abuse materials (CSAM ) and adult predator accounts without the need to create or log in before reporting, said the order.

The alleged defect of timing and clustering of notifications of third-party content in a way that promotes addiction, is, however, barred, Rogers ruled. Other alleged design defects barred by Section 230 directly target defendants’ roles as publishers of third-party content, said the order. Those include failing to put default protective limits on the length and frequency of sessions; failing to institute blocks to useduring certain times of day; not providing a beginning and end to a user’s feed; publishing geolocation information for minors; recommending minor accounts to adult strangers; and using algorithms to promote addictive engagement, the order said.