Export Compliance Daily is a Warren News publication.
'Head-Scratching'

Social Media Lawsuits Testing Section 230's Relevance, Say Legal Experts

The Children’s Online Privacy Protection Act (COPPA) claims in California et al. vs. Meta, brought in U.S. District Court for Northern California in Oakland last month by 33 attorneys general (see 2310250066), are “self-defeating,” said attorney Cathy Gellis on a Chamber of Progress webinar Thursday examining social media addiction lawsuits pending before state and federal courts.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

The lawsuit alleges Meta’s practices harm the physical and mental health of children and teens and have fueled what the U.S. surgeon general has called a ‘youth mental health crisis.” It alleges Facebook and Instagram collect the personal information of children on their platforms without first obtaining, or attempting to obtain, “verifiable parental consent,” as required by COPPA, for users under 13. Similar suits blaming social media platforms for young people's addiction to social media were filed in California Superior Court in Los Angeles County and in mulitdistrict litigation in U.S. District Court for Northern California in Oakland.

The complaint is “backwards,” Gellis said of the AGs' case. If the defendants had done more of what is being complained about they would be in bigger trouble,” she said. “You don’t get to elicit personal information from people and then do something with it,” Gellis said. “It’s a very bizarre state of affairs to say your COPPA liability is that you did not elicit enough personal information from your user, and therefore you have liability under the statute that gets mad at you when you take too much personal information from your user.”

Having some type of automated mechanism for determining a user’s age has been an untested concept for the first 20-plus years of COPPA, said Santa Clara University law professor Eric Goldman. “The answer has always been you don’t have to do a damn thing in order to figure out the age of your users,” he said. “COPPA doesn’t require it,” and it would be unconstitutional, he said.

The state AGs’ position “is basically trying to turn COPPA into an affirmative obligation that seems like it would directly conflict with the constitutional provisions that say you can't force online publishers to determine the age of their users,” Goldman said. “So the whole thing actually should fail either because COPPA says something different, or because if it says what they claim,” and “then they've created a constitutional problem with COPPA.”

A social media case in Los Angeles County Superior Court, meanwhile, focuses on the “defective design” of the platforms, rather than about the underlying content, said Jess Miers, Chamber of Progress legal advocacy counsel. Section 230 and the First Amendment, therefore, don’t apply in this case, she said. Los Angeles Superior Court Judge Carolyn Kuhl found that social media platforms can’t use the Section 230 shield -- that they can't be liable for content provided by a third party -- to escape some of the claims in the case and that the design features of the platforms, not specific content viewed by users, caused their injuries.

Calling the ruling “quite a read” and “head-scratching,” Goldman said that “in the court’s mind, the claims are not based on the content of third-party users; the claims are based on the conduct of the social media services in engaging with their own users.” The problem is that “it’s an impossible distinction to make,” he said.

The ruling made several references to auto-scrolling, Goldman noted, saying the court identified that as one of “conduct only.” Goldman said: “What are people scrolling? They’re scrolling content.” The reason it’s a problem is because “they’re consuming content.” To say that auto-scrolling can be separated from the content being scrolled “creates this logic puzzle that the court can never fully solve.”

The Los Angeles court says that Section 230 only applies to particular items of content, and “that’s not in the statute,” Goldman said. That isn’t how courts have interpreted Section 230 to date, he said, but here, “the court just made that up.” With “making up that new rule,” it’s easier for the court to talk about “this abstract thing called content.” Section 230 doesn’t apply to abstract concepts of content, “only to specific items,” he said. The distinction between particular and general content “is nowhere in the statute, and it really gives the court a way to disregard what I think are pretty obvious Section 230 defenses."

Section 230 “should have made this case go away” because the tort liability being described “is necessarily a creature of state law, and Section 230 has an exemption provision to prevent states from being able to impose their law on the provision of internet services,” Gellis said. “If they could all do it, then they would all do it differently and you’d have a big problem and not be able to provide any sort of internet interstate commerce type of internet service.” There are “important reasons why this law needs to apply, and the court gives short shrift to it,” she said.

Commenting on Miers’ question on whether Section 230 "is dead” in light of recent rulings and cases, Gellis said, “It’s wounded”; it’s “not doing what it needs to do.” She called Section 230 “pretty straightforward,” but said, “People don’t like the answer that it gives and so therefore, they’re ignoring it.” The statute is “just sort of ill at the moment,” Gellis said, suggesting, hopefully, that it has suffered temporary setbacks.

Claims of negligent design in complaints against social media companies “gets them around Section 230,” said Goldman. “We’ll see if that wins or not on the merits, but the bottom line is Section 230 is no longer doing the work.” The “story is still being written” on whether there is a broadbased negligent design exception to Section 230, he said, “but that’s certainly what plaintiffs want to hear, and that’s what they’re arguing” in current social media cases.

One possible response social media companies could have to the lawsuits is to deploy age verification to reduce their risk of liability “and just get rid of all teens, all underage users,” Goldman suggested. At that point, without that class of users, “it doesn’t really matter whether Section 230 would protect the services from liability because literally the internet has shrunk in a material way,” he said. If the argument that social media platforms can addict users works, "then the damages and potential service configurations that flow from that conclusion will radically reshape the internet again,” he said. In that scenario, “Section 230 won’t necessarily be dead,” but the internet will “look entirely different.” That would “almost certainly guarantee that the internet is less functional than it is now.”