Export Compliance Daily is a Warren News publication.
'Layers of Chaos'

Experts See Conflicts, Consequences in Next Week's Section 230 Arguments

Decisions in the two interlinked cases involving the Communications Decency Act's Section 230 being argued before the Supreme Court next week could lead to many conflicts with state and international laws, force Congress to act, spawn waves of litigation or cause the cases to “dissolve like alka seltzer,” said legal and tech experts on a Brookings Institute virtual panel Tuesday on Gonzalez v. Google and Twitter v. Taamneh. “I think it is correct this will be the most important Supreme Court decision about the Internet possibly ever,” said Alan Rozenshtein, associate professor of law at the University of Minnesota Law School.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Both cases are concerned with terrorist attacks by Isis, and with whether social media platforms can be held liable for terrorist attacks by groups that use them to distribute content. Argument in Gonzalez v. Google -- which involves the scope of Section 230’s liability protections for platforms -- is Tuesday. Argument in Twitter v. Taamneh --- which concerns whether Antiterrorism Act’s provisions against aiding terrorist can apply to entities that don’t have a specific link to a terrorism incident – is next Wednesday.

Oral argument in the cases will provide an insight into the “vibe” of the court on Internet policy, said Rozenshtein. “You have a new generation of justices; they are somewhat younger, they are just as internet-obsessed as the rest of us,” he said. The court might focus on the narrow questions of whether social media platforms can be held liable for the content they host or “it might go up in any number of directions that various amici have suggested,” said Daphne Keller, director of the Program on Platform Regulation, Stanford University’s Cyber Policy Center. (see 2301200059). Keller filed amicus briefs in Gonazalez in support of the platforms. Justice Clarence Thomas expressed concern that platforms aren’t moderating content in a neutral manner and that Section 230 was given broader authority than originally intended, said Rozenschtein.

SCOTUS’s opinion on Section 230 and the arguments in the Gonzalez case could be washed away by the court’s ruling in Twitter v. Taamneh, said Benjamin Wittes, Brookings Senior Fellow in Governance Studies. Since that case concerns whether a platform can be liable for acts of terrorism that it didn’t specifically aid or facilitate, a ruling in favor of Twitter would decide both cases for the platforms. “The entire Gonzalez adjudication depends on the Supreme Court getting Taamneh wrong,” said Wittes. SCOTUS could “make two big problems go away” by ruling for the platforms, Wittes said: “They should be able to get nine votes for that. I'm not saying they will, but they should.”

Holding platforms liable for their content or for the algorithms they use to rank that content would have widespread consequences for vast swaths of the Internet, said Keller. Algorithms can drive users to more extreme content but they can also be used to mitigate that tendency, she said. University of California-Berkeley Professor of Electrical Engineering and Computer Science Hany Farid said he’d like platforms to take some responsibility for designing algorithms that favor engagement over anything else. “It is really easy to drive people into very deep rabbit holes based on these really radical systems that are vacuuming up every morsel of personal data,” he said.

Wittes said he sees merit in arguments about liability for algorithmic ranking, but the court should wait for a case where it's more obvious that the algorithms caused the harm. “The case is poorly presented,” he said. Rozenshtein said the court should rule heavily against the platforms because that’s the outcome most likely to force Congress to better define platform liability. Congress is more likely to stay in a stalemate on the issue, leaving a breach states could step into with “really wild laws,” said Keller.

If SCOTUS does rule against the platforms, their decision could conflict with future decisions in pending Section 230 Florida and Texas cases NetChoice v. Paxton and NetChoice v. Moody, and European Union internet policies, said Keller. Both the Florida and Texas laws in question have provisions that prevent platforms from editing or removing certain content, and a decision in Gonzalez could mean the platforms are liable for the content they host, said Keller.

If platforms have to navigate a patchwork of state laws about what content they can host, they'are likely to go with the lowest common denominator, Keller said. A SCOTUS ruling in Gonzalez is likely to come out around June, roughly a month before platforms are required to come into compliance with new EU laws for internet platforms, and many platforms are laying off many of their experienced employees, Keller said. “The layers of chaos go very deep,” she said.

A ruling against the platforms in Taamneh would also have widespread consequences, said Wittles. “If the court were to rule that the plaintiffs have a cause of action in Taamneh, we have a world of hurt,” he said. Any institution that has engaged in any capacity with terrorist groups could face potential liability, he said. “If I give some assistance in general to the University of Minnesota, and Alan Rozenschtein goes and kills somebody, I did not aid and abet the murder,” Wittes quipped.