NetChoice Cases Seen as Opportunity for Social Media Clarity
The U.S. Supreme Court has opened the door for lower courts to clarify when the government can regulate the tech industry’s content moderation practices, legal experts said Friday.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
As it ended its term last month, the high court remanded two tech industry lawsuits against social media laws in Florida and Texas, while punting on key First Amendment questions (see 2407010053 and 2402260051). All nine justices agreed on remanding, but conservative justices disagreed with First Amendment-related aspects of Justice Elena Kagan’s majority opinion. Co-plaintiffs NetChoice and Computer and Communications Industry Association expect they will return to court in August, NetChoice CEO Steve DelBianco said during a Congressional Internet Caucus Academy livestream Friday.
It’s time for courts to differentiate between expressive third-party speech that platforms host and platform behavior surrounding content moderation, said Cybersecurity for Democracy senior fellow Yael Eisenstat, a former Facebook elections integrity official. “That is something that really warrants updating.”
Platforms want it both ways, she said: They enjoy First Amendment protection because they’re considered speakers, but they’re also free from liability because the Communications Decency Act Section 230 protects their conduct. She agreed companies shouldn’t be held responsible for all third-party speech, but that doesn’t mean Section 230 should be interpreted so broadly that platforms can do anything they want when moderating and monetizing content, she said.
Justice Kagan wrote in her opinion that platforms create “expressive products” through content moderation and therefore are entitled to First Amendment protection. Justices Amy Coney Barrett and Ketanji Brown Jackson issued separate concurrences questioning what behavior should be considered expressive and how far First Amendment protections go. Justice Samuel Alito said there needs to be more transparency about how algorithms work before courts can make informed decisions.
The ACLU agrees certain types of platform behavior aren’t and shouldn’t be First Amendment-protected, staff attorney Vera Eidelman said. For example, her organization has won legal challenges against allegedly discriminatory practices on Facebook related to housing services.
The Supreme Court is “frustrated” with “sweeping” First Amendment arguments guarding the tech industry against accountability, Eisenstat said: This industry is operating “with impunity.”
Platforms are stuck in an “impossible squeeze play,” DelBianco said: Half of Americans believe platforms moderate content too heavily, and the other half believes platforms don’t moderate enough. He said the Supreme Court decision is a win for anyone who values free expression and is against government overreach. The lower courts now can develop the facts on the scope of the laws in Texas and Florida, in terms of how broadly they apply and to what services.
Fordham University law professor Olivier Sylvain argued that based on the language in the Supreme Court opinion, the Texas and Florida statutes are probably unlawful when applied to YouTube and Facebook content feeds. Courts will have to go back and look at how each law might apply to different online functions, including online reviews and email services, he said: For example, there are open questions about a platform’s role when inappropriate content reaches minors on child-directed services.