Communications Decency Act Section 230 isn’t a “license to do whatever one wants online,” the 4th Circuit U.S. Court of Appeals ruled last week, reversing a district court decision and finding a website liable for selling misleading and incomplete information used in background checks.
Section 230
Communications Litigation Today is providing readers with the top stories from last week in case you missed them. Each can be found by searching on its title or by clicking on the hyperlinked reference number.
Twitter knowingly disseminated videos of child sex abuse material (CSAM) and profited from sex-trafficking activity so it can’t claim immunity under Communications Decency Act Section 230, trafficking victims argued Friday in 22-15103 before the 9th Circuit U.S. Court of Appeals. The 9th Circuit recently sided with Reddit in a similar case about hosting child porn (see 2210260073). Twitter was notified by the victims, John Doe #1 and John Doe #2, about the sharing of the CSAM on the platform, the plaintiffs argued in a reply brief on cross-appeal: Once Twitter was aware of the material, it could have removed it and reported it to the National Center for Missing & Exploited Children, or it could “deliberately join in the exploitation” and continue to profit from hosting the material. “Twitter elected to profit and directly engage in the ongoing trafficking of the minor children,” the plaintiffs argued. No federal appellate court has ever held that Section 230 “provides civil immunity for knowing violations of federal or state laws prohibiting CSAM, and only a handful of district courts have done so,” they said.
Victims suing Reddit for allegedly profiting from child porn failed to plead that the website “knowingly benefited” from facilitating sex trafficking, the 9th Circuit U.S. Court of Appeals ruled this week, citing Reddit’s immunity under Communications Decency Act Section 230.
TikTok is immune from liability for the death of a 10-year-old girl who strangled herself after watching a “Blackout Challenge” on the social media app, a federal judge ruled Tuesday, citing Communications Decency Act Section 230. Tawainna Anderson sued the platform, claiming it was responsible for the death of her daughter, Nylah. The circumstances are “tragic,” but because Anderson sought to hold TikTok liable as a publisher of third-party content, the platform is immune under Section 230, Judge Paul Diamond wrote in his memo, granting TikTok’s motion to dismiss on “immunity grounds.” TikTok didn’t create the challenge but only made it “readily available” on its app, Diamond wrote: TikTok’s algorithm was a “way to bring the challenge to the attention of those likely to be most interested in it.” Section 230 protects the platform for publishing others’ works, he said: “The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.” FCC Commissioner Brendan Carr drew attention to the case, tweeting TikTok used Nylah’s personal information to serve her the blackout challenge video encouraging users to strangle themselves: “She did that with a purse strap & dies. Court accepts all this as true & rules that § 230 shields TikTok from liability.”
When the Supreme Court takes up two related Communications Decency Act Section 230 cases this term, “the questions will be difficult and the stakes enormous,” said Miller Nash partner Robert Cumbow in an analysis Monday. Many over the past quarter century have credited Section 230 “with enabling the internet to grow and flourish,” said Cumbow. But others say that “reconsideration of the reach of Section 230 is long overdue,” he said. Legal experts told us earlier this month that SCOTUS will almost undoubtedly recast or cut back the broad immunity that interactive online platforms enjoy via the Section 230 liability shield (see 2210110030). Cumbow said that waiting in the wings is the pending 4th U.S. Circuit Court of Appeals case of Hepp v. Facebook, in which a misappropriated photograph of Philadelphia news anchor Karen Hepp found its way into numerous ads that appeared on Facebook and other online platforms, promoting such products as dating services and sexual performance enhancement. Plaintiff Hepp claims Facebook “is liable for violating her publicity rights because Section 230 expressly excludes intellectual property claims,” he said. Many states, including Hepp’s home state of Pennsylvania, “regard publicity rights as intellectual property, leading the Fourth Circuit to hold that Facebook is not shielded from Hepp’s claims” via Section 230, he said. Hepp and the two related social media cases all maintain that under the current interpretation of Section 230 they “will have no redress for wrongs committed against them and perpetuated by the companies that control web platforms,” said Cumbow.
The Supreme Court should either consider whether provisions of Florida’s disputed social media law are preempted by Section 230 or vacate the decision of the 11th U.S. Circuit Court of Appeals with the directive to consider whether the law is preempted, said an amicus brief from Reynaldo Gonzalez and Mehier Taamneh posted in docket 22-277 on a writ of certiorari for Moody v. NetChoice Monday. The amici are plaintiffs in cases against Google and Twitter over the murder of their relatives by ISIS, which they say was caused in part by content hosted on tech platforms. Florida’s social media law would limit the ability of social media companies “to remove, or refuse to recommend, posted material likely to incite terrorism or violence,” the brief said. A separate brief from public interest law firm Freedom X on behalf of Florida argues that removing speech isn’t protected by the First Amendment. SCOTUS should grant cert and “maintain the longstanding distinction between adding speech and subtracting it,” said Freedom X.
Outside counsel to Google consented to the filing of amicus briefs at the Supreme Court in Reynaldo Gonzalez v. Google, said the lawyer's letter Tuesday in docket 21-1333. The case is one of two related appeals of appellate court decisions on social media companies' legal protections in which SCOTUS granted certiorari Oct. 3 (see 2210110030). The petitioner is the estate of Nohemi Gonzalez, a U.S. citizen who was killed in ISIS attacks in 2015. The petitioner asked SCOTUS to revisit the 9th Circuit's holding that the Communications Decency Act's Section 230 protects YouTube's algorithm for recommending videos.
Tech companies “are largely free to create and operate online platforms without legal consequences for the negative outcomes of their products” because of Communications Decency Act Section 230, said an investigative report Tuesday from the Office of the New York Attorney General on the role of online platforms in the May 14 mass shooting in Buffalo that killed 10 and wounded three. Section 230 allows “too much legal immunity” for platforms, even “when a platform allows users to post and share unlawful content,” it said.
Communications Litigation Today is providing readers with the top stories from last week in case you missed them. Each can be found by searching on its title or by clicking on the hyperlinked reference number.