Export Compliance Daily is a Warren News publication.
‘Root of the Problem’

Durbin Eyes Section 230 Hearing After Meta Whistleblower Testimony

The Senate Judiciary Committee is scheduling a hearing on Section 230 and prospects for repealing the tech industry’s liability shield, ranking member Lindsey Graham, R-S.C., told us Tuesday.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

His comments came after the Senate Privacy Subcommittee heard claims from a former Meta employee about how CEO Mark Zuckerberg and executives have ignored recommendations for reducing mental health harms for young users. Chairman Dick Durbin, D-Ill., and Senate Privacy Subcommittee Chairman Richard Blumenthal, D-Conn., both expressed interest in repealing Section 230.

Prospects for repealing Communications Decency Act Section 230 have never been better, Graham told us: “It’s the root of the problem.” The “sooner” Judiciary holds a hearing on the topic, “the better,” he said. “Let [Big Tech] come in and account for what’s going on.” Durbin’s office didn’t comment Tuesday. Meta didn’t comment.

Sen. John Cornyn, R-Texas, cast doubt about moving any meaningful tech-related legislation. “How many years have we been talking about [repealing Section 230]?” he asked. “As long as I can remember.” He noted the Judiciary Committee’s unanimous passage of six child safety-related bills. Sen. Majority Leader Chuck Schumer, D-N.Y., “hasn’t brought them to the floor,” said Cornyn. “It’s a little hard to believe the sincerity of people’s speeches about social media and kids if they won’t bring the bills we pass to the floor.”

Senate Privacy Subcommittee ranking member Josh Hawley, R-Mo., said during Tuesday’s hearing that before the end of the calendar year, he plans to force a floor vote on legislation. “Any senator can go to the floor and call up a piece of legislation and ask for a vote on it,” he said. “And I’m going to do it.” After the hearing, he told us a good place to start is the Strengthening Transparency and Obligation to Protect Children Suffering from Abuse and Mistreatment (Stop CSAM) Act (S-1199) (see 2310260044). Durbin, who introduced the bill, gained wide, bipartisan support in May (see 2305110048). Hawley said the bill’s most important component is its private right of action, which would expose tech companies to lawsuits from individuals and their families. The threat of litigation is the only way to get platforms to alter their behavior, said Hawley,

If Section 230 is repealed, Big Tech will flood Congress with legislative ideas, said Graham during the hearing. Blumenthal said he looks forward to discussing the idea further with Graham and Durbin. Sen. Sheldon Whitehouse, D-R.I., already announced plans to partner with Graham on the bill.

Arturo Bejar, a former security engineer and consultant at Meta, testified that Zuckerberg, former Meta Chief Operating Officer Sheryl Sandberg and Instagram Head Adam Mosseri repeatedly ignored his suggestions about how to improve social media interaction for teen users. Bejar worked at Facebook 2009-2015 and returned as a consultant for Instagram 2019-2021. His second stint with the company followed alleged sexual harassment his then-14-year-old daughter experienced on Instagram. After two years away from the company, Bejar said he has concluded the company can take actionable steps to address teen harms, but leadership has repeatedly ignored low-cost options for fixing the problems. Bejar said he often shared statistics with company leadership about the harm its platforms inflict on young users. For example, he told Mosseri that on a weekly basis, around 7% of Facebook users encounter content promoting suicide and self-harm.

Hawley shared statistics from Bejar’s communications with executives: one in four children experiences sexual solicitation on Meta’s platforms on a regular basis and one in eight experiences unwanted sexual advances on a weekly basis. Meta’s internal research showed its platforms make body image issues worse for one in three girls, said Hawley.

Meta has very narrow definitions of what constitutes harm, said Bejar. When users follow and interact with accounts controlled by abusers, content is recommended for those users. When a user tries to report accounts, as his daughter did, the company doesn’t respond. The number of accounts Meta acts on is a fraction of a percent due to its narrow definitions of harm, he said.

Meta employees set number-driven goals for increasing platform engagement, said Bejar. Yet no number-driven goals are set for driving down statistics of those who experience unwanted sexual advances on the platform, he said. Durbin said the company is clearly profit-driven, and he agreed with Graham that the only way to incentivize change is to open it up to liability in court.