Export Compliance Daily is a Warren News publication.
Riot Question Dodged

Big Tech CEOs Back CDA S. 230 Transparency

Facebook, Google and Twitter support Communications Decency Act Section 230 proposals to increase content moderation transparency, their respective CEOs, Mark Zuckerberg, Sundar Pichai and Jack Dorsey, told House Commerce Committee members Thursday during a virtual hearing. Noting Zuckerberg’s support for “thoughtful changes” to 230 (see 2103240076), Communications Subcommittee ranking member Bob Latta, R-Ohio, asked the Facebook chief for specific proposals. Zuckerberg supported two specific changes, saying Congress should be careful about removing protections for smaller companies.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Platforms should have to issue transparency reports that state the prevalence of harmful content in specific categories, said Zuckerberg, noting Facebook’s quarterly ones. They must have “effective systems” for removing “clearly illegal content” like sex trafficking, child exploitation and terrorism, he said. It would be reasonable to condition immunity for larger platforms on having a generally effective moderation system for illegal content, he said.

There are good plans for accountability and transparency in legislative proposals, said Pichai. Zuckerberg’s ideas on transparency are good, but it would be difficult to distinguish between large and small platforms, said Dorsey.

Zuckerberg and Pichai didn't clearly answer a question from Communications Chair Mike Doyle, D-Pa., for a yes or no on whether the platforms bore any responsibility for the Jan. 6 Capitol riot. Zuckerberg said platforms are responsible for building “effective” moderation systems, and Pichai said it’s a complex question that can’t be answered with a yes or no. Dorsey said platforms bore some responsibility, but the broader ecosystem needs to be taken into account. Doyle thanked Dorsey, saying he agreed.

Primary responsibility lies with the people who took the action at the Capitol and spread the content, including the president with his rhetoric about a rigged election, said Zuckerberg. Doyle responded that platform algorithms supercharged actions and opinions on Jan. 6.

If tech companies choose not to remove harmful content, Congress will legislate to stop it, said Doyle in opening remarks. Latta focused his opening on Big Tech’s “ever-increasing censorship” of conservative voices. Consumer Protection Subcommittee Chair Jan Schakowsky, D-Ill., focused on disinformation and hate groups, hoping for bipartisan support for her Online Consumer Protection Act. Platforms have responsibilities to be stewards of their platforms and enforce their policies evenly, said Consumer Protection ranking member Gus Bilirakis, R-Fla.

Commerce Chairman Frank Pallone, D-N.J., promised lawmakers will move forward with legislation, saying platforms failed to change after fomenting insurrection at the Capitol. Ranking member Cathy McMorris Rodgers, R-Wash., focused on Big Tech’s failure to promote free speech while censoring political viewpoints. She asked why Big Tech still deserves liability protections while discussing the industry's impact on young users and citing related youth suicides.

Schakowsky asked the CEOs whether they think Section 230 should be included in international trade agreements, which got bipartisan opposition last session from the committee. The section plays a foundational role in the internet, but it should be updated to reflect modern reality, said Zuckerberg. There’s value in 230, but its evolution should be reflected in trade agreements, said Pichai. Dorsey said he didn’t understand the “full ramifications” of what she was asking. “Ratifying the liability shield in international agreements is a bad idea,” said Schakowsky.

Pallone claimed Zuckerberg's and Pichai's opening remarks gave the impression they’re not promoting extremism, when in fact they’re profiting from it. The chairman asked if YouTube’s recommendation algorithms are designed to keep users on the site. “That’s not the sole goal,” said Pichai. Pallone took that as a yes, saying YouTube amplifies extreme content.

YouTube isn’t doing enough about its recommendations, said Rep. Anna Eshoo, D-Calif. She noted her bill with Rep. Tom Malinowski, D-N.J., the Protecting Americans From Dangerous Algorithms Act. She said the bill would narrowly amend 230 so courts can examine the role of algorithmic amplification that leads to violence. She asked Pichai if Google would overhaul YouTube core recommendations. The platform overhauled them “substantially,” said Pichai.

Rep. Jerry McNerney, D-Calif., asked if the CEOs would oppose legislation that prohibits platforms from placing advertisements next to false or misleading information. That’s a nuanced issue, said Zuckerberg: Determining misinformation is a process that would need to be spelled out well in a law like that. Dorsey said he would oppose it until he saw requirements and ramifications. Pichai said the principle makes sense, but advertisers don’t want to be near content like that, so there’s already incentive.