TikTok COO Details Beijing Operation, Denies Chinese Data Access
TikTok has employees in Beijing as do many other global tech companies, TikTok Chief Operating Officer Vanessa Pappas told the Senate Homeland Security Committee during a hearing Wednesday.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
Ranking member Rob Portman, R-Ohio, repeatedly pressed Pappas over the company’s ties to Beijing and data access for employees to China (see 2207150064). TikTok doesn’t operate in China, but it does have an office and employees in Beijing, said Pappas. TiKTok isn’t subject to Chinese surveillance laws because the company is incorporated in the U.S. and complies with local law. She denied the Chinese Communist Party has access to U.S. user data, saying “under no circumstances” would the data be given to the Chinese government. Portman said her remarks don’t “square” with what Congress knows about Chinese surveillance law. He asked Pappas to commit TikTok to cutting off all data access to China and any China-based employees. Pappas said TikTok is committed to its final agreement with the U.S. government and the Committee on Foreign Investment in the U.S. concerning data security, which “will satisfy all national security concerns.”
The Chinese government can ask TikTok officials to turn over data on any users in the world or face jail time, testified Geoffrey Cain, Lincoln Network senior fellow-critical emerging technologies. He called it a “documented legal situation” TikTok can’t avoid just by pointing out the company’s office locations: “This is a red herring to distract from the issue at hand.”
Portman and Senate Homeland Security Committee Chairman Gary Peters, D-Mich., focused on how social media platforms need to be more transparent about how their algorithms shape public discourse, given their role in amplifying content. Peters highlighted the role of algorithms in shaping public discourse, amplifying extremist content and radicalizing users. He noted the former tech executives testifying during the hearing’s first panel testified that profit incentives outweigh incentives to prioritize safety and security. Portman advocated for legislation to create more transparency about social media algorithms so researchers can evaluate how platforms are affecting people, plugging his bill the Platform Transparency and Accountability Act with Sen. Chris Coons, D-Del. (see 2112090068). Coons hosted a Senate Privacy Subcommittee hearing on protecting American data from “hostile foreign powers” Wednesday.
Former Twitter Senior Vice President-Engineering Alex Roetter recommended Congress create an independent group of researchers and data scientists who are allowed to request data from platforms. The “attention game” social media platforms play is a “race to the bottom," said Roetter: Companies are incentivized to build more addictive products or risk losing user engagement to rivals. There’s no way to optimize for anything other than engagement and revenue, he said.
Meta works hard to take down extremist content, publish results and work with law enforcement, said Chief Product Officer Chris Cox. Cain said Meta is probably the most transparent company in the group but not by much, giving the company a D grade. Like Cox, YouTube Chief Product Officer Neal Mohan cited his company’s efforts to remove harmful content. YouTube elevates authoritative sources, including mainstream news outlets, and works hard to “make sure” extremist content “has no home” on the platform, he said. YouTube’s business model relies on content creators and advertisers who don’t want to be associated with hateful and extremist content, he said.