Onlinecensorship.org Finds Growing Frustration With Social Media Content Moderation Policies
Reports of social media censorship from Facebook, Google, Instagram and Twitter indicate that users are becoming increasingly frustrated with those platforms' content moderation policies, Onlinecensorship.org reported Wednesday. The Electronic Frontier Foundation/Visualizing Impact project based its report on an analysis of…
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
user-generated content takedown reports between April and November. About 76 percent of 230 reports of social media content takedowns between March and October concerned Facebook, while 17 percent involved Twitter and about 4 percent concerned Instagram, the report said. Facebook this year faced criticism from congressional lawmakers and others over claims the website censored conservative news as well as the publication of several fake news stories (see 1605100032, 1605240059, 1605110048 and 1610310038). Onlinecensorship.org researchers also analyzed "content takedowns" from Google+ and YouTube users. The report said 36 percent of the reports related to account shutdowns. About 26 percent involved a takedown of a post, 19 percent involved a photo takedown and 6 percent involved a video, Onlinecensorship.org said. Most users don't have a clear understanding of why their content was removed with only 60 reports providing a reason, the project said. Election-related censorship complaints in particular showed that users desired to speak their minds about the presidential contest between Republican President-elect Donald Trump and Democratic presidential nominee Hillary Clinton. “These companies have enormous impact on the public sphere, yet they are still private entities with the ability to curate the information we see and the information we don’t see at their sole discretion,” said Jillian York, Onlinecensorship.org co-founder, in a news release. “The user base is what powers these social media tools, yet users are feeling like they don’t have any control or understanding of the system.” Onlinecensorship.org recommended that social media platforms create best practices for content moderation, including a commitment to transparency in how such policies are enforced. Researchers also recommended improving the systems for appealing content takedown decisions when they're made in error. Facebook, Google, Instagram and Twitter didn't immediately comment.