Export Compliance Daily is a Warren News publication.
‘Should-Have-Known’

Advertisers Target ‘Overly Broad’ FTC Rules for Online Reviews

The FTC’s proposed rules for moderating fake online reviews are overly broad and carry liability risks that will result in platforms censoring legitimate reviews on sites like Google, Facebook and Yelp, the Interactive Advertising Bureau said Tuesday.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

The agency held an informal hearing on an NPRM for combatting fake reviews (see 2306300029 and 2402080050). An administrative judge heard testimony from IAB, a watchdog for fake online reviews and a group of academics.

The proposal is inconsistent with Communications Decency Act Section 230 and the First Amendment, and it will chill legitimate reviews, said IAB Executive Vice President Lartease Tiffith. The commission asserted without evidence that unintended censorship is “very unlikely” under the new rules, he said. Several provisions can be read to impose civil penalties even when a company doesn’t know a review or testimonial violates the rules, he said, noting the U.S. Chamber of Commerce and the Computer & Communications Industry Association have raised similar concerns. He spoke against consumer advocates’ recommendation that the FTC enforce a “should-have-known” standard, an overly broad term that will “sweep in” companies hosting legitimate speech.

Platforms like Google, Facebook and Yelp have repeatedly failed to remove phony reviews that have real-world consequences on users’ livelihoods, said Fake Review Watch founder Kathryn Dean, a former federal criminal investigator. Her organization investigates platform content moderation for bogus reviews. She testified that a Florida construction company was the subject of fake, negative reviews on Google and Facebook. Google ignored the businesses’ appeals until Fake Review Watch posted a video that led to a news report. On Facebook, 35 of 37 reviews identified as fake remained on the platform in January, Dean said.

Fake Yelp reviews are often circulated on Facebook and Instagram, she said: People are paid as much as $20-$50 to post those reviews. Dean highlighted examples of companies changing dates of reviews, so they appear more current to internet users.

Platforms are in the best position to act in these circumstances, though the response from major platforms has been “wholly inadequate,” she said. If Section 230 precludes holding third-party review sites accountable for false or deceptive content, the FTC needs to enforce rules requiring greater transparency, she said.

Dean recommended platforms continue hosting fake and deceptive reviews with full disclosures, so consumers are aware. IAB believes fake reviews have “no place” on these platforms, said Tiffith: IAB members should be free to remove any content they determine to be in violation of their community standards.

Tiffith argued against recommendations that reviewers always be identified by name, location and other personal information. This can raise privacy concerns and deters privacy-minded individuals from leaving legitimate reviews, he said. Ben Beck, Brigham Young University assistant professor of marketing, told the panel that research shows anonymous reviewing spurs more negative reviews on platforms. At the same time, he said identifying reviewers with personal information carries risks of platform bias and privacy breaches. Ultimately, he said legislation and new rules are needed to increase trust in online review platforms.