Technology

AI-generated and fake online reviews are banned — here’s what we know about the new FTC rule

Amazon review from Shutterstock

The Federal Trade Commission has announced a new rule that should theoretically make online user reviews of products more useful.

In a press release, the FTC announced a finalized set of guidelines aimed at taking down fraudulent or misleading product reviews on the internet. Among the things that will be prohibited once the rule goes into effect are paid reviews and those generated by artificial intelligence. The maximum penalty for transgressions is $ 51,744 per violation.

“Fake reviews not only waste people’s time and money, but also pollute the marketplace and divert business away from honest competitors,” FTC chair Lina Khan said. “By strengthening the FTC’s toolkit to fight deceptive advertising, the final rule will protect Americans from getting cheated, put businesses that unlawfully game the system on notice, and promote markets that are fair, honest, and competitive.”

What did the FTC ban?

The FTC’s press release included a helpful list of everything that is prohibited by this new rule:

  • Fake reviews and testimonials, whether written by humans or AI, that misrepresent either the author of the review or the author’s experience with the product

  • Paid reviews (either negative or positive)

  • Reviews and testimonials from company insiders with a conflict of interest

  • Companies owning websites that claim to host independent reviews of their products

  • Using groundless legal threats or intimidation to have negative reviews removed

  • Buying or selling fake social media followers or views for a commercial purpose

That’s a pretty comprehensive list of things that can be troublesome in the realm of online product reviews. Amazon, alone, had to remove 200 million fake reviews on its site in 2020, per TechCrunch. These new rules aim to make it easier to identify genuine reviews and reduce the presence of fake ones.

Mashable