by Jabio
1 September 2023
7 min read

Google’s New Policy on AI-Generated Reviews

Let’s cut to the chase: If you’re thinking about increasing your online reviews using artificial intelligence and automating content to generate reviews, forget it.

Google has updated its policy in the search engine’s Merchant Center for product ratings, starting with, “We don’t allow spam,” and going on to say AI or automated reviews are spam. Here’s the new policy:

We don’t allow reviews that are primarily generated by an automated program or artificial intelligence application. If you have identified such content, it should be marked as spam in your feed using the <is_spam> attribute.
Google Merchant Center

Google wants to make sure its users get authentic reviews from real customers, so posting automated reviews is a violation of its policy. Here’s where it gets tricky: If you are taking reviews that have been left for you and posting them to your Merchant Center feed, you can be held responsible if they have been generated by AI or other automation. If they are, you need to add that <is_spam> attribute, or you will fall outside of Google’s guidelines.

You should stay away from using AI tools for images to accompany reviews, too. Google says that if an image is flagged for a policy violation, the review will be blocked.

Google Enforcement Actions for AI-Generated Reviews

Google says it will use a combination of computing algorithms and human reviews to make sure reviews and content follows its guidelines. When algorithms suspect AI content, it will undergo further analysis. Cases may also be evaluated by specialists who are trained to detect fraudulent or automated content.

Google is poised to take action on reviews that go against its policies. Penalties can include:

  • Disapproving reviews
  • Issuing warnings
  • Suspending accounts

Repeat violators may be subject to more serious penalties.

Google on AI

Google’s generally asserted that AI-generated content, outside of reviews, is OK — as long as it provides valuable and helpful content. “It’s important to recognize that not all use of automation, including AI generation, is spam,” Google posted on its Search Central blog, citing examples of how automation has been used to provide helpful content like sports scores or weather forecasts.

So, if the content that’s been generated by AI provides solid and accurate information, Google is OK with it.

To be clear, here’s how Google phrased it:

“Appropriate use of AI or automation is not against our guidelines.”
– Google Search Central blog

The key here is appropriate use. If automation is used to create content that is primarily to manipulate search rankings, it’s not appropriate. The latest policy clarification marks automation-generated reviews as spam.

The Challenge of Fake and AI-Generated Reviews

Reviews are important for buyers — 93% say online reviews have an impact on their purchasing decisions. But that only works if the reviews are real. Sitejabber research reveals that 80% of consumers say they’ve seen fake reviews and about half say they’ve seen multiple examples. And, consumers care deeply about whether reviews are authentic.

In the past, the majority of fake reviews were often easy to spot. Poor grammar, over-the-top praise, a lack of details, and obscure reviewer identities were warning signs. In fact, Google said it removed or blocked more than 115 million reviews and 20 million Google Business Profiles in 2022 that were fake.

But AI is changing that. With large language models trained on millions of data points — including product descriptions and product reviews — it’s become strikingly good at generating real-sounding content, especially with a little guidance and direction.

With just a little prompting, ChatGPT produced these reviews…

Restaurant

The vibe was nice, but the food was just okay, and the service was slow and messed up my order at first. Kind of a letdown, to be honest.

SaaS provider

The pricing seemed reasonable, but the software usability was a bit of a struggle with a steep learning curve. It’s got potential, but the user experience needs improvement.

Marketing company

They truly delivered. Not only did they work within my budget, but their spectacular service also yielded fantastic results. Highly satisfied with their performance.

While you might spot the “tells” by taking a close look yourself, AI detection tools on the market so far are spotty, at best, at detecting whether something has been written by humans or generated by a computer — so much so that the people behind ChatGPT, OpenAI, removed their AI detector from the market because they felt they couldn’t accurately determine AI content.

The Federal Trade Commission Proposal for Banning Fake Reviews

It’s not just Google that’s speaking out about the problems of fake reviews and automated content. The Federal Trade Commission (FTC) has proposed rule-making in progress aimed at stopping the flow of fake reviews, suppression of honest negative reviews, and paying for positive reviews. The FTC says each of these practices deceives potential customers that are looking for real feedback to make decisions.

If these guidelines are put in place, it would open companies up to civil penalties for knowingly violating these rules.

The proposed rule would prohibit the posting of reviews or testimonials by someone who does not exist or have actual experience with the product or service. It will also prohibit posting reviews that purposely misrepresent the experience.

In most cases, though, it appears the burden falls on the company to be the arbiter of what’s fake, automated, or genuine. Businesses that know, or should have known, that these reviews are inauthentic would have a proactive obligation to prevent them from being seen according to the FTC’s proposals.

Companies would also be prohibited from:

  • Obtaining fake consumer reviews or testimonials
  • Using reviews for a specific product or service and making it appear it’s for a different product or service (so-called review hijacking)
  • Buying reviews from third parties or providing compensation or incentives that are based on generating reviews
  • Having company officers or managers writing reviews without disclosing their company relationships
  • Using legal threats, false accusations, or intimidation to prevent negative reviews or get consumers to remove negative reviews

The FTC is taking comments on its proposed rulemaking through September 29, 2023.

How Sitejabber Can Help

All of this emphasizes the importance of getting authentic reviews for your products or services to avoid a potentially sticky (and expensive) situation. 

When you work with a reputable company like Sitejabber, you can grow your business using reviews from real buyers. Easily collect, monitor, and showcase legitimate reviews on your website and on the third-party sites that are most important and influential to your customers. 

Our platform, Jabio, can help you gather more reviews by reaching out to your customers at multiple touchpoints — from online checkout surveys to customized batch emails and text messaging (SMS). This improves the volume of reviews while helping ensure they come from real customers. An all-in-one dashboard helps you monitor your reviews and respond quickly, thanking users for leaving a positive review or addressing negative ones in a positive manner. You also get review insights to help uncover buyer sentiment, trending topics, and potential problems that need mitigation.

If you’re interested in learning more about how Jabio can elevate your company’s online presence and supercharge your revenues using legitimate online reviews, contact us today.

© 2024 GGL Projects, Inc.