Is Google Broken Or Are Googlers Right That It’s Working Fine? via @sejournal, @martinibuster

Recent statements by Googlers indicate that the algorithm is working the way it’s supposed to and that site owners should just focus more on their users and less on trying to give the algorithm what it’s looking for. But the same Googlers also say that the search team is working on a way to show more good content.

That can seem confusing because if the algorithm isn’t broken then why are they also working on it as if it’s broken in some way? The answer to the question is a bit surprising.

Google’s Point Of View

It’s important to try to understand how search looks like from Google’s point of view. Google makes it easier to do with their Search Off The Record (SOTR) podcast because it’s often just Googlers talking about search from their side of the search box.

And in a recent SOTR podcast Googlers Gary Illyes and John Mueller talked about how something inside Google might break but from their side of the search box it’s a minor thing, not worth making an announcement. But then people outside of Google notice that something’s broken.

It’s in that context that Gary Illyes made the following statement about deciding whether to “externalize” (communicate) that something is broken.

He shared:

“There’s also the flip side where we are like, “Well, we don’t actually know if this is going to be noticed,” and then two minutes later there’s a blog that puts up something about “Google is not indexing new articles anymore. What up?” And I say, “Okay, let’s externalize it.””

John Mueller then asks:

“Okay, so if there’s more pressure on us externally, we would externalize it?”

And Gary answered:

“Yeah. For sure. Yeah.”

John follows up with:

“So the louder people are externally, the more likely Google will say something?”

Gary then answered yes and no because sometimes nothing is broken and there’s nothing to announce, even though people are complaining that something is broken.

He explained:

“I mean, in certain cases, yes, but it doesn’t work all the time, because some of the things that people perceive externally as a failure on our end is actually working as intended.”

So okay, sometimes things are working as they should but what’s broken is on the site owner’s side and maybe they can’t see it for whatever reason and you can tell because sometimes people tweet about getting caught in an update that didn’t happen, like some people thought their sites were mistakenly caught in Site Reputation Abuse crackdown because their sites lost rankings at the same time that the manual actions went out.

The Non-Existent Algorithms

Then there are the people who continue to insist that their sites are suffering from the HCU (the helpful content update) even though there is no HCU system anymore.

SearchLiaison recently tweeted about the topic of people who say they were caught in the HCU.

“I know people keep referring to the helpful content system (or update), and I understand that — but we don’t have a separate system like that now. It’s all part of our core ranking systems: https://developers.google.com/search/help/helpful-content-faq

It’s a fact, all the signals of the HCU are now a part of the core algorithm which consists of a lot of parts and there is no longer that one thing that used to be the HCU. So the algorithm is still looking for helpfulness but there are other signals as well because in a core update there are a lot of things changing.

So it may be the case that people should focus less on helpfulness related signals and be more open to the possibility of a wider range of issues instead of just one thing (helpfulness) which might not even be the reason why a site lost rankings.

Mixed Signals

But then there are the mixed signals where Googlers say that things are working the way they should but that the search team is working on showing more sites, which kind of implies the algorithm isn’t working the way it should be working.

On June 3rd SearchLiaison discussed how people who claim they have algorithmic actions against them don’t. The context of the statement was in answering a June 3rd tweet by someone who said they were hit by an algorithm update on May 6th and that they don’t know what to fix because they didn’t receive a manual action. Please note that the tweet has a type where they wrote June 6th when they meant May 6th.

The original June 3rd tweet refers to the site reputation abuse manual actions:

“I know @searchliaison says that there was no algorithmic change on June 6, but the hits we’ve taken since then have been swift and brutal.

Something changed, and we didn’t get the luxury of manual actions to tell us what we did wrong, nor did anyone else in games media.”

Before we get into what SearchLiason said, the above tweet could be seen as an example of focusing on the wrong “signal” or thing and instead it might be more productive to be open to a wider range of possible reasons why the site lost rankings.

SearchLiaison responded:

“I totally understand that thinking, and I won’t go back over what I covered in my long post above other than to reiterate that 1) some people think they have an algorithmic spam action but they don’t and 2) you really don’t want a manual action.”

In the same response, SearchLiaison left the door open that it’s possible search could do better and that they’re researching on how to do that.

He said:

“And I’ll also reiterate what both John and I have said. We’ve heard the concerns such as you’ve expressed; the search team that we’re both part of has heard that. We are looking at ways to improve.”

And it’s not just SearchLiaison leaving the door open to the possibility of something changing at Google so that more sites are shown, John Mueller also said something similar last month.

John tweeted:

“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”

SearchLiaison said that they’re looking at ways to improve and Mueller said they’re evaluating how sites “can/will improve in Search for the next update.” So, how does one reconcile that something is working the way it’s supposed to and yet there’s room to be improved?

Well, one way to consider it is that the algorithm is functional and satisfactory but that it’s not perfect. And because nothing is perfect that means there is room for refinement and opportunities to improve, which is the case about everything, right?

Takeaways:

1. It may be helpful to consider that everything can be refined and made better is not necessarily broken because nothing is perfect.

2. It may also be productive to consider that helpfulness is just one signal out of many signals and what might look like an HCU issue might not be that at all, in which case a wider range of possibilities should be considered.

Featured Image by Shutterstock/ViDI Studio

Google Warns Of Quirk In Some Hreflang Implementations via @sejournal, @martinibuster

Google updated their hreflang documentation to note a quirk in how some websites are using it which (presumably) can lead to unintended consequences with how Google processes it.

hreflang Link Tag Attributes

is an HTML attribute that can be used to communicate data to the browser and search engines about linked resources relevant to the webpage. There are multiple kinds of data that can be linked to such as CSS, JS, favicons and hreflang data.

In the case of the hreflang attribute (attribute of the link element), the purpose is to specify the languages. All of the link elements belong in the section of the document.

Quirk In hreflang

Google noticed that there is an unintended behavior that happens when publishers combine multiple in attributes in one link element so they updated the hreflang documentation to make this more broadly known.

The changelog explains:

“Clarifying link tag attributes
What: Clarified in our hreflang documentation that link tags for denoting alternate versions of a page must not be combined in a single link tag.

Why: While debugging a report from a site owner we noticed we don’t have this quirk documented.”

What Changed In The Documentation

There was one change to the documentation that warns publishers and SEOs to watch out for this issue. Those who audit websites should take notice of this.

This is the old version of the documentation:

“Put your tags near the top of the element. At minimum, the tags must be inside a well-formed section, or before any items that might cause the to be closed prematurely, such as

or a tracking pixel. If in doubt, paste code from your rendered page into an HTML validator to ensure that the links are inside the element.”

This is the newly updated version:

“The tags must be inside a well-formed section of the HTML. If in doubt, paste code from your rendered page into an HTML validator to ensure that the links are inside the element. Additionally, don’t combine link tags for alternate representations of the document; for example don’t combine hreflang annotations with other attributes such as media in a single tag.”

Google’s documentation didn’t say what the consequence of the quirk is but if Google was debugging it then that means it did cause some kind of issue. It’s a seemingly minor thing that could have an outsized impact.

Read the newly updated documentation here:

Tell Google about localized versions of your page

Featured Image by Shutterstock/Mix and Match Studio

Want More Clicks? Use Simple Headlines, Study Advises via @sejournal, @MattGSouthern

A new study shows that readers prefer simple, straightforward headlines over complex ones.

The researchers, Hillary C. Shulman, David M. Markowitz, and Todd Rogers, did over 30,000 experiments with The Washington Post and Upworthy.

They found that readers are likelier to click on and read headlines with common, easy-to-understand words.

The study, published in Science Advances, suggests that people are naturally drawn to simpler writing.

In the crowded online world, plain headline language can help grab more readers’ attention.

Field Experiments and Findings

Between March 2021 and December 2022, researchers conducted experiments analyzing nearly 9,000 tests involving over 24,000 headlines.

Data from The Washington Post showed that simpler headlines had higher click-through rates.

The study found that using more common words, a simpler writing style, and more readable text led to more clicks.

In the screenshot below, you can see examples of headline tests conducted at The Washington Post.

Screenshot from: science.org, June 2024.

A follow-up experiment looked more closely at how people process news headlines.

This experiment used a signal detection task (SDT) to find that readers more closely read simpler headlines when presented with a set of headlines of varied complexity.

The finding that readers engage less deeply with complex writing suggests that simple writing can help publishers increase audience engagement even for complicated stories.

Professional Writers vs. General Readers

The study revealed a difference between professional writers and general readers.

A separate survey showed that journalists didn’t prefer simpler headlines.

This finding is important because it suggests that journalists may need help understanding how their audiences will react to and engage with the headlines they write.

Implications For Publishers

As publishers compete for readers’ attention, simpler headline language could create an advantage.

Simplified writing makes content more accessible and engaging, even for complex articles.

To show how important this is, look at The Washington Post’s audience data from March 2021 to December 2022. They averaged around 70 million unique digital visitors per month.

If each visitor reads three articles, a 0.1% increase in click-through rates (from 2.0% to 2.1%) means 200,000 more readers engaging with stories due to the simpler language.

See also: Title Tag Optimization: A Complete How-to Guide

Why SEJ Cares

Google’s recurring message to websites is to create the best content for your readers. This study helps demonstrate what readers want from websites.

While writers and journalists may prefer more complex language, readers are more drawn to simpler, more straightforward headlines.

How This Can Help You

Using simpler headlines can increase the number of people who click on and read your stories.

The study shows that even a tiny increase in click-through rates means more readers.

Writing simple headlines also makes your content accessible to more people, including those who may not understand complex terminology or jargon.

To implement this, test different headline styles and analyze the data on what works best for your audience.


Featured Image: marekuliasz/Shutterstock

David Vs. Goliath [Part 2]: Algorithm Updates Have Become The Biggest Risk In SEO via @sejournal, @Kevin_Indig

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

Taking a break from analyzing leaked Google ranking factors and AI Overviews, let’s come back to the question, “Do big sites get an unfair advantage in Google Search?”

In part 1 of David vs. Goliath, I found that bigger sites indeed grow faster than smaller sites, but likely not because they’re big but because they’ve found growth levers they can pull over a long time period.

  • The analysis of 1,000 winner and 1,000 loser sites shows that communities have gained disproportional SEO visibility over the last 12 months, while ecommerce retailers and publishers have lost the most.
  • Backlinks seem to have lost weight in Google’s ranking systems over the last 12 months, even though overperformers still have stronger backlink profiles than underperformers.
  • However, newcomer sites still have good chances to grow big, but not in established verticals.

The correlation between SEO visibility and the number of linking domains is strong but was higher in May 2023 (.81) than in May 2024 (0.62). Sites that lost organic traffic showed lower correlations (0.39 in May 2023 and 0.41 in May 2024). Even though sites that gained organic visibility have more backlinks, the signal seems to have come down significantly over the last 12 months.

In the second part, I share more insights from the data about how and when sites grow or decline in SEO traffic. My goal is to colorize the modern, working approach to SEO and contrast the old, dying approach.

Insights:

  • Most sites lose during core algorithm updates but win outside of them.
  • Most sites grow linearly, not “exponentially.”
  • Tool + programmatic SEO works well.
  • High ad load and confusing design work poorly.
Image Credit: Lyna ™

Hard(Core) Algorithm Updates

During almost half of the year, you can expect at least one Google update to be rolling out.

According to the official page for ranking “incidents,” 2021-2023 had an average of 170 days of Google updates.

Days of the year with Google updates (Image Credit: Kevin Indig)

Keep in mind that roll-out days reflect official updates and the times new updates roll out. The impact of updated rank systems can come into effect way after roll-out when new data is infused into the system, according to Danny Sullivan.

So the folks going “it’s a never ending update” or “the update isn’t over,” search is always being updated.

Image Credit: Kevin Indig

As I wrote in Hard(Core) Algorithm Updates, algo updates become a growing challenge for Google as a user acquisition channel:

No platform has as many changes of requirements. Over the last 3 years, Google launched 8 Core, 19 major and 75-150 minor updates. The company mentions thousands of improvements every year.

Every platform improves its algorithms, but not as often as Google. Instagram launched 6 major algorithm changes over the last 3 years. Linkedin launched 4.

The top 1,000 domains with the biggest traffic losses reflect the risk: When a domain loses organic traffic, it’s most likely due to a Google core algorithm update. A few examples:

  • In SaaS, applicant tracking software company Betterteam was caught by the September 2023 Helpful Content and October 2023 core update, likely because of too much programmatic “low-quality” content.
  • Hints from the Google ranking factor leak indicate a connection between brand searches, backlinks, and content. Whether that’s true or not and if we can influence it remains to be seen, but for Betterteam, brand searches have stagnated since March 2022 while the number of pages has been growing.
Image Credit: Kevin Indig
  • In ecommerce, big US retailers across all verticals (fashion, home, mega-retailers) have been on the decline since the August 2023 core update. More about that in a moment.
Image Credit: Kevin Indig
  • In publishing, sites like Movieweb have also started declining since August 2023. In this case, it’s interesting how Screenrant picks up market share but also dips during the March 2024 core update.
Image Credit: Kevin Indig

Overlapping algorithm updates make it near-impossible to understand what happened, which is a reverse engineering problem for SEO pros and also a guideline issue for anyone responsible for organic traffic. To understand what guideline you violated, you need to be able to understand what happened.

S-Curves Are Rare

It’s rare for a domain to grow exponentially (actually, sigmoidal ), and the average of the top 1,000 domains by organic traffic growth shows linear growth as well. The upside is that growth is more predictable.

Image Credit: Kevin Indig

A great example of the modern approach to SEO in SaaS is the AI tool Quillbot. With a simple but effective design, the tool makes it easy for users to solve issues instead of reading about how to solve them.

Owner Learneo, who also owns Course Hero, saw consistent growth outside of Google algorithm updates. Like German startup DeepL, Quillbot has programmatic pages for translation queries like “translate Arabic to Hindu” or “translate German to English.” The combination of programmatic SEO and a tool works like a charm.

Public relations management tool Muck Rack has programmatic pages for every client (+50,000) in its /media-outlet/ folder, like muckrack.com/media-outlet/fintechzoom. Each page ranks for the client name and has a description, a few details about the company, and the latest press releases for fresh content. Despite not being a tool, the programmatic play works, and Google deems it valuable.

In ecommerce, brands saw the strongest growth.

A few examples:

  • Kay Jewelry (outlet).
  • Lenovo.
  • Steve Madden.
  • Sigma (photo).
  • Billabong.
  • Coleman.
  • Hanes.
  • Etc.

Obviously, there are exceptions on both fronts: brands that lost organic traffic and retailers that gained. We need more data, but it seems that Google has favored brands in the search results over retailers since August 2023.

In publishing, garagegymreviews.com is one of the few affiliate sites that has seen strong growth. It’s important to point out that the main channel of the business is YouTube.

Another example is fodors.com, a travel site that grew predominantly because of its community.

A line graph showing the growth of fodors.com and its subdomains from July 2022 to May 2024. Fodors.com shows the highest growth, followed by community, world, and news subdomains. As evidenced by this David vs. Goliath scenario, smaller subdomains navigate SEO risks amid constant algorithm updates. Data by ahrefs.Image Credit: Kevin Indig

Algo Thrashing

We want a better Google, and Google seems to have taken notice. The response is stronger algorithms that can thrash sites into oblivion and have become the biggest risk in SEO.

At the same time, I haven’t noticed many sites growing due to algorithm updates, meaning the positive effect is indirect: Competitors might be losing traffic.

The big question to finish with is how to mitigate the risk of being hit by an algorithm update. While there is absolutely no guarantee, we can agree on what sites that are unaffected by updates have in common:

  • Allowing Google to index only high-quality pages.
  • Investing in content quality with expert writers and high effort (research, design).
  • Offering good design that makes content easy to read and answers quick to find.
  • Reducing ads as much as possible.

Google Search Status Dashboard


Google Quietly Ends Covid-Era Rich Results via @sejournal, @martinibuster

Google removed the Covid-era structured data associated with the Home Activities rich results that allowed online events to be surfaced in search since August 2020, publishing a mention of the removal in the search documentation changelog.

Home Activities Rich Results

The structured data for the Home Activities rich results allowed providers of online livestreams, pre-recorded events and online events to be findable in Google Search.

The original documentation has been completely removed from the Google Search Central webpages and now redirects to a changelog notation that explains that the Home Activity rich results is no longer available for display.

The original purpose was to allow people to discover things to do from home while in quarantine, particularly online classes and events. Google’s rich results surfaced details of how to watch, description of the activities and registration information.

Providers of online events were required to use Event or Video structured data. Publishers and businesses who have this kind of structured data should be aware that this kind of rich result is no longer surfaced but it’s not necessary to remove the structured data if it’s a burden, it’s not going to hurt anything to publish structured data that isn’t used for rich results.

The changelog for Google’s official documentation explains:

“Removing home activity documentation
What: Removed documentation on home activity structured data.

Why: The home activity feature no longer appears in Google Search results.”

Read more about Google’s Home Activities rich results:

Google Announces Home Activities Rich Results

Read the Wayback Machine’s archive of Google’s original announcement from 2020:

Home activities

Featured Image by Shutterstock/Olga Strel

Google’s Structured Data Update May Boost Merchant Sales via @sejournal, @martinibuster

Google updated their structured data guidelines to reflect support for a sitewide return policy within the Organization structured data. This eliminates the need to add redundant return policy information for every product listing structured data and can result in more traffic and sales to online merchants.

This doesn’t mean that merchants are required to change their current structured data, the old method remains unchanged. This simply adds an alternative way that is more streamlined and reduces the size of product structured data.

Improvement To Brand Knowledge Panel

Google’s change to the organization structured data will be reflected in the brand panel that Google shows when someone searches on a brand name. The updated brand panel will feature a new entry that reflects the company’s return policy.

Screenshot Of Brand Knowledge Panel Example

Benefits Of Organization-Level Return Policy

As part of this change Google is adding search features in Knowledge Panels and in Brand Panels that can show a merchant’s return policies. This means that a merchant’s search feature will be eligible to show a returns policy which in turn can encourage a higher clickthrough rate from the search engine results pages (SERPs) and a higher conversion rate.

Research conducted by the International Council of Shopping Centers (ICSC) in 2024 shows that online shoppers are strongly influenced by a merchant’s returns policy.

They discovered:

“82% of respondents said that when shopping online, return policies influence whether they decide to purchase from a retailer.

… If retailers charged a fee to ship back purchases made online, nearly three-fourths (71%) of respondents said they’d likely stop shopping online from that company altogether, while 6 in 10 said they’d likely stop shopping online with retailers that shortened the free return window.”

Clearly a return policy can be a way to generate more online sales and Google’s new support for a sitewide returns policy structured data helps to communicate that information to online shoppers directly from search.

Google’s announcement explained:

“A return policy is a major factor considered by shoppers when buying products online, and so last year we enabled the extraction of structured data return policies for individual products. Today we’re adding support for return policies at the organization level as well, which means you’ll be able to specify a general return policy for your business instead of having to define one for each individual product you sell.

Adding a return policy to your organization structured data is especially important if you don’t have a Merchant Center account and want the ability to provide a return policy for your business. Merchant Center already lets you provide a return policy for your business, so if you have a Merchant Center account we recommend defining your return policy there instead.

…If your site is an online or local business, we recommend using one of the OnlineStore, or LocalBusiness subtypes of Organization.

We hope this addition makes it easier for you to add return policies for your business, and enable them to be shown across Google shopping experiences.”

Google Updates Organization Structured Data Documentation

Google added a new section to their Organization structured data documentation to reflect support for this new way to show return policies in the search results.

The new documentation states:

“MerchantReturnPolicy
Use the following properties to describe general return policies for your entire Organization, if applicable to your business. If you have specific policies for individual products, use merchant listing markup instead.”

Read Google’s announcement:

Adding markup support for organization-level return policies

Read the new MerchantReturnPolicy documentation on Google’s official Organization structured data page:

Organization (Organization) structured data – MerchantReturnPolicy

6 Local SEO Full-Guides That Help You Rank For Your Business Type

The elusive five-star review used to be something you could only flaunt in a rotating reviews section on your website.

But today, Google has pulled these stars out of the shadows and features them front and center across branded SERPs and beyond.

Star ratings can help businesses earn trust from potential customers, improve local search rankings, and boost conversions.

This is your guide to how they work.

Stars And SERPs: What Is The Google Star Rating?

A Google star rating is a consumer-powered grading system that lets other consumers know how good a business is based on a score of one to five stars.

These star ratings can appear across maps and different Google search results properties like standard blue link search listings, ads, rich results like recipe cards, local pack results, third-party review sites, and on-app store results.

How Does The Google Star Rating Work?

When a person searches Google, they will see star ratings in the results. Google uses an algorithm and an average to determine how many stars are displayed on different review properties.

Google explains that the star score system operates based on an average of all review ratings for that business that have been published on Google.

It’s important to note that this average is not calculated in real-time and can take up to two weeks to update after a new review is created.

When users leave a review, they are asked to rate a business based on specific aspects of their customer experience, as well as the type of business being reviewed and the services they’ve included.

For example, “plumbers may get “Install faucet” or “Repair toilet” as services to add,” and Google also allows businesses to add custom services that aren’t listed.

When customers are prompted to give feedback, they can give positive or critical feedback, or they can choose not to select a specific aspect to review, in which case this feedback aspect is considered unavailable.

This combination of feedback is what Google uses to determine a business’s average score by “dividing the number of positive ratings by the total number of ratings (except the ones where the aspect was not rated).”

Google star ratings do have some exceptions in how they function.

For example, the UK and EU have certain restrictions that don’t apply to other regions, following recent scrutiny by the EU Consumer Protection Cooperation and the UK Competitions and Market Authority about fake reviews being generated.

Additionally, the type of rating search property will determine the specifics of how it operates and how to gather and manage reviews there.

Keep reading to get an in-depth explanation of each type of Google star rating available on the search engine results pages (SERPs).

How To Get Google Star Ratings On Different Search Properties

As mentioned above, there are different types of Google star ratings available across search results, including the standard blue-link listings, ads, local pack results, rich snippets, third-party reviews, and app store results.

Here’s what the different types of star-rating results look like in Google and how they work on each listing type.

Standard “Blue Link” Listings And Google Stars

In 2021, Google started testing star ratings in organic search and has since kept this SERP feature intact.

Websites can stand out from their competitors by getting stars to show up around their organic search results listing pages.

Text result showing google star ratings in the SERPsScreenshot from SERPs, Google, February 2024

How To Get Google Stars On Organic SERPs

If you want stars to show up on your organic search results, add schema markup to your website.

Learn how to do that in the video below:

As the video points out, you need actual reviews to get your structured data markup to show.

Then, you can work with your development team to input the code on your site that indicates your average rating, highest, lowest, and total rating count.

structured markup example for google star ratings and reviewsScreenshot JSON-LD script on Google Developers, August 2021

Once you add the rich snippet to your site, there is no clear timeline for when they will start appearing in the SERPs – that’s up to Google.

In fact, Google specifically mentions that reviews in properties like search can take longer to appear, and often, this delay is caused by business profiles being merged.

When you’re done, you can check your work with Google’s Structured Data Testing Tool.

Adding schema is strongly encouraged. But even without it, if you own a retail store with ratings, Google may still show your star ratings in the search engine results.

They do this to ensure searchers are getting access to a variety of results. Google says:

“content on your website that’s been crawled and is related to retail may also be shown in product listings and annotations for free across Google.”

If you want star ratings to show up on Shopping Ads, you’ll have to pay for that.

Paid Ads And Google Stars

When Google Stars appear in paid search ads, they’re known as seller ratings, “an automated extension type that showcases advertisers with high ratings.”

These can appear in text ads, shopping ads, and free listings. Both the star rating and the total number of votes or reviews are displayed.

In addition to Google star ratings, shopping ads may include additional production information such as shipping details, color, material, and more, as shown below.

Google shopping ads showing star ratingsScreenshot from SERPs ads, Google, February 2024

Paid text ads were previously labeled as “ads” and recently have been upgraded to a “sponsored” label, as shown below.

paid ad showing google star ratingsScreenshot from SERPs ads, Google, February 2024

How To Get Google Stars On Paid Ads

To participate in free listings, sellers have to do three things:

  • Follow all the required policies around personally identifiable information, spam, malware, legal requirements, return policies, and more.
  • Submit a feed through the Google Merchant Center or have structured data markup on their website (as described in the previous section).
  • Add their shipping settings.

Again, some ecommerce sellers who do not have schema markup may still have their content show up in the SERPs.

For text ads and shopping ads to show star ratings, sellers are typically required to have at least 100 reviews in the last 12 months.

Paid advertisers must also meet a minimum number of stars for seller ratings to appear on their text ads. This helps higher-quality advertisers stand out from the competition.

For example, text ads have to have a minimum rating of 3.5 for the Google star ratings to show.

Google treats reviews on a per-country basis, so the minimum review threshold of 100 also applies only to 1 region at a time.

For star ratings to appear on a Canadian ecommerce company’s ads, for example, they would have to have obtained a minimum of 100 reviews from within Canada in the last year.

Google considers reviews from its own Google Customer Reviews and also from approved third-party partner review sites from its list of 29 supported review partners, which makes it easier for sellers to meet the minimum review threshold each year.

Google also requests:

  • The domain that has ratings must be the same as the one that’s visible in the ad.
  • Google or its partners must conduct a research evaluation of your site.
  • The reviews included must be about the product or service being sold.

Local Pack Results And Google Stars

Local businesses have a handful of options for their business to appear on Google via Places, local map results, and a Google Business Profile page – all of which can show star ratings.

Consumers even have the option to sort local pack results by their rating, as shown in the image example below.

Google star ratings on search resultsScreenshot from SERPs local pack, Google, February 2024

How To Get Google Stars On Local Search Results

To appear in local search results, a Google Business Profile is required.

Customers may leave reviews directly on local business properties without being asked, but Google also encourages business owners to solicit reviews from their customers and shares best practices, including:

  • Asking your customers to leave you a review and make it easy for them to do so by providing a link to your review pages.
  • Making review prompts desktop and mobile-friendly.
  • Replying to customer reviews (ensure you’re a verified provider on Google first).
  • Be sure you do not offer incentives for reviews.

Customers can also leave star ratings on other local review sites, as Google can pull from both to display on local business search properties. It can take up to two weeks to get new local reviews to show in your overall score.

Once customers are actively leaving reviews, Google Business Profile owners have a number of options to help them manage these:

options to manage review on google business profileScreenshot from Google Business Profile Help, Google, February 2024

Rich Results, Like Recipes, And Google Stars

Everybody’s gotta eat, and we celebrate food in many ways — one of which is recipe blogs.

While restaurants rely more on local reviews, organic search results, and even paid ads, food bloggers seek to have their recipes rated.

Similar to other types of reviews, recipe cards in search results show the average review rating and the total number of reviews.

recipe search results on desktopScreenshot from search for [best vegan winter recipes], Google, February 2024

The outcome has become a point of contention among the food blogging community, since only three recipes per search can be seen on Google desktop results (like shown in the image above), and four on a mobile browser.

These coveted spots will attract clicks, leaving anyone who hasn’t mastered online customer reviews in the dust. That means that the quality of the recipe isn’t necessarily driving these results.

Google gives users the option to click “Show more” to see two additional rows of results:

expanded desktop recipe search resultsScreenshot from SERPs, Google, February 2024

Searchers can continue to click the “Show more” button to see additional recipe results.

Anyone using Google Home can search for a recipe and get results through their phone:

Google assistant recipesScreenshot from Elfsight, February 2024

Similarly, recipe search results can be sent from the device to the Google Home assistant. Both methods will enable easy and interactive step-by-step recipe instructions using commands like “start recipe,” “next step,” or even “how much olive oil?”

How To Get Google Stars On Recipe Results

Similar to the steps to have stars appear on organic blue-link listings, food bloggers and recipe websites need to add schema to their websites in order for star ratings to show.

However, it’s not as straightforward as listing the average and the total number of ratings. Developers should follow Google’s instructions for recipe markup.

There is both required and recommended markup:

Required Markup For Recipes

  • Name of the recipe.
  • Image of the recipe in a BMP, GIF, JPEG, PNG, WebP, or SVG format.

Recommended Markup For Recipes

  • Aggregate rating.
  • Author.
  • Cook time, preparation time, and total duration.
  • Date published.
  • Description.
  • Keywords.
  • Nutrition information.
  • Prep time.
  • Recipe category by meal type, like “dinner.”
  • Region associated with the recipe.
  • Ingredients.
  • Instructions.
  • Yield or total serving.
  • Total time.
  • Video (and other related markup, if there is a video in the recipe).

To have recipes included in Google Assistant Guided Recipes, the following markup must be included:

  • recipeIngredient
  • recipeInstructions
  • To have the video property, add the contentUrl.

For example, here’s what the structured markup would look like for the recipeIngredient property:

example of structured markup for recipe steps in Google AssistantScreenshot from Google Developer, February 2024

Third-Party Review Sites And Google Stars

Many software companies rely on third-party review sites to help inform their customer’s purchasing decisions.

Third-party review sites include any website a brand doesn’t own where a customer can submit a review, such as Yelp, G2, and many more.

Many of these sites, like Featured Customers shown below, can display star ratings within Google search results.

Example of star ratings showing in SERPs from third-party review sitesScreenshot from SERPs listing of a review site, Google, February 2024

Rich snippets from third-party reviews, such as stars, summary info, or ratings, can also appear on a Google Business Profile or map view from approved sites.

For local businesses, Google star ratings appear in different locations than the third-party reviews on a desktop:

third party reviews and google stars on desktop resultsScreenshot from SERPs listing of a review site, Google, February 2024

On mobile, ratings are displayed on a company’s Google Business Profile. Users need to click on Reviews or scroll down to see the third-party reviews:

third party reviews in local mobile resultsScreenshot from SERPs listing of a review site, Google, February 2024

On a map, the results from third parties may be more prominent, like the Tripadvisor review that shows up for a map search of The Hilton in Vancouver (although it does not display a star rating even though Tripadvisor does provide star ratings):

third party reviews in map resultsScreenshot from SERPs listing of a review site, Google, February 2024

How To Get Google Stars On Third-Party Review Sites

The best way to get a review on a third-party review site depends on which site is best for the brand or the business.

For example, if you have active customers on Yelp or Tripadvisor, you may choose to engage with customers there.

third-party reviews in search resultsScreenshot from SERPs listing of a review site, Google, February 2024

Similarly, if a software review site like Trustpilot shows up for your branded search, you could do an email campaign with your customer list asking them to leave you a review there.

Here are a few of the third-party review websites that Google recognizes:

  • Trustpilot.
  • Reevoo.
  • Bizrate – through Shopzilla.

When it comes to third-party reviews, Google reminds businesses that there is no way to opt out of third-party reviews, and they need to take up any issues with third-party site owners.

App Store Results And Google Stars

When businesses have an application as their core product, they typically rely on App Store and Google Play Store downloads.

Right from the SERPs, searchers can see an app’s star ratings, as well as the total votes and other important information, like whether the app is free or not.

App store reviews in search resultsScreenshot from SERP play store results, Google, February 2024

How To Get Google Stars On App Store Results

Businesses can list their iOS apps in the App Store or on the Google Play store, prompt customers to leave reviews there, and also respond to them.

Does The Google Star Rating Influence SEO Rankings?

John Mueller confirmed that Google does not factor star ratings or customer reviews into web search rankings. However, Google is clear that star ratings influence local search results and rankings:

“Google review count and review score factor into local search ranking. More reviews and positive ratings can improve your business’ local ranking.”

Even though they are not a ranking factor for non-local organic search, star ratings can serve as an important conversion element, helping you display social proof, build credibility, and increase your click-through rate from search engines (which may indirectly impact your search rankings).

For local businesses, both Google stars and third-party ratings appear in desktop and mobile searches, as seen above.

These ratings not only help local businesses rank above their competitors for key phrases, but they will also help convince more customers to click, which is every company’s search game.

How Do I Improve My Star Rating?

Businesses that want to improve their Google star rating should start by claiming their Google Business Profile and making sure all the information is complete and up to date.

If a company has already taken these steps and wants to offset a poor rating, they are going to need more reviews to offset the average.

Companies can get more Google reviews by making it easy for customers to leave one. The first step for a company is to get the link to leave a review inside their Google Business Profile:

Ask customers for reviews linkScreenshot from Wordstream, February 2024

From there, companies can send this link out to customers directly (there are four options displayed right from the link as seen above), include it on social media, and even dedicate sections of their website to gathering more reviews and/or displaying reviews from other users.

It isn’t clear whether or not responding to reviews will help improve a local business’s ranking; however, it’s still a good idea for companies to respond to reviews on their Google Business Profile in order to improve their ratings overall.

That’s because responding to reviews can entice other customers to leave a review since they know they will get a response and because the owner is actually seeing the feedback.

For service businesses, Google provides the option for customers to rate aspects of the experience.

This is helpful since giving reviewers this option allows anyone who had a negative experience to rate just one aspect negatively rather than giving a one-star review overall.

Does Having A Star Rating On Google Matter? Yes! So Shoot For The Stars

Stars indicate quality to consumers, so they almost always improve click-through rates wherever they are present.

Consumers tend to trust and buy from brands with higher star ratings in local listings, paid ads, or even app downloads.

Many, many, many studies have demonstrated this phenomenon time and again. So, don’t hold back when it comes to reviews.

Do an audit of where your brand shows up in SERPs and get stars next to as many placements as possible.

The most important part of star ratings across Google, however, will always be the service and experiences companies provide that fuel good reviews from happy customers.

More resources:


Feature Image: BestForBest/Shutterstock
All screenshots taken by author

Google’s Gary Illyes: Lastmod Signal Is Binary via @sejournal, @MattGSouthern

In a recent LinkedIn discussion, Gary Illyes, Analyst at Google, revealed that the search engine takes a binary approach when assessing a website’s lastmod signal from sitemaps.

The revelation came as Illyes encouraged website owners to upgrade to WordPress 6.5, which now natively supports the lastmod element in sitemaps.

When Mark Williams-Cook asked if Google has a “reputation system” to gauge how much to trust a site’s reported lastmod dates, Illyes stated, “It’s binary: we either trust it or we don’t.”

No Shades Of Gray For Lastmod

The lastmod tag indicates the date of the most recent significant update to a webpage, helping search engines prioritize crawling and indexing.

Illyes’ response suggests Google doesn’t factor in a website’s history or gradually build trust in the lastmod values being reported.

Google either accepts the lastmod dates provided in a site’s sitemap as accurate, or it disregards them.

This binary approach reinforces the need to implement the lastmod tag correctly and only specify dates when making meaningful changes.

Illyes commends the WordPress developer community for their work on version 6.5, which automatically populates the lastmod field without extra configuration.

Accurate Lastmod Essential For Crawl Prioritization

While convenient for WordPress users, the native lastmod support is only beneficial if Google trusts you’re using it correctly.

Inaccurate lastmod tags could lead to Google ignoring the signal when scheduling crawls.

With Illyes confirming Google’s stance, it shows there’s no room for error when using this tag.

Why SEJ Cares

Understanding how Google acts on lastmod can help ensure Google displays new publish dates in search results when you update your content.

It’s an all-or-nothing situation – if the dates are deemed untrustworthy, the signal could be disregarded sitewide.

With the information revealed by Illyes, you can ensure your implementation follows best practices to the letter.


Featured Image: Danishch/Shutterstock

Google Reminds Websites To Use Robots.txt To Block Action URLs via @sejournal, @MattGSouthern

In a LinkedIn post, Gary Illyes, an Analyst at Google, reiterated long-standing guidance for website owners: Use the robots.txt file to prevent web crawlers from accessing URLs that trigger actions like adding items to carts or wishlists.

Illyes highlighted the common complaint of unnecessary crawler traffic overloading servers, often stemming from search engine bots crawling URLs intended for user actions.

He wrote:

“Looking at what we’re crawling from the sites in the complaints, way too often it’s action URLs such as ‘add to cart’ and ‘add to wishlist.’ These are useless for crawlers, and you likely don’t want them crawled.”

To avoid this wasted server load, Illyes advised blocking access in the robots.txt file for URLs with parameters like “?add_to_cart” or “?add_to_wishlist.”

As an example, he suggests:

“If you have URLs like:
https://example.com/product/scented-candle-v1?add_to_cart
and
https://example.com/product/scented-candle-v1?add_to_wishlist

You should probably add a disallow rule for them in your robots.txt file.”

While using the HTTP POST method can also prevent the crawling of such URLs, Illyes noted crawlers can still make POST requests, so robots.txt remains advisable.

Reinforcing Decades-Old Best Practices

Alan Perkins, who engaged in the thread, pointed out that this guidance echoes web standards introduced in the 1990s for the same reasons.

Quoting from a 1993 document titled “A Standard for Robot Exclusion”:

“In 1993 and 1994 there have been occasions where robots have visited WWW servers where they weren’t welcome for various reasons…robots traversed parts of WWW servers that weren’t suitable, e.g. very deep virtual trees, duplicated information, temporary information, or cgi-scripts with side-effects (such as voting).”

The robots.txt standard, proposing rules to restrict well-behaved crawler access, emerged as a “consensus” solution among web stakeholders back in 1994.

Obedience & Exceptions

Illyes affirmed that Google’s crawlers fully obey robots.txt rules, with rare exceptions thoroughly documented for scenarios involving “user-triggered or contractual fetches.”

This adherence to the robots.txt protocol has been a pillar of Google’s web crawling policies.

Why SEJ Cares

While the advice may seem rudimentary, the re-emergence of this decades-old best practice underscores its relevance.

By leveraging the robots.txt standard, sites can help tame overzealous crawlers from hogging bandwidth with unproductive requests.

How This Can Help You

Whether you run a small blog or a major e-commerce platform, following Google’s advice to leverage robots.txt for blocking crawler access to action URLs can help in several ways:

  • Reduced Server Load: You can reduce needless server requests and bandwidth usage by preventing crawlers from hitting URLs that invoke actions like adding items to carts or wishlists.
  • Improved Crawler Efficiency: Giving more explicit rules in your robots.txt file about which URLs crawlers should avoid can lead to more efficient crawling of the pages/content you want to be indexed and ranked.
  • Better User Experience: With server resources focused on actual user actions rather than wasted crawler hits, end-users will likely experience faster load times and smoother functionality.
  • Stay Aligned with Standards: Implementing the guidance puts your site in compliance with the widely adopted robots.txt protocol standards, which have been industry best practices for decades.

Revisiting robots.txt directives could be a simple but impactful step for websites looking to exert more control over crawler activity.

Illyes’ messaging indicates that the ancient robots.txt rules remain relevant in our modern web environment.


Featured Image: BestForBest/Shutterstock

Something Weird Is Going On In Google’s SERPs via @sejournal, @martinibuster

People are always complaining that there’s something wrong with Google’s search results but what’s going on with search results for queries with the acronym “SEO” is in a class by itself and has to be seen to be believed.

Anomalies In Search Results

An anomaly is something that deviates from the norm or what’s expected. A lot of time when there’s something wrong with the search engine results pages (SERPs) the anomaly is explainable. For example, queries that combine a geographical element with a relatively longtail phrase tend to generate weird results. Another driver of strange search results is when there simply isn’t enough data about a specific combination of words, which sometimes leads to offensive search results.

What’s happening with a particular group of keyword phrases that are related to the word “SEO ” is not any of those kinds of anomalies. It’s a true anomaly.

Here are the keywords that Google is (arguably) getting wrong:

  • SEO program
  • What is an SEO program?
  • SEO New York (City)
  • SEO NYC
  • SEO Conference
  • SEO Events
  • SEO Education
  • SEO Awards
  • SEO-USA.Org

The site that’s ranking for all those SEO search queries (and probably more) is a site called SEO-USA.org. The acronym SEO in that website stands for Sponsors for Educational Opportunity. It’s not a spam site, it’s a legit non-profit website that’s been around since 1963. The purpose of the non-profit is to provide mentorship to young people who are underserved to help them get into colleges and universities. That program evolved in the SEO Scholars, an eight year academic program for talented young people to help them through high school and college.

“SEO Scholars creates a more equitable society by closing the academic opportunity gap for motivated young people, setting the standard for academics, mentorship, community, peer-to-peer support, and a powerful, lifelong network.”

SEO-USA.org Is Not Relevant For SEO

The acronym SEO is heavily relevant for the context of online marketing. A search for “SEO” in Google spawns suggestions that are all relevant for SEO in the sense of search marketing.

Google Trends shows that the phrase SEO Scholars and SEO Scholars Application are not widely searched in the United States, most of the searches occur in New York. But SEO-USA.org is top ranked for the group of keywords listed above in other areas outside of New York.

Screenshot Of SERPs For Keyword Phrase “SEO Awards”

It’s kind of obvious that SEO-USA.org is not relevant for the most commonly understood meaning for the acronym SEO.

Could Backlinks Be The Reason?

It’s possible that the reason SEO-USA.org is ranking for all of those phrases is because of backlinks. A search for the domain name but restricted to .edu sites shows almost seventy .edu websites that link to the the SEO-USA.org domain name.

This is the advanced search that shows scores of .edu sites that link or mention SEO-USA.org:

"seo-usa.org" site:.edu"

Screenshot Of Site:.EDU Search

There are also a large amount of high quality sites with dot org domains that link to SEO-USA.org as well, which is observable using the following advanced search:

"seo-usa.org" site:.org -site:seo-usa.org"

On the surface it looks clear that backlinks are the reason why SEO-USA.org ranks for irrelevant keywords.

But of course, the most obvious answer isn’t always the right answer. There’s more to the picture.

Why Links Probably Don’t Explain The Rankings

If links were the reason for SEO-USA.org’s rankings then it would follow that virtually every keyword phrase related to SEO would be littered with .edu and .org websites but that’s not the case.

I’ve been doing SEO for about 25 years now and I remember the days when sites that had the maximum level of PageRank used to rank for virtually anything. Also, dot edu links were regarded as powerful because SEOs were able to rank quite well with them.

Google’s algorithms improved and the effect from .edu links started to wane because context of a link started counting more. The words in the title element and the words in the surrounding text influenced the links. I know this too from my experience.

Another important change in Google’s link ranking algorithms was to dampen the effect of quantity of links. It used to be that an avalanche of links was enough to help a site rank over more authoritative sites. I know this from my experience too.

But the effect of a huge amount of links also changed in many ways, like hundreds of links from one domain stopped counting as hundreds of links and began counting as just one link. The position of a link within a page also mattered more, there were lots of changes that whittled down the power of links so that less and less links mattered for the wrong reasons.

I’m kind of skeptical that links is the reason why SEO-USA.org ranks.

What’s The Answer?

For some reason, a relevance factor is not kicking in, which allows the (arguably) irrelevant SEO-USA.org site to rank for keywords it probably shouldn’t rank for.

I think that’s a clue, a reason for why that site is ranking where it should not. It’s slipping through because something is missing that would ordinarily be there to keep it out.

It may very well be that there’s a factor related to trustworthiness that is allowing that site to slip through. That’s just speculation. Do you have any ideas?

Featured Image by Shutterstock/SS 360