Google Ranking Systems & Signals: How To Adapt Your SEO Strategy In 2024 & Beyond via @sejournal, @sejournal

Have you noticed a dip in your search rankings lately?

Are you feeling frustrated and anxious about your website’s performance?

Given the state of SEO this past year, we’d be surprised if you didn’t.

As the search landscape continues to evolve, we’re seeing a surge in volatility, with high-quality content often outranked by spam pages.

And with Google’s algorithms becoming more and more complex, traditional best practices no longer seem to cut it.

So, what does this mean for you and your strategy?

How can you navigate these complexities and boost your search rankings?

Our new ebook, Google Ranking Systems & Signals 2024, is the ultimate resource for understanding the recent ranking trends and unlocking sustainable SEO success.

You’ll get expert insights and analysis from seasoned SEO professionals, digital marketing strategists, industry thought leaders, and more.

Our featured experts include:

  • Adam Riemer, President, Adam Riemer Marketing.
  • Aleh Barysevich, Founder, SEO PowerSuite.
  • Andrea Volpini, Co-Founder and CEO, WordLift.
  • Dan Taylor, Partner & Head of Technical SEO, SALT.agency.
  • Erika Varangouli, Head of Branded Content at Riverside.fm.
  • Helen Pollitt, Head of SEO, Car & Classic.
  • Kevin Indig, Writer of the Growth Memo.
  • Kevin Rowe, Founder & Head of Digital PR Strategy, PureLinq.
  • Ludwig Makhyan, Global Head of Technical SEO, EssilorLuxottica.
  • Mordy Oberstein, Head of SEO Brand at Wix.
  • Scott Stouffer, CTO and Co-Founder, Market Brew.

Download the ebook to learn about the latest developments in Google Search, and how to meet the challenges of today’s competitive search environment.

From the rise of spam content on SERPs to the most reliable ranking factors, this comprehensive guide covers it all.

We also address where different types of content belong and offer advice on whether you should diversify your acquisition channels or pivot to gated content models.

Explore the following topics inside:

  • Why Is Search Full Of Spam?
  • What Are The Top Ranking Factors That SEO Pros Can Rely On Right Now?
    • The Top 3 Ranking Factors
    • Freshness & Content Maintenance
    • “Ranking” In Search Generative Experience
  • Staying Indexed Is The New SEO Challenge
  • Where Does Your Best Content Belong?
  • Proactively Embracing SEO Disruption By Focusing On User Needs
  • Making Sense Of Ranking In 2024

Whether you’re a seasoned professional or just starting out, this ebook is full of practical tips and actionable strategies to help you improve your website’s visibility and drive organic traffic.

Grab your copy of Google Ranking Systems & Signals 2024 today, and start optimizing your website for success in 2024 and beyond!

An announcement to adapt SEO strategies for Google's systems with an image of a book titled


Featured Image: Paulo Bobita/Search Engine Journal

Apple’s AI Push: ChatGPT For Everyone, But At What Cost? via @sejournal, @MattGSouthern

At its annual developer conference, Apple announced a wave of AI-powered features for users of the latest devices.

While these developments showcase new ways to use generative AI, they raise questions about the potential impact on content creators, publishers, and search and discovery.

Here’s an overview of what Apple announced and the broader considerations.

Integration Of ChatGPT

Screenshot from: Apple.com, June 2024.

At the heart of Apple’s AI push is the integration of ChatGPT, the chatbot developed by OpenAI.

By weaving ChatGPT’s capabilities into Siri, systemwide writing tools, and image generation features, Apple is making generative AI more accessible to users.

However, this integration has implications that extend beyond user convenience.

Potential Impact On Content Visibility & Discoverability

One of the primary concerns is how Apple’s AI will affect the visibility and discoverability of publisher content within apps and on the web.

As Siri and other AI-powered features become more adept at understanding context and providing targeted suggestions, there is a risk that certain types of content or sources may be prioritized over others.

This could impact traffic and revenue for publishers outside of Apple’s preferred partner ecosystem.

Concerns About Control & Transparency

An increased reliance on AI-driven recommendations and content curation raises questions about the level of control and transparency.

If Apple’s algorithms begin to favor specific content types, formats, or sources, it could create an uneven playing field for publishers and limit the diversity of information available to users.

This is concerning, given Apple’s scale and influence in the tech industry.

Potential For A Closed Ecosystem

Another potential consequence of Apple’s AI advancements is the creation of a more closed and curated ecosystem.

By providing users with highly personalized and context-aware experiences, Apple may inadvertently discourage them from venturing outside its walled garden.

This could limit opportunities for publishers and marketers outside Apple’s inner circle, as they may need help gaining visibility and engaging with users within the Apple ecosystem.

Availability

Apple’s new AI capabilities will be free for users. ChatGPT subscribers can connect their accounts to access premium features.

A beta version in English will launch this fall for iPhones, iPads, and Macs in the U.S., and more languages and capabilities will roll out over the next year.

To access the features, you’ll need an iPhone 15 Pro or a Mac with an M1 chip (or newer).

Looking Ahead

As Apple rolls out these AI features to millions of users worldwide, publishers, content creators, and the tech community should closely monitor their impact.

While Apple’s AI advancements undoubtedly offer exciting possibilities for enhancing the user experience, they must be approached critically.

How Apple’s AI shapes user behavior and content discovery could have far-reaching consequences.

By carefully examining the potential risks, the industry can work toward a future that balances innovation with equal opportunities.

Google Issues Statement About CTR And HCU via @sejournal, @martinibuster

In a series of tweets, Google’s SearchLiaison responded to a question that connected click-through rates (CTR) and HCU (Helpful Content Update) with how Google ranks websites, remarking that if the associated ideas were true it would be impossible for any new website to rank.

Users Are Voting With Their Feet?

Search Liaison’s answer was to a tweet that quoted an interview answer by Google CEO Sundar Pichai, the quote being, “Users vote with their feet”.

Here is the tweet:

“If the HCU (Navboost, whatever you want to call it) is clicks/user reaction based – how could sites hit by the HCU ever hope to recover if we’re no longer being served to Google readers?

@sundarpichai “Users vote with their feet”,

Okay I’ve changed my whole site – let them vote!”

The above tweet appears to connect Pichai’s statement to Navboost, user clicks and rankings. But as you’ll see below, Sundar’s statement about users voting “with their feet” has nothing to do with clicks or ranking algorithms.

Background Information

Sundar Pichai’s answer about users voting “with their feet” has nothing to do with clicks.

The problem with the interview question (and Sundar Pichai’s answer) is that the question and answer are in the context of “AI-powered search and the future of the web.”

The interviewer at The Verge used a site called HouseFresh as an example of a site that’s losing traffic because of Google’s platform shift to the new AI Overviews.

But the HouseFresh site’s complaints predate AI Overviews. Their complaints are about Google ranking low quality “big media” product reviews over independent sites like HouseFresh.

HouseFresh wrote:

“Big media publishers are inundating the web with subpar product recommendations you can’t trust…

Savvy SEOs at big media publishers (or third-party vendors hired by them) realized that they could create pages for ‘best of’ product recommendations without the need to invest any time or effort in actually testing and reviewing the products first.”

Sundar Pichai’s answer has nothing to do with why HouseFresh is losing traffic. His answer is about AI Overviews. HouseFresh’s issues are about low quality big brands outranking them. Two different things.

  • The Verge-affiliated interviewer was mistaken to cite HouseFresh in connection with Google’s platform shift to AI Overviews.
  • Furthermore, Pichai’s statement has nothing to do with clicks and rankings.

Here is the interview question published on The Verge:

“There’s an air purifier blog that we covered called HouseFresh. There’s a gaming site called Retro Dodo. Both of these sites have said, “Look, our Google traffic went to zero. Our businesses are doomed.”

…Is that the right outcome here in all of this — that the people who care so much about video games or air purifiers that they started websites and made the content for the web are the ones getting hurt the most in the platform shift?”

Sundar Pichai answered:

“It’s always difficult to talk about individual cases, and at the end of the day, we are trying to satisfy user expectations. Users are voting with their feet, and people are trying to figure out what’s valuable to them. We are doing it at scale, and I can’t answer on the particular site—”

Pichai’s answer has nothing to do with ranking websites and absolutely zero context with the HCU. What Pichai’s answer means is that users are determining whether or not AI Overviews are helpful to them.

SearchLiaison’s Answer

Let’s reset the context of SearchLiaison’s answer, here is the tweet (again) that started the discussion:

“If the HCU (Navboost, whatever you want to call it) is clicks/user reaction based – how could sites hit by the HCU ever hope to recover if we’re no longer being served to Google readers?

@sundarpichai “Users vote with their feet”,

Okay I’ve changed my whole site – let them vote!”

Here is SearchLiaison’s response:

“If you think further about this type of belief, no one would ever rank in the first place if that were supposedly all that matters — because how would a new site (including your site, which would have been new at one point) ever been seen?

The reality is we use a variety of different ranking signals including, but not solely, “aggregated and anonymized interaction data” as covered here:”

The person who started the discussion responded with:

“Can you please tell me if I’m doing right by focusing on my site and content – writing new articles to be found through search – or if I should be focusing on some off-site effort related to building a readership? It’s frustrating to see traffic go down the more effort I put in.”

When a client says something like “writing new articles to be found through search” I always follow up with questions to understand what they mean. I’m not commenting about the person who made the tweet, I’m just making an observation about past conversations I’ve had with clients. When a client says something like that, they sometimes mean that they’re researching Google keywords and competitor sites and using that keyword data verbatim within their content instead of relying on their own personal expertise and understanding of what the readers want and need.

Here’s SearchLiaison’s answer:

“As I’ve said before, I think everyone should focus on doing whatever they think is best for their readers. I know it can be confusing when people get lots of advice from different places, and then they also hear about all these things Google is supposedly doing, or not doing, and really they just want to focus on content. If you’re lost, again, focus on that. That is your touchstone.”

Site Promotion To People

SearchLiaison next addressed the excellent question about off-site promotion where he strongly asserted focusing on the readers. A lot of SEOs focus on promoting sites to Google, which is what link building is all about.

Promoting sites to people is super important. It’s one of the things that I see high ranking sites do and, although I won’t mention specifics, I believe it feeds into higher rankings in an indirect way.

SearchLiaison continued:

“As to the off-site effort question, I think from what I know from before I worked at Google Search, as well as my time being part of the search ranking team, is that one of the ways to be successful with Google Search is to think beyond it.

Great sites with content that people like receive traffic in many ways. People go to them directly. They come via email referrals. They arrive via links from other sites. They get social media mentions.

This doesn’t mean you should get a bunch of social mentions, or a bunch of email mentions because these will somehow magically rank you better in Google (they don’t, from how I know things). It just means you’re likely building a normal site in the sense that it’s not just intended for Google but instead for people. And that’s what our ranking systems are trying to reward, good content made for people.”

What About False Positives?

The phrase false positive is used in many contexts and one of them is to describe the situation of a high quality site that loses rankings because an algorithm erroneously identified it as low quality. SearchLiaison offered hope to high quality sites that may have seen a decrease in traffic, saying that it’s possible that the next update may offer a positive change.

He tweeted:

“As to the inevitable “but I’ve done all these things when will I recover!” questions, I’d go back to what we’ve said before. It might be the next core update will help, as covered here:

It might also be that, as I said here, it’s us in some of these cases, not the sites, and that part of us releasing future updates is doing a better job in some of these cases:

SearchLiaison linked to a tweet by John Mueller from a month ago where he said that the search team is looking for ways to surface more helpful content.

“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”

Is Your Site High Quality?

Everyone likes to think that their site is high quality and most times it is. But there are also cases where a site publisher will do “everything right” in terms of following SEO practices but what they’re unaware of is that those “good SEO practices” that are backfiring on them.

One example, in my opinion, is the widely practiced strategy of copying what competitors are doing but “doing it better.” I’ve been hands-on involved in SEO for well over 20 years and that’s an example of building a site for Google and not for users. It’s a strategy that explicitly begins and ends with the question of “what is Google ranking and how can I create that?”

That kind of strategy can create patterns that overtly signal that a site is not created for users.  It’s also a recipe for creating a site that offers nothing new from what Google is already ranking. So before assuming that everything is fine with the site, be certain that everything is indeed fine with the site.

Featured Image by Shutterstock/Michael Vi

Google Gives Merchants New Insights Into Shopping Search Performance via @sejournal, @MattGSouthern

Google has introduced a feature in Search Console that allows merchants to track their product listings in the Google Search Image tab.

This expanded functionality can help businesses better understand their visibility across Google’s shopping experiences.

Where To Find ‘Merchant Listings Performance’ In Search Console

The new data is accessible through the “Performance” report under the “Google Search Image” tab.

From there, you can monitor the performance of your listings across various Google surfaces.

This includes information on impressions, clicks, and other key metrics related to your product showcases.

By integrating merchant listing performance into Search Console, businesses get a more comprehensive view of their product visibility to optimize their strategies accordingly.

Eligibility & Shopping Section In Search Console

To qualify for merchant listing reports, a website must be identified by Google as an online merchant primarily selling physical goods or services directly to consumers.

Affiliate sites or those that redirect users to other platforms for purchase completion are not considered eligible.

Once recognized as an online merchant, the Search Console will display a “Shopping” section in its navigation bar.

This dedicated area houses tools and reports tailored to shopping experiences, including:

  1. Product Snippet Rich Report: Providing insights into product snippet structured data on the site, enabling enhanced search result displays with visual elements like ratings and prices.
  2. Merchant Listing Rich Report: Offering analytics on merchant listing structured data enables more comprehensive search results, often appearing in carousels or knowledge panels.
  3. Shopping Tab Listings: Information and guidance on enabling products to appear in the dedicated Shopping tab within Google Search results.

Google’s automated systems determine a site’s eligibility as an online merchant based on the presence of structured data and other factors.

In Summary

This new feature in Google’s Search Console provides valuable information about the visibility of your product listings in search results.

You can use these insights to make changes and improve your products’ visibility so that more potential customers can find them.


Featured Image: T. Schneider/Shutterstock

Google Responds: Is Desktop SEO Still Necessary? via @sejournal, @martinibuster

Google’s John Mueller responded to a question about whether it’s okay to stop optimizing a desktop version of a website now that Google is switching over to exclusively indexing mobile versions of websites.

The question asked is related to an announcement they made a week ago:

“…the small set of sites we’ve still been crawling with desktop Googlebot will be crawled with mobile Googlebot after July 5, 2024. … After July 5, 2024, we’ll crawl and index these sites with only Googlebot Smartphone. If your site’s content is not accessible at all with a mobile device, it will no longer be indexable.”

Stop Optimizing Desktop Version Of A Site?

The person asking the question wanted to know if it’s okay to abandon optimizing a purely desktop version of a site and just focus on the mobile friendly version. The person is asking because they’re new to a company and the developers are far into the process of developing a mobile-only version of a site.

This is the question:

“I am currently in a discussion at my new company, because they are implementing a different mobile site via dynamic serving instead of just going responsive. Next to requirements like http vary header my reasoning is that by having two code bases we need to crawl, analyze and optimize two websites instead of one. However, this got shut down because “due to mobile first indexing we no longer need to optimize the desktop website for SEO”. I read up on all the google docs etc. but I couldn’t find any reasons as to why I would need to keep improving the desktop website for SEO, meaning crawlability, indexability, using correct HTML etc. etc. What reasons are there, can you help me?”

Mobile-Only Versus Responsive Website

Google’s John Mueller expressed the benefits of one version of a website that’s responsive. This eliminates the necessity of maintaining two websites plus it’s desktop-friendly to site visitors who are visiting a site with a desktop browser.

He answered:

“First off, not making a responsive site in this day & age seems foreign to me. I realize sometimes things just haven’t been updated in a long time and you might need to maintain it for a while, but if you’re making a new site”

Maintaining A Desktop-Friendly Site Is A Good Idea

Mueller next offered reasons why it’s a good idea to maintain a functional desktop version of a website, such as other search engines, crawlers and site visitors who actually are on desktop devices. Most SEOs understand that conversions, generating income with a website, depends on being accessible to all site visitors, that’s the big picture. Optimizing a site for Google is only a part of that picture, it’s not the entire thing itself.

Mueller explained:

“With mobile indexing, it’s true that Google focuses on the mobile version for web search indexing. However, there are other search engines & crawlers / requestors, and there are other requests that use a desktop user-agent (I mentioned some in the recent blog post, there are also the non-search user-agents on the user-agent documentation page).”

He then said that websites exist for more than just getting crawled and ranked by Google.

“All in all, I don’t think it’s the case that you can completely disregard what’s served on desktop in terms of SEO & related. If you had to pick one and the only reason you’re running the site is for Google SEO, I’d probably pick mobile now, but it’s an artificial decision, sites don’t live in isolation like that, businesses do more than just Google SEO (and TBH I hope you do: a healthy mix of traffic sources is good for peace of mind). And also, if you don’t want to have to make this decision: go responsive.”

After the person asking the question explained that the decision had already been made to focus on mobile, Mueller responded that this is a case of choosing your battles.

“If this is an ongoing project, then shifting to dynamic serving is already a pretty big step forwards. Pick your battles :). Depending on the existing site, sometimes launching with a sub-optimal better version earlier is better than waiting for the ideal version to be completed. I’d just keep the fact that it’s dynamic-serving in mind when you work on it, with any tools that you use for diagnosing, monitoring, and tracking. It’s more work, but it’s not impossible. Just make sure the desktop version isn’t ignored completely :). Maybe there’s also room to grow what the team (developers + leads) is comfortable with – perhaps some smaller part of the site that folks could work on making responsive. Good luck!”

Choose Your Battles Or Stand Your Ground?

John Mueller’s right that there are times where it’s better to choose your battles rather than dig in and compromise. But just make sure that your recommendations are on record and that those pushing back are on record. That way if things go wrong the blame will find it’s way back to the ones who are responsible.

Featured Image by Shutterstock/Luis Molinero

Google: Can 10 Pages Impact Sitewide Rankings? via @sejournal, @martinibuster

Google’s John Mueller answered a question about sitewide impacts on a site with ten pages that lost rankings in the March/April 2024 Core Update then subsequently experienced a sitewide collapse in May.

Can 10 Pages Trigger A Sitewide Penalty?

The person asking the question on Reddit explained that they had ten pages (out of 20,000 pages) that were hit by the Helpful Content Update (HCU) in September 2023. They subsequently updated the pages which eventually recovered their rankings and traffic. Things were fine until the the same ten pages got slammed by the March/April core update. The precise date of the second ranking drop event was April 20th.

Up to that point the rest of the site was fine. Only the same ten pages were affected. That changed on May 7th when the site experienced a sitewide drop in rankings across all 20,000 pages of the website.

Their question was if the ten problematic pages triggered a sitewide impact or whether the May 7th collapse was due to the Site Reputation Abuse penalties that were announced on May 6th.

A Note About Diagnosing Ranking Drops

I’m not commenting specifically about the person who asked the question but… the question has the appearance of correlating ranking drops with specific parts of announced algorithm updates.

Here is the exact wording:

“Our website has about 20K pages, and we found that around 10 pages were hit by HCU in September. We updated those articles and saw a recovery in traffic, but after the March core update around April 20, the same pages were hit again, likely due to HCU. On May 7th, we saw a sharp drop in rankings across the board, and suspect that a sitewide classifier may have been applied.

Question: Can an HCU hit on 10 pages cause a sitewide classifier for 20K pages? Or on May 7th reputation abuse update may had an impact?”

In general it’s reasonable to assume that a ranking drop is connected to a recently announced Google update when the dates of both events match. However, it bears pointing out that a core algorithm update can affect multiple things (for example query-content relevance) and it should be understood that the HCU is no longer a single system.

The person asking the question is following a pattern that I often see which is that they’re assuming that ranking drops are due to something wrong with their site but that’s not always the case, it could be changes in how Google interprets a search query (among many other potential reasons).

The other potential mistake is assuming that the problem is related to a specific algorithm. The person asking the question assumes they were hit by the HCU system, which is something that no longer exists. All the elements of the HCU were subsumed into the core ranking algorithm as signals.

Here is what Google’s documentation says about what happened to the HCU:

Is there a single “helpful content system” that Google Search uses for ranking?
Our work to improve the helpfulness of content in search results began with what we called our “helpful content system” that was launched in 2022. Our processes have evolved since. There is no one system used for identifying helpful content. Instead, our core ranking systems use a variety of signals and systems.”

While Google is still looking for helpfulness in content there is no longer a helpful content system that’s de-ranking pages on specific dates.

The other potential evidence of faulty correlation is when the Redditor asked if their May 7th sitewide collapse was due to the site reputation abuse penalties. The site reputation abuse penalties weren’t actually in effect by May 7th. On May 6th it was announced that site reputation abuse manual actions would begin at some point in the near future.

Those are two examples of how it can be misleading to correlate site ranking anomalies with announced updates. There is more to diagnosing updates than correlating traffic patterns to announced updates. Site owners and SEOs who diagnose problems in this manner risk approaching the solution like someone who’s focusing on the map instead of looking at the road.

Properly diagnosing issues requires understanding the full range of technical issues that can impact a site and algorithmic changes that can happen on Google’s side (especially unannounced changes). I have over 20 years experience and know enough to be able to identify anomalies in the SERPs that indicate changes to how Google is approaching relevance.

Complicating the diagnosis is that sometimes it’s not something that needs “fixing” but rather it’s more about the competition is doing something more right than the sites that lost rankings. More right can be a wide range of things.

Ten Pages Caused Sitewide “Penalty”?

John Mueller responded by first addressing the specific issue of sitewide ranking collapse, remarking that he doesn’t think it’s likely that ten pages would cause 20,000 other pages to lose rankings.

John wrote:

“The issues more folks post about with regards to core updates tend to be site-wide, and not limited to a tiny subset of a site. The last core update was March/April, so any changes you’d be seeing from May would be unrelated. I’m not sure how that helps you now though :-), but I wouldn’t see those 10 pages as being indicative of something you need to change across 20k other pages.”

Sometimes It’s More Than Announced Updates

John Mueller didn’t offer a diagnosis of what is wrong with the site, that’s impossible to say without actually seeing the site. SEOs on YouTube, Reddit and Facebook routinely correlate ranking drops with recently announced updates but as I wrote earlier in this article, that could be a mistake.

When diagnosing a drop in rankings it’s important to look at the site, the competition and the SERPs.

Do:

  • Inspect the website
  • Review a range of keywords and respective changes in the SERPs
  • Inspect the top ranked sites

Don’t:

  • Assume that a ranking drop is associated with a recent update and stop your investigation right there.

Google’s John Mueller alludes to the complexity of diagnosing ranking drops by mentioning that sometimes it’s not even about SEO, which is 100% correct.

John explained:

“Based on the information you posted, it’s also impossible to say whether you need to improve / fix something on those 20k pages, or if the world has just moved on (in terms of their interests, their expectations & your site’s relevance).

It sounds like you did find things to make more “helpful” on those 10 pages, maybe there’s a pattern? That’s something for you to work out – you know your site, its content, its users best. This isn’t an easy part of SEO, sometimes it’s not even about SEO.”

Look At The Road Ahead

It’s been a trend now that site owners focus on recent announcements by Google as clues to what is going on with their sites. It’s a reasonable thing to do and people should 100% keep doing that. But don’t make that the limit of your gaze because there is always the possibility that there is something else going on.

Featured Image by Shutterstock/vovan

Google Quietly Fixed Site Names In Search Results via @sejournal, @martinibuster

Google resolved a site names issue that had been ongoing since September 2023 that prevented a website’s site name from properly appearing when an inner page was ranked in the search results.

Site Names In The Search Results

A site name is exactly what it sounds like, the name of a website that’s displayed in the search engine results pages (SERPs). This is a feature that allows users to identify the name of the site that’s in the search engine results pages (SERPs).

If your site name is Acme Anvil Company, and that’s how the company is known, then Google wants to display Acme Anvil Company in the search results. If Acme Anvil Company is better known as the AAC and that’s what the company wants to show in the SERPs, then that’s what Google wants to show.

Google allows site owners to use the “WebSite” structured data on the home page to specify the correct site name that Google should use.

Problem Propagating Site Names

Back in September 7, 2023, Google published a warning in their site name documentation that acknowledged they were having problems propagating the site name to the inner pages of a site when those inner pages were shown in the SERPs.

This is the warning that was published:

“Known issue: site name isn’t appearing for internal pages
In some cases, a site name shown for a home page may not have propagated to appear for other pages on that site. For example, example.com might be showing a site name that’s different from example.com/internal-page.html.

We’re actively working to address this. We will update this help page when this issue is resolved. In the meantime, if your home page is showing the site name you prefer, understand that it should also appear for your internal pages eventually.”

Google Fixes Site Name Problem

The documentation for the site name problem was recently removed. A changelog for Google documentation noted this:

“Resolving the issue with site names and internal pages
What: Removed the warning about the issue that was preventing new site names from propagating to internal pages.

Why: The issue has been resolved. Keep in mind that it takes time for Google to recrawl and process the new information, including recrawling your internal pages.”

There’s no word on what caused the site name propagation problem but it is interesting that it was finally fixed after all this time because one has to wonder if it took so long because it was low priority or if something on the backend of Google’s systems changed that allowed them to finally fix the issue.

Read Google’s updated site names documentation:

Provide a site name to Google Search

Featured Image by Shutterstock/Cast Of Thousands

Google On How It Manages Disclosure Of Search Incidents via @sejournal, @martinibuster

Google’s latest Search Off The Record podcast discussed examples of disruptive incidents that can affect crawling and indexing and discuss the criteria for deciding whether or not to disclose the details of what happened.

Complicating the issue of making a statement is that there are times when SEOs and publishers report that Search is broken when from Google’s point of view they’re working the way they’re supposed to.

Google Search Has A High Uptime

The interesting part of the podcast began with the observation that Google Search (the home page with the search box) itself has an “extremely” high uptime and rarely ever goes down and become unreachable. Most of the reported issues were due to network routing issues from the Internet itself than a failure from within Google’s infrastructure.

Gary Illyes commented:

“Yeah. The service that hosts the homepage is the same thing that hosts the status dashboard, the Google Search Status Dashboard, and it has like an insane uptime number. …the number is like 99.999 whatever.”

John Mueller jokingly responded with the word “nein” (pronounced like the number nine), which means “no” in German:

“Nein. It’s never down. Nein.”

The Googlers admit that the rest of Google Search on the backend does experience outages and they explain how that’s dealt with.

Crawling & Indexing Incidents At Google

Google’s ability to crawl and index web pages is critical for SEO and earnings. Disruption can lead to catastrophic consequences particularly for time-sensitive content like announcements, news and sales events (to name a few).

Gary Illyes explained that there’s a team within Google called Site Reliability Engineering (SRE) that’s responsible for making sure that the public-facing systems are running smoothly. There’s an entire Google subdomain devoted to the site reliability where they explain that they approach the task of keeping systems operational similar to how software issues are. They watch over services like Google Search, Ads, Gmail, and YouTube.

The SRE page explains the complexity of their mission as being very granular (fixing individual things) to fixing larger scale problems that affect “continental-level service capacity” for users that measure in the billions.

Gary Ilyes explains (at the 3:18 minute mark):

“Site Reliability Engineering org publishes their playbook on how they manage incidents. And a lot of the incidents are caught by incidents being issues with whatever systems. They catch them with automated processes, meaning that there are probers, for example, or there are certain rules that are set on monitoring software that looks at numbers.

And then, if the number exceeds whatever value, then it triggers an alert that is then captured by a software like an incident management software.”

February 2024 Indexing Problem

Gary next explains how the February 2024 indexing problem is an example of how Google monitors and responds to incidents that could impact users in search. Part of the response is figuring out if it’s an actual problem or a false positive.

He explains:

“That’s what happened on February 1st as well. Basically some number went haywire, and then that opened an incident automatically internally. Then we have to decide whether that’s a false positive or it’s something that we need to actually look into, as in like we, the SRE folk.

And, in this case, they decided that, yeah, this is a valid thing. And then they raised the priority of the incident to one step higher from whatever it was.

I think it was a minor incident initially and then they raised it to medium. And then, when it becomes medium, then it ends up in our inbox. So we have a threshold for medium or higher. Yeah.”

Minor Incidents Aren’t Publicly Announced

Gary Ilyes next explained that they don’t communicate every little incident that happens because most of the times it won’t even be noticed by users. The most important consideration is whether the incident affects users, which will automatically receive an upgraded priority level.

An interesting fact about how Google decides what’s important is that problems that affect users are automatically boosted to a higher priority  level. Gary said he didn’t work in SRE so he was unable to comment on the exact number of users that need to be affected before Google decides to make a public announcement.

Gary explained:

“SRE would investigate everything. If they get a prober alert, for example, or an alert based on whatever numbers, they will look into it and will try to explain that to themselves.

And, if it’s something that is affecting users, then it almost automatically means that they need to raise the priority because users are actually affected.”

Incident With Images Disappearing

Gary shared another example of an incident, this time it was about images that weren’t showing up for users. It was decided that although the user experience was affected it was not affected to the point that it was keeping users from finding what they were searching for, the user experience was degraded but not to the point where Google became unusable. Thus, it’s not just whether users are affected by an incident that will cause an escalation in priority but also how badly the user experience is affected.

The case of the images not displaying was a situation in which they decided to not make a public statement because users could still be able to find the information they needed. Although Gary didn’t mention it, it does sound like an issue that recipe bloggers have encountered in the past where images stopped showing.

He explained:

“Like, for example, recently there was an incident where some images were missing. If I remember correctly, then I stepped in and I said like, “This is stupid, and we should not externalize it because the user impact is actually not bad,” right? Users will literally just not get the images. It’s not like something is broken. They will just not see certain images on the search result pages.

And, to me, that’s just, well, back to 1990 or back to 2008 or something. It’s like it’s still usable and still everything is dandy except some images.”

Are Publishers & SEOs Considered?

Google’s John Mueller asked Gary if the threshold for making a public announcement was if the user’s experience was degraded or if it was the case that the experience of publishers and SEOs were also considered.

Gary answered (at about the 8 minute mark):

“So it’s Search Relations, not Site Owners Relations, from Search perspective.

But by extension, like the site owners, they would also care about their users. So, if we care about their users, it’s the same group of people, right? Or is that too positive?”

Gary apparently sees his role as primarily as Search Relations in a general sense of their users. That may come as a surprise to many in the SEO community because Google’s own documentation for their Search Off The Record podcast explains the role of the Search Relations team differently:

“As the Search Relations team at Google, we’re here to help site owners be successful with their websites in Google Search.”

Listening to the entire podcast, it’s clear that Googlers John Mueller and Lizzi Sassman are strongly focused on engaging with the search community. So maybe there’s a language issue that’s causing his remark to be interpretable differently than he intended?

What Does Search Relations Mean?

Google explained that they have a process for deciding what to disclose about disruptions in search and it is a 100% sensible approach. But something to consider is that the definition of “relations” is that it’s about a connection between two or more people.

Search is a relation(ship). It is an ecosystem where two partners, the creators (SEOs and site owners) create content and Google makes it available to their users.

Featured Image by Shutterstock/Khosro

Google Case Study Shows Importance Of Structured Data via @sejournal, @martinibuster

Google published a case study that shows how using structured data and following best practices improved discoverability and brought more search traffic. The case study was about the use of Video structured data but the insights shared are applicable across a range of content types.

The new case study is about an Indonesian publisher called Vidio.

How CDNs Can Cause Indexing Problems

One of the interesting points in the case study is about an issue related to how CDNs can link to image and video files with expiring URLs. The new documentation specifically mentions that it’s important that the CDN uses stable URLs and links to another Google documentation page that goes into more detail.

Google explains that some CDNs use quickly expiring URLs for video and thumbnail files and encourages publishers and SEOs to use just one stable URL for each video. Something interesting to note is that not only does this help Google index the files it also helps Google collect user interest signals.

This is what the documentation advises:

“Some CDNs use quickly expiring URLs for video and thumbnail files. These URLs may prevent Google from successfully indexing your videos or fetching the video files. This also makes it harder for Google to understand users’ interest in your videos over time.

Use a single unique and stable URL for each video. This allows Google to discover and process the videos consistently, confirm they are still available and collect correct signals on the videos.”

Implementing The Correct Structured Data

Google highlighted the importance of using the correct structured data and validating it with Google’s structured data testing tool.

These are the results of the above work:

“Within a year of implementing VideoObject markup, Vidio saw improvements in impressions and clicks on their video pages. While the number of videos that Vidio published from Q1 2022 to Q1 2023 increased by ~30%, adding VideoObject markup made their videos eligible for display in various places on Google.

This led to an increase of ~3x video impressions and close to 2x video clicks on Google Search. Vidio also used the Search Console video indexing report and performance report, which helped them to identify and fix issues for their entire platform.”

Indexing + Structured Data = More Visibility

The keys to better search performance were ensuring that Google is able to crawl the URLs, which is something that can easily be overlooked in the rush to correlate a drop in rankings to a recent algorithm update. Never rule anything out during a site audit.

Another thing the case study recommends that is important is to assure that the proper structured data is being used. Using the appropriate structured data can help make a webpage qualify for improved search visibility through one of Google’s enhanced search features like featured snippets.

Read Google’s case study:

How Vidio brought more locally relevant video-on-demand (VOD) content to Indonesian users through Google Search

Featured Image by Shutterstock/Anton Vierietin

Rand Fishkin At MozCon: Rethinking Strategies Amid Google API “Leak” via @sejournal, @MattGSouthern

At the MozCon industry conference this week, Rand Fishkin, the outspoken former CEO of Moz and founder of SparkToro, shared his opinion on how SEOs and marketers should potentially adjust strategies based on his interpretation of the recent Google API leaks.

In a packed session with Dr. Pete Meyers, Fishkin laid out specific ways he believes the leaked information, which has not been verified, could impact best practices.

Fishkin firmly believes the leaks contradict Google’s public statements about its systems.

“Google has been unkind and unfair. They have been abusive about this,” Fishkin stated, though these are his opinions based on reviewing the leaks.

On Google’s lack of transparency, Fishkin states:

“Google has told us off and on that they don’t use clicks for ranking. And I always heard it, maybe this is charitable on my part, as we don’t use capital ‘C’ clicks for capital ‘R’ ranking. And the truth is, I think even that was charitable on my case.

And we’ve seen in not just these documents, but anyone who’s familiar with Andrew Navick’s testimony last year, it’s really confirming a lot of what we saw, a lot of what we saw with Navboost.”

He adds:

“They have lied through either omission or misinformation.”

Fishkin’s Recommendations

Fishkin admittedly speculated and provided concrete examples of how SEO strategies could change if his interpretations of the leaks were accurate.

However, these are his opinions, not directives. Among his potential recommendations:

1. Invest In Author/Entity Authority

Surprised by the continued emphasis on authorship and entity signals in the leaked code, Fishkin said brands should prioritize hiring writers with established reputational authority that Google already associates with quality content.

Fishkin said this is what he’s going to do differently:

“We’re going to hire a content marketer, basically a part-time content person, to make sure that the SparkToro blog has a couple of new posts on it every week.

And all that authorship and entity stuff made me think we should find someone who already has a profile.”

2. Supplement Link-building With Public Relations

According to Fishkin, the leaks uncovered potential evidence that Google devalues links to sites without sufficient brand awareness and search volume.

As a result, he recommends accompanying traditional link acquisition with broader brand-building efforts like PR and advertising to increase branded search demand.

Fishkin stated:

“If you get a whole bunch of links in one day and nothing else, guess what? You manipulated the link graph.

If you’re really a big brand, people should be talking about you.”

3. Embrace Geographic Nuance

With abundant references to geographic and country-specific signals throughout the code, Fishkin cautioned against one-size-fits-all global strategies.

What works for major markets like the US may prove ineffective for smaller regions where Google needs more data.

Fishkin advised attendees:

“I would encourage you to think about SEO as being more geographically specific than you think it is even for web search results.”

4. Rediscover Experimentation

More than anything, Fishkin hopes the leaks will catalyze a renewed sense of curiosity and skepticism within SEO.

On the value of experimentation, Fishkin says:

“We’ve seen it over and over. One thing we’ve lost, I feel like, is that spirit of experimentation. And with these things coming out where I don’t think we can take what Google says for granted, how do you see, how do we get that back?”

He challenged practitioners to move beyond regurgitating Google’s public statements and instead embrace testing to uncover what drives results.

Referring to an unexplained metric surfaced in the leaks, Fishkin states:

“My dream would be that if I were to come back to MozCon next year, somebody would be on this stage, and they’d be like, ‘Guys, I figured out what Keto score is. Publish that. I’ll amplify it.”

A Wakeup Call?

In many ways, Fishkin framed the leaks as a pivotal moment for an industry he believes has grown insular, conflict-averse, and too accepting of Google’s carefully crafted narratives.

His call to action left some energized and others put off by its unrestrained bluntness.

But whether one admires Fishkin’s brash delivery or not, the leaks have undeniably cracked open Google’s black box.

For those willing to dig into the technical details and chart their path through testing, Fishkin argues lucrative opportunities await those who stop taking Google’s word as gospel.

A Word Of Caution Regarding The Google API Leak

Doubts have emerged about the true nature and significance of this “leak.”

Evidence suggests the data may be connected to Google’s public Document AI Warehouse API rather than exposing the ranking system’s inner workings. The information also appears to be at least five years old.

While Fishkin’s plans to adjust his SEO tactics are interesting, they should be taken with a grain of salt, given the ongoing debate over what the data really signifies.

It illustrates the importance of vetting sources when evaluating any supposed “insider information” about how search engines operate.

As the discussion around the Google “leak” continues, be careful not to fall victim to confirmation bias—seeing the data through the lens of pre-existing theories rather than objectively assessing it.


Featured Image: Taken by author at MozCon, June 2024.