Google Responds: Is Desktop SEO Still Necessary? via @sejournal, @martinibuster

Google’s John Mueller responded to a question about whether it’s okay to stop optimizing a desktop version of a website now that Google is switching over to exclusively indexing mobile versions of websites.

The question asked is related to an announcement they made a week ago:

“…the small set of sites we’ve still been crawling with desktop Googlebot will be crawled with mobile Googlebot after July 5, 2024. … After July 5, 2024, we’ll crawl and index these sites with only Googlebot Smartphone. If your site’s content is not accessible at all with a mobile device, it will no longer be indexable.”

Stop Optimizing Desktop Version Of A Site?

The person asking the question wanted to know if it’s okay to abandon optimizing a purely desktop version of a site and just focus on the mobile friendly version. The person is asking because they’re new to a company and the developers are far into the process of developing a mobile-only version of a site.

This is the question:

“I am currently in a discussion at my new company, because they are implementing a different mobile site via dynamic serving instead of just going responsive. Next to requirements like http vary header my reasoning is that by having two code bases we need to crawl, analyze and optimize two websites instead of one. However, this got shut down because “due to mobile first indexing we no longer need to optimize the desktop website for SEO”. I read up on all the google docs etc. but I couldn’t find any reasons as to why I would need to keep improving the desktop website for SEO, meaning crawlability, indexability, using correct HTML etc. etc. What reasons are there, can you help me?”

Mobile-Only Versus Responsive Website

Google’s John Mueller expressed the benefits of one version of a website that’s responsive. This eliminates the necessity of maintaining two websites plus it’s desktop-friendly to site visitors who are visiting a site with a desktop browser.

He answered:

“First off, not making a responsive site in this day & age seems foreign to me. I realize sometimes things just haven’t been updated in a long time and you might need to maintain it for a while, but if you’re making a new site”

Maintaining A Desktop-Friendly Site Is A Good Idea

Mueller next offered reasons why it’s a good idea to maintain a functional desktop version of a website, such as other search engines, crawlers and site visitors who actually are on desktop devices. Most SEOs understand that conversions, generating income with a website, depends on being accessible to all site visitors, that’s the big picture. Optimizing a site for Google is only a part of that picture, it’s not the entire thing itself.

Mueller explained:

“With mobile indexing, it’s true that Google focuses on the mobile version for web search indexing. However, there are other search engines & crawlers / requestors, and there are other requests that use a desktop user-agent (I mentioned some in the recent blog post, there are also the non-search user-agents on the user-agent documentation page).”

He then said that websites exist for more than just getting crawled and ranked by Google.

“All in all, I don’t think it’s the case that you can completely disregard what’s served on desktop in terms of SEO & related. If you had to pick one and the only reason you’re running the site is for Google SEO, I’d probably pick mobile now, but it’s an artificial decision, sites don’t live in isolation like that, businesses do more than just Google SEO (and TBH I hope you do: a healthy mix of traffic sources is good for peace of mind). And also, if you don’t want to have to make this decision: go responsive.”

After the person asking the question explained that the decision had already been made to focus on mobile, Mueller responded that this is a case of choosing your battles.

“If this is an ongoing project, then shifting to dynamic serving is already a pretty big step forwards. Pick your battles :). Depending on the existing site, sometimes launching with a sub-optimal better version earlier is better than waiting for the ideal version to be completed. I’d just keep the fact that it’s dynamic-serving in mind when you work on it, with any tools that you use for diagnosing, monitoring, and tracking. It’s more work, but it’s not impossible. Just make sure the desktop version isn’t ignored completely :). Maybe there’s also room to grow what the team (developers + leads) is comfortable with – perhaps some smaller part of the site that folks could work on making responsive. Good luck!”

Choose Your Battles Or Stand Your Ground?

John Mueller’s right that there are times where it’s better to choose your battles rather than dig in and compromise. But just make sure that your recommendations are on record and that those pushing back are on record. That way if things go wrong the blame will find it’s way back to the ones who are responsible.

Featured Image by Shutterstock/Luis Molinero

Google: Can 10 Pages Impact Sitewide Rankings? via @sejournal, @martinibuster

Google’s John Mueller answered a question about sitewide impacts on a site with ten pages that lost rankings in the March/April 2024 Core Update then subsequently experienced a sitewide collapse in May.

Can 10 Pages Trigger A Sitewide Penalty?

The person asking the question on Reddit explained that they had ten pages (out of 20,000 pages) that were hit by the Helpful Content Update (HCU) in September 2023. They subsequently updated the pages which eventually recovered their rankings and traffic. Things were fine until the the same ten pages got slammed by the March/April core update. The precise date of the second ranking drop event was April 20th.

Up to that point the rest of the site was fine. Only the same ten pages were affected. That changed on May 7th when the site experienced a sitewide drop in rankings across all 20,000 pages of the website.

Their question was if the ten problematic pages triggered a sitewide impact or whether the May 7th collapse was due to the Site Reputation Abuse penalties that were announced on May 6th.

A Note About Diagnosing Ranking Drops

I’m not commenting specifically about the person who asked the question but… the question has the appearance of correlating ranking drops with specific parts of announced algorithm updates.

Here is the exact wording:

“Our website has about 20K pages, and we found that around 10 pages were hit by HCU in September. We updated those articles and saw a recovery in traffic, but after the March core update around April 20, the same pages were hit again, likely due to HCU. On May 7th, we saw a sharp drop in rankings across the board, and suspect that a sitewide classifier may have been applied.

Question: Can an HCU hit on 10 pages cause a sitewide classifier for 20K pages? Or on May 7th reputation abuse update may had an impact?”

In general it’s reasonable to assume that a ranking drop is connected to a recently announced Google update when the dates of both events match. However, it bears pointing out that a core algorithm update can affect multiple things (for example query-content relevance) and it should be understood that the HCU is no longer a single system.

The person asking the question is following a pattern that I often see which is that they’re assuming that ranking drops are due to something wrong with their site but that’s not always the case, it could be changes in how Google interprets a search query (among many other potential reasons).

The other potential mistake is assuming that the problem is related to a specific algorithm. The person asking the question assumes they were hit by the HCU system, which is something that no longer exists. All the elements of the HCU were subsumed into the core ranking algorithm as signals.

Here is what Google’s documentation says about what happened to the HCU:

Is there a single “helpful content system” that Google Search uses for ranking?
Our work to improve the helpfulness of content in search results began with what we called our “helpful content system” that was launched in 2022. Our processes have evolved since. There is no one system used for identifying helpful content. Instead, our core ranking systems use a variety of signals and systems.”

While Google is still looking for helpfulness in content there is no longer a helpful content system that’s de-ranking pages on specific dates.

The other potential evidence of faulty correlation is when the Redditor asked if their May 7th sitewide collapse was due to the site reputation abuse penalties. The site reputation abuse penalties weren’t actually in effect by May 7th. On May 6th it was announced that site reputation abuse manual actions would begin at some point in the near future.

Those are two examples of how it can be misleading to correlate site ranking anomalies with announced updates. There is more to diagnosing updates than correlating traffic patterns to announced updates. Site owners and SEOs who diagnose problems in this manner risk approaching the solution like someone who’s focusing on the map instead of looking at the road.

Properly diagnosing issues requires understanding the full range of technical issues that can impact a site and algorithmic changes that can happen on Google’s side (especially unannounced changes). I have over 20 years experience and know enough to be able to identify anomalies in the SERPs that indicate changes to how Google is approaching relevance.

Complicating the diagnosis is that sometimes it’s not something that needs “fixing” but rather it’s more about the competition is doing something more right than the sites that lost rankings. More right can be a wide range of things.

Ten Pages Caused Sitewide “Penalty”?

John Mueller responded by first addressing the specific issue of sitewide ranking collapse, remarking that he doesn’t think it’s likely that ten pages would cause 20,000 other pages to lose rankings.

John wrote:

“The issues more folks post about with regards to core updates tend to be site-wide, and not limited to a tiny subset of a site. The last core update was March/April, so any changes you’d be seeing from May would be unrelated. I’m not sure how that helps you now though :-), but I wouldn’t see those 10 pages as being indicative of something you need to change across 20k other pages.”

Sometimes It’s More Than Announced Updates

John Mueller didn’t offer a diagnosis of what is wrong with the site, that’s impossible to say without actually seeing the site. SEOs on YouTube, Reddit and Facebook routinely correlate ranking drops with recently announced updates but as I wrote earlier in this article, that could be a mistake.

When diagnosing a drop in rankings it’s important to look at the site, the competition and the SERPs.

Do:

  • Inspect the website
  • Review a range of keywords and respective changes in the SERPs
  • Inspect the top ranked sites

Don’t:

  • Assume that a ranking drop is associated with a recent update and stop your investigation right there.

Google’s John Mueller alludes to the complexity of diagnosing ranking drops by mentioning that sometimes it’s not even about SEO, which is 100% correct.

John explained:

“Based on the information you posted, it’s also impossible to say whether you need to improve / fix something on those 20k pages, or if the world has just moved on (in terms of their interests, their expectations & your site’s relevance).

It sounds like you did find things to make more “helpful” on those 10 pages, maybe there’s a pattern? That’s something for you to work out – you know your site, its content, its users best. This isn’t an easy part of SEO, sometimes it’s not even about SEO.”

Look At The Road Ahead

It’s been a trend now that site owners focus on recent announcements by Google as clues to what is going on with their sites. It’s a reasonable thing to do and people should 100% keep doing that. But don’t make that the limit of your gaze because there is always the possibility that there is something else going on.

Featured Image by Shutterstock/vovan

Google Quietly Fixed Site Names In Search Results via @sejournal, @martinibuster

Google resolved a site names issue that had been ongoing since September 2023 that prevented a website’s site name from properly appearing when an inner page was ranked in the search results.

Site Names In The Search Results

A site name is exactly what it sounds like, the name of a website that’s displayed in the search engine results pages (SERPs). This is a feature that allows users to identify the name of the site that’s in the search engine results pages (SERPs).

If your site name is Acme Anvil Company, and that’s how the company is known, then Google wants to display Acme Anvil Company in the search results. If Acme Anvil Company is better known as the AAC and that’s what the company wants to show in the SERPs, then that’s what Google wants to show.

Google allows site owners to use the “WebSite” structured data on the home page to specify the correct site name that Google should use.

Problem Propagating Site Names

Back in September 7, 2023, Google published a warning in their site name documentation that acknowledged they were having problems propagating the site name to the inner pages of a site when those inner pages were shown in the SERPs.

This is the warning that was published:

“Known issue: site name isn’t appearing for internal pages
In some cases, a site name shown for a home page may not have propagated to appear for other pages on that site. For example, example.com might be showing a site name that’s different from example.com/internal-page.html.

We’re actively working to address this. We will update this help page when this issue is resolved. In the meantime, if your home page is showing the site name you prefer, understand that it should also appear for your internal pages eventually.”

Google Fixes Site Name Problem

The documentation for the site name problem was recently removed. A changelog for Google documentation noted this:

“Resolving the issue with site names and internal pages
What: Removed the warning about the issue that was preventing new site names from propagating to internal pages.

Why: The issue has been resolved. Keep in mind that it takes time for Google to recrawl and process the new information, including recrawling your internal pages.”

There’s no word on what caused the site name propagation problem but it is interesting that it was finally fixed after all this time because one has to wonder if it took so long because it was low priority or if something on the backend of Google’s systems changed that allowed them to finally fix the issue.

Read Google’s updated site names documentation:

Provide a site name to Google Search

Featured Image by Shutterstock/Cast Of Thousands

Google On How It Manages Disclosure Of Search Incidents via @sejournal, @martinibuster

Google’s latest Search Off The Record podcast discussed examples of disruptive incidents that can affect crawling and indexing and discuss the criteria for deciding whether or not to disclose the details of what happened.

Complicating the issue of making a statement is that there are times when SEOs and publishers report that Search is broken when from Google’s point of view they’re working the way they’re supposed to.

Google Search Has A High Uptime

The interesting part of the podcast began with the observation that Google Search (the home page with the search box) itself has an “extremely” high uptime and rarely ever goes down and become unreachable. Most of the reported issues were due to network routing issues from the Internet itself than a failure from within Google’s infrastructure.

Gary Illyes commented:

“Yeah. The service that hosts the homepage is the same thing that hosts the status dashboard, the Google Search Status Dashboard, and it has like an insane uptime number. …the number is like 99.999 whatever.”

John Mueller jokingly responded with the word “nein” (pronounced like the number nine), which means “no” in German:

“Nein. It’s never down. Nein.”

The Googlers admit that the rest of Google Search on the backend does experience outages and they explain how that’s dealt with.

Crawling & Indexing Incidents At Google

Google’s ability to crawl and index web pages is critical for SEO and earnings. Disruption can lead to catastrophic consequences particularly for time-sensitive content like announcements, news and sales events (to name a few).

Gary Illyes explained that there’s a team within Google called Site Reliability Engineering (SRE) that’s responsible for making sure that the public-facing systems are running smoothly. There’s an entire Google subdomain devoted to the site reliability where they explain that they approach the task of keeping systems operational similar to how software issues are. They watch over services like Google Search, Ads, Gmail, and YouTube.

The SRE page explains the complexity of their mission as being very granular (fixing individual things) to fixing larger scale problems that affect “continental-level service capacity” for users that measure in the billions.

Gary Ilyes explains (at the 3:18 minute mark):

“Site Reliability Engineering org publishes their playbook on how they manage incidents. And a lot of the incidents are caught by incidents being issues with whatever systems. They catch them with automated processes, meaning that there are probers, for example, or there are certain rules that are set on monitoring software that looks at numbers.

And then, if the number exceeds whatever value, then it triggers an alert that is then captured by a software like an incident management software.”

February 2024 Indexing Problem

Gary next explains how the February 2024 indexing problem is an example of how Google monitors and responds to incidents that could impact users in search. Part of the response is figuring out if it’s an actual problem or a false positive.

He explains:

“That’s what happened on February 1st as well. Basically some number went haywire, and then that opened an incident automatically internally. Then we have to decide whether that’s a false positive or it’s something that we need to actually look into, as in like we, the SRE folk.

And, in this case, they decided that, yeah, this is a valid thing. And then they raised the priority of the incident to one step higher from whatever it was.

I think it was a minor incident initially and then they raised it to medium. And then, when it becomes medium, then it ends up in our inbox. So we have a threshold for medium or higher. Yeah.”

Minor Incidents Aren’t Publicly Announced

Gary Ilyes next explained that they don’t communicate every little incident that happens because most of the times it won’t even be noticed by users. The most important consideration is whether the incident affects users, which will automatically receive an upgraded priority level.

An interesting fact about how Google decides what’s important is that problems that affect users are automatically boosted to a higher priority  level. Gary said he didn’t work in SRE so he was unable to comment on the exact number of users that need to be affected before Google decides to make a public announcement.

Gary explained:

“SRE would investigate everything. If they get a prober alert, for example, or an alert based on whatever numbers, they will look into it and will try to explain that to themselves.

And, if it’s something that is affecting users, then it almost automatically means that they need to raise the priority because users are actually affected.”

Incident With Images Disappearing

Gary shared another example of an incident, this time it was about images that weren’t showing up for users. It was decided that although the user experience was affected it was not affected to the point that it was keeping users from finding what they were searching for, the user experience was degraded but not to the point where Google became unusable. Thus, it’s not just whether users are affected by an incident that will cause an escalation in priority but also how badly the user experience is affected.

The case of the images not displaying was a situation in which they decided to not make a public statement because users could still be able to find the information they needed. Although Gary didn’t mention it, it does sound like an issue that recipe bloggers have encountered in the past where images stopped showing.

He explained:

“Like, for example, recently there was an incident where some images were missing. If I remember correctly, then I stepped in and I said like, “This is stupid, and we should not externalize it because the user impact is actually not bad,” right? Users will literally just not get the images. It’s not like something is broken. They will just not see certain images on the search result pages.

And, to me, that’s just, well, back to 1990 or back to 2008 or something. It’s like it’s still usable and still everything is dandy except some images.”

Are Publishers & SEOs Considered?

Google’s John Mueller asked Gary if the threshold for making a public announcement was if the user’s experience was degraded or if it was the case that the experience of publishers and SEOs were also considered.

Gary answered (at about the 8 minute mark):

“So it’s Search Relations, not Site Owners Relations, from Search perspective.

But by extension, like the site owners, they would also care about their users. So, if we care about their users, it’s the same group of people, right? Or is that too positive?”

Gary apparently sees his role as primarily as Search Relations in a general sense of their users. That may come as a surprise to many in the SEO community because Google’s own documentation for their Search Off The Record podcast explains the role of the Search Relations team differently:

“As the Search Relations team at Google, we’re here to help site owners be successful with their websites in Google Search.”

Listening to the entire podcast, it’s clear that Googlers John Mueller and Lizzi Sassman are strongly focused on engaging with the search community. So maybe there’s a language issue that’s causing his remark to be interpretable differently than he intended?

What Does Search Relations Mean?

Google explained that they have a process for deciding what to disclose about disruptions in search and it is a 100% sensible approach. But something to consider is that the definition of “relations” is that it’s about a connection between two or more people.

Search is a relation(ship). It is an ecosystem where two partners, the creators (SEOs and site owners) create content and Google makes it available to their users.

Featured Image by Shutterstock/Khosro

Bluehost Launches AI WordPress Website Creator via @sejournal, @martinibuster

Bluehost launched an AI Website Creator that enables users to quickly create professional websites, an evolution of the click and build website builder that makes it easy for anyone to create a WordPress website and benefit from the power and freedom of the open source community.

The importance of what this means for businesses and agencies cannot be overstated because it allows agencies to scale WordPress site creation and puts the ability to create professional WordPress sites within reach of virtually everyone.

Point And Click Website Creation

Bluehost offers an easy website building experience that provides the ease of point and click site creation with the freedom of a the WordPress open source content management system. The heart of this system is called WonderSuite.

WonderSuite is comprised of multiple components, such as a user interface that walks a user through the site creation process with a series of questions that are used as part of the site creation process. There is also a library of patterns, templates, and an easy to configure shopping cart, essentially all the building blocks for creating a site and doing business online quickly and easily.

The new AI Website Creator functionality is the newest addition to the WonderSuite site builder.

AI Website Builder

An AI website builder is the natural evolution of the point and click site creation process. Rather than moving a cursor around on a screen the new way to build a website is with an AI that acts as a designer that responds to what a user’s website needs are.

The AI asks questions and starts building the website using open source WordPress components and plugins. Fonts, professional color schemes, and plugins are all installed as needed, completely automatically. Users can also save custom generated options for future use which should be helpful for agencies that need to scale client website creation.

Ed Jay, President of Newfold Digital, the parent company of Bluehost, commented:

“Efficiency and ease are what WordPress entrepreneurs and professionals need and our team at Bluehost is dedicated to deliver these essentials to all WordPress users across the globe. With AI Website Creator, any user can rely on the Bluehost AI engine to create their personalized website in just minutes. After answering a few simple questions, our AI algorithm leverages our industry leading WordPress experience, features and technology, including all aspects of WonderSuite, to anticipate the website’s needs and ensure high quality outcomes.

The AI Website Creator presents users with multiple fully functional, tailored and customizable website options that provide a powerful but flexible path forward. It even generates images and content aligned with the user’s brief input, expediting the website off the ground and ready for launch.”

Future Of Website Creation

Bluehost’s innovative AI site creator represents the future of how businesses get online and how entrepreneurs who service clients can streamline site creation and scale their business with WordPress.

Read more about Bluehost’s new AI Website Creator:

WordPress made wonderful with AI

Featured Image by Shutterstock/Simple Line

Automattic For Agencies: A New Way To Monetize WordPress via @sejournal, @martinibuster

Automattic, the company behind WordPress.com, Jetpack, WooCommerce and more, have announced a new program to woo Agencies into their ecosystem of products with more ways to earn revenue.

This new program could be seen as putting Automattic into direct competition with closed source systems like Wix and Duda but there are clear differences between all three products and services.

Automattic For Agencies

Automattic for Agencies brings together multiple Automattic products into a single service with a dashboard for managing multiple client sites and billing. The program offers a unified locations for managing client sites as well as discounted pricing and revenue sharing opportunities. Aside from the benefits of streamlining the program also offers technical support across all of the Automattic products that are a part of the program. Lastly the program offers agencies managed security and performance improvements.

According to the announcement:

“We worry about site performance and security so you don’t have to. When you connect your sites to the Automattic for Agencies dashboard, you’ll receive instant notifications about updates and alerts, so your sites stay problem-free and your clients stay happy.”

Revenue Share And Discounts

Agencies can now earn a revenue share of the Automattic products used by clients. For example, agencies can earn a 50% revenue share on Jetpack product referrals, including renewals. As part of the program Jetpack also offers discounts on licenses, starting at 10% off for five licenses and to as high as 50% off for 100 licenses.

As part of the new program there are similar benefits for agencies that build or manage WooCommerce sites, with discounted agency pricing and a referral program

WordPress.com, the managed WordPress hosting subsidiary of Automattic, is offering a 20% revenue share on new subscriptions and a 50% share on migrations from other hosts.

A tweet from WordPress.com described the new program:

“Agencies, we’ve got some news for you!

Our new referral program is live, and as a referrer of http://WordPress.com’s services, your agency will receive a 20% revenue share on new subscriptions and 50% on new migrations to http://WordPress.com from other hosting providers.”

New Directory For Agencies

A forthcoming benefit of the Autommatic For Agencies program is a business directory that lists agencies that are a part of the program. The benefit of the directory is presumably that it may lead to business referrals to the agencies.

The Jetpack announcement describes the new directory:

“Gain heightened visibility through multiple directory listings across Automattic’s business units. This increased exposure creates more opportunities for potential clients to find and engage with your services, helping you grow your agency’s reach and reputation.”

The WooCommerce announcement describes the directory like this:

“Expand your reach
Increase your visibility with partner directory listings across multiple Automattic brands.”

Automattic Affiliate Program

The Automattic for Agencies announcement follows the rollout of a separate affiliate program which offers up to 100% referral bonus for affiliates who refer new hosting clients, with a limit of $300 payout per item, and up to 50% referral bonus for Jetpack plugin subscriptions. The program has a 30 day cookie conversion period which provides affiliates the opportunity to earn referral bonuses on any additional sales within a 30 day period.

Read more about the new program:

Live the Suite Life With Automattic For Agencies

Featured Image by Shutterstock/Volodymyr TVERDOKHLIB

Google Case Study Shows Importance Of Structured Data via @sejournal, @martinibuster

Google published a case study that shows how using structured data and following best practices improved discoverability and brought more search traffic. The case study was about the use of Video structured data but the insights shared are applicable across a range of content types.

The new case study is about an Indonesian publisher called Vidio.

How CDNs Can Cause Indexing Problems

One of the interesting points in the case study is about an issue related to how CDNs can link to image and video files with expiring URLs. The new documentation specifically mentions that it’s important that the CDN uses stable URLs and links to another Google documentation page that goes into more detail.

Google explains that some CDNs use quickly expiring URLs for video and thumbnail files and encourages publishers and SEOs to use just one stable URL for each video. Something interesting to note is that not only does this help Google index the files it also helps Google collect user interest signals.

This is what the documentation advises:

“Some CDNs use quickly expiring URLs for video and thumbnail files. These URLs may prevent Google from successfully indexing your videos or fetching the video files. This also makes it harder for Google to understand users’ interest in your videos over time.

Use a single unique and stable URL for each video. This allows Google to discover and process the videos consistently, confirm they are still available and collect correct signals on the videos.”

Implementing The Correct Structured Data

Google highlighted the importance of using the correct structured data and validating it with Google’s structured data testing tool.

These are the results of the above work:

“Within a year of implementing VideoObject markup, Vidio saw improvements in impressions and clicks on their video pages. While the number of videos that Vidio published from Q1 2022 to Q1 2023 increased by ~30%, adding VideoObject markup made their videos eligible for display in various places on Google.

This led to an increase of ~3x video impressions and close to 2x video clicks on Google Search. Vidio also used the Search Console video indexing report and performance report, which helped them to identify and fix issues for their entire platform.”

Indexing + Structured Data = More Visibility

The keys to better search performance were ensuring that Google is able to crawl the URLs, which is something that can easily be overlooked in the rush to correlate a drop in rankings to a recent algorithm update. Never rule anything out during a site audit.

Another thing the case study recommends that is important is to assure that the proper structured data is being used. Using the appropriate structured data can help make a webpage qualify for improved search visibility through one of Google’s enhanced search features like featured snippets.

Read Google’s case study:

How Vidio brought more locally relevant video-on-demand (VOD) content to Indonesian users through Google Search

Featured Image by Shutterstock/Anton Vierietin

Rand Fishkin At MozCon: Rethinking Strategies Amid Google API “Leak” via @sejournal, @MattGSouthern

At the MozCon industry conference this week, Rand Fishkin, the outspoken former CEO of Moz and founder of SparkToro, shared his opinion on how SEOs and marketers should potentially adjust strategies based on his interpretation of the recent Google API leaks.

In a packed session with Dr. Pete Meyers, Fishkin laid out specific ways he believes the leaked information, which has not been verified, could impact best practices.

Fishkin firmly believes the leaks contradict Google’s public statements about its systems.

“Google has been unkind and unfair. They have been abusive about this,” Fishkin stated, though these are his opinions based on reviewing the leaks.

On Google’s lack of transparency, Fishkin states:

“Google has told us off and on that they don’t use clicks for ranking. And I always heard it, maybe this is charitable on my part, as we don’t use capital ‘C’ clicks for capital ‘R’ ranking. And the truth is, I think even that was charitable on my case.

And we’ve seen in not just these documents, but anyone who’s familiar with Andrew Navick’s testimony last year, it’s really confirming a lot of what we saw, a lot of what we saw with Navboost.”

He adds:

“They have lied through either omission or misinformation.”

Fishkin’s Recommendations

Fishkin admittedly speculated and provided concrete examples of how SEO strategies could change if his interpretations of the leaks were accurate.

However, these are his opinions, not directives. Among his potential recommendations:

1. Invest In Author/Entity Authority

Surprised by the continued emphasis on authorship and entity signals in the leaked code, Fishkin said brands should prioritize hiring writers with established reputational authority that Google already associates with quality content.

Fishkin said this is what he’s going to do differently:

“We’re going to hire a content marketer, basically a part-time content person, to make sure that the SparkToro blog has a couple of new posts on it every week.

And all that authorship and entity stuff made me think we should find someone who already has a profile.”

2. Supplement Link-building With Public Relations

According to Fishkin, the leaks uncovered potential evidence that Google devalues links to sites without sufficient brand awareness and search volume.

As a result, he recommends accompanying traditional link acquisition with broader brand-building efforts like PR and advertising to increase branded search demand.

Fishkin stated:

“If you get a whole bunch of links in one day and nothing else, guess what? You manipulated the link graph.

If you’re really a big brand, people should be talking about you.”

3. Embrace Geographic Nuance

With abundant references to geographic and country-specific signals throughout the code, Fishkin cautioned against one-size-fits-all global strategies.

What works for major markets like the US may prove ineffective for smaller regions where Google needs more data.

Fishkin advised attendees:

“I would encourage you to think about SEO as being more geographically specific than you think it is even for web search results.”

4. Rediscover Experimentation

More than anything, Fishkin hopes the leaks will catalyze a renewed sense of curiosity and skepticism within SEO.

On the value of experimentation, Fishkin says:

“We’ve seen it over and over. One thing we’ve lost, I feel like, is that spirit of experimentation. And with these things coming out where I don’t think we can take what Google says for granted, how do you see, how do we get that back?”

He challenged practitioners to move beyond regurgitating Google’s public statements and instead embrace testing to uncover what drives results.

Referring to an unexplained metric surfaced in the leaks, Fishkin states:

“My dream would be that if I were to come back to MozCon next year, somebody would be on this stage, and they’d be like, ‘Guys, I figured out what Keto score is. Publish that. I’ll amplify it.”

A Wakeup Call?

In many ways, Fishkin framed the leaks as a pivotal moment for an industry he believes has grown insular, conflict-averse, and too accepting of Google’s carefully crafted narratives.

His call to action left some energized and others put off by its unrestrained bluntness.

But whether one admires Fishkin’s brash delivery or not, the leaks have undeniably cracked open Google’s black box.

For those willing to dig into the technical details and chart their path through testing, Fishkin argues lucrative opportunities await those who stop taking Google’s word as gospel.

A Word Of Caution Regarding The Google API Leak

Doubts have emerged about the true nature and significance of this “leak.”

Evidence suggests the data may be connected to Google’s public Document AI Warehouse API rather than exposing the ranking system’s inner workings. The information also appears to be at least five years old.

While Fishkin’s plans to adjust his SEO tactics are interesting, they should be taken with a grain of salt, given the ongoing debate over what the data really signifies.

It illustrates the importance of vetting sources when evaluating any supposed “insider information” about how search engines operate.

As the discussion around the Google “leak” continues, be careful not to fall victim to confirmation bias—seeing the data through the lens of pre-existing theories rather than objectively assessing it.


Featured Image: Taken by author at MozCon, June 2024. 

SEO In Crisis? Moz Search Scientist Warns Of Challenges Ahead via @sejournal, @MattGSouthern

Are the days of organic SEO numbered? That’s the idea raised by a search scientist’s assessment of Google’s AI-powered disruptions.

At MozCon’s 20th annual conference, Tom Capper, Moz’s Senior Search Scientist, provided a data-driven reality check.

Capper warned attendees

“At the end of this talk, I’m going to tell you that full-funnel organic marketing is borderline impossible in 2024 for most businesses.”

He examined how Google’s AI overview results, aggressive monetization, and evolving search intents pose challenges for companies relying on SEO.

Additionally, in an exclusive interview with Search Engine Journal, Capper highlighted potential paths forward for those willing to pivot.

Photo taken by author at MozCon, June 2024.

The Zero-Click Threat

Capper opened by chronicling the rise of search “intents” like informational, navigational, commercial, and transactional queries.

Google’s new AI Overview feature, which generates direct answers at the top of the page, has proven particularly disruptive for informational searches.

“Organic is a really tough game for informational intent,” said Capper, displaying data that informational searches have the lowest share of voice for traditional organic results due to AI Overviews and other SERP features.

Photo taken by author at MozCon, June 2024.

He also noted 21% of informational searches now surface a Featured Snippet result, which can satisfy users without a click.

“You basically can’t play at the top of the funnel,” he stated bluntly.

AI Overviews A “Mistake”

In the exclusive interview, Capper cautioned that Google’s rush to implement AI overviews could negatively impact the company’s brand image:

“I think Google has gone too soon and rushed this, and yeah, I do think it’s a mistake. That is a little bit dangerous for SEO in that if Google suffers, then that’s disruptive for our industry as well.”

The Commercial Battleground

While the data is dire for informational content, Capper says commercial searches represent a “sweet spot.”

However, these valuable mid-funnel queries have become a “turbulent” and “incredibly contested” battleground.

Weighing in on the Google product reviews update and other recent changes, Capper said:

“Commercial is where a lot of this [Google’s search quality issues] plays out…it’s become an incredibly volatile section.”

Major sites like Amazon, Reddit, and YouTube dominate commercial results alongside a glut of price listings and review rich results. This raises the bar for smaller sites trying to rank.

“There are arts, hobbies, real estate – much more realistic to try and compete in here,” Capper advised.

He warned publishers who rely solely on easily answered questions,

“If that’s what you’ve been doing, you’ve probably been suffering for a long time…If you’re not willing to pivot to any other kind of content, then yeah, sure, go. Find a different channel.”

The Paid & Local Future?

At the bottom funnel, Capper described transactional searches as “pay-to-play unless you’re a brick-and-mortar business.”

Google’s monetization of product listings and its experimental map embeds for transactional queries continue to squeeze out organic visibility.

However, Capper highlighted local SEO as a promising path forward, stating:

“If you can do well in local search, I think even in a worst-case scenario AI Overview rollout, you would still be doing well here.”

Adapting To The Changing Landscape

Despite the challenges posed by AI-powered search features, Capper believes there are still opportunities for organic marketing success.

He offers the following recommendations:

  • Target informational queries that don’t have a featured snippet, allowing for better organic visibility.
  • Focus on less competitive commercial queries in verticals like arts, hobbies, and real estate.
  • Leverage local search optimization for transactional queries, even for businesses without a brick-and-mortar presence.
  • Use keyword modifiers like “best,” “compare,” “top,” and “reviews” to identify commercial intent queries.
Photo taken by author at MozCon, June 2024.

Looking To The Future

When asked about his advice for SEO professionals who may be disheartened by the AI search revolution, Capper suggests adapting and focusing on creating high-quality, authoritative content.

Capper stated in the exclusive interview:

“If you’ve got any willingness at all to write something more interesting, then I think you can still play an organic.”

Ultimately, Capper remains optimistic about the future of organic search.

In the interview, he points Google’s business model depends on sending organic traffic to other sites:

“I don’t think Google will ever reach the point where Google doesn’t send traffic at all because, ultimately, that’s its business model.

People expect when they search Google that they will end up going to other websites; if people don’t have that expectation, they won’t click on ads; if people aren’t clicking on ads, Google doesn’t make any money.”

In Summary

While informational and transactional searches have become challenging to rank for organically, Capper’s research suggests there are opportunities in commercial and local spaces.

To adapt, he recommends focusing on less competitive commercial topics, leveraging local SEO for transactional queries, and creating content beyond simply answering basic questions.


Featured Image: KieferPix/Shutterstock

Google AI Overviews: New Research Offers Insights via @sejournal, @martinibuster

New research by BrightEdge offers a snapshot of the kinds of queries that tend to show Google AI Overviews (AIO) and provides insights into the kinds of queries and verticals where AIO are more prevalent.

The findings show dramatic differences in the amount of AI Overviews shown across different verticals in a way that reflects the kinds of queries that are common. This effect works in reverse as well, where some verticals experience less AIO search features.

Is This A Paradigm Shift?

While BrightEdge calls it the greatest paradigm shift in decades, I think that’s understating shifts to Google search in the recent past, not just the ones in 2024. Something that’s not widely understood is that Google Search has been an AI Search engine since at least 2015 with the introduction of RankBrain and other subsequent changes to the backend side of search.

The big change in Search this year is that AI is more obvious on the front-end as a Feature in Search, largely replacing the role that Featured Snippets once played. Perhaps more importantly there may have been an infrastructure change at the beginning of 2024.

BrightEdge Generative Parser

BrightEdge has a technology, called the Generative Parser, which tracks and analyzes patterns in Google’s AI search features. BrightEdge used their Generative Parser to produce research findings about Google’s new AI Overviews (AIO) search feature.

Albert Gouyet, VP of Operations at BrightEdge said this about the BrightEdge Generative Parser:

“It’s fascinating to see the BrightEdge Generative Parser™ giving marketers a front-row seat into how AI in search is developing and giving the community a glimpse into the future. For marketers who rely on organic traffic, early indications suggest that AI will help reach new customers and present new opportunities to create content that serves multiple needs and elevates brand performance.”

What Triggers AIO

BrightEdge’s report indicates that Featured Snippets and questions were likely to trigger the AIO feature. Featured Snippets are answers to questions that are created with direct quotes from websites. BrightEdge found that AI Overviews were more likely to appear when there was also a Featured Snippet.

What Doesn’t Trigger AI Overviews

The research showed that local search queries were the least likeliest to trigger an AI Overview search result. That makes sense because a user is looking for a structured search result (business names, addresses, phone numbers), information that can’t be usefully summarized.

Similarly, search queries that generate sitelinks were also less likely to trigger AIO. Sitelinks are search results related to branded searches which feature multiple links to inner pages of a website. For example, searching for the name of a clothing store can generate a search result that features inner pages for women’s clothes, men’s clothes, etc. This also makes sense because it’s the kind of search query that is best answered with direct data and not a summary.

Verticals Most Likely To Contain AIO

Search results that tended to feature AI Overviews were wildly different when compared by verticals (verticals means specific industries or topics). This likely doesn’t mean that Google was targeting specific verticals for showing more AIO. Search features are always tied to the helpfulness of the features. The helpfulness of features are tested with the Search Quality Raters, workers who test out new kinds of search results and rate them for helpfulness and other criteria.

Search queries related to Healthcare tended to generate AI Overviews at a rate of 63% of the time. That makes sense for search queries that are information-seeking.

B2B technology queries tended to generate AIO results 32% of the time while Ecommerce search queries triggered AI Overviews 23% of the time.

Interestingly, restaurants and travel related queries did not tend to trigger AIO results.

AIO Shown Less Often Than SGE

Another interesting data point is that AIO is triggered 20% less times than Search Generative Experience (SGE) answers were.

BrightEdge offered three insights related to why AIO is shown less than the experimental SGE was.

  1. “This indicates that AI is getting more precise when generating helpful experiences.
  2. This is likely because AI now caters better to people’s needs, such as looking for summaries, recommendations, or conversational experiences.
  3. Ultimately, Google is getting better at selecting answers.”

BrightEdge research pointed out that Google is improving the ability to anticipate follow up questions by providing AI search summaries that more completely answer a question.

They write:

“Since Google l/O, the overlap between citations in AI and traditional results has diminished. Google is ensuring users do not get the same results in the two types of different results. It is also now delivering on its promise to do the second, third, and fourth search for you. AI is beginning to anticipate the following question and give options before a user even asks. This often happens with ‘what,’ ‘where,’ and ‘how’ intent-based queries.”

Early Days Of AIO

Google has received overwhelmingly negative reviews from users and the news media about the quality of Google’s AI Overviews, which in turn can lead to trust issues. BrightEdge’s report can be considered a snapshot of Google AIO today and I’m certain BrightEdge will be back with new data in the future when Google’s (AI) SERPs eventually change again.

Featured image by Shutterstock/Marco Lazzarini