Google Issues Statement About CTR And HCU via @sejournal, @martinibuster

In a series of tweets, Google’s SearchLiaison responded to a question that connected click-through rates (CTR) and HCU (Helpful Content Update) with how Google ranks websites, remarking that if the associated ideas were true it would be impossible for any new website to rank.

Users Are Voting With Their Feet?

Search Liaison’s answer was to a tweet that quoted an interview answer by Google CEO Sundar Pichai, the quote being, “Users vote with their feet”.

Here is the tweet:

“If the HCU (Navboost, whatever you want to call it) is clicks/user reaction based – how could sites hit by the HCU ever hope to recover if we’re no longer being served to Google readers?

@sundarpichai “Users vote with their feet”,

Okay I’ve changed my whole site – let them vote!”

The above tweet appears to connect Pichai’s statement to Navboost, user clicks and rankings. But as you’ll see below, Sundar’s statement about users voting “with their feet” has nothing to do with clicks or ranking algorithms.

Background Information

Sundar Pichai’s answer about users voting “with their feet” has nothing to do with clicks.

The problem with the interview question (and Sundar Pichai’s answer) is that the question and answer are in the context of “AI-powered search and the future of the web.”

The interviewer at The Verge used a site called HouseFresh as an example of a site that’s losing traffic because of Google’s platform shift to the new AI Overviews.

But the HouseFresh site’s complaints predate AI Overviews. Their complaints are about Google ranking low quality “big media” product reviews over independent sites like HouseFresh.

HouseFresh wrote:

“Big media publishers are inundating the web with subpar product recommendations you can’t trust…

Savvy SEOs at big media publishers (or third-party vendors hired by them) realized that they could create pages for ‘best of’ product recommendations without the need to invest any time or effort in actually testing and reviewing the products first.”

Sundar Pichai’s answer has nothing to do with why HouseFresh is losing traffic. His answer is about AI Overviews. HouseFresh’s issues are about low quality big brands outranking them. Two different things.

  • The Verge-affiliated interviewer was mistaken to cite HouseFresh in connection with Google’s platform shift to AI Overviews.
  • Furthermore, Pichai’s statement has nothing to do with clicks and rankings.

Here is the interview question published on The Verge:

“There’s an air purifier blog that we covered called HouseFresh. There’s a gaming site called Retro Dodo. Both of these sites have said, “Look, our Google traffic went to zero. Our businesses are doomed.”

…Is that the right outcome here in all of this — that the people who care so much about video games or air purifiers that they started websites and made the content for the web are the ones getting hurt the most in the platform shift?”

Sundar Pichai answered:

“It’s always difficult to talk about individual cases, and at the end of the day, we are trying to satisfy user expectations. Users are voting with their feet, and people are trying to figure out what’s valuable to them. We are doing it at scale, and I can’t answer on the particular site—”

Pichai’s answer has nothing to do with ranking websites and absolutely zero context with the HCU. What Pichai’s answer means is that users are determining whether or not AI Overviews are helpful to them.

SearchLiaison’s Answer

Let’s reset the context of SearchLiaison’s answer, here is the tweet (again) that started the discussion:

“If the HCU (Navboost, whatever you want to call it) is clicks/user reaction based – how could sites hit by the HCU ever hope to recover if we’re no longer being served to Google readers?

@sundarpichai “Users vote with their feet”,

Okay I’ve changed my whole site – let them vote!”

Here is SearchLiaison’s response:

“If you think further about this type of belief, no one would ever rank in the first place if that were supposedly all that matters — because how would a new site (including your site, which would have been new at one point) ever been seen?

The reality is we use a variety of different ranking signals including, but not solely, “aggregated and anonymized interaction data” as covered here:”

The person who started the discussion responded with:

“Can you please tell me if I’m doing right by focusing on my site and content – writing new articles to be found through search – or if I should be focusing on some off-site effort related to building a readership? It’s frustrating to see traffic go down the more effort I put in.”

When a client says something like “writing new articles to be found through search” I always follow up with questions to understand what they mean. I’m not commenting about the person who made the tweet, I’m just making an observation about past conversations I’ve had with clients. When a client says something like that, they sometimes mean that they’re researching Google keywords and competitor sites and using that keyword data verbatim within their content instead of relying on their own personal expertise and understanding of what the readers want and need.

Here’s SearchLiaison’s answer:

“As I’ve said before, I think everyone should focus on doing whatever they think is best for their readers. I know it can be confusing when people get lots of advice from different places, and then they also hear about all these things Google is supposedly doing, or not doing, and really they just want to focus on content. If you’re lost, again, focus on that. That is your touchstone.”

Site Promotion To People

SearchLiaison next addressed the excellent question about off-site promotion where he strongly asserted focusing on the readers. A lot of SEOs focus on promoting sites to Google, which is what link building is all about.

Promoting sites to people is super important. It’s one of the things that I see high ranking sites do and, although I won’t mention specifics, I believe it feeds into higher rankings in an indirect way.

SearchLiaison continued:

“As to the off-site effort question, I think from what I know from before I worked at Google Search, as well as my time being part of the search ranking team, is that one of the ways to be successful with Google Search is to think beyond it.

Great sites with content that people like receive traffic in many ways. People go to them directly. They come via email referrals. They arrive via links from other sites. They get social media mentions.

This doesn’t mean you should get a bunch of social mentions, or a bunch of email mentions because these will somehow magically rank you better in Google (they don’t, from how I know things). It just means you’re likely building a normal site in the sense that it’s not just intended for Google but instead for people. And that’s what our ranking systems are trying to reward, good content made for people.”

What About False Positives?

The phrase false positive is used in many contexts and one of them is to describe the situation of a high quality site that loses rankings because an algorithm erroneously identified it as low quality. SearchLiaison offered hope to high quality sites that may have seen a decrease in traffic, saying that it’s possible that the next update may offer a positive change.

He tweeted:

“As to the inevitable “but I’ve done all these things when will I recover!” questions, I’d go back to what we’ve said before. It might be the next core update will help, as covered here:

It might also be that, as I said here, it’s us in some of these cases, not the sites, and that part of us releasing future updates is doing a better job in some of these cases:

SearchLiaison linked to a tweet by John Mueller from a month ago where he said that the search team is looking for ways to surface more helpful content.

“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”

Is Your Site High Quality?

Everyone likes to think that their site is high quality and most times it is. But there are also cases where a site publisher will do “everything right” in terms of following SEO practices but what they’re unaware of is that those “good SEO practices” that are backfiring on them.

One example, in my opinion, is the widely practiced strategy of copying what competitors are doing but “doing it better.” I’ve been hands-on involved in SEO for well over 20 years and that’s an example of building a site for Google and not for users. It’s a strategy that explicitly begins and ends with the question of “what is Google ranking and how can I create that?”

That kind of strategy can create patterns that overtly signal that a site is not created for users.  It’s also a recipe for creating a site that offers nothing new from what Google is already ranking. So before assuming that everything is fine with the site, be certain that everything is indeed fine with the site.

Featured Image by Shutterstock/Michael Vi

Google Analytics Update To Improve Paid Search Attribution via @sejournal, @MattGSouthern

Google has announced an update to the attribution models in Google Analytics 4 (GA4) to improve the accuracy of paid search campaigns.

Google plans to roll out adjustments over the next two weeks to address a longstanding issue where conversions originating from paid search were mistakenly attributed to organic search traffic.

According to the company’s statement, this misattribution occurs with single-page applications when the “gclid” parameter — a unique identifier for paid search clicks — fails to persist across multiple page views.

As a result, conversions that should have been credited to paid search campaigns were incorrectly assigned to organic search channels.

Improved Conversion Attribution Methodology

To address this problem, Google is modifying how it attributes conversions to ensure campaign information is captured from the initial event on each page.

Under the new methodology, the attribution will be updated to reflect the appropriate traffic source if a user exits the site and returns through a different channel.

This change is expected to increase the number of conversions attributed to paid search campaigns, potentially impacting advertising expenditures for marketers leveraging Google Ads.

Preparation & Review Recommended

In light of the impending update, Google strongly advises advertisers to review their budget caps and make necessary adjustments before the changes take effect.

As more conversions may be assigned to paid search efforts, campaign spending levels could be affected.

Proactive budget management should be used to align with evolving performance data.

Why SEJ Cares

Improved attribution accuracy gives you a clearer picture of how well your paid search advertising works.

This will allow you to make smarter decisions about where to spend your marketing budget and how to improve your paid search campaigns based on precise data.

How This Can Help You

With more accurate conversion data, you can:

  • Gain a clearer picture of your paid search campaigns’ actual impact and return on investment (ROI).
  • Optimize campaigns based on reliable performance metrics, allowing for more effective budget allocation and targeting strategies.
  • Identify areas for improvement or expansion within your paid search efforts, informed by precise attribution data.
  • Make data-driven decisions regarding budget adjustments, bid strategies, and overall campaign management.

To get the most out of these changes, review your budget caps and make necessary adjustments to anticipate the potential increase in conversions attributed to paid search campaigns.

Staying ahead will make it easier to adapt to the new attribution method and leverage the improved data.


Featured Image: Piotr Swat/Shutterstock

Google Gives Merchants New Insights Into Shopping Search Performance via @sejournal, @MattGSouthern

Google has introduced a feature in Search Console that allows merchants to track their product listings in the Google Search Image tab.

This expanded functionality can help businesses better understand their visibility across Google’s shopping experiences.

Where To Find ‘Merchant Listings Performance’ In Search Console

The new data is accessible through the “Performance” report under the “Google Search Image” tab.

From there, you can monitor the performance of your listings across various Google surfaces.

This includes information on impressions, clicks, and other key metrics related to your product showcases.

By integrating merchant listing performance into Search Console, businesses get a more comprehensive view of their product visibility to optimize their strategies accordingly.

Eligibility & Shopping Section In Search Console

To qualify for merchant listing reports, a website must be identified by Google as an online merchant primarily selling physical goods or services directly to consumers.

Affiliate sites or those that redirect users to other platforms for purchase completion are not considered eligible.

Once recognized as an online merchant, the Search Console will display a “Shopping” section in its navigation bar.

This dedicated area houses tools and reports tailored to shopping experiences, including:

  1. Product Snippet Rich Report: Providing insights into product snippet structured data on the site, enabling enhanced search result displays with visual elements like ratings and prices.
  2. Merchant Listing Rich Report: Offering analytics on merchant listing structured data enables more comprehensive search results, often appearing in carousels or knowledge panels.
  3. Shopping Tab Listings: Information and guidance on enabling products to appear in the dedicated Shopping tab within Google Search results.

Google’s automated systems determine a site’s eligibility as an online merchant based on the presence of structured data and other factors.

In Summary

This new feature in Google’s Search Console provides valuable information about the visibility of your product listings in search results.

You can use these insights to make changes and improve your products’ visibility so that more potential customers can find them.


Featured Image: T. Schneider/Shutterstock

Google Responds: Is Desktop SEO Still Necessary? via @sejournal, @martinibuster

Google’s John Mueller responded to a question about whether it’s okay to stop optimizing a desktop version of a website now that Google is switching over to exclusively indexing mobile versions of websites.

The question asked is related to an announcement they made a week ago:

“…the small set of sites we’ve still been crawling with desktop Googlebot will be crawled with mobile Googlebot after July 5, 2024. … After July 5, 2024, we’ll crawl and index these sites with only Googlebot Smartphone. If your site’s content is not accessible at all with a mobile device, it will no longer be indexable.”

Stop Optimizing Desktop Version Of A Site?

The person asking the question wanted to know if it’s okay to abandon optimizing a purely desktop version of a site and just focus on the mobile friendly version. The person is asking because they’re new to a company and the developers are far into the process of developing a mobile-only version of a site.

This is the question:

“I am currently in a discussion at my new company, because they are implementing a different mobile site via dynamic serving instead of just going responsive. Next to requirements like http vary header my reasoning is that by having two code bases we need to crawl, analyze and optimize two websites instead of one. However, this got shut down because “due to mobile first indexing we no longer need to optimize the desktop website for SEO”. I read up on all the google docs etc. but I couldn’t find any reasons as to why I would need to keep improving the desktop website for SEO, meaning crawlability, indexability, using correct HTML etc. etc. What reasons are there, can you help me?”

Mobile-Only Versus Responsive Website

Google’s John Mueller expressed the benefits of one version of a website that’s responsive. This eliminates the necessity of maintaining two websites plus it’s desktop-friendly to site visitors who are visiting a site with a desktop browser.

He answered:

“First off, not making a responsive site in this day & age seems foreign to me. I realize sometimes things just haven’t been updated in a long time and you might need to maintain it for a while, but if you’re making a new site”

Maintaining A Desktop-Friendly Site Is A Good Idea

Mueller next offered reasons why it’s a good idea to maintain a functional desktop version of a website, such as other search engines, crawlers and site visitors who actually are on desktop devices. Most SEOs understand that conversions, generating income with a website, depends on being accessible to all site visitors, that’s the big picture. Optimizing a site for Google is only a part of that picture, it’s not the entire thing itself.

Mueller explained:

“With mobile indexing, it’s true that Google focuses on the mobile version for web search indexing. However, there are other search engines & crawlers / requestors, and there are other requests that use a desktop user-agent (I mentioned some in the recent blog post, there are also the non-search user-agents on the user-agent documentation page).”

He then said that websites exist for more than just getting crawled and ranked by Google.

“All in all, I don’t think it’s the case that you can completely disregard what’s served on desktop in terms of SEO & related. If you had to pick one and the only reason you’re running the site is for Google SEO, I’d probably pick mobile now, but it’s an artificial decision, sites don’t live in isolation like that, businesses do more than just Google SEO (and TBH I hope you do: a healthy mix of traffic sources is good for peace of mind). And also, if you don’t want to have to make this decision: go responsive.”

After the person asking the question explained that the decision had already been made to focus on mobile, Mueller responded that this is a case of choosing your battles.

“If this is an ongoing project, then shifting to dynamic serving is already a pretty big step forwards. Pick your battles :). Depending on the existing site, sometimes launching with a sub-optimal better version earlier is better than waiting for the ideal version to be completed. I’d just keep the fact that it’s dynamic-serving in mind when you work on it, with any tools that you use for diagnosing, monitoring, and tracking. It’s more work, but it’s not impossible. Just make sure the desktop version isn’t ignored completely :). Maybe there’s also room to grow what the team (developers + leads) is comfortable with – perhaps some smaller part of the site that folks could work on making responsive. Good luck!”

Choose Your Battles Or Stand Your Ground?

John Mueller’s right that there are times where it’s better to choose your battles rather than dig in and compromise. But just make sure that your recommendations are on record and that those pushing back are on record. That way if things go wrong the blame will find it’s way back to the ones who are responsible.

Featured Image by Shutterstock/Luis Molinero

Google: Can 10 Pages Impact Sitewide Rankings? via @sejournal, @martinibuster

Google’s John Mueller answered a question about sitewide impacts on a site with ten pages that lost rankings in the March/April 2024 Core Update then subsequently experienced a sitewide collapse in May.

Can 10 Pages Trigger A Sitewide Penalty?

The person asking the question on Reddit explained that they had ten pages (out of 20,000 pages) that were hit by the Helpful Content Update (HCU) in September 2023. They subsequently updated the pages which eventually recovered their rankings and traffic. Things were fine until the the same ten pages got slammed by the March/April core update. The precise date of the second ranking drop event was April 20th.

Up to that point the rest of the site was fine. Only the same ten pages were affected. That changed on May 7th when the site experienced a sitewide drop in rankings across all 20,000 pages of the website.

Their question was if the ten problematic pages triggered a sitewide impact or whether the May 7th collapse was due to the Site Reputation Abuse penalties that were announced on May 6th.

A Note About Diagnosing Ranking Drops

I’m not commenting specifically about the person who asked the question but… the question has the appearance of correlating ranking drops with specific parts of announced algorithm updates.

Here is the exact wording:

“Our website has about 20K pages, and we found that around 10 pages were hit by HCU in September. We updated those articles and saw a recovery in traffic, but after the March core update around April 20, the same pages were hit again, likely due to HCU. On May 7th, we saw a sharp drop in rankings across the board, and suspect that a sitewide classifier may have been applied.

Question: Can an HCU hit on 10 pages cause a sitewide classifier for 20K pages? Or on May 7th reputation abuse update may had an impact?”

In general it’s reasonable to assume that a ranking drop is connected to a recently announced Google update when the dates of both events match. However, it bears pointing out that a core algorithm update can affect multiple things (for example query-content relevance) and it should be understood that the HCU is no longer a single system.

The person asking the question is following a pattern that I often see which is that they’re assuming that ranking drops are due to something wrong with their site but that’s not always the case, it could be changes in how Google interprets a search query (among many other potential reasons).

The other potential mistake is assuming that the problem is related to a specific algorithm. The person asking the question assumes they were hit by the HCU system, which is something that no longer exists. All the elements of the HCU were subsumed into the core ranking algorithm as signals.

Here is what Google’s documentation says about what happened to the HCU:

Is there a single “helpful content system” that Google Search uses for ranking?
Our work to improve the helpfulness of content in search results began with what we called our “helpful content system” that was launched in 2022. Our processes have evolved since. There is no one system used for identifying helpful content. Instead, our core ranking systems use a variety of signals and systems.”

While Google is still looking for helpfulness in content there is no longer a helpful content system that’s de-ranking pages on specific dates.

The other potential evidence of faulty correlation is when the Redditor asked if their May 7th sitewide collapse was due to the site reputation abuse penalties. The site reputation abuse penalties weren’t actually in effect by May 7th. On May 6th it was announced that site reputation abuse manual actions would begin at some point in the near future.

Those are two examples of how it can be misleading to correlate site ranking anomalies with announced updates. There is more to diagnosing updates than correlating traffic patterns to announced updates. Site owners and SEOs who diagnose problems in this manner risk approaching the solution like someone who’s focusing on the map instead of looking at the road.

Properly diagnosing issues requires understanding the full range of technical issues that can impact a site and algorithmic changes that can happen on Google’s side (especially unannounced changes). I have over 20 years experience and know enough to be able to identify anomalies in the SERPs that indicate changes to how Google is approaching relevance.

Complicating the diagnosis is that sometimes it’s not something that needs “fixing” but rather it’s more about the competition is doing something more right than the sites that lost rankings. More right can be a wide range of things.

Ten Pages Caused Sitewide “Penalty”?

John Mueller responded by first addressing the specific issue of sitewide ranking collapse, remarking that he doesn’t think it’s likely that ten pages would cause 20,000 other pages to lose rankings.

John wrote:

“The issues more folks post about with regards to core updates tend to be site-wide, and not limited to a tiny subset of a site. The last core update was March/April, so any changes you’d be seeing from May would be unrelated. I’m not sure how that helps you now though :-), but I wouldn’t see those 10 pages as being indicative of something you need to change across 20k other pages.”

Sometimes It’s More Than Announced Updates

John Mueller didn’t offer a diagnosis of what is wrong with the site, that’s impossible to say without actually seeing the site. SEOs on YouTube, Reddit and Facebook routinely correlate ranking drops with recently announced updates but as I wrote earlier in this article, that could be a mistake.

When diagnosing a drop in rankings it’s important to look at the site, the competition and the SERPs.

Do:

  • Inspect the website
  • Review a range of keywords and respective changes in the SERPs
  • Inspect the top ranked sites

Don’t:

  • Assume that a ranking drop is associated with a recent update and stop your investigation right there.

Google’s John Mueller alludes to the complexity of diagnosing ranking drops by mentioning that sometimes it’s not even about SEO, which is 100% correct.

John explained:

“Based on the information you posted, it’s also impossible to say whether you need to improve / fix something on those 20k pages, or if the world has just moved on (in terms of their interests, their expectations & your site’s relevance).

It sounds like you did find things to make more “helpful” on those 10 pages, maybe there’s a pattern? That’s something for you to work out – you know your site, its content, its users best. This isn’t an easy part of SEO, sometimes it’s not even about SEO.”

Look At The Road Ahead

It’s been a trend now that site owners focus on recent announcements by Google as clues to what is going on with their sites. It’s a reasonable thing to do and people should 100% keep doing that. But don’t make that the limit of your gaze because there is always the possibility that there is something else going on.

Featured Image by Shutterstock/vovan

Google Quietly Fixed Site Names In Search Results via @sejournal, @martinibuster

Google resolved a site names issue that had been ongoing since September 2023 that prevented a website’s site name from properly appearing when an inner page was ranked in the search results.

Site Names In The Search Results

A site name is exactly what it sounds like, the name of a website that’s displayed in the search engine results pages (SERPs). This is a feature that allows users to identify the name of the site that’s in the search engine results pages (SERPs).

If your site name is Acme Anvil Company, and that’s how the company is known, then Google wants to display Acme Anvil Company in the search results. If Acme Anvil Company is better known as the AAC and that’s what the company wants to show in the SERPs, then that’s what Google wants to show.

Google allows site owners to use the “WebSite” structured data on the home page to specify the correct site name that Google should use.

Problem Propagating Site Names

Back in September 7, 2023, Google published a warning in their site name documentation that acknowledged they were having problems propagating the site name to the inner pages of a site when those inner pages were shown in the SERPs.

This is the warning that was published:

“Known issue: site name isn’t appearing for internal pages
In some cases, a site name shown for a home page may not have propagated to appear for other pages on that site. For example, example.com might be showing a site name that’s different from example.com/internal-page.html.

We’re actively working to address this. We will update this help page when this issue is resolved. In the meantime, if your home page is showing the site name you prefer, understand that it should also appear for your internal pages eventually.”

Google Fixes Site Name Problem

The documentation for the site name problem was recently removed. A changelog for Google documentation noted this:

“Resolving the issue with site names and internal pages
What: Removed the warning about the issue that was preventing new site names from propagating to internal pages.

Why: The issue has been resolved. Keep in mind that it takes time for Google to recrawl and process the new information, including recrawling your internal pages.”

There’s no word on what caused the site name propagation problem but it is interesting that it was finally fixed after all this time because one has to wonder if it took so long because it was low priority or if something on the backend of Google’s systems changed that allowed them to finally fix the issue.

Read Google’s updated site names documentation:

Provide a site name to Google Search

Featured Image by Shutterstock/Cast Of Thousands

Google On How It Manages Disclosure Of Search Incidents via @sejournal, @martinibuster

Google’s latest Search Off The Record podcast discussed examples of disruptive incidents that can affect crawling and indexing and discuss the criteria for deciding whether or not to disclose the details of what happened.

Complicating the issue of making a statement is that there are times when SEOs and publishers report that Search is broken when from Google’s point of view they’re working the way they’re supposed to.

Google Search Has A High Uptime

The interesting part of the podcast began with the observation that Google Search (the home page with the search box) itself has an “extremely” high uptime and rarely ever goes down and become unreachable. Most of the reported issues were due to network routing issues from the Internet itself than a failure from within Google’s infrastructure.

Gary Illyes commented:

“Yeah. The service that hosts the homepage is the same thing that hosts the status dashboard, the Google Search Status Dashboard, and it has like an insane uptime number. …the number is like 99.999 whatever.”

John Mueller jokingly responded with the word “nein” (pronounced like the number nine), which means “no” in German:

“Nein. It’s never down. Nein.”

The Googlers admit that the rest of Google Search on the backend does experience outages and they explain how that’s dealt with.

Crawling & Indexing Incidents At Google

Google’s ability to crawl and index web pages is critical for SEO and earnings. Disruption can lead to catastrophic consequences particularly for time-sensitive content like announcements, news and sales events (to name a few).

Gary Illyes explained that there’s a team within Google called Site Reliability Engineering (SRE) that’s responsible for making sure that the public-facing systems are running smoothly. There’s an entire Google subdomain devoted to the site reliability where they explain that they approach the task of keeping systems operational similar to how software issues are. They watch over services like Google Search, Ads, Gmail, and YouTube.

The SRE page explains the complexity of their mission as being very granular (fixing individual things) to fixing larger scale problems that affect “continental-level service capacity” for users that measure in the billions.

Gary Ilyes explains (at the 3:18 minute mark):

“Site Reliability Engineering org publishes their playbook on how they manage incidents. And a lot of the incidents are caught by incidents being issues with whatever systems. They catch them with automated processes, meaning that there are probers, for example, or there are certain rules that are set on monitoring software that looks at numbers.

And then, if the number exceeds whatever value, then it triggers an alert that is then captured by a software like an incident management software.”

February 2024 Indexing Problem

Gary next explains how the February 2024 indexing problem is an example of how Google monitors and responds to incidents that could impact users in search. Part of the response is figuring out if it’s an actual problem or a false positive.

He explains:

“That’s what happened on February 1st as well. Basically some number went haywire, and then that opened an incident automatically internally. Then we have to decide whether that’s a false positive or it’s something that we need to actually look into, as in like we, the SRE folk.

And, in this case, they decided that, yeah, this is a valid thing. And then they raised the priority of the incident to one step higher from whatever it was.

I think it was a minor incident initially and then they raised it to medium. And then, when it becomes medium, then it ends up in our inbox. So we have a threshold for medium or higher. Yeah.”

Minor Incidents Aren’t Publicly Announced

Gary Ilyes next explained that they don’t communicate every little incident that happens because most of the times it won’t even be noticed by users. The most important consideration is whether the incident affects users, which will automatically receive an upgraded priority level.

An interesting fact about how Google decides what’s important is that problems that affect users are automatically boosted to a higher priority  level. Gary said he didn’t work in SRE so he was unable to comment on the exact number of users that need to be affected before Google decides to make a public announcement.

Gary explained:

“SRE would investigate everything. If they get a prober alert, for example, or an alert based on whatever numbers, they will look into it and will try to explain that to themselves.

And, if it’s something that is affecting users, then it almost automatically means that they need to raise the priority because users are actually affected.”

Incident With Images Disappearing

Gary shared another example of an incident, this time it was about images that weren’t showing up for users. It was decided that although the user experience was affected it was not affected to the point that it was keeping users from finding what they were searching for, the user experience was degraded but not to the point where Google became unusable. Thus, it’s not just whether users are affected by an incident that will cause an escalation in priority but also how badly the user experience is affected.

The case of the images not displaying was a situation in which they decided to not make a public statement because users could still be able to find the information they needed. Although Gary didn’t mention it, it does sound like an issue that recipe bloggers have encountered in the past where images stopped showing.

He explained:

“Like, for example, recently there was an incident where some images were missing. If I remember correctly, then I stepped in and I said like, “This is stupid, and we should not externalize it because the user impact is actually not bad,” right? Users will literally just not get the images. It’s not like something is broken. They will just not see certain images on the search result pages.

And, to me, that’s just, well, back to 1990 or back to 2008 or something. It’s like it’s still usable and still everything is dandy except some images.”

Are Publishers & SEOs Considered?

Google’s John Mueller asked Gary if the threshold for making a public announcement was if the user’s experience was degraded or if it was the case that the experience of publishers and SEOs were also considered.

Gary answered (at about the 8 minute mark):

“So it’s Search Relations, not Site Owners Relations, from Search perspective.

But by extension, like the site owners, they would also care about their users. So, if we care about their users, it’s the same group of people, right? Or is that too positive?”

Gary apparently sees his role as primarily as Search Relations in a general sense of their users. That may come as a surprise to many in the SEO community because Google’s own documentation for their Search Off The Record podcast explains the role of the Search Relations team differently:

“As the Search Relations team at Google, we’re here to help site owners be successful with their websites in Google Search.”

Listening to the entire podcast, it’s clear that Googlers John Mueller and Lizzi Sassman are strongly focused on engaging with the search community. So maybe there’s a language issue that’s causing his remark to be interpretable differently than he intended?

What Does Search Relations Mean?

Google explained that they have a process for deciding what to disclose about disruptions in search and it is a 100% sensible approach. But something to consider is that the definition of “relations” is that it’s about a connection between two or more people.

Search is a relation(ship). It is an ecosystem where two partners, the creators (SEOs and site owners) create content and Google makes it available to their users.

Featured Image by Shutterstock/Khosro

Bluehost Launches AI WordPress Website Creator via @sejournal, @martinibuster

Bluehost launched an AI Website Creator that enables users to quickly create professional websites, an evolution of the click and build website builder that makes it easy for anyone to create a WordPress website and benefit from the power and freedom of the open source community.

The importance of what this means for businesses and agencies cannot be overstated because it allows agencies to scale WordPress site creation and puts the ability to create professional WordPress sites within reach of virtually everyone.

Point And Click Website Creation

Bluehost offers an easy website building experience that provides the ease of point and click site creation with the freedom of a the WordPress open source content management system. The heart of this system is called WonderSuite.

WonderSuite is comprised of multiple components, such as a user interface that walks a user through the site creation process with a series of questions that are used as part of the site creation process. There is also a library of patterns, templates, and an easy to configure shopping cart, essentially all the building blocks for creating a site and doing business online quickly and easily.

The new AI Website Creator functionality is the newest addition to the WonderSuite site builder.

AI Website Builder

An AI website builder is the natural evolution of the point and click site creation process. Rather than moving a cursor around on a screen the new way to build a website is with an AI that acts as a designer that responds to what a user’s website needs are.

The AI asks questions and starts building the website using open source WordPress components and plugins. Fonts, professional color schemes, and plugins are all installed as needed, completely automatically. Users can also save custom generated options for future use which should be helpful for agencies that need to scale client website creation.

Ed Jay, President of Newfold Digital, the parent company of Bluehost, commented:

“Efficiency and ease are what WordPress entrepreneurs and professionals need and our team at Bluehost is dedicated to deliver these essentials to all WordPress users across the globe. With AI Website Creator, any user can rely on the Bluehost AI engine to create their personalized website in just minutes. After answering a few simple questions, our AI algorithm leverages our industry leading WordPress experience, features and technology, including all aspects of WonderSuite, to anticipate the website’s needs and ensure high quality outcomes.

The AI Website Creator presents users with multiple fully functional, tailored and customizable website options that provide a powerful but flexible path forward. It even generates images and content aligned with the user’s brief input, expediting the website off the ground and ready for launch.”

Future Of Website Creation

Bluehost’s innovative AI site creator represents the future of how businesses get online and how entrepreneurs who service clients can streamline site creation and scale their business with WordPress.

Read more about Bluehost’s new AI Website Creator:

WordPress made wonderful with AI

Featured Image by Shutterstock/Simple Line

Automattic For Agencies: A New Way To Monetize WordPress via @sejournal, @martinibuster

Automattic, the company behind WordPress.com, Jetpack, WooCommerce and more, have announced a new program to woo Agencies into their ecosystem of products with more ways to earn revenue.

This new program could be seen as putting Automattic into direct competition with closed source systems like Wix and Duda but there are clear differences between all three products and services.

Automattic For Agencies

Automattic for Agencies brings together multiple Automattic products into a single service with a dashboard for managing multiple client sites and billing. The program offers a unified locations for managing client sites as well as discounted pricing and revenue sharing opportunities. Aside from the benefits of streamlining the program also offers technical support across all of the Automattic products that are a part of the program. Lastly the program offers agencies managed security and performance improvements.

According to the announcement:

“We worry about site performance and security so you don’t have to. When you connect your sites to the Automattic for Agencies dashboard, you’ll receive instant notifications about updates and alerts, so your sites stay problem-free and your clients stay happy.”

Revenue Share And Discounts

Agencies can now earn a revenue share of the Automattic products used by clients. For example, agencies can earn a 50% revenue share on Jetpack product referrals, including renewals. As part of the program Jetpack also offers discounts on licenses, starting at 10% off for five licenses and to as high as 50% off for 100 licenses.

As part of the new program there are similar benefits for agencies that build or manage WooCommerce sites, with discounted agency pricing and a referral program

WordPress.com, the managed WordPress hosting subsidiary of Automattic, is offering a 20% revenue share on new subscriptions and a 50% share on migrations from other hosts.

A tweet from WordPress.com described the new program:

“Agencies, we’ve got some news for you!

Our new referral program is live, and as a referrer of http://WordPress.com’s services, your agency will receive a 20% revenue share on new subscriptions and 50% on new migrations to http://WordPress.com from other hosting providers.”

New Directory For Agencies

A forthcoming benefit of the Autommatic For Agencies program is a business directory that lists agencies that are a part of the program. The benefit of the directory is presumably that it may lead to business referrals to the agencies.

The Jetpack announcement describes the new directory:

“Gain heightened visibility through multiple directory listings across Automattic’s business units. This increased exposure creates more opportunities for potential clients to find and engage with your services, helping you grow your agency’s reach and reputation.”

The WooCommerce announcement describes the directory like this:

“Expand your reach
Increase your visibility with partner directory listings across multiple Automattic brands.”

Automattic Affiliate Program

The Automattic for Agencies announcement follows the rollout of a separate affiliate program which offers up to 100% referral bonus for affiliates who refer new hosting clients, with a limit of $300 payout per item, and up to 50% referral bonus for Jetpack plugin subscriptions. The program has a 30 day cookie conversion period which provides affiliates the opportunity to earn referral bonuses on any additional sales within a 30 day period.

Read more about the new program:

Live the Suite Life With Automattic For Agencies

Featured Image by Shutterstock/Volodymyr TVERDOKHLIB

Google Case Study Shows Importance Of Structured Data via @sejournal, @martinibuster

Google published a case study that shows how using structured data and following best practices improved discoverability and brought more search traffic. The case study was about the use of Video structured data but the insights shared are applicable across a range of content types.

The new case study is about an Indonesian publisher called Vidio.

How CDNs Can Cause Indexing Problems

One of the interesting points in the case study is about an issue related to how CDNs can link to image and video files with expiring URLs. The new documentation specifically mentions that it’s important that the CDN uses stable URLs and links to another Google documentation page that goes into more detail.

Google explains that some CDNs use quickly expiring URLs for video and thumbnail files and encourages publishers and SEOs to use just one stable URL for each video. Something interesting to note is that not only does this help Google index the files it also helps Google collect user interest signals.

This is what the documentation advises:

“Some CDNs use quickly expiring URLs for video and thumbnail files. These URLs may prevent Google from successfully indexing your videos or fetching the video files. This also makes it harder for Google to understand users’ interest in your videos over time.

Use a single unique and stable URL for each video. This allows Google to discover and process the videos consistently, confirm they are still available and collect correct signals on the videos.”

Implementing The Correct Structured Data

Google highlighted the importance of using the correct structured data and validating it with Google’s structured data testing tool.

These are the results of the above work:

“Within a year of implementing VideoObject markup, Vidio saw improvements in impressions and clicks on their video pages. While the number of videos that Vidio published from Q1 2022 to Q1 2023 increased by ~30%, adding VideoObject markup made their videos eligible for display in various places on Google.

This led to an increase of ~3x video impressions and close to 2x video clicks on Google Search. Vidio also used the Search Console video indexing report and performance report, which helped them to identify and fix issues for their entire platform.”

Indexing + Structured Data = More Visibility

The keys to better search performance were ensuring that Google is able to crawl the URLs, which is something that can easily be overlooked in the rush to correlate a drop in rankings to a recent algorithm update. Never rule anything out during a site audit.

Another thing the case study recommends that is important is to assure that the proper structured data is being used. Using the appropriate structured data can help make a webpage qualify for improved search visibility through one of Google’s enhanced search features like featured snippets.

Read Google’s case study:

How Vidio brought more locally relevant video-on-demand (VOD) content to Indonesian users through Google Search

Featured Image by Shutterstock/Anton Vierietin