Google Search Console Adds Hourly Data Export To 24-Hour View via @sejournal, @MattGSouthern

Google has expanded its newly launched 24-hour data view in Search Console by adding the ability to export hourly performance data.

This enhancement comes in response to user feedback since the feature’s initial rollout.

The announcement, shared on LinkedIn, stated:

“Last month we announced the 24-hour view in the Search Console Performance reports, and we got lots of positive feedback and feature requests. Today, we’re making the export button available for that view (one of the most requested features): the ability to export data on an hourly basis for the last 24 hours. Enjoy the new data!”

This update allows users to access detailed website performance data within the last 24 hours, broken down by the hour, and export it for further analysis.


Key Details About the 24-Hour Data View

The 24-hour data view, introduced in December, is an upgrade to Google Search Console’s performance reports.

It aims to provide near real-time insights into how website content performs across Google Search, Discover, and News.

Fresher, Faster Data

With the 24-hour view, website owners no longer need to wait up to 48 hours for performance metrics to update.

Data for clicks, impressions, average click-through rate (CTR), and average position is now available with minimal delay.

This feature is handy for monitoring the early performance of newly published content or tracking the immediate impact of updates to existing pages.

It provides timely insights into which search queries drive traffic and how pages rank in search results.

Hourly Breakdown

The 24-hour view includes an hourly breakdown of performance metrics displayed in an overtime graph.

While the data updates as soon as it becomes available, Google notes that incomplete data is marked with a dotted line in the interface to indicate ongoing collection.

You can filter the data by query, page, or country and analyze performance within their local time zone, as determined by browser settings.


How the Export Feature Enhances the 24-Hour View

Adding an export button to the 24-hour view addresses one of the most requested features from users.

With this functionality, you can now download hourly data for the past day, enabling deeper analysis through external tools like Excel or Google Sheets.

This enhancement is expected to help businesses better respond to performance trends and optimize their content strategies.


Why It Matters

By making website metrics more accessible and timely, Google is helping marketers and SEO professionals stay competitive.

To access the 24-hour view, you can go to the performance reports section of Search Console and select the “24 hours” tab. The export button is now available to all users with access to this feature.

As the rollout continues over the coming months, this update highlights Google’s broader commitment to providing real-time tools that enhance website performance tracking and optimization.


Featured Image: Vladimka production/Shutterstock

Google Criticizes Bing For Mimicking Google’s Homepage via @sejournal, @MattGSouthern

Parisa Tabriz, the security leader for Google Chrome, has criticized Microsoft for a new strategy involving Bing’s search interface.

In a post on X (formerly Twitter), Tabriz denounced Microsoft’s decision to imitate the design of Google’s homepage, labeling it “another tactic in its long history of tricks to confuse users and limit choice.”

She concluded her statement with sharp words: “New year; new low, Microsoft.”

This criticism comes after Bing introduced a controversial feature that mimics Google’s user interface when users search for “Google” or “Google.com.”

Microsoft’s Sneaky New Bing Interface

When users not signed into a Microsoft account search for Google on Bing, they see a page that looks a lot like Google’s homepage.

Screenshot from: Bing.com, January 2025.

The page has a search bar in the center, a banner with animated figures similar to Google Doodles, and a message saying, “Every search brings you closer to a free donation. Choose from over 2 million nonprofits!”

This message links to the Microsoft Rewards catalog, where users can donate their reward points to nonprofit organizations.

The design makes it hard to see Bing’s branding by scrolling the page slightly down to hide the Bing logo.

Users may only realize they’re still using Bing when they scroll or interact with the page further.

Attempt To Retain Users

Industry observers like The Verge note this move appears targeted at users setting up new Windows PCs, who might initially search for Google through Microsoft Edge’s default Bing search engine.

The design change could potentially retain users who might otherwise switch to Google’s search platform.

Many of these users search for Google to switch their search engine. Microsoft’s change aims to keep users from leaving Bing.

While tech-savvy users may notice this strategy, it might persuade less experienced users to keep searching on Bing, helping Microsoft retain more users.

Broader Context: The Search Engine Wars

This latest tactic highlights the ongoing competition between Microsoft and Google in the search engine market.

Microsoft has employed various strategies to promote its Bing search engine and Edge browser, including pop-ups and changes to Chrome’s download pages.

In parallel, Google has encouraged users to download Chrome and set Google as their default search engine, though its methods haven’t included outright deception.

Google’s and Microsoft’s rivalry remains heated. As of December, Google’s search engine maintained a dominant global market share of 89.74%, while Microsoft’s Bing held 3.97%.

Final Thoughts

As Microsoft continues to push for greater adoption of Bing, the company’s latest tactic raises questions about user trust and transparency.

While the mimicry may boost Bing’s metrics in the short term, the backlash from users and industry leaders could damage Microsoft’s reputation.

Whether Microsoft will address the criticism or double down on its strategy remains to be seen.


Featured Image: kovop/Shutterstock

Google Clarifies 404 & Redirect Validation In Search Console via @sejournal, @MattGSouthern

Google’s Search Advocate, John Mueller, has provided insights into Search Console’s validation process, addressing how it handles 404 errors and redirects during site migrations.

Key Points

A Reddit user shared their experience with a client’s website migration that led to a loss in rankings.

They explained that they took several steps to address the issues, including:

  • Fixing on-site technical problems.
  • Redirecting 404 pages to the appropriate URLs.
  • Submitting these changes for validation in Google Search Console.

Although they confirmed that all redirects and 404 pages were working correctly, they failed to validate the changes in Search Console.

Feeling frustrated, the user sought advice on what to do next.

This prompted a response from Mueller, who provided insights into how Google processes these changes.

Mueller’s Response

Mueller explained how Google manages 404 errors and redirect validations in Search Console.

He clarified that the “mark as fixed” feature doesn’t speed up Google’s reprocessing of site changes. Instead, it’s a tool for site owners to monitor their progress.

Mueller noted:

“The ‘mark as fixed’ here will only track how things are being reprocessed. It won’t speed up reprocessing itself.”

He also questioned the purpose of marking 404 pages as fixed, noting that no further action is needed if a page intentionally returns a 404 error.

Mueller adds:

“If they are supposed to be 404s, then there’s nothing to do. 404s for pages that don’t exist are fine. It’s technically correct to have them return 404. These being flagged don’t mean you’re doing something wrong, if you’re doing the 404s on purpose.”

For pages that aren’t meant to be 404, Mueller advises:

“If these aren’t meant to be 404 – the important part is to fix the issue though, set up the redirects, have the new content return 200, check internal links, update sitemap dates, etc. If it hasn’t been too long (days), then probably it’ll pick up again quickly. If it’s been a longer time, and if it’s a lot of pages on the new site, then (perhaps obviously) it’ll take longer to be reprocessed.”

Key Takeaways From Mueller’s Advice

Mueller outlined several key points in his response.

Let’s break them down:

For Redirects and Content Updates

  • Ensure that redirects are correctly set up and new content returns a 200 (OK) status code.
  • Update internal links to reflect the new URLs.
  • Refresh the sitemap with updated dates to signal changes to Google.

Reprocessing Timeline

  • If changes were made recently (within a few days), Google will likely process them quickly.
  • For larger websites or older issues, reprocessing may take more time.

Handling 404 Pages

  • If a page is no longer meant to exist, returning a 404 error is the correct approach.
  • Seeing 404s flagged in Search Console doesn’t necessarily indicate a problem, provided the 404s are intentional.

Why This Matters

Website migrations can be complicated and may temporarily affect search rankings if not done correctly.

Google Search Console is useful for tracking changes, but it has limitations.

The validation process checks if fixes are implemented correctly, not how quickly changes will be made.

Practice patience and ensure all technical details—redirects, content updates, and internal linking—are adequately addressed.


Featured Image: Sammby/Shutterstock

Google AI Overviews Appear in 18% Of Publisher-Related Queries via @sejournal, @MattGSouthern

New research indicates that Google’s AI Overviews appear in 18% of publisher-related search queries.

Additionally, the findings suggest that traditional search ranking factors may be less relevant for content appearing in AI Overviews.

Here are more highlights from the study released by ZipTie.dev, which analyzes over 500,000 queries across multiple industries.

Key Findings

Data indicates that 63% of sources cited in AI Overviews are not found in the top 10 traditional search results.

This change illustrates a shift in Google’s strategy, as explained by Rudzki.:

“In traditional ranking, Google’s job is to send you to pages that you will likely be satisfied with. With AI Overviews the goal is different, it’s about showing you the best answer.”

The analysis found different frequencies of AI Overviews in search results:

  • “How much” queries show AI Overviews 54% of the time.
  • Review-related queries show AI Overviews only 9% of the time.
  • “What is” queries generate AI Overviews 39% of the time.

The study also notes that Google is using YouTube content in AI Overviews. This change could give publishers with video strategies more visibility opportunities.

Questions About Authority

Research shows that some publications are featured prominently in AI Overviews, even when the topics are outside their usual areas of expertise.

For example, Business Insider is often cited for celebrity news, while The Times of India is mentioned in health-related discussions.

This trend indicates that traditional ideas about who has authority on a topic are becoming less important.

Looking Ahead

AI Overviews are now available in over 100 countries and territories. However, their use in the EU is limited because of regulations. Right now, the feature has only a small amount of ads.

The study expects AI Overviews to grow more in the future, but notes two main factors that could slow this expansion.

Rudzki states:

“Google is not putting ads in AI Overviews, except for very limited usage. Once they will find a good way to earn money, they will likely increase the share of AI Overviews.”

Additionally, he notes that user experience remains crucial:

“Google just can’t put AI Overviews for every keyword. This would translate to extremely low satisfaction rates.”

Methodology

The analysis examined over 500,000 queries across multiple industries between June and December 2024.

The complete study and detailed methodology are available through ZipTie.


Featured Image: Below the Sky/Shutterstock

Mullenweg Criticizes WP Engine For Something He Also Does via @sejournal, @martinibuster

Matt Mullenweg cited a Reddit thread on X to promote the idea that WP Engine makes it difficult to cancel accounts. Turns out that his own hosting company does the exact same thing.

“Money Grab” Post By Redditor

Someone posted that they cancelled a WP Engine account on Friday December 6th. They subsequently learned that WP Engine has a 30 day advance notice cancellation policy so they called customer service and was assured they wouldn’t be charged, despite not giving 30 days advance notice.

They wrote:

“On Dec 6th, I cancelled my WPEngine service that I’ve had since 2015. …That’s when I discovered that WPEngine requires 30 days notice to cancel. An obvious money-grab. A user should be able to cancel a single-site hosting environment instantly with one click of a button. In fact, this will be the law soon, created because of unscrupulous cancellation tactics like this.

WPEngine support informed me that my site would be cancelled on Jan 3rd. …Surprise, surprise… it’s Jan 5th. My account hasn’t been cancelled, and I was charged $300 today for another year of service.”

Hours after starting the Reddit post to complain about WP Engine they updated it to say that WP Engine had refunded their money.

They posted:

“UPDATE: WP Engine support got back to me, cancelled the account today, and initiated a refund that’ll take up to 10 business days.”

Mullenweg Dumps On WP Engine

Mullenweg posted on X to compare WP Engine to an unscrupulous gym, accompanying his post with a screenshot of the Reddit post:

“One way @wpengine juices its profits at the expense of its customers is by making it hard to export or cancel your plan, like one of those bad gyms.”

He followed that up with another post touting that WordPress.com has a flexible cancellation policy:

“The WordPress philosophy is to make it easy for people to leave, so they’re more likely to stay. Give freedom and choice.”

But the reality is that WordPress.com’s cancellation policy also requires 30 day advanced notice:

“You must cancel at least one month before the scheduled end date of any annual subscription…”

What’s Going On?

WP Engine’s contract is clear that they require 30 days notice to cancel a service. But it’s not like once you pay you’re committed to a whole year of hosting. The contract enables customers to cancel their yearly hosting plan at any time (with 30 days notice) and the difference for any remaining months will be refunded.

The Redditor cancelled their account with less than 30 days notice (on a Friday), got charged 24 days later and then refunded on a Sunday, before the weekend was over.

Response On Reddit

While many Redditors were supportive of the person who started the discussion, others pointed out the obvious that it’s a weekend and they failed to give adequate notice.

A Redditor named ThePresidentOfStraya posted:

“Not affiliated with WPEngine. Downvoted. This is a boring billing issue, you’re not being oppressed. Annoying sure. But just call them Monday mate.”

Another Redditor downplayed the events:

“Meh, crap happens. It’s not at all abnormal to have a 30 day opt-out prior to renewal.”

Another Redditor put the original posters situation into perspective, commenting:

“Money grab? You know the rules…

Inform yourself. You should have cancelled sooner. Now be polite and ask them for a solution instead of ranting about it online.”

Read the original Reddit post:

UPDATE: WPEngine didn’t follow through on cancelling my account on Jan 3rd and I was charged for another year of service

Featured Image by Shutterstock/Nicoleta Ionescu

WordPress Popular Posts Plugin Vulnerability Affects 100k+ Sites via @sejournal, @martinibuster

An advisory has been issued about a high-severity WordPress vulnerability that makes it possible for attackers to inject arbitrary shortcodes into sites using the WordPress Popular Posts plugin. Attackers do not need a user account to launch an attack.

WordPress Popular Posts is installed in over 100,000 websites enables websites to display the most popular posts within any given time period and has been translated into sixteen different languages to extend its use around the world. It comes with caching features to improve performance and an admin console that allows website administrators to view popularity statistics.

WordPress Shortcode Vulnerability

Shortcodes is a feature that allows users to insert functionalities within a web page by inserting a predefined snippet within brackets that automatically inserts a script that performs a function, like adding a contact form with a shortcode that looks like this: [add_contact_form].

WordPress is gradually evolving away from the use of shortcodes in favor of blocks with specific functionalities. The official WordPress developer site encourages plugin and theme developers to discontinue using shortcodes in favor of dedicated blocks, with the main reason being that it’s a smoother workflow for a user to select and insert a block rather than configure a shortcode within a plugin then manually inserting the shortcode into a webpage.

WordPress advises:

“We would recommend people eventually upgrade their shortcodes to be blocks.”

The vulnerability discovered in the WordPress Popular Posts plugin is due to the implementation of the shortcode functionality, specifically a part called do_shortcode(), which is a WordPress function for processing and executing shortcodes that requires input sanitization and other standard WordPress plugin and theme security practices.

According to an advisory published by Wordfence:

“The WordPress Popular Posts plugin for WordPress is vulnerable to arbitrary shortcode execution in all versions up to, and including, 7.1.0. This is due to the software allowing users to execute an action that does not properly validate a value before running do_shortcode. This makes it possible for unauthenticated attackers to execute arbitrary shortcodes.”

That part about “validating a value” generally means checking to ensure that what the user inputs (the “value”), such as the content of a shortcode, is validated to confirm that it’s safe and conforms to expected inputs before being passed along for use by the website.

Official Plugin Changelog

A changelog is the documentation of what’s being updated, which for users of the plugin provides them an opportunity to understand what is being updated and to make decisions about whether to update their installation or not, thus transparency is important.

The WordPress Popular Posts plugin is responsibly transparent in their documentation of the update.

The plugin changelog advises:

“Fixes a security issue that allows unintended arbitrary shortcode execution (props to mikemyers and the Wordfence team!)”

Recommended Actions

All versions of the WordPress Popular Posts plugin up to and including version 7.1.0 are vulnerable. Wordfence recommends updating to the latest version of the plugin, 7.2.0.

Read the official Wordfence advisory:

WordPress Popular Posts <= 7.1.0 – Unauthenticated Arbitrary Shortcode Execution

Featured Image by Shutterstock/GrandeDuc

WordPress Backup Plugin Vulnerability Affects 3+ Million Sites via @sejournal, @martinibuster

A high severity vulnerability in a popular WordPress backup plugin allows unauthenticated attackers to exploit the flaw. The vulnerability is rated 8.8 on a scale of 0.0 to 10.

UpdraftPlus: WP Backup & Migration Plugin

The vulnerability affects the popular Updraft Plus WordPress plugin, installed in over 3 million websites. Updraft Plus comes in a free and paid version that allows users to upload backups to a user’s cloud storage or to email the files. The plugin allows users to manually backup the website or schedule it for automatic backups. It offers a tremendous amount of flexibility of what can be backed up and can make a huge difference for recovering from a catastrophic server issue and is also useful for migrating to a different server altogether.

Wordfence explains the vulnerability:

“The UpdraftPlus: WP Backup & Migration Plugin plugin for WordPress is vulnerable to PHP Object Injection in all versions up to, and including, 1.24.11 via deserialization of untrusted input in the ‘recursive_unserialized_replace’ function. This makes it possible for unauthenticated attackers to inject a PHP Object.

No known POP chain is present in the vulnerable software. If a POP chain is present via an additional plugin or theme installed on the target system, it could allow the attacker to delete arbitrary files, retrieve sensitive data, or execute code. An administrator must perform a search and replace action to trigger the exploit.”

The Updraft Plus changelog seems to minimize the vulnerability, it doesn’t even call the update a security patch, it’s labeled as a “tweak.”

From the official Updraft Plus WordPress plugin changelog:

“TWEAK: Complete the review and removal of calls to the unserialize() PHP function allowing class instantiation begun in 1.24.7. (The final removal involved a theoretical security defect, if your development site allowed an attacker to post content to it which you migrated to another site, and which contained customised code that could perform destructive actions which the attacker knew about, prior to you then cloning the site. The result of this removal is that some search-replaces, highly unlikely to be encountered in practice, will be skipped).”

Updraft Plus Vulnerability Patched

Users are recommended to consider updating their installations of Updraft Plus to the latest version, 1.24.12. All versions prior to the latest version are vulnerable.

Read the Wordfence advisory:

UpdraftPlus: WP Backup & Migration Plugin <= 1.24.11 – Unauthenticated PHP Object Injection

Featured Image by Shutterstock/Tithi Luadthong

Google Shows How To Confirm Indexing Issues Due To JavaScript via @sejournal, @martinibuster

SearchNorwich recently published an excellent video featuring Google’s Martin Splitt discussing how to debug crawling and indexing issues related to JavaScript, saying that most of the times it’s not JavaScript that’s causing indexing issues, the actual cause is something else. Even if you don’t know how to code with JavaScript, the tips that Martin shares will enable anyone to get a good start on debugging crawl issues that are originating on a website.

JavaScript Is Rarely The Cause Of SEO Issues

Martin’s SearchNorwich video was published a month ago. Just a few days ago John Mueller advises that too much JavaScript can have a negative impact on SEO, which aligns with Martin’s assertion that JavaScript is rarely the reason for SEO issues, that it’s either the misuse of JavaScript or something else entirely.

He explains that of the issues that virtually all suspected JavaScript issues that get emailed to him end up being something else. He pins the blame on a flawed approach to debugging SEO issues. What he describes is confirmation bias, which is suspecting that something is the cause and then looking for clues to justify that opinion. The definition of confirmation bias is the tendency to interpret existing evidence or to look for evidence that confirms existing beliefs, while ignoring evidence that contradicts those beliefs.

Martin explained:

“…it seems to me, as someone on the Google side of things, that SEOs look for clues that allow them to blame things they’re seeing on JavaScript. Then they show up, or someone from their team shows up, in my inbox or on my social media and says, “We found a bug. It’s JavaScript. You say JavaScript works in Google Search, but we have a strong hint that it doesn’t, and you know it’s because of JavaScript.”

He goes on to say that out of hundreds of times a year that he’s approached with a diagnosis that JavaScript is to blame for an SEO problem he has only seen one actual instance where an actual bug related to JavaScript was to blame. Just one.

He also says:

“People often claim, “You say it works if you use client-side rendering, but clearly, it is not working. It must be a JavaScript problem and maybe even a bug in Google.” Surprisingly, many of the people who end up in my inbox suspect it’s a Google bug. I find that interesting, especially when a small, niche website claims to be affected by a bug that doesn’t affect any other websites. Most of the time, it’s not us—it’s you.”

Splitt explains that when JavaScript is involved in a crawling or rendering issue, it’s most often not because JavaScript is to blame but rather it’s being used incorrectly

Finding Source Of Rendering Issues

Martin suggests debugging rendering issues by checking how Google “sees” the web page. Rendering, in the context of Googlebot crawling, is the process of downloading all the resources from a web page like fonts, JavaScript, CSS and HTML and then creating fully functional web page that’s similar to what a human user would experience in a web browser.

Debugging how Google renders a page may show that the page renders fine, that certain parts don’t render or that the page cannot be indexed at all.

He recommends using the following tools for debugging possible JavaScript issues:

1. Google Search Console URL Inspection Tool

2. Google Rich Results Test

3. Chrome Dev Tools

Easy JavaScript Debugging

Both of the first two tools let you submit a URL that gets immediately crawled by Google and they’ll show you the rendered page, what the page looks like for Google for indexing purposes.

Martin explains the usefulness of the JavaScript console messages in Chrome Dev Tools:

“There’s also more info that gives you very helpful details about what happened in the JavaScript console messages and what happened in the network. If your content is there and it’s what you expect it to be, then it’s very likely not going to be JavaScript that is causing the problem. If people were doing just that, checking these basics, 90% of the people showing up in my inbox would not show up in my inbox. That’s what I do.”

He also explained that just because the JavaScript console flags an error that doesn’t mean that the problem is with the JavaScript itself. He uses the example of an error in how JavaScript failed to execute that was caused by an API that’s blocked by Robots.txt, preventing the page from rendering.

Why Do So Many SEOs Blame JavaScript?

Martin implies that not knowing how to debug JavaScript is the cause of the reputation it’s received as a cause of crawling and indexing issues. I get it, I learned the basics of coding JavaScript by hand 25 years ago and I disliked it then and now, it’s never been my thing.

But Martin’s right that knowing a few tricks for debugging JavaScript will save a lot of wasted time chasing down the wrong problem.

Watch Martin Splitt’s presentation here:

Maybe It Isn’t JavaScript – Martin Splitt at SearchNorwich 18

Featured Image by Shutterstock/Artem Samokhvalov

OpenAI Blames Cloud Provider For ChatGPT Outage via @sejournal, @martinibuster

OpenAI published an incident report detailing the cause of last weeks ChatGPT outage and what they are doing to prevent a repeat. The outage began on December 26th, 2024 at 10:40 AM and was mostly resolved by 3:11 PM, except for except ChatGPT which was 100% recovered by 6:20 PM.

The following services were impacted:

  • ChatGPT
  • Sora video creation
  • APIs: agents, realtime speech, batch, and DALL-E

Cause of OpenAI Outage

The cause of the outage was a cloud provider data center failure which impacted OpenAI databases. While the databases are mirrored across regions, switching over to a backup database required manual interaction on the part of the cloud provider to redirect operations to a backup datacenter in another region. The manual intervention was cited as how the outage was fixed but the given reason for why it took so long was the scale of the project.

A failover is an automated process for switching to a backup system in the event of a system failure. OpenAI announced that they are working toward creating infrastructure changes to improve responses to future cloud database failures.

OpenAI explained:

“In the coming weeks, we will embark on a major infrastructure initiative to ensure our systems are resilient to an extended outage in any region of any of our cloud providers by adding a layer of indirection under our control in between our applications and our cloud databases. This will allow significantly faster failover.”

Significant ChatGPT Outage

OpenAI’s said the ChatGPT outage was due to a regional cloud provider database failure but the effect was global, evidenced by user reports on social media from across Europe and North America.

Screenshot of Google Trends graph showing largest ever spike in searches for query

Google Trends, which tracks search volume, indicates that this may have been the largest such event, with more people searching for information about it than for any previous outage.

Featured Image by Shutterstock/lilgrapher

Google Podcast Discusses SEO Expertise via @sejournal, @martinibuster

Google’s recent Search Off the Record podcast touched on the issue of SEO expertise and the disconnect between how SEOs think Google ranks websites how Googlers understand it. The disparity is so great that Gary Ilyes remarked that sometimes he doesn’t know what SEOs are talking about.

Googlers Question SEO Expertise

Martin Splitt discussed meeting Turkish publishers and SEOs of different experience levels at a Google event in Turkey in which the attendees complained of poor search results. Turned out that the problem wasn’t Google’s search results, it was an issue with how Turkish websites are created, which indirectly called into question the SEO expertise of Turkish language publishers.

He said:

“And then eventually we worked out as a group as a whole, that there are a lot of problems with the way that content is created in Turkish language websites…”

Gary Illyes expanded on Martin’s comment about experience levels to say that it’s a subjective thing, that some people who self-describe themselves as newbies are actually extremely knowledgeable on the fine details of indexing and crawling, while other SEO gurus ask questions that don’t make sense.

Gary shared:

“The thing you mentioned about experience, I came to realize the past few years that that’s a very subjective thing. Like, when you are asking people, ‘What’s your experience?’ And they are like, ‘Oh, I’m a guru,’ and then on the opposite end of the spectrum, like, ‘I’m a complete newbie.’

And then you start talking to them and the newbie knows way more about like HTTP, for example, than I do and crawling and indexing and whatever, like how it’s perceived externally.

And then you talk to the guru and the guru is like… the questions themselves don’t make sense. Like, you can’t interpret the question that they are asking.”

That part about the questions not making sense describes a disconnect between what SEOs and Googlers believe about SEO. Let’s face it, there’s a disconnect.

The Knowledge And Experience Gap

Sometimes there’s a gap separating how SEOs experience the ranking algorithm and how Googlers try to explain how it works.  A classic example is the disconnect in the SEO belief in the concept of domain authority and Google’s denial of its existence. A few years ago, in a Google Search Central Hangout, a person told John Mueller that a core update eliminated the rankings of all of their keywords.

They asked,

“How could it be possible that our authority can drop more than 50% overnight? What actions could we take to increase our authority?”

Mueller answered:

“So in general, Google doesn’t evaluate a site’s authority. So it’s not something where we would give you a score on authority and say this is the general score for authority for your website. So that’s not something that we would be applying here.”

That belief in “domain authority” is one example of out of many where what SEOs think they know about Google is completely disconnected from what Googlers know about how search rankings work.

What Do SEO Experts Really Know?

Martin Splitt continues the conversation to proxies for judging expertise of SEOs by how big the sites are that they manage but concludes that those proxy metrics don’t really say much about their SEO expertise, either. Ultimately they conclude that they need to engage in a deeper conversation with the search marketing and publishing community to identify if there’s something Google could do better to explain what SEOs should be doing.

He explained:

“I mean, we try to gauge experience by asking them how many years have you been doing this kind of job and how many years have you been in this industry, and how many impressions do you manage a month, roughly? And these are proxy metrics. And as you say, it’s super subjective.”

He mentions the wide range of complexity of technical issues an SEO needs to understand and John Mueller adds that even specialists in a specific SEO niche can have gaps in fundamental SEO concepts. The point of the conversation is to speculate if the root of the disconnect is in Google’s documentation or just that the SEO experts just don’t know.

John commented:

“It’s like someone could be like super focused on web workers or trying to get them indexed and at the same time, like, ‘How do I block a page from being indexed?’”

Martin agreed, saying:

“Yeah. And that’s probably why it is so subjective. And it’s super interesting, super interesting to see how they’re like, ‘Yeah, we got everything nailed down. We are running a tight ship here.’ And then you see, like some of the stuff that is discussed at large in all of the beginner documentation is being missed.

And that left me with a question. Is it that they are not aware that this documentation exists? Is it that they had a hard time fielding the amount of information we put out there? Or is it that they don’t know?”

Lizzi Sassman then asked:

“Did you get a sense, just in conversation with them, if they knew about the documentation or if there was like sort of a, I don’t know, a feeling or a vibe about like that the translation is bad or something like that.”

Martin answered:

“That’s exactly what I don’t know, because we were so busy during the event fielding all the conversations, like everyone wanted to talk to us. And that’s great. That’s fantastic. That’s why we are doing it.

But it doesn’t really give you the space to reflect on things on the spot. So I reflected, basically, on my flight back home, I was like, ‘Hm. I wonder. Dang. I should have asked these questions.’ But, you know, this means we have to go back and ask them again.”

What Is An SEO Expert?

SEO expertise is subjective. Anyone who insists that SEO is one thing is out of touch with the reality that there is no single definition of SEO. I disagree with many SEOs about what they think is a good practice and with more experience some of them eventually come around to agreeing with me. There are some SEOs whose experience is wildly different than mine and I sit humbly and listen to them as they share what they know over dinner.

Many of us work from home but it should be understood that we’re all members of the search marketing community and we should be able to listen to what others say about SEO and not only have polite disagreements about the “right way” but to expect that others will disagree and to not let it polarize you, but rather, keep an open mind.