Why Google’s Spam Problem Is Getting Worse

Spam is back in search. And in a big way.

Honestly, I don’t think Google can handle this at all. The scale is unprecedented. They went after publishers manually with the site reputation abuse update. More expired domain abuse is reaching the top of the SERPs than at any time I can remember in recent history. They’re fighting a losing battle, and they’ve taken their eye off the ball.

In a microcosm, this is what’s happening (Image Credit: Harry Clarkson-Bennett)

A few years ago, search was getting on top of the various spam issues “creative” SEOs were trialling. The prospect of being nerfed by a spam update and Google’s willingness to invest and care in the quality of search seemed to be winning the war. Trying to recover from these penalties is nothing short of disastrous. Just ask anybody hit by the Helpful Content update.

But things have shifted. AI is haphazardly rewriting the rules, and big tech has bigger, more poisonous fish to fry. This is not a great time to be a white hat SEO.

TL;DR

  1. Google is currently losing the war against spam, with unprecedented scale driven by AI-generated slop, and expired domain and PBN abuse.
  2. Google’s spam detection monitors four key groups of signals – content, links, reputational, and behavioral.
  3. Data from the Google Leak suggests its most capable detection focuses on link velocity and anchor text.
  4. AI “search” is dozens of times more expensive than traditional search. This enormous cost and focus on new AI products is leading to underinvestment in core spam-fighting.

How Does Google’s Spam Detection System Work?

Via SpamBrain. Previously, the search giant rolled out PenguinPanda, and RankBrain to make better decisions based on links and keywords.

And right now, badly.

SpamBrain is designed to identify content and websites engaging in spammy activities with apparently “shocking” accuracy. I don’t know whether shocking in this sense is meant in a positive or negative way right now, but I can only parrot what is said.

Over time, the algorithm learns what is and isn’t spam. Once it has clearly established signals associated with spammy sites, it’s able to create a neural network.

Much like the concept of seed sites, if you have the spammiest websites mapped out, you can accurately score everyone else against them. Then you can analyse signals at scale – content, links, behavioral, and reputational signals – to group sites together.

  • Inputs (content, linking reputational and behavioral signals).
  • Hidden layer (clustering and comparing each site to known spam ones).
  • Outputs (spam or not spam).

If your site is bucketed in the same group as obviously spammy sites when it comes to any of the above, that is not a good sign. The algorithm works on thresholds. I imagine you need to sail pretty close to the wind for long enough to get hit by a spam update.

But if your content is relatively thin and low value add, you’re probably halfway there. Add some dangerous links into the mix, some poor business decisions (parasite SEO being the most obvious example), and scaled content abuse, and you’re doomed.

What Type Of Spam Are We Talking About Here?

Google notes the most egregious activities here. We’re talking:

  • Cloaking.
  • Doorway abuse.
  • Expired domain abuse.
  • Hacked content.
  • Hidden text and content.
  • Keyword stuffing.
  • Link spam.
  • Scaled content abuse.
  • Site reputation abuse.
  • Thin affiliate content.
  • UGC spam.

Lots of these are grossly intertwined. Expired domain abuse and PBNs. Keyword stuffing is a little old hat, but link spam is still very much alive and well. Scaled content abuse is at an all-time high across the internet.

The more content you have spread across multiple, semantically similar websites, the more effective you can be. Using exact and partial match anchors to leverage your authority towards “money” pages, the richer you will become.

Let’s dive into the big ones below.

Fake News

Google Discover – Google’s engagement baiting, social network-lite platform – has been hit by the unscrupulous spammers in recent times. There have been several instances of fake, AI-driven content reaching the masses. It’s become so prevalent, it has even reached legacy media sites (woohoo).

Millions of page views have been sent to expired and drop domain abusers (Image Credit: Harry Clarkson-Bennett)

From changing the state pension age to free bus passes and TV licenses, the spammers know the market. They know how to incite emotions. Hell hath no fury like a pensioner scorned, and while you can forgive the odd slip-up, nobody can be this generous.

The people who have been working by the book are being sidelined. But the opportunities in the black hat world are booming. Which is, in fairness, quite fun.

Scaled Content Abuse

At the time of writing, over 50% of the content on the internet is AI slop. Some say more. From nearly a million pages analyzed this year, Ahrefs says 74% contain AI-content. What we see is just what slips through the mammoth-sized cracks.

Not hard to see what the problem is… (Image Credit: Harry Clarkson-Bennett)

According to award-winning journalist Jean-Marc Manach’s research, he has found over 8,300 AI-generated news websites in French and over 300 in English (the tip of the iceberg, trust me).

He estimates two of these site owners have become millionaires.

By leveraging authoritative, expired domains and PBNs (more on that next), SEOs – the people still ruining the internet – know how to game the system. By faking clicks, manipulating engagement signals, and utilizing past link equity effectively.

Expired Domain Abuse

The big daddy. Black hat ground zero.

If you engage even a little bit with a black hat community, you’ll know how easy it is right now to leverage expired domains. In the example below, someone had bought the London Road Safety website (a once highly authoritative domain) and turned it into a single-page “best betting sites not on GamStop” site.

This is just one example of many (Image Credit: Harry Clarkson-Bennett)

Betting and crypto are ground zero for all things black hat, just because there’s so much money involved.

I’m not an expert here, but I believe the process is as follows:

  1. Purchase an expired, valuable domain with a strong, clean backlink history (no manual penalties). Ideally, a few of them.
  2. Then you can begin to create your own PBN with unique hosting providers, nameservers, and IP addresses, with a variety of authoritative, aged, and newer domains.
  3. This domain(s) then becomes your equity/authority stronghold.
  4. Spin up multiple TLD variations of the domain, i.e., instead of .com it becomes .org.uk.
  5. Add a mix of exact and partial match anchors from a PBN to the money site to signal its new focus.
  6. Either add a 301 redirect for a short period of time to the money variation of the domain or canonicalize to the variation.

These scams are always short-term plays. But they can be worth tens of hundreds of thousands of pounds when done well. And they are back, and I believe more valuable than ever.

Right now, I think it’s as simple as buying an old charity domain, adding a quick reskin and voila. A 301 or equity passing tactic and your single page site about ‘best casinos not on gamstop’ is printing money. Even in the English speaking market.

According to notorious black hat fella Charles Floate, some of these companies are laundering hundreds of thousands of pounds a month.

PBNs

A PBN (or Private Blog Network) is a network of websites that someone controls that link back to the money site. The variation of the site designed to generate typically advertising or affiliate revenue.

A private blog network has to be completely unique from each other. They cannot share breadcrumbs that Google can trace. Each site needs a standalone:

  • Hosting provider.
  • IP address.
  • Nameserver.

The reason PBNs are so valuable is you can build up an enormous amount of link equity and falsified topical authority to mitigate risk. Expired domains are risky because they’re expensive, and once they get a penalty, they’re doomed. PBNs spread the risk. Like the head of a Hydra, one dies; another rises up.

Protecting the tier 1 asset (the purchased aged or expired domain) is paramount. Instead of pointing links directly to the money site, you can link to the sites that link to the money site.

This indirectly boosts the value of the money site, protecting it from Google’s prying eyes.

What Does The Google Leak Show About Spam?

As always, this is an inexact science. Barely even pseudo-science really. I’ve got the tinfoil hat on and a lot of string connecting wild snippets of information around the room to make this work. You should follow Shaun Anderson here.

If I take every mention of the word “spam” in the module names and descriptions, there are around 115, once I’ve removed any nonsense. Then we can categorize those into content, links, reputational, and behavioral signals.

Taking it one step further, these modules can be classified as relating to things like link building, anchor text, content quality, et al. This gives us a rough sense of what matters in terms of scale.

Anchor text makes up the lion’s share of spammy modules based on data from the Google Leak (and my own flawed categorization)(Image Credit: Harry Clarkson-Bennett)

A few examples:

  • spambrainTotalDocSpamScore calculates a document’s overall spam score.
  • IndexingDocjoinerAnchorPhraseSpamInfo and IndexingDocjoinerAnchorSpamInfo modules identify spammy anchor phrases by looking at the number, velocity, the days the links were discovered, and the time the spike ended.
  • GeostoreSourceTrustProto helps evaluate the trustworthiness of a source.

Really, the takeaway is how important links are from a spam sense. Particularly, anchor text. The velocity at which you gain links matters. As does the text and surrounding content. Linking seems to be where Google’s algorithm is most capable of identifying red and amber flags.

If your link velocity graph spiked with exact match anchors to highly commercial pages, that’s a flag. Once a site is pinged for this type of content or link-related abuse, the behavioral and reputational signals are analysed as part of SpamBrain.

If these corroborate and your site exceeds certain thresholds, you’re doomed. It’s why this has (until recently) been a relatively fine art.

Ultimately, They’re Just Investing Less In Traditional Search

As Martin McGarry pointed out, they just care a bit less … They have bigger, more hallucinogenic fish to fry.

Image Credit: Harry Clarkson-Bennett

In 2025, we have had four updates, with a duration of c. 70 days. In 2024, we had seven that lasted almost 130 days. Productivity levels we can all aspire to.

It’s Not Hard To Guess Why…

The bleeding-edge search experience is changing. Google is rolling out preferred publisher sources globally and inline linking more effectively in its AI products. Much-needed changes.

I think we’re seeing the real-time moulding of the new search experience in the form of The Google Web Guide. A personalized mix of trusted sources, AI Mode, a more classic search interface, and something inspirational. I suspect this might be a little like a Discover-lite feed. A place in the traditional search interface where content you will almost certainly like is fed to you to keep you engaged.

Unconfirmed, but apparently, Google has added persona-driven recommendation signals and a private publisher entity layer, among other things. Grouping users into cohorts is I believe a fundamental part of Discover. It’s what allows content to go viral.

Once you understand enough about a user to bucket them into specific groups, you can saturate a market over the course of a few days Discover. Less even. But the problem is the economics of it all. Ten blue links are cheap. AI is not. At any level.

According to Google, when someone chooses a preferred source, they click through to that site twice as often on average. So I suspect it’s worth taking seriously.

Why Are AI Searches So Much More Expensive?

Google is going to spend $10 billion more this year than expected due to the growing demand for cloud services. YoY, Google’s CAPEX spend is nearly double 2024’s $52.5 billion.

It’s not just Google. It’s a Silicon Valley race to the bottom.

2025 has been extrapolated, but on course for $92 billion this year (Image Credit: Harry Clarkson-Bennett)

While Google hasn’t released public information on this, it’s no secret that AI searches are significantly more expensive than the classic 10 blue links. Traditional search is largely static and retrieval-based. It relies on pre-indexed pages to serve a list of links and is very cheap to run.

An AI Overview is generative. Google has to run a large language model to summarize and generate a natural language answer. AI Mode is significantly worse. The multi-turn, conversational interface processes the entire dialogue in addition to the new query.

Given the query fan-out technique – where dozens of searches are run in parallel – this process demands significantly more computational power.

Custom chips, efficiencies, and caching can reduce the cost of this. But this is one of Google’s biggest challenges. I suspect exactly why Barry believes AI Mode won’t be the default search experience. I’d be surprised if it isn’t just applied at a search/personalization level, too. There are plenty of branded and navigational searches where this would be an enormous waste of money.

And these guys really love money.

According to The IET, if the population of London (>9.7 million) asked ChatGPT to write a 100-word email this would require 4,874,000 litres of water to cool the servers – equivalent to filling over seven 25m swimming pools

LLMs Already Have A Spam Problem

This is pretty well documented. LLMs seem to be driven at least in part by the sheer volume of mentions in the training data. Everything is ingested and taken as read.

Image Credit: Harry Clarkson-Bennett

When you add a line in your footer describing something you or your business did, it’s taken as read. Spammy, low-quality tactics work more effectively than heavy lifting.

Ideally, we wouldn’t live in a world where low-lift shit outperforms proper marketing efforts. But here we are.

Like in 2012, “best” lists are on the tip of everyone’s tongue. Basic SEO is making a comeback because that’s what is currently working in LLMs. Paid placements, reciprocal link exchanges. You name it.

Image Credit: Harry Clarkson-Bennett

If it’s half-arsed, it’s making a comeback.

As these models rely on Google’s index for searches that the model cannot confidently answer (RAG), Google’s spam engine matters more than ever. In the same way that I think publishers need to take a stand against big tech and AI, Google needs to step up and take this seriously.

I’m Not Sure Anyone Is Going To…

I’m not even sure they want to right now. OpenAI has signed some pretty extraordinary contracts, and its revenue is light-years away from where it needs to be. And Google’s CAPEX expenditure is through the roof.

So, things like quality and accuracy are not at the top of the list. Consumer and investor confidence is not that high. They need to make some money. And private companies can be a bit laissez-faire when it comes to reporting on revenue and profits.

According to HSBC, OpenAI needs to raise at least $207 billion by 2030 so it can continue to lose money. Being described as ‘a money pit with a website on top’ isn’t a great look.

New funding has to be thrown at data centres (Image Credit: Harry Clarkson-Bennett)

Let’s see them post-hoc rationalize their way out of this one. That’s it. Thank you for reading and subscribing to my last update of the year. Certainly been a year.

More Resources:


This post was originally published on Leadership in SEO.


Featured Image: Khaohom Mali/Shutterstock

Eight Overlooked Reasons Why Sites Lose Rankings In Core Updates via @sejournal, @martinibuster

There are multiple reasons why a site can drop in rankings due to a core algorithm update. The reasons may reflect specific changes to the way Google interprets content, a search query, or both. The change could also be subtle, like an infrastructure update that enables finer relevance and quality judgments. Here are eight commonly overlooked reasons for why a site may have lost rankings after a Google core update.

Ranking Where It’s Supposed To Rank?

If the site was previously ranking well and now it doesn’t, it could be what I call “it’s ranking where it’s supposed to rank.” That means that some part of Google’s algorithm has caught up to a loophole that the page was intentionally or accidentally taking advantage of and is currently ranking it where it should have been ranking in the first place.

This is difficult to diagnose because a publisher might believe that the web pages or links were perfect the way they previously were, but in fact there was an issue.

Topic Theming Defines Relevance

A part of the ranking process is determining what the topic of a web page is. Google admitted a year ago that a core topicality system is a part of the ranking process. The concept of topicality as part of the ranking algorithm is real.

The so-called Medic Update of 2018 brought this part of Google’s algorithm into sharp focus. Suddenly, sites that were previously relevant for medical keywords were nowhere to be found because they dealt in folk remedies, not medical ones. What happened was that Google’s understanding of what keyword phrases were about became more topically focused.

Bill Slawski wrote about a Google patent (Website representation vector) that describes a way to classify websites by knowledge domains and expertise levels that sounds like a direct match to what the Medic Update was about.

The patent describes part of what it’s doing:

“The search system can use information for a search query to determine a particular website classification that is most responsive to the search query and select only search results with that particular website classification for a search results page. For example, in response to receipt of a query about a medical condition, the search system may select only websites in the first category, e.g., authored by experts, for a search results page.”

Google’s interpretation of what it means to be relevant became increasingly about topicality in 2018 and continued to be refined in successive updates over the years. Instead of relying on links and keyword similarity, Google introduced a way to identify and classify sites by knowledge domain (the topic) in order to better understand how search queries and content are relevant to each other.

Returning to the medical queries, the reason many sites lost rankings during the Medic Update was that their topics were outside the knowledge domain of medical remedies and science. Sites about folk and alternative healing were permanently locked out of ranking for medical phrases, and no amount of links could ever restore their rankings. The same thing happened across many other topics and continues to affect rankings as Google’s ability to understand the nuances of topical relevance is updated.

Example Of Topical Theming

A way to think of topical theming is to consider that keyword phrases can be themed by topic. For example, the keyword phrase “bomber jacket” is related to both military clothing, flight clothing, and men’s jackets. At the time of writing, Alpha Industries, a manufacturer of military clothing, is ranked number one in Google. Alpha Industries is closely related to military clothing because the company not only focuses on selling military style clothing, it started out as a military contractor producing clothing for America’s military, so it’s closely identified by consumers with military clothing.

Screenshot Showing Topical Theming

Screenshot of SERPs showing how Google interprets a keyword phrase and web pages

So it’s not surprising that Alpha Industries ranks #1 for bomber jacket because it ticks both boxes for the topicality of the phrase Bomber Jacket:

  • Shopping > Military clothing
  • Shopping > Men’s clothing

If your page was previously ranking and now it isn’t, then it’s possible that the topical theme was redefined more sharply. The only way to check this is to review the top ranked sites, focusing, for example, on the differences between ranges such as position one and two, or sometimes positions one through three or positions one through five. The range depends on how the topic is themed. In the example of the Bomber Jacket rankings, positions one through three are themed by “military clothing” and “Men’s clothing.” Position three in my example is held by the Thursday Boot Company, which is themed more closely with “men’s clothing” than it is with military clothing. Perhaps not coincidentally, the Thursday Boot Company is closely identified with men’s fashion.

This is a way to analyze the SERPs to understand why sites are ranking and why others are not.

Topic Personalization

Sometimes the topical themes are not locked into place because user intents can change. In that case, opening a new browser or searching a second time in a different tab might cause Google to change the topical theme to a different topical intent.

In the case of the “bomber jacket” search results, the hierarchy of topical themes can change to:

  • Informational > Article About Bomber Jackets
  • Shopping > Military clothing
  • Shopping > Men’s clothing

The reason for that is directly related to the user’s information need which informs the intent and the correct topic. In the above case it looks like the military clothing theme may be the dominant user intent for this topic but the informational/discovery intent may be a close tie that’s triggered by personalization. This can vary by previous searches but also by geographic location, a user’s device, and even by the time of day.

The takeaway is that there may not be anything wrong with a site. It’s just ranking for a more specific topical intent. So if the topic is getting personalized so that your page no longer ranks, a solution may be to create another page to focus on the additional topic theme that Google is ranking.

Authoritativeness

In one sense, authoritativeness can be seen as an external validation of expertise of a website as a go-to source for a product, service, or content topic. While the expertise of the author contributes to authoritativeness and authoritativeness in a topic can be inherent to a website, ultimately it’s third-party recognition from readers, customers, and other websites (in the form of citations and links) that communicate a website’s authoritativeness back to Google as a validating signal.

The above can be reduced to these four points:

  1. Expertise and topical focus originate within the website.
  2. Authoritativeness is the recognition of that expertise.
  3. Google does not assess that recognition directly.
  4. Third-party signals can validate a site’s authoritativeness.

To that we can add the previously discussed Website Representation Vector patent that shows how Google can identify expertise and authoritativeness.

What’s going on then is that Google selects relevant content and then winnows that down by prioritizing expert content.

Here’s how Google explains how it uses E-E-A-T:

“Google’s automated systems are designed to use many different factors to rank great content. After identifying relevant content, our systems aim to prioritize those that seem most helpful. To do this, they identify a mix of factors that can help determine which content demonstrates aspects of experience, expertise, authoritativeness, and trustworthiness, or what we call E-E-A-T.”

Authoritativeness is not about how often a site publishes about a topic; any spammer can do that. It has to be about more than that. E-E-A-T is a standard to hold your site up to.

Stuck On Page Two Of Search Results? Try Some E-E-A-T

Speaking of E-E-A-T, many SEOs have the mistaken idea that it’s something they can add to websites. That’s not how it works. At the 2025 New York City Search Central Live event, Google’s John Mueller confirmed that E-E-A-T is not something you add to web pages.

He said:

“Sometimes SEOs come to us or like mention that they’ve added EEAT to their web pages. That’s not how it works. Sorry, you can’t sprinkle some experiences on your web pages. It’s like, that doesn’t make any sense.”

Clearly, content reflects qualities of authoritativeness, trustworthiness, expertise, and experience, but it’s not something that you add to content. So what is it?

E-E-A-T is just a standard to hold your site up to. It’s also a subjective judgment made by site visitors. A subjective judgment is like how a sandwich can taste great, with the “great” part being the subjective judgment. It is a matter of opinion.

One thing that is difficult for SEOs to diagnose is when their content is missing that extra something to push their site onto the first page of the SERPs. It can feel unfair to see competitors ranking on the first page of the SERPs even though your content is just as good as theirs.

Those differences indicate that their top-ranked web pages are optimized for people. Another reason is that more people know about them because they have a multimodal approach to content, whereas the site on page two of the SERPs mainly communicates via textual content.

In SERPs where Google prefers to rank government and educational sites for a particular keyword phrase, except for one commercial site, I almost always find evidence that their content and their outreach are resonating with site visitors in ways that the competitor websites do not. Websites that focus on multimodal, people-optimized content and experiences are usually what I find in those weird outlier rankings.

So if your site is stuck on page two, revisit the top-ranked web pages and identify ways that those sites are optimized for people and multimodal content. You may be surprised to see what makes those sites resonate with users.

Temporary Rankings

Some rankings are not made to last. This is the case with a new site or new page ranking boost. Google has a thing where it tastes a new site to see how it fits with the rest of the Internet. A lot of SEOs crow about their client’s new website conquering the SERPs right out of the gate. What you almost never hear about is when those same sites drop out of the SERPs.

This isn’t a bad thing. It’s normal. It simply means that Google has tried the site and now it’s time for the site to earn its place in the SERPs.

There’s Nothing Wrong With The Site?

Many site publishers find it frustrating to be told that there’s nothing wrong with their site even though it lost rankings. What’s going on may be that the site and web page are fine, but that the competitors’ pages are finer. These kinds of issues are typically where the content is fine and the competitors’ content is about the same but is better in small ways.

This is the one form of ranking drop that many SEOs and publishers easily overlook because SEOs generally try to identify what’s “wrong” with a site, and when nothing obvious jumps out at them, they try to find something wrong with the backlinks or something else.

This inability to find something wrong leads to recommendations like filing link disavows to get rid of spam links or removing content to fix perceived but not actual problems (like duplicate content). They’re basically grasping at straws to find something to fix.

But sometimes it’s not that something is wrong with the site. Sometimes it’s just that there’s something right with the competitors.

What can be right with competitors?

  • Links
  • User experience
  • Image content (for example, site visitors are reflected in image content).
  • Multimodal approach
  • Strong outreach to potential customers
  • In-person marketing
  • Cultivate word-of-mouth promotion
  • Better advertising
  • Optimized for people

SEO Secret Sauce: Optimized For People

Optimizing for people is a common blind spot. Optimizing for people is a subset of conversion optimization. Conversion optimization is about subtle signals that indicate a web page contains what the site visitor needs.

Sometimes that need is to be recognized and acknowledged. It can be reassurance that you’re available right now or that the business is trustworthy.

For example, a client’s site featured a badge at the top of the page that said something like “Trusted by over 200 of the Fortune 500.” That badge whispered, “We’re legitimate and trustworthy.”

Another example is how a business identified that most of their site visitors were mothers of boys, so their optimization was to prioritize images of mothers with boys. This subtly recognized the site visitor and confirmed that what’s being offered is for them.

Nobody loves a site because it’s heavily SEO’d, but people do love sites that acknowledge the site visitor in some way. This is the secret sauce that’s invisible to SEO tools but helps sites outrank their competitors.

It may be helpful to avoid mimicking what competitors are doing and increase ways that differentiate the site and outreach in ways that make people like your site more. When I say outreach, I mean actively seeking out places where your typical customer might be hanging out and figuring out how you can make your pitch there. Third-party signals have long been strong ranking factors at Google, and now, with AI Search, what people and other sites say about your site are increasingly playing a role in rankings.

Takeaways

  1. Core updates sometimes correct over-ranking, not punish sites
    Ranking drops sometimes reflect Google closing loopholes and placing pages where they should have ranked all along rather than identifying new problems.
  2. Topical theming has become more precise
    Core updates sometimes make existing algorithms more precise. Google increasingly ranks content based on topical categories and intent, not just keywords or links.
  3. Topical themes can change dynamically
    Search results may shift between informational and commercial themes depending on context such as prior searches, location, device, or time of day.
  4. Authoritativeness is externally validated
    Recognition from users, citations, links, and broader awareness can be the difference why one site ranks and another does not.
  5. SEO does not control E-E-A-T and can’t be reduced to an on-page checklist
    While concepts of expertise and authoritativeness are inherent in content, they’re still objective judgments that can be inferred from external signals, not something that can be directly added to content by SEOs.
  6. Temporary ranking boosts are normal
    New pages and sites are tested briefly, then must earn long-term placement through sustained performance and reception.
  7. Competitors may simply be better for users
    Ranking losses often occur because competitors outperform in subtle but meaningful ways, not because the losing site is broken.
  8. People-first optimization is a competitive advantage
    Sites that resonate emotionally, visually, and practically with visitors often outperform purely SEO-optimized pages.

Ranking changes after a core update sometimes reflect clearer judgments about relevance, authority, and usefulness rather than newly discovered web page flaws. As Google sharpens how it understands topics, pages increasingly compete on how well they align with what users are actually trying to accomplish and which sources people already recognize and trust. The lasting advantage comes from building a site that resonates with actual visitors, earns attention beyond search, and gives Google consistent evidence that users prefer it over alternatives. Marketing, the old-fashioned tell-people-about-a-business approach to promoting it, should not be overlooked.

Featured Image by Shutterstock/Silapavet Konthikamee

Primer on ChatGPT’s 3 Bots

ChatGPT uses separate bots for training, searching, and taking action:

  • GPTBot provides training data.
  • OAI-SearchBot gathers data to respond to specific prompts.
  • ChatGPT-User accesses pages when requested by users.

Knowing which bot is responsible for which task is essential before attempting to disallow it.

GPTBot

GPTBot locates information to build and update training data, ChatGPT’s knowledge base for providing answers.

ChatGPT doesn’t store training URLs or track where the info comes from. Disallowing this bot will prevent the platform from using your content for training, but it won’t impact your traffic. It may affect what ChatGPT understands about your company, though external sources likely provide that information, too.

Some publishers disallow the bot to prevent ChatGPT from learning from their content and to reduce costs, as AI bots can increase hosting needs and slow down servers, especially for large sites.

I typically suggest allowing access to GPTBot to provide first-hand information about a business and thus control the context.

ChatGPT updates training data regularly, usually with each release.

OAI-SearchBot

OAI-SearchBot searches the web for current information, user reviews, product details, and more.

Opinions differ as to whether the platform indexes the URLs from these searches. (ChatGPT states it “uses a hybrid system that includes limited indexing, plus on-demand retrieval, rather than a single, exhaustive web index.”)

OAI-SearchBot searches Google, Bing, Reddit, and others for info, much like humans, and may independently crawl sites, too.

Disallowing this bot prevents it from visiting your site, but it may still cite your pages via external links. Google does this, too, incidentally. A robots.txt file can prevent Google’s bot from crawling a site, but the search giant can still index and rank its pages.

Still, disallowing OAI-SearchBot will likely reduce or eliminate citations (and traffic), which is why I don’t usually advise it.

ChatGPT-User

ChatGPT-User performs actions as requested by users. For example, a user can prompt ChatGPT to visit a page and summarize its content.

ChatGPT-User does not provide training data or citations. If your server logs include this bot, a human instructed ChatGPT to visit your site. There’s no way to block this bot because it’s user-initiated, per ChatGPT.

Google Explains Why Staggered Site Migrations Impact SEO Outcome via @sejournal, @martinibuster

Google’s John Mueller recently answered a question about how Google responds to staggered site moves where a site is partially moved from one domain to another. He said a standard site move is generally fine, but clarified his position when it came to partial site moves.

Straight Ahead Site Move?

Someone asked about doing a site move, initially giving the impression that they were moving the entire site. The question was in the context of using Google Search Console’s change of address feature.

They asked:

“Do you have any thoughts on this GSC Change of Address question?

Can we submit the new domain if a few old URLs still get traffic and aren’t redirected yet, or should we wait until all redirects are live?”

Mueller initially answered that it should be fine:

“It’s generally fine (for example, some site moves keep the robots.txt on the old domain with “allow: /” so that all URLs can be followed). The tool does check for the homepage redirect though.”

Google Explains Why Partial Site Moves Are Problematic

His opinion changed however after the OP responded with additional information indicating that the home page has been moved while many of the product and category pages on the old domain will stay put for now, meaning that they want to move parts of the site now and other parts later, retaining one foot in on a new domain and the other firmly planted on the old one.

That’s a different scenario entirely. Unsurprisingly, Mueller changed his opinion.

He responded:

“Practically speaking, it’s not going to be seen as a full site move. You can still use the change of address tool, but it will be a messy situation until you’ve really moved it all over. If you need to do this (sometimes it’s not easy, I get it :)), just know that it won’t be a clean slate.

…You’ll have a hard time tracking things & Google will have a hard time understanding your sites. My recommendation would be to clean it up properly as soon as you can. Even properly planned & executed site migrations can be hard, and this makes it much more challenging.”

Google’s Site Understanding

Something that I find intriguing is Mueller’s occasional reference to Google’s understanding of a website. He’s mentioned this factor in other contexts in the past and it seems to be a catchall for things that are related to quality but also to something else that he’s referred to in the past as a relevance topic related to understanding where a site fits in the Internet.

In this context, Mueller appears to be using the phrase to mean understanding the site relative to the domain name.

Featured Image by Shutterstock/Here

Cloudflare Report: Googlebot Tops AI Crawler Traffic via @sejournal, @MattGSouthern

Cloudflare published its sixth annual Year in Review, offering a comprehensive looks at Internet traffic, security, and AI crawler activity across 2025.

The report draws on data from Cloudflare’s network, which spans more than 330 cities across 125 countries and handles over 81 million HTTP requests per second on average.

The AI crawler findings stand out. Googlebot crawled far more web pages than any other AI bot, reflecting Google’s dual-purpose approach to crawling for both search indexing and AI training.

Googlebot Top AI Crawler Traffic

Cloudflare analyzed successful requests for HTML content from leading AI crawlers during October and November 2025. The results showed Googlebot reached 11.6% of unique web pages in the sample.

That’s more than 3 times the pages seen by OpenAI’s GPTBot at 3.6%. It’s nearly 200 times more than PerplexityBot, which crawled just 0.06% of pages.

Bingbot came in third at 2.6%, followed by Meta-ExternalAgent and ClaudeBot at 2.4% each.

The report noted that because Googlebot crawls for both search indexing and AI model training, web publishers face a difficult choice. Blocking Googlebot’s AI training means risking search discoverability.

Cloudflare wrote:

“Because Googlebot is used to crawl content for both search indexing and AI model training, and because of Google’s long-established dominance in search, Web site operators are essentially unable to block Googlebot’s AI training without risking search discoverability.”

AI Bots Now Account For 4.2% of HTML Requests

Throughout 2025, AI bots (excluding Googlebot) averaged 4.2% of HTML requests across Cloudflare’s customer base. The share fluctuated between 2.4% in early April and 6.4% in late June.

Googlebot alone accounted for 4.5% of HTML requests, slightly more than all other AI bots combined.

The share of human-generated HTML traffic started 2025 at seven percentage points below non-AI bot traffic. By September, human traffic began exceeding non-AI bot traffic on some days. As of December 2, humans generated 47% of HTML requests while non-AI bots generated 44%.

Crawl-to-Refer Ratios Show Wide Variation

Cloudflare tracks how often AI and search platforms send traffic to sites relative to how often they crawl. A high ratio means heavy crawling without sending users back to source sites.

Anthropic had the highest ratios among AI platforms, ranging from approximately 25,000:1 to 100,000:1 during the second half of the year after stabilizing from earlier volatility.

OpenAI’s ratios reached as high as 3,700:1 in March. Perplexity maintained the lowest ratios among leading AI platforms, generally below 400:1 and under 200:1 from September onward.

For comparison, Google’s search crawl-to-refer ratio stayed much lower, generally between 3:1 and 30:1 throughout the year.

User-Action Crawling Grew Over 20X

Not all AI crawling is for model training. “User action” crawling occurs when bots visit sites in response to user questions posed to chatbots.

This category saw the fastest growth in 2025. User-action crawling volume increased more than 15 times from January through early December. The trend closely matched the traffic pattern for OpenAI’s ChatGPT-User bot, which visits pages when users ask ChatGPT questions.

The growth showed a weekly usage pattern starting in mid-February, suggesting increased use in schools and workplaces. Activity dropped during June through August when students were on break and professionals took vacations.

AI Crawlers Most Blocked In Robots.txt

Cloudflare analyzed robots.txt files across nearly 3,900 of the top 10,000 domains. AI crawlers were the most frequently blocked user agents.

GPTBot, ClaudeBot, and CCBot had the highest number of full disallow directives. These directives tell crawlers to stay away from entire sites.

Googlebot and Bingbot showed a different pattern. Their disallow directives leaned heavily toward partial blocks, likely focused on login endpoints and non-content areas rather than full site blocking.

Civil Society Became Most-Attacked Sector

For the first time, organizations in the “People and Society” vertical were the most targeted by attacks. This category includes religious institutions, nonprofits, civic organizations, and libraries.

The sector received 4.4% of global mitigated traffic, up from under 2% at the start of the year. Attack share jumped to over 17% in late March and peaked at 23.2% in early July.

Many of these organizations are protected by Cloudflare’s Project Galileo.

Gambling and games, the most-attacked vertical in 2024, saw its share drop by more than half to 2.6%.

Other Key Findings

Cloudflare’s report included several additional findings across traffic, security, and connectivity.

Global Internet traffic grew 19% year-over-year. Growth stayed relatively flat through mid-April, then accelerated after mid-August.

Post-quantum encryption now secures 52% of human traffic to Cloudflare, nearly double the 29% share at the start of the year.

ChatGPT remained the top generative AI service globally. Google Gemini, Windsurf AI, Grok/xAI, and DeepSeek were new entrants to the top 10.

Starlink traffic doubled in 2025, with service launching in more than 20 new countries.

Nearly half of the 174 major Internet outages observed globally were caused by government-directed shutdowns. Cable cut outages dropped nearly 50%, while power failure outages doubled.

European countries dominated Internet quality metrics. Spain topped the list for overall Internet quality, with average download speeds above 300 Mbps.

Why This Matters

The AI crawler data should affects how you think about bot access and traffic.

Google’s dual-purpose crawler creates a competitive advantage. You can block other AI crawlers while keeping Googlebot access for search visibility, but you can’t separate Google’s search crawling from its AI training crawling.

The crawl-to-refer ratios help quantify what publishers already suspected. AI platforms crawl heavily but send little traffic back. The gap between crawling and referring varies widely by platform.

The civil society attack data matters if you work with nonprofits or advocacy organizations. These groups now face the highest rate of attacks.

Looking Ahead

Cloudflare expects AI metrics to change as the space continues to evolve. The company added several new AI-related datasets to this year’s report that weren’t available in previous editions.

The crawl-to-refer ratios may change as AI platforms adjust their search features and referral behavior. OpenAI’s ratios already showed some decline through the year as ChatGPT search usage grew.

For robots.txt management, the data shows most publishers are choosing partial blocks for major search crawlers while fully blocking AI-only crawlers. The year-end state of these directives provides a baseline for tracking how publisher policies evolve in 2026.


Featured Image: Mamun_Sheikh/Shutterstock

20 SEO Experts Offer Their Advice For 2026 via @sejournal, @theshelleywalsh

This year has been a continuation of learning and understanding about how AI impacts our industry. It’s been less about the chaos of the initial disruption and more about “how do we leverage this?

My belief is that SEO is a practice that needs to be adaptable to the end goal and not fixed to any predetermined notions centered around Google, ranking, or keywords. The foundation of SEO is about making yourself visible online wherever your audience can find you.

“S’ is for “search engine,” but one of my favorite phrases from the year is from Ashley Liddell, who said “search everywhere optimization,” and that is the perfect approach for the mindset needed to continue in the next level of SEO.

It might be TikTok, YouTube, Google, ChatGPT, or Reddit. Most likely, it’s a combination of all of these.

For the technical side of SEO, it’s fundamental that your pages can be accessed by all search engines and machines. For the content side of SEO, you need to be creating content experiences that can be cited by search engines and machines. And everyone should be thinking about the bottom line: Does this align with the defined business outcome for my client/brand/company? Show me the money.

One critical area I wouldn’t overlook is agentic AI and the development of closed systems completing actions for users. Think booking a holiday, personal shopping for a styled wardrobe, or buying your food shop based on a specific diet. When that happens, you need to ensure you are in the game and included in those closed spaces. Start learning about this now.

As the AI future is coming fast, get ready and go with it rather than resisting it. 2026 is the watershed where you need to get on board to stay in the game.

At Search Engine Journal, we showcase some of the best SEO minds in the industry, and in our usual tradition, we asked 20 of the best practicing SEO experts, including many of our contributors, “In 2026, what should SEOs focus on to maintain visibility and achieve measurable results?”

(Editor’s Note: The following are not in any order of preference and are displayed in the order of who responded first.)

How To Maintain Visibility Online In 2026

1. Be Mentioned In the Right Places

Kevin Indig, Growth Advisor

In 2026, visibility is the result of having the right content, engaging on the right channels, and being mentioned in the right places.

The right content is a mix of hyperlong-tail articles/landing pages tailored to your audience(s) and based on your unique positioning and data stories. People prompt 5x longer than they search on Google, so you want to be the best result for their specific context. LLMs also love fresh, unique datapoints, so you want to create research-driven content.

The right channels are Google, ChatGPT, Reddit, Quora, review sites, LinkedIn, and niche forums. Those are not just the most cited platforms in LLMs but also in Google Search. But being present here takes an engagement strategy rather than an SEO approach.

The right places to be mentioned are authoritative publishers and review sites in your industry. LLMs seem to rely heavily on mentions from other (relevant) sites, so you have to be present in context (surrounding words) that reflect your positioning and market position.

→ Read More: The Alpha Is Not LLM Monitoring

2. We Have To Do More Than Just Appease Google

Cindy Krum, CEO & Founder, MobileMoxie

We have to do more than just appease Google.

Now, to get visibility in all the places where it is needed, having a good website, with high-quality, indexable content, is table stakes; it is the bare minimum, and likely not enough.

For years, Google’s algorithm focused on using content and links to a site to evaluate that particular site, and rank it. AI search utilities and LLMs work very differently. They were designed to find a consensus and synthesize it, and they are looking across all the information that they have access to, to do it.

This means, if you are just relying on your website to create your visibility online, it will not be enough. There is no consensus and minimal synthesis from just one site.

Your branding message needs to be widely distributed across the web to create a consistent but discernibly unique branding message.

→ Read More: Google’s AI Search Journeys Are Reshaping SEO With Cindy Krum

3. Optimize For Systems That Read Like Machines

Duane Forrester, Founder and CEO, UnboundAnswers.com

In 2026, SEOs need to treat visibility as something earned through retrieval, not ranking.

Focus on how content is chunked, cited, and most importantly, trusted by AI systems.

Audit what gets surfaced inside chatbots and answer engines, not just in SERPs.

Build authority signals machines can verify: structured data, consistent sourcing, and entity clarity.

Use embeddings, vector search, and retrieval testing to understand how meaning (not keywords) drives exposure.

Replace “optimize for Google” with “optimize for systems that read like machines.” Your goal isn’t a blue link anymore. It’s being the trusted source those systems turn to when humans ask questions. Trust, in 2026, is paramount.

→ Read More: Ex-Microsoft SEO Pioneer On Why AI’s Biggest Threat To SEO Isn’t What You Think

4. Be Retrieved, Cited, And Trusted Wherever Users Search

Carolyn Shelby, Founder, CSHEL Search Strategies

In 2026, SEOs need to refocus on clarity, consistency, and comprehension.

Every channel that describes your brand – your site, feeds, listings, and profiles – must tell the same story, in the same words, in a way both humans *and machines* can understand. That means cleaning up fragmented site structures, removing “hidden” or toggle-buried information, and ensuring the important facts live on the page in visible text. (Note, I did not say Schema doesn’t matter, but I am saying that there are situations where the Schema that is in the JSON-LD is NOT being read, and for those times, it is important that you have valuable product specs and data ON the page, in visible text, and not hidden behind a tab or in a toggle.)

You won’t be penalized or hurt yourself in Google or Bing by *also* optimizing for the lowest-common-denominator crawlers – but you will lose out on that extra visibility if you ignore them. Build pages that are fast (LLMs have a short attention span), crawlable, and semantically clear. Make sure your product, pricing, and positioning statements are consistent across every surface.

The goal isn’t *just* to rank anymore (though ranking is still a necessary first step in most cases). It’s to be retrieved, cited, and trusted wherever users search – whether that’s Google, Bing, or an LLM.

→ Read More: Why AI Search Isn’t Overhyped & What To Focus On Right Now

5. Visibility Will Depend On Agentic Readiness

Andrea Volpini, Co-Founder and CEO, WordLift

In 2026, we are finally designing for the Reasoning Web, where agents will read, decide, and act on our behalf, and SEO becomes the discipline of making these systems effective. Visibility will depend on agentic readiness: clean structured data, stable identifiers, precise ontologies, and knowledge graphs that let agents resolve entities, compare offers, execute tasks, and learn from results.

This is a semantic shift: not simply about being “mentioned” in AI Overviews or ChatGPT, but about exposing products, content, and services as machine-operable assets through feeds, APIs, and tools that make agents smarter every time they interact with us.

The brands that let agents run the show, safely and verifiably, will own the next chapter of search.

→ Read More: How Structured Data Shapes AI Snippets And Extends Your Visibility Quota

6. Search And Product Are Intimately Connected

Ray Grieselhuber, Founder & CEO, DemandSphere

The most important thing, in our view, is understanding that AI search is ubiquitous now across three core experiences: SERPs, LLMs, and agentic experiences.

For the first two, SERPs and LLMs, there is a lot of overlap because they rely on a shared search index (Google in most cases), but the way in which the retrieval process works across these two experiences varies widely. This is why we are hearing that everyone’s No. 1 problem is getting good data, so spend time to make sure your monitoring and data pipelines are accurate and fine-tuned.

For the agentic experience, it’s still early but you should be thinking about how your product strategy will intersect with feeds and APIs (and new, related protocols like MCP). Search and product are intimately connected going forward, and the real ones will know that they always have been.

→ Read More: AI Platform Founder Explains Why We Need To Focus On Human Behavior, Not LLMs

7. Have A Relentless Focus On Being The Best

Barry Adams, Polemic Digital

Whatever you do, don’t lose your mind to the AI hype and try to radically reinvent your SEO efforts. Yes, it will be tougher to grow traffic and revenue from search, but too many SEOs have been coasting along and relying on Google’s own growth to fuel their figures. Now that clicks from Google have stagnated, you’ll need to be smarter about your SEO.

Spend less time and effort on “busywork,” those minor little things that don’t bring any measurable improvement to your traffic. Do the stuff that actually works. Don’t compromise on quality, have a relentless focus on being the best, and make sure you capitalize on your site’s strengths and eradicate its weaknesses.

Sites that are significantly suboptimal, either technically or editorially, will simply not succeed. You have to be all-in on search, without cutting corners and “that will do” concessions. Anything less than that and you will end up on the wrong side of the zero-sum game that Google search has become.

→ Read More: AI Survival Strategies For Publishers

8. Focus On Quality And Conversion Over The Quantity Of Content

Lily Ray, Vice President, SEO Strategy & Research, Amsive

For many years, I’ve answered this question with some version of “focusing on E-E-A-T,” and believe it or not, I think this answer *still* applies in 2026 with the rise of AI search.

Why? Because being mentioned in AI search is all about reputability, experience, and trust. The more your brand is well-known and well-respected in your industry, the more likely LLMs will be to cite you as a trusted and recommended brand. This requires earning mentions and positive reviews in all the places where it matters; having a well-known and well-respected team of individuals who contribute authentic, expert insights into the brand’s content, etc.

As homogenous, AI-generated content floods the internet, users will continue to want to follow real human creators engaged in honest and authentic conversations. Also, focus on the quality and conversion potential of content over the quantity of content, as the latter can cause major SEO headaches over time.

→ Read More: The Role Of E-E-A-T In AI Narratives: Building Brand Authority For Search Success

9. Maintain A Strong Focus On Retrieval Systems And Search Overall

Pedrio Dias, Technical SEO/AI Discoverability Consultant, Visively

I believe that, in the current scenario where a significant amount of new (AI) technologies have been introduced between users and how we interact with the web, and are currently being seen through a disruptive lens, it’s more important than ever to maintain objectivity and pragmatism in our approach to organic visibility as a whole, and search in particular. As professionals, we need to understand in depth the changes that we’re being faced with, both from a technical point of view, but also (and maybe more importantly) from a behavioral point of view.

It’s tempting to cling to old habits and metrics to chase around, instead of assessing if and how we need to rework our strategies and tactics. We’re currently being bombarded with an insane amount of tools claiming to “give you insights into AI answers” and promising that they can give you directional “data” – and in some cases even bold claims of outcomes – but we haven’t even started to understand if any kind of optimization can be performed on AI, or even if inference can be influenced in any controlled and desirable way. So far, everyone is mostly just poking around, guessing, and hoping.

So, that said, in 2026, I believe SEOs should maintain a strong focus on retrieval systems and search overall. Make sure your SEO strategy didn’t get stuck in 2005 and that you’re considering all areas that contribute to consistency in visibility, be it content, branding, technical, etc.

Above all, make sure your share-of-voice strategy is omnichannel and isn’t siloed. All this while keeping your curiosity sharp and your critical thinking aimed at questioning the inconsistencies, while being cautious with a dive-head-first approach.

Watch out for overpromising claims, outdated methodologies sitting on top of baseless assumptions, and vanity metrics.

→ Read More: AI Overviews – How Will They Impact The Industry? We Ask Pedro Dias

10. Remain Focused On What Drives Impact

Montserrat Cano, MC. International SEO & Digital Strategy

In 2026, SEOs and digital marketers need to combine a deep understanding of how AI platforms work with a strong knowledge of their user base across every market.

As search becomes more personalized, AI-driven, and fragmented, visibility may also depend on understanding local search behaviors, expectations, cultural nuances, and how audiences interact with SERP features and LLMs along the purchase path, often in different ways.

The real value comes from embedding this research into ongoing internal processes such as content planning, prioritization, and testing. This ensures teams remain focused on what drives impact, e.g., the queries and content formats that matter, and the AI experiences users actually engage with.

Grounding strategies in first-party data, current market insights, and continuous learning may protect visibility and help build sustainable growth. In 2026, this becomes a core capability for effective SEO and marketing strategy.

→ Read More: Why The Build Process Of Custom GPTs Matters More Than The Technology Itself

11. Review How Content Is Organized, Linked, And Surfaced

Alex Moss, Principal SEO, Yoast

Site speed, UX, and IA are obvious and constant, but structure is something that needs to be audited and improved in the coming months, as we now need to accommodate for both agents and humans. Review how content is organized, linked, and surfaced.

Schema is essential, where in 2026, they will be utilized more to understand entities and their relationships better, which in turn reduces possible hallucinogenic responses from agents.

Also concentrate on IA, query grouping, and internal linking. These strategies have existed for some time, but also need to be revisited if you haven’t done so recently.

For brand and offsite, shift from old-hat link acquisition and instead focus on brand sentiment through third-party perspectives, including native digital PR (unlinked brand mentions are welcome).

Finally, take advantage of multi-modal content – invest in imagery, video, and platforms beyond traditional search to increase discoverability.

→ Read More: The Same But Different: Evolving Your Strategy For AI-Driven Discovery

12. Focusing On Evaluating The Revenue Impact Of Your Strategies

Helen Pollitt, Head of SEO, Getty Images

In 2026, SEOs should be focusing on evaluating the revenue impact of their strategies. Too often, SEOs fall into the trap of trying to optimize for traffic or following the newest advice or fancy tactic.

In reality, the most effective SEO strategies are those that are constantly driving towards revenue or other commercial goals. Keeping this premise front and center to your SEO strategies in 2026 will ensure you don’t get sidetracked by the latest SEO fad rather than working on a plan that drives genuine value to your business.

This means setting out your priorities based on their likeliness of success, and their revenue-generating potential. This simple calculation can help you to identify which projects or activities are worth focusing on in 2026. You will be able to identify if the latest “reverse-meta-optimization-deindexing” fad, or whatever it ends up being, is really worth your budget and resources to pursue.”

→ Read More: Ask An SEO: How Can You Distinguish Yourself In This Era Of AI Search Engines?

13. Treat The Website Like An Enterprise System

Bill Hunt, Global Strategist with Bisan Digital

In 2026, SEOs must stop optimizing solely for pages and singular phrases and start optimizing for topical understanding.

AI-driven search systems are no longer ranking documents but evaluating entities, synthesizing answers, and choosing which brands they trust enough to cite. Visibility now depends on three things: clean, authoritative data; deep topical coverage; and systems that make your content easy to retrieve, understand, and reuse. If your site architecture, structured data, and feeds aren’t aligned to these eligibility gates, you’re invisible before the ranking discussion even begins.

The SEOs who will win in 2026 are the ones who treat the website like an enterprise system, not a collection of pages. That means building durable information architecture, improving data reliability, collaborating with product and engineering teams, and creating content designed for synthesis across formats – not just the blue link.

If you’re not strengthening your site’s underlying information integrity and cross-functional alignment, you’re not competing in the new search environment; you’re just publishing.

→ Read More: Industry Pioneer Reveals Why SEO Isn’t Working & What To Refocus On

14. Develop A Distributed Revenue Strategy

John Shehata, CEO & Founder, NewzDash

In 2026, Brand Authority takes the front seat, replacing traffic volume as the primary metric. AI platforms prioritize trusted entities, so you must prove you are one. SEOs need a dual-speed strategy: a short-term strategy that maximizes today’s Google reality, and a long-term plan for a world where traffic and attention are more fragmented.

In the short term, Google is still the primary traffic driver, so optimize for multi-surface and multi-modal visibility. That means targeting AI Overviews, Discover, Top Stories, video, and short-form reels, not just traditional text results.

Convert every visitor into a direct connection through email, apps, and own communities. At the same time, double down on entity and topic authority, publish useful and unique content that is hard for AI to replicate, such as strong opinion, investigative work, and proprietary data, and strengthen technical SEO, structured data, and answer-ready formatting.

Long-term: Prepare for a post-click reality. Develop a distributed revenue strategy driven by a creator network that monetizes attention directly on social platforms and AI interfaces, accepting that success means revenue generated off-site, not just on your domain.

→ Read More: Google Discover, AI Mode, And What It Means For Publishers: Interview With John Shehata

15. Really Focus On Your Audience

Harry Clarkson-Bennett, SEO Director, The Telegraph

This is very brand and customer-dependent. My best advice is to really focus on your audience. Speak to them. Understand the impact SEO should have vs the impact it currently has. There may still be easy wins on the table. Don’t neglect it.

If you use a last click attribution system, I suspect SEO is over-valued. Work with your analytics team to trial multi-touch attribution and try to figure out the value of each channel. Then work with your PPC, social, and newsletter teams to create a proper marketing and acquisition strategy. Build your owned channels. Improve your blended CPA and solve real business problems.

This is the year you manage up more effectively and stop silo-ing channels and people. Make SEO Great Again.

→ Read More: The Impact AI Is Having On The Marketing Ecosystem

16. Transform Metrics Into Strategic Levers

Motoko Hunt, International SEO Consultant, AJPR

Audit and evolve your measurement framework. Many organizations track extensive data points without translating insights into actionable optimization strategies. The key differentiation lies not in data collection, but in strategic application.

Adapt your metrics architecture for the fragmented SERP landscape. With AI Overviews, featured snippets, and expanding SERP features fragmenting traditional organic visibility, implement granular tracking that isolates performance by SERP element. This segmentation reveals where you’re capturing attention and, more critically, where competitors are intercepting traffic before users reach your listings.

Balance emerging channels with revenue-driving fundamentals. AI search warrants monitoring – track share of voice in AI-generated responses and assess brand mention quality. However, at current adoption rates, AI search primarily serves upper-funnel awareness objectives. Your core optimization efforts should remain anchored to proven conversion pathways: traditional organic search, site experience optimization, and technical excellence that drives qualified traffic and revenue.

Transform metrics into strategic levers. Don’t just report CTR decline from position 3 to 5 – quantify the revenue impact, and identify the ranking factors at play. Connect performance gaps directly to business outcomes, then prioritize initiatives that close those gaps with the highest ROI potential.

→ Read More: Effective SEO Organizational Structure For A Global Company

17. Be Aware Of Falsehoods Which Will Continue To Circulate

Dawn Anderson, International SEO Consultant, Bertey

In 2026, SEOs should accept that we continue to have a steeper-than-ordinary SEO learning curve ahead of us. How AI is going to fully impact our industry over time continues to be largely an educated guessing game.

LLMs and agentic search provide a considerable opportunity, but it is important to not simply presume producing copy and paste AI LLM slop will make the cut for performative SEO in 2026, since this is a degenerative downward quality spiral. Instead, we must prioritize adding more authentic value beyond the norm, standing head and shoulders above competitors, and using AI predominantly for efficiency and ideation kick starting, along with prototype generation and concept testing.

Building increasingly robust reputation and authority through quality and connections should remain firmly a key priority. Particularly as the general consensus of opinion in verticals will continue to build via accumulative LLM extractions, shaping competitive narratives.

We should also be aware of falsehoods, which will continue to circulate in the vacuum of genuine knowledge that these severe industry changes create.  Don’t end up going down the wrong paths which may be very difficult to return from in the short to medium term.

→ Read More: Building Trust In The AI Era: Content Marketing Ethics And Transparency

18. Understand The User And How They Make Decisions

Giulia Panozzo, Founder, Neuroscientive

I believe that our key to achieving measurable results in 2026 is looking beyond the tactics and the new shiny tools: we need to get back to basics and really understand the user, their motivations, their frustrations, and mostly how they make decisions.

When customers decide to engage with a brand, a product, or a service, they do so by leveraging a number of micro-decisions that have very little to do with our marketing tactics and a lot more to do with their expectations and needs, their personal experiences, and the perception they have about us. A lot of those choices are made subconsciously, before they are even aware of them – and as a result, they are not visible by looking at traditional metrics.

So, focus on the bigger picture by working cross-functionally to understand not only how people get to your site, but what underlying needs and expectations they have by leveraging social listening, CX logs, and on-site behavioral metrics that will inform what they need to see and engage with before they even click on your result on the SERP.

→ Read More: The Behavioral Data You Need To Improve Your Users’ Search Journey

19. Find Ways To Differentiate Yourself From The Noise

Alli Berry, SEO Director, Marketwise

Looking into 2026 and beyond, I think SEOs need to be focused heavily on digital PR efforts and getting brand mentions and links from influential sites and people. I think we’re going to hit a point where what others say about your brand is going to have more weight than what you say about your own brand.

We’re already starting to see that with Reddit and forums, and as LLMs gain more traction, that is only becoming a more important factor in gaining visibility.

I’d also be focused on finding unique content angles that can’t be easily replicated by AI. Whether it’s telling customer stories or doing primary research, you’re going to need to find ways to differentiate yourself from the noise.

→ Read More: How To Get Brand Mentions In Generative AI

20. Have Influence Where Your Audience Is

Shelley Walsh, Managing Editor, Search Engine Journal & IMHO

During times of significant flux, go to the fundamentals and hold on: Know where your audience is finding its trusted information and have influence in those spaces.

If you embrace this core maxim, it will guide you through all the changes that Google, discovery engines, LLMs, and whatever comes next can throw at you.

However, don’t overlook the significant changes happening with technology that do influence the channels through which audiences can find us. Also, pay attention to how agentic SEO is developing so that you can consider now how you could apply it to your niche.

Don’t get caught up in pointless arguments over nomenclature or caught up in hype cycles chasing distractions. Keep focusing on what a user wants and applying your brand presence and message where they can see it. Everyone is running around like the sky is falling, but it’s all just SEO.

→ Read More: Google’s Old Search Era Is Over – Here’s What 2026 SEO Will Really Look Like

SEO In 2026

What most of our experts are saying is that what is changing is not so much the how, but the where.

Search is happening everywhere, and you need to ensure your brand narrative is accessible and consistent across all the channels where your audience is.

However, that means being mentioned in the right places, and constantly asking: “Does this move the needle for revenue, or is it just more noise?

The future of search is being built in real time, so make sure you’re not just watching it happen, but actively shaping how your brand shows up in it.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Why Your AI Agent Keeps ‘Hallucinating’ (Hint: It’s Your Data, Not The AI) via @sejournal, @purnavirji

If it looks like an AI hallucination problem, and sounds like an AI hallucination problem, it’s probably a data hygiene problem.

I’ve sat through dozens of demos this year where marketing leaders show me their shiny new AI agent, ask it a basic question, and watch it confidently spit out information that’s either outdated, conflicting, or flat-out wrong.

The immediate reaction is to blame the AI: “Oh, sorry the AI hallucinated. Let’s try something different.”

But was it really the AI hallucinating?

Don’t shoot the messenger, as the saying goes. While the AI is the messenger bringing you what looks like inaccurate data or hallucination, it’s really sending a deeper message: Your data is a mess.

The AI is simply reflecting that mess back to you at scale.

The Data Crisis Hiding Behind “AI Hallucinations”

An Adverity study found that 45% of marketing data is inaccurate.

Almost half of the data feeding your AI systems, your reporting dashboards, and your strategic decisions is wrong. And we wonder why AI agents give vague answers, contradict themselves, or pull messaging that no one’s used since 2022.

Here’s what I see in nearly every enterprise:

  • Three teams operating with three different definitions of ideal customer profile (ICP).
  • Marketing defines “conversion” one way, sales defines it another.
  • Buyer data scattered across six systems that barely acknowledge each other’s existence.
  • A battlecard last updated in 2019 still floating around, treated like gospel by your AI agent.

When your foundational data argues with itself, AI doesn’t know which version to believe. So it picks one. Sometimes correctly. Often not.

Why Clean Data Matters More Than Smart AI

AI isn’t magic. It reflects whatever you feed it: the good, the bad, and the three-years-outdated.

Everyone wants the “build an agent” sexy moment. The product demo that has everyone applauding. The efficiency gains that guarantee a great review, heck, maybe even a raise.

But the thing that makes AI useful is the boring, unsexy, foundational work of data discipline.

I’ve watched companies spend six figures on AI infrastructure while their product catalog still has duplicate entries from a 2021 migration. I’ve seen sales teams adopt AI coaching tools while their CRM defines “qualified lead” three different ways depending on which region you ask.

The AI works exactly as designed. The problem is what it’s designed to work with.

If your system is messy, AI can’t clean it up (at least, not yet). It amplifies the mess at scale, across every interaction. As much as we would like for it to, even the sexiest AI model in the world won’t save you if your data foundation is broken.

The Real Cost Of Bad Data Hygiene

When your data is inaccurate, inconsistent, or outdated, mistakes are inevitable. These can get risky quickly, especially if they negatively impact customer experience or revenue.

Here’s what that looks like in practice:

Your sales agent gives prospects pricing that changed six months ago because nobody updated the product sheet it’s trained on.

Your content generation tool pulls brand messaging from 2020 because the 2026 messaging framework lives in a deck on someone’s desktop.

Your lead scoring AI uses ICP criteria that marketing and sales never agreed on, so you’re nurturing the wrong prospects while ignoring the right ones.

Your sales enablement agent recommends a case study for a product you discontinued last quarter because nobody archived the old collateral.

This is happening every single week in enterprises that have invested millions in AI transformation. And most teams don’t even realize it until a customer or prospect points it out.

Where To Start: 5 Steps To Fix Your Data Foundation

The good news: You don’t need a massive transformation initiative to fix this. You need discipline and ownership.

1. Audit What Your AI Can Actually See

Before you can fix your data problem, you need to understand its scope.

Pull every document, spreadsheet, presentation, and database your AI systems have access to. Don’t assume. Actually look.

You’ll more than likely find:

  • Conflicting ICP definitions across departments.
  • Outdated pricing from previous years.
  • Messaging from three rebrand cycles ago.
  • Competitive intel that no longer reflects market reality.
  • Case studies for products you no longer sell.

Retire what’s wrong. Update what’s salvageable. Be ruthless about what stays and what goes.

2. Create One Source Of Truth

This is non-negotiable. Pick one system for every definition that matters to your business:

  • ICP criteria.
  • Conversion stage definitions.
  • Territory assignments.
  • Product positioning.
  • Competitive differentiators.

Everyone pulls from it. No exceptions. No “but our team does it differently.”

When marketing and sales use different definitions, your AI can’t arbitrate. It picks one randomly. Sometimes it picks both and contradicts itself across interactions.

One source of truth eliminates that chaos.

3. Set Expiration Dates For Everything

Every asset your AI can access should have a “valid until” date.

Battlecards. Case studies. Competitive intelligence. Messaging frameworks. Product specs.

When it expires, it automatically disappears from AI access. No manual cleanup required. No hoping someone remembers to archive old content.

Stale data is worse than no data. At least with no data, your AI admits it doesn’t know. With stale data, it confidently delivers wrong information.

4. Test What Your AI Actually Knows

Don’t assume your AI is working correctly. Test it.

Ask basic questions:

  • “What’s our ICP?”
  • “How do we define a qualified lead?”
  • “What’s our current pricing for [product]?”
  • “What differentiates us from [competitor]?”

If the answers conflict with what you know is true, you just found your data hygiene problem.

Run these tests monthly. Your business changes. Your data should change with it.

5. Assign Someone To Own It

Data discipline without ownership is a Slack thread that goes nowhere.

One person needs to be explicitly responsible for maintaining your source of truth. Not as an “additional responsibility.” As a core part of their role.

This person:

  • Reviews and approves all updates to the source of truth.
  • Sets and enforces expiration dates for assets.
  • Runs monthly audits of what AI can access.
  • Coordinates with teams to retire outdated content.
  • Reports on data quality metrics.

Without ownership, your data hygiene initiative dies in three months when everyone gets busy with other priorities.

The Bottom Line: Foundation Before Flash

If you don’t fix the mess, AI will scale the mess.

Deploying powerful AI on top of chaotic data is at best inefficient, but at worst, it can actively damage your brand, your customer relationships, and your competitive position.

You can have the most sophisticated AI model in the world. The best prompts. The most expensive infrastructure. None of it matters if you’re feeding it garbage. It takes a disciplined foundation to make it work.

It’s like seeing someone with perfectly white teeth and thinking they just got lucky. What you don’t see is the daily flossing, the regular dental cleanings, the discipline of avoiding sugar and brushing twice a day for years.

Or watching an Olympic athlete make a performance look effortless. You’re not seeing the 5 a.m. training sessions, the strict diet, the thousands of hours of practice that nobody applauds.

The same applies to AI.

To get real value and ROI from AI, start with setting it up for success with the right data foundation. Yes, it might not be the most glamorous or exciting work. But it is what makes the glamorous and exciting possible.

Remember, your AI isn’t hallucinating. It’s telling you exactly what your data looks like.

The question is: Are you ready to fix it?

More Resources:


Featured Image: BestForBest/Shutterstock

Google Updates Search Live With Gemini Model Upgrade via @sejournal, @martinibuster

Google has updated Search Live with Gemini 2.5 Flash Native Audio, upgrading how voice functions inside Search while also extending the model’s use across translation and live voice agents. The update introduces more natural spoken responses in Search Live and reflects Google’s effort to improve natural voice queries, treating voice as a core interface as a way for users to get everything they can get from regular search plus enabling them to ask questions about the physical world around them and receive immediate voice translations between two people speaking different languages.

The new updated voice capabilities, rolling out this week in the  United States, will enable Google’s voice responses to sound more natural and can even be slowed down for instructional content.

According to Google:

“When you go Live with Search, you can have a back-and-forth voice conversation in AI Mode to get real-time help and quickly find relevant sites across the web. And now, thanks to our latest Gemini model for native audio, the responses on Search Live will be more fluid and expressive than ever before.”

Broader Gemini Native Audio Rollout

This Search upgrade is part of a broader update to Gemini 2.5 Flash Native Audio rolling out across Google’s ecosystem, including Gemini Live (in the Gemini App), Google AI Studio, and Vertex AI. The model processes spoken audio in real time and produces fluid spoken responses, reducing barriers to natural conversation, reducing friction in live interactions. Although Google’s announcement didn’t say that the model was a speech-to-speech model (as opposed to speech-to-text then text-to-speech), this update follows Google’s October announcement of “Speech-to-Retrieval (S2R). It’s a neural network-based machine-learning model trained on large datasets of paired audio queries.”

These changes show Google treating native audio as a core capability across consumer-facing products, making it easier for users to ask and receive information about the physical world around them in a natural manner that wasn’t previously possible.

Improvements For Voice-Based Systems

For developers and enterprises building voice-based systems, Google says the updated model improves reliability in several areas. Gemini 2.5 Flash Native Audio more consistently triggers external functions during conversations, follows complex instructions, and maintains context across multiple turns. These improvements make live voice agents more dependable in real-world workflows, where misinterpreted instructions or broken conversational flow reduce usability.

Smooth Conversational Translation

Beyond Search and voice agents, the update introduces native support for “live speech-to-speech translation.” Gemini translates spoken language in real time, either by continuously translating ambient speech into a target language or by handling conversations between speakers of different languages in both directions. The system preserves vocal characteristics such as speech rhythm and emphasis, supporting translation that sounds smoother and conversational.

Google highlights several capabilities supporting this translation feature, including broad language coverage, automatic language detection, multilingual input handling, and noise filtering for everyday environments. These features reduce setup friction and allow translation to occur passively during conversation rather than through manual controls. The result is a translation experience that behaves much like an actual person in the middle translating between two people.

Voice Search Realizing Google’s Aspirations

The update reflects Google’s continued iteration of voice search toward an ideal that was originally inspired by the science fiction voice interactions between humans and computers in the popular Star Trek television and movie series.

Read More:

Google Announces A New Era For Voice Search

You can now have more fluid and expressive conversations when you go Live with Search.

Improved Gemini audio models for powerful voice interactions

Gemini Live

5 ways to get real-time help by going Live with Search

Featured Image by Shutterstock/Jackbin

SEO Pulse: December Core Update, Preferred Sources & Social Data via @sejournal, @MattGSouthern

The December 2025 core update is the main story this week.

Google confirmed a new broad ranking update, clarified how often core changes happen, expanded Preferred Sources in Top Stories, and started testing social performance data in Search Console Insights.

Here’s what matters for your work.

Google Releases December 2025 Core Update

Google has released the December 2025 core update, its third core update of the year.

Key Facts

The rollout started on December 11, and Google says it may take up to three weeks to complete. This follows the March and June core updates and comes two days after Google refreshed its core updates documentation to explain smaller, ongoing changes.

Why SEOs Should Pay Attention

If you see big swings in rankings or traffic over the next few weeks, this update is probably the cause.

Core updates are broad changes to how Google evaluates content. Pages can move up or down even if you haven’t changed anything on the site, because Google is reassessing your content against everything else in the index.

The timing matters. Earlier in the week, Google reminded everyone that smaller core updates happen all the time. The December core update sits on top of that layer. You’re dealing with both a visible event and quieter, continuous adjustments running underneath.

Right now, the best move is to watch your data rather than panic. Mark the rollout dates in your reporting. Track when things start to move for your key sections. Compare this behavior with what you saw during the March and June updates. That helps you separate core-update effects from seasonality, technical issues, or campaign changes.

Over the longer term, this is another nudge toward content that shows clear expertise, purpose, and useful detail. The documentation change earlier in the week suggests those improvements can be recognized over time, not only when Google names a new core update.

What SEO Professionals Are Saying

Reactions on X focused on timing, expectations, and the kind of content that might come out ahead.

Some SEO professionals leaned into the holiday angle, joking that Google’s “Christmas update” could either deliver a gift or push sites “off a cliff” right before peak season. Others used the announcement to talk about human-written work, saying they hope this is the update where stronger, human-generated content gets more visibility.

There were also practical reads. A few people tied the update to recent delays in Search Console data, saying the backlog now makes more sense. Others pointed out that this is the third broad update in a year where Google is also investing heavily in AI systems, and that core updates now sit inside a bigger stack of changes rather than defining everything on their own.

Read our full coverage: Google Releases December 2025 Core Update

Google Confirms Smaller Core Updates Happen Continuously

Earlier in the week, Google updated its core updates documentation to spell out that ranking changes can happen between the named core updates.

Key Facts

The documentation now says Google makes smaller core updates on an ongoing basis, alongside the larger core updates it announces a few times a year. Google explained that this change is meant to clarify that sites can see ranking gains after making improvements without waiting for the next big announcement.

Smaller core updates were mentioned in a 2019 blog post, but this is the first time the concept appears directly in the core updates documentation.

Why SEOs Should Pay Attention

This answers a question that has been hanging over SEO for years. Recovery isn’t limited to moments when Google announces a core update. The new wording confirms that Google can reward improvements at any time as smaller updates roll out in the background.

If you’ve been holding back on site fixes or content work until “the next core update,” this is a good time to drop that pattern. You can ship improvements now, knowing there’s more than one window where Google might reassess your content.

The timing is interesting given this year’s release pattern. Until this week, the only named core updates in 2025 were the March and June releases, with several months between them. For sites hit early in the year, those gaps made it hard to know when changes might start to pay off. The December update adds another obvious checkpoint, but the documentation makes it clear that it isn’t the only one.

For reporting and communication, this supports a change from “wait for the next update” to “improve steadily and monitor continuously.” You still don’t need to chase every drop, but you can be more confident that sustained work has more than one chance to show up in the data.

What SEO Professionals Are Saying

Former Google search team member Pedro Dias summed up one common read, saying he thinks Google has finally reached a place where it doesn’t need to announce every core update separately. Others have connected the change to Google’s move toward layered ranking systems, where visible events are only one part of an ongoing stream of tweaks.

For you, that supports a slower, steadier approach. Instead of waiting for one moment to “fix” everything, you can keep tuning content and UX, and treat named core updates as checkpoints rather than the only chance to move.

Read our full coverage: Google Confirms Smaller Core Updates Happen Continuously

Google Expands Preferred Sources In Top Stories

Google is expanding Preferred Sources globally for English-language users, giving people more control over which outlets show up in Top Stories and similar news surfaces.

Key Facts

Preferred Sources lets people pick specific outlets they want to see more often when they browse news in Google Search. The feature is now rolling out to English-language users worldwide, with other supported languages planned for early next year. Google says people have already selected close to 90,000 different sources, from local blogs to large international publishers, and that users who mark a site as preferred tend to click through to it about twice as often.

Why SEOs Should Pay Attention

Preferred Sources gives you a direct way to turn casual readers into regulars inside Google’s own interfaces. If your site publishes timely coverage, you can now build a segment of people who have chosen to see more of your work in Top Stories.

That makes “choose us as a preferred source” another call to action you can test alongside email sign-ups and follow buttons. Some publishers are already creating simple guides that show readers how to add them and what changes once they do. You can take a similar approach, especially if you already have a loyal audience on site or through newsletters.

It’s also a signal that Google wants users to have more say in which outlets they see. For you, that means brand perception, clarity of coverage, and consistency matter a bit more, because people are deciding which sources they want in their feed instead of relying on a default mix.

What SEO Professionals Are Saying

On LinkedIn, several SEO professionals and content strategists pointed out that Preferred Sources mostly reinforces behavior that already exists.

Garrett Sussman notes that people tend to stick with outlets they trust. This feature simply makes that choice more visible and gives publishers another growth lever inside Google’s ecosystem.

If you work on news or frequently updated content, you can start treating Preferred Sources selection as its own metric. Watch how often people choose you, which articles tend to drive that choice, and how those readers behave over time.

Read our full coverage: Google Expands Preferred Sources & Publisher AI Partnerships

Google Tests Social Channel Insights In Search Console

Search Console is testing a feature that shows how your social channels perform in Google Search results.

Key Facts

Google announced a new experimental feature in Search Console that adds social performance data to the Search Console Insights report. It covers social profiles that Google has automatically associated with your site. For each connected profile, you can see clicks, impressions, top queries, trending content, and audience location.

The experiment is limited to a small set of properties, and you can’t manually add profiles. The feature only appears if Search Console detects your channels and prompts you to link them.

Why SEOs Should Pay Attention

Up to now, you’ve probably watched search performance for your site and your social channels in separate tools. This experiment pulls both into one place, which can save time and make it easier to see how people move between your website and your social profiles.

The new data shows which queries lead people to your social profiles, which posts tend to surface in search, and which markets use Google to find you on social platforms. That’s useful if you run campaigns where organic search, social content, and creator work all overlap.

The main limitation is access. If you don’t see a prompt in Search Console Insights asking you to connect detected social channels, your site isn’t in the initial test group. Still, it’s worth logging as a feature to watch, especially if you already spend time explaining how social content shows up for branded and navigational queries.

What SEO Professionals Are Saying

Reactions on LinkedIn focused on two main points. People liked the idea of a single view of website and social performance, and they quickly started asking when similar data might be available for AI Overviews, AI Mode, and other search experiences.

Others raised questions about coverage. Some practitioners want to know whether this data will stay limited to Google-owned properties or expand to platforms like Instagram, LinkedIn, and X. There’s also curiosity about how Google detects and links social profiles in the first place, and whether structured data or Knowledge Graph entities play a role.

Read our full coverage: Google Tests Social Channel Insights In Search Console

Theme Of The Week: Core Updates At Two Speeds

The common thread this week is movement at two speeds.

At one speed, you have the December 2025 core update. It’s a visible event with a clear start date, a multi-week rollout, and a lot of attention. At the other speed, you have the quieter changes around it.

Google has now said directly that smaller core updates happen all the time. Preferred Sources gives users more control over which outlets they see. Social insights start to connect website and social performance in one view.

For you, this means there’s no single moment when everything gets decided. Core updates still matter and can cause sharp movements, but they sit inside an environment where improvements can pay off gradually and where readers are making more explicit choices about who they want to hear from.

The practical response is to treat this as an ongoing feedback loop. Keep improving content and UX. Watch how those changes behave during calm periods and during core updates. Encourage your most engaged readers to mark you as a preferred source where they can. Keep an eye on how search and social interact for your brand. That way, you’re ready for both speeds.


Top Stories Of The Week

More Resources


Featured Image: Pixel-Shot/Shutterstock

Google Releases December 2025 Core Update via @sejournal, @MattGSouthern

Google has released the December 2025 core update, the company confirmed through its Search Status Dashboard.

The rollout began at 9:25 a.m. Pacific Time on December 11, 2025.

This marks Google’s third core update of 2025, following the March and June core updates earlier this year.

What’s New

Google lists the update as an “incident affecting ranking” on its status dashboard.

The company states the rollout “may take up to three weeks to complete.”

Core updates are broad changes to Google’s ranking systems designed to improve search results overall. Unlike specific updates targeting spam or particular ranking factors, core updates affect how Google’s systems assess content across the web.

2025 Core Update Timeline

The December update follows two previous core updates this year.

The March 2025 core update rolled out from March 13-27, taking 14 days to complete. Data from SEO tracking providers suggested volatility similar to the December 2024 core update.

The June 2025 core update ran from June 30 to July 17, lasting about 16 days. SEO data providers indicated it was one of the larger core updates in recent memory. Some sites previously hit by the September 2023 Helpful Content Update saw partial recoveries during this rollout.

Documentation Update On Continuous Changes

Two days before this core update, Google updated its core updates documentation with new language about ongoing algorithm changes.

The updated documentation now states:

“However, you don’t necessarily have to wait for a major core update to see the effect of your improvements. We’re continually making updates to our search algorithms, including smaller core updates. These updates are not announced because they aren’t widely noticeable, but they are another way that your content can see a rise in position (if you’ve made improvements).”

Google explained that the addition was meant to clarify that content improvements can lead to ranking changes without waiting for the next announced update.

Why This Matters

If you notice ranking fluctuations over the coming weeks, this update is likely a major factor.

Core updates can shift rankings for pages that weren’t doing anything wrong. Google has consistently stated that pages losing visibility after a core update don’t necessarily have problems to fix. The systems are reassessing content relative to what else is available.

The documentation update is a reminder that rankings can change between major updates as Google rolls out smaller core changes behind the scenes.

Looking Ahead

Google will update the Search Status Dashboard when the rollout is complete.

Monitor your rankings and traffic over the next three weeks. If you see changes, document when they occurred relative to the rollout timeline.

Based on 2025’s previous updates, completion typically takes two to three weeks. Google will confirm completion through the dashboard and its Search Central social accounts.