Google Podcast Discusses SEO Expertise via @sejournal, @martinibuster

Google’s recent Search Off the Record podcast touched on the issue of SEO expertise and the disconnect between how SEOs think Google ranks websites how Googlers understand it. The disparity is so great that Gary Ilyes remarked that sometimes he doesn’t know what SEOs are talking about.

Googlers Question SEO Expertise

Martin Splitt discussed meeting Turkish publishers and SEOs of different experience levels at a Google event in Turkey in which the attendees complained of poor search results. Turned out that the problem wasn’t Google’s search results, it was an issue with how Turkish websites are created, which indirectly called into question the SEO expertise of Turkish language publishers.

He said:

“And then eventually we worked out as a group as a whole, that there are a lot of problems with the way that content is created in Turkish language websites…”

Gary Illyes expanded on Martin’s comment about experience levels to say that it’s a subjective thing, that some people who self-describe themselves as newbies are actually extremely knowledgeable on the fine details of indexing and crawling, while other SEO gurus ask questions that don’t make sense.

Gary shared:

“The thing you mentioned about experience, I came to realize the past few years that that’s a very subjective thing. Like, when you are asking people, ‘What’s your experience?’ And they are like, ‘Oh, I’m a guru,’ and then on the opposite end of the spectrum, like, ‘I’m a complete newbie.’

And then you start talking to them and the newbie knows way more about like HTTP, for example, than I do and crawling and indexing and whatever, like how it’s perceived externally.

And then you talk to the guru and the guru is like… the questions themselves don’t make sense. Like, you can’t interpret the question that they are asking.”

That part about the questions not making sense describes a disconnect between what SEOs and Googlers believe about SEO. Let’s face it, there’s a disconnect.

The Knowledge And Experience Gap

Sometimes there’s a gap separating how SEOs experience the ranking algorithm and how Googlers try to explain how it works.  A classic example is the disconnect in the SEO belief in the concept of domain authority and Google’s denial of its existence. A few years ago, in a Google Search Central Hangout, a person told John Mueller that a core update eliminated the rankings of all of their keywords.

They asked,

“How could it be possible that our authority can drop more than 50% overnight? What actions could we take to increase our authority?”

Mueller answered:

“So in general, Google doesn’t evaluate a site’s authority. So it’s not something where we would give you a score on authority and say this is the general score for authority for your website. So that’s not something that we would be applying here.”

That belief in “domain authority” is one example of out of many where what SEOs think they know about Google is completely disconnected from what Googlers know about how search rankings work.

What Do SEO Experts Really Know?

Martin Splitt continues the conversation to proxies for judging expertise of SEOs by how big the sites are that they manage but concludes that those proxy metrics don’t really say much about their SEO expertise, either. Ultimately they conclude that they need to engage in a deeper conversation with the search marketing and publishing community to identify if there’s something Google could do better to explain what SEOs should be doing.

He explained:

“I mean, we try to gauge experience by asking them how many years have you been doing this kind of job and how many years have you been in this industry, and how many impressions do you manage a month, roughly? And these are proxy metrics. And as you say, it’s super subjective.”

He mentions the wide range of complexity of technical issues an SEO needs to understand and John Mueller adds that even specialists in a specific SEO niche can have gaps in fundamental SEO concepts. The point of the conversation is to speculate if the root of the disconnect is in Google’s documentation or just that the SEO experts just don’t know.

John commented:

“It’s like someone could be like super focused on web workers or trying to get them indexed and at the same time, like, ‘How do I block a page from being indexed?’”

Martin agreed, saying:

“Yeah. And that’s probably why it is so subjective. And it’s super interesting, super interesting to see how they’re like, ‘Yeah, we got everything nailed down. We are running a tight ship here.’ And then you see, like some of the stuff that is discussed at large in all of the beginner documentation is being missed.

And that left me with a question. Is it that they are not aware that this documentation exists? Is it that they had a hard time fielding the amount of information we put out there? Or is it that they don’t know?”

Lizzi Sassman then asked:

“Did you get a sense, just in conversation with them, if they knew about the documentation or if there was like sort of a, I don’t know, a feeling or a vibe about like that the translation is bad or something like that.”

Martin answered:

“That’s exactly what I don’t know, because we were so busy during the event fielding all the conversations, like everyone wanted to talk to us. And that’s great. That’s fantastic. That’s why we are doing it.

But it doesn’t really give you the space to reflect on things on the spot. So I reflected, basically, on my flight back home, I was like, ‘Hm. I wonder. Dang. I should have asked these questions.’ But, you know, this means we have to go back and ask them again.”

What Is An SEO Expert?

SEO expertise is subjective. Anyone who insists that SEO is one thing is out of touch with the reality that there is no single definition of SEO. I disagree with many SEOs about what they think is a good practice and with more experience some of them eventually come around to agreeing with me. There are some SEOs whose experience is wildly different than mine and I sit humbly and listen to them as they share what they know over dinner.

Many of us work from home but it should be understood that we’re all members of the search marketing community and we should be able to listen to what others say about SEO and not only have polite disagreements about the “right way” but to expect that others will disagree and to not let it polarize you, but rather, keep an open mind.

Google Speculates If SEO ‘Is On A Dying Path’ via @sejournal, @martinibuster

Google’s latest Search Off the Record podcast discussed whether ‘SEO is on a dying path’ because of AI Search. Their assessment sought to explain that SEO remains unchanged by the introduction of AI Search, revealing a divide between their ‘nothing has changed’ outlook for SEO and the actual experiences of digital marketers and publishers.

Google Speculates If AI Is On A Dying Path

At a certain point in the podcast they started talking about AI after John Mueller introduced the topic of the impact of AI on SEO.

John asked:

“So do you think AI will replace SEO? Is SEO on a dying path?”

Gary Illyes expressed skepticism, asserting that SEOs have been predicting the decline of SEO for decades.

Gary expressed optimism that SEO is not dead, observing:

“I mean, SEO has been dying since 2001, so I’m not scared for it. Like, I’m not. Yeah. No. I’m pretty sure that, in 2025,the first article that comes out is going to be about how SEO is dying again.”

He’s right. Google began putting the screws to the popular SEO tactics of the day around 2004, gaining momentum in 2005 with things like statistical analysis.

It was a shock to SEOs when reciprocal links stopped working. Some refused to believe Google could suppress those tactics, speculating instead about a ‘Sandbox’ that arbitrarily kept sites from ranking. The point is, speculation has always been the fallback for SEOs who can’t explain what’s happening, fueling the decades-long fear that SEO is dying.

What the Googlers avoided discussing are the thousands of large and small publishers that have been wiped out over the last year.

More on that below.

RAG Is How SEOs Can Approach SEO For AI Search

Google’s Lizzi Sassman then asked how SEO is relevant in 2025 and after some off-topic banter John Mueller raised the topic of RAG, Retrieval Augmented Generation. RAG is a technique that helps answers generated by a large language model (LLM) up to date and grounded in facts. The system retrieves information from an external source like a search index and/or a knowledge graph and the large language model subsequently generates the answer, retrieval augmented generation. The Chatbot interface then provides the answer in natural language.

When Gary Illyes confessed he didn’t know how to explain it, Googler Martin Splitt stepped in with an analogy of documents (representing the search index or knowledge base), search and retrieval of information from those documents, and an output of the information from “out of the bag”).

Martin offered this simplified analogy of RAG:

“Probably nowadays it’s much better and you can just show that, like here, you upload these five documents, and then based on those five documents, you get something out of the bag.”

Lizzi Sassman commented:

“Ah, okay. So this question is about how the thing knows its information and where it goes and gets the information.”

John Mueller picked up this thread of the discussion and started weaving a bigger concept of how RAG is what ties SEO practices to AI Search Engines, saying that there is still a crawling, indexing and ranking part to an AI search engine. He’s right, even an AI search engine like Perplexity AI uses an updated version of Google’s old PageRank algorithm.

Mueller explained:

“I found it useful when talking about things like AI in search results or combined with search results where SEOs, I feel initially, when they think about this topic, think, “Oh, this AI is this big magic box and nobody knows what is happening in there.

And, when you talk about kind of the retrieval augmented part, that’s basically what SEOs work on, like making content that’s crawlable and indexable for Search and that kind of flows into all of these AI overviews.

So I kind of found that angle as being something to show, especially to SEOs who are kind of afraid of AI and all of these things, that actually, these AI-powered search results are often a mix of the existing things that you’re already doing. And it’s not that it suddenly replaces crawling and indexing.”

Mueller is correct that the traditional process of indexing, crawling, and ranking still exists, keeping SEO relevant and necessary for ensuring websites are discoverable and optimized for search engines.

However, the Googlers avoided discussing the obvious situation today, which is the thousands of large and small publishers in the greater web ecosystem that have been wiped out by Google’s AI algorithms on the backend.

The Real Impacts Of AI On Search

What’s changed (and wasn’t addressed) is that the important part of AI  in Search isn’t the one on the front end with AI Overviews. It’s the part on the back-end making determinations based on opaque signals of authority, topicality and the somewhat ironic situation that an artificial intelligence is deciding whether content is made for search engines or humans.

Organic SERPs Are Explicitly Obsolete

The traditional ten blue links have been implicitly obsolete for about 15 years but AI has made them explicitly obsolete.

Natural Language Search Queries

The context of search users who ask precise conversational questions within several back and forth turns is a huge change to search queries. Bing claims that this makes it easier to understand search queries and provide increasingly precise answers. That’s the part that unsettles SEOs and publishers because , let’s face it, a significant amount of content was created to rank in the keyword-based query paradigm, which is gradually disappearing as users increasingly shift to more complex queries. How content creators optimize for that is a big concern.

Backend AI Algorithms

The word “capricious” means the tendency to make sudden and unexplainable changes in behavior. It’s not a quality publishers and SEOs desire in a search engine. Yet capricious back-end algorithms that suddenly throttle traffic and subsequently change their virtual minds months later  is a reality.

Is Google Detached From Reality Of The Web Ecosystem?

Industry-wide damage caused by AI-based algorithms that are still “improving” have unquestionably harmed a considerable segment of the web ecosystem. Immense amounts of traffic to publishers of all sizes has been wiped out since the increased integration of AI into Google’s backend, an issue that the recent Google Search Off The Record avoided discussing.

Many hope Google will address this situation in 2025 with greater nuance than their CEO Sundar Pichai who struggled to articulate how Google supports the web ecosystem, seemingly detached from the plight of thousands of publishers.

Maybe the question isn’t whether SEO is on a dying path but whether publishing itself is in decline because of AI on both the backend and the front of Google’s search box and Gemini apps.

Check out these related articles:

Google CEO’s 2025 AI Strategy Deemphasizes The Search Box

Google Gemini Deep Research May Erode Website Earnings

Google CEO: Search Will Change Profoundly In 2025

Featured Image by Shutterstock/Shutterstock AI Generator

Google AI Overviews Claims More Pixel Height in SERPs via @sejournal, @martinibuster

New data from BrightEdge reveals that Google’s AI Overviews is increasingly blocking organic search results. If this trend continues, Google AI Overviews and advertisements could cover well over half of the available space in search results.

Organic Results Blocking Creeping Up

Google’s AI Overviews feature, launched in May 2024, has been a controversial feature among publishers and SEOs since day one. Many publishers resent that Google is using their content to create answers in the search results that discourage users from clicking through and reading more, thereby negatively influencing earnings.

Many publishers, including big brand publishers, have shut down from a combination of declining traffic from Google and algorithmic suppression of rankings. AI Overviews only added to publisher woes and has caused Google to become increasingly unpopular with publishers.

Google AIO Taking Over Entire Screen

BrightEdge’s research shows that AI Overviews started out in May 2024 taking up to 600 pixels of screen space, crowding out the organic search results, formerly known as the ten blue links. When advertising is factored in there isn’t much space left over for links to publisher sites.

By the end of summer the amount of space taken over by Google’s AIO increased to 800 pixels and continued to climb. At this pace BrightEdge predicts that Google could eventually reach 1,000 pixels of screen space. To put that in perspective, 600 pixels is considered “above the fold,” what users typically see without scrolling.

Graph Showing  Growth Of AIO Pixel Size By Height

Percentage Of Queries Showing AIOs

The percentage of queries that display Google’s AI Overviews have also been creeping up. Health related search queries have been trending higher than any other niche. B2B Technology, eCommerce, and finance queries are also increasingly showing AI Overview search results.

Healthcare search queries initially triggered AIO at around 70% of the time. Health related queries are now triggered over 80% of the time.

B2B technology queries started out in May 2024 showing AIO results at about 30% of the time. Now those same queries trigger AIO results almost 50% of the time.

Finance queries that trigger AI Overviews have grown from around 5% to 20% of the time. BrightEdge data shows that Google AIO coverage is trending upwards and is predicted to cover an increasing amount of search queries across other topics, specifically in travel, restaurants, and entertainment.

BrightEdge’s data shows:

“Finance shows most dramatic trajectory: starting at just 5.3% but projected to reach 15-20% by June 2025

-Healthcare led (67.5% in June)
-B2B Tech: 33.2% → 38.4%, projected 45-50%
-eCommerce: 26.9% → 35.1%, projected 40-45%
-Emerging sectors showing dramatic growth:

Entertainment (shows, events, venues): 0.3% → 5.2%
Travel (destinations, lodging, activities): 0.1% → 4.1%
Restaurants (dining, menus, reservations): ~0% → 6.0%”

BrightEdge explains that restaurant search query coverage started out small, focusing on long tail search queries like “restaurants with vegetarian food for groups” but is now is rolling out in higher amounts, suggesting that Google is feeling more comfortable with their AIO results and is expected to roll out across more search queries in 2025.

They explain:

“AIO’s evolved from basic definitions to understanding complex needs combining multiple requirements (location + features + context)

In 2025, expect AIO’s to handle even more sophisticated queries as they shift from informational to actionable responses.

-Healthcare stable at 65-70%
-B2B Tech/eCommerce will reach 40-50%
-Finance sector will surge from 5.3% to 25%
-Emerging sectors could see a 50-100x growth potential
-AIOs will evolve from informational to actionable (reservations, bookings, purchases)
-Feature complexity: 2.5x current levels”

The Takeaway

I asked BrightEdge for a comment about what they feel publishers should get ahead of for 2025.

Jim Yu, CEO of BrightEdge, responded:

“Publishers will need to adapt to the complexity of content creation and optimization while leaning into core technical SEO to guarantee their sites are seen and valued as authoritative sources.

Citations are a new form of ranking. As search and AI continue to converge, brands need to send the right signals to search and AI engines to help them decide if the content is helpful, unique, and informative. In a multi-modal world, this means schema tags about a publisher’s company, products, images, videos, overall site and content structure, reviews, and more!

In 2025, content, site structure, and authority will matter more than ever, and SEO has a huge role to play in that.

Key Questions marketers need to address in 2025

  • Is your content ready for 4-5 layered intents?
    Can you match Google’s growing complexity?
    Have you mapped your industry’s intent combinations?

Key Actions for 2025

The Pattern is clear: Simple answers → rich, context-aware responses!

  • Intent Monitoring: See which intents AIO’s are serving for your space
    Query Evolution: Identify what new keyword patterns are emerging that AIO’s are serving
    Citation Structure: Align content structure to intents and queries AIO’s are focused on to ensure you are cited
    Competitive Intelligence: Track which competitor content AIOs select and why

AIOs aren’t just displaying content differently – they’re fundamentally changing how users find and interact with information.

The takeaway from the data is that publishers are encouraged to create unambiguous content that directly address topics in order to rank for complex search queries. A careful eye on how AI Overviews are displayed and what kinds of content are cited and linked to is encouraged.

Google’s CEO, Sundar Pichai, recently emphasized increasing the amount of coverage that AI assistants like Gemini handle, which implies that Google’s focus on AI, if successful, may begin to eat into the amount of traffic from the traditional search box. That’s a trend to be on the watch for and a wakeup call to get on top of creating content that resonates with today’s AI Search.

The source of AIO data is from the proprietary BrightEdge Generative Parser™ and DataCubeX, which regularly informs the BrightEdge guide to AIO.

How Long Should An SEO Migration Take? [Study Updated] via @sejournal, @TaylorDanRW

Website migrations, specifically domain migrations, are often seen as one of the more complex parts of SEO.

They are becoming increasingly more common as businesses are consolidating websites and assets to reduce costs and consolidate efforts as more channels and platforms come into play.

As SEO professionals, we are tasked with mitigating as many risks and variables as possible so that the business doesn’t see organic performance issues – either at all or for a longer than necessary period of time.

When we ran this study in 2023, we looked at 171 migrations and found that it took 229 days (on average) for third-party tools to reflect organic traffic for the new domain to return to the same pre-migration levels of the original domain; 42% didn’t return at all.

The reason we’ve repeated this study is that we think it’s important that businesses (and SEO marketers) have data to work off to make informed decisions when planning domain migrations.

Over the years, I’ve been in a number of pitch meetings where the other agencies pitching have promised no traffic loss at all during migration, and more often than not, adequate preparation work, monitoring, or expectation setting has been done.

Study Methodology

This study aims to research and provide a data-led answer to “How long should an SEO migration take,” to help both in-house SEOs and consultants provide better estimations and communications with non-SEO stakeholders of their migration projects.

This is building off of last year’s study in which we looked at 171 domain migrations. This year, we’ve expanded the dataset to 892, thanks to fellow SEO professionals responding to information requests on various Slack channels and X (Twitter).

Using third-party tools, we then measured the number of days it took Domain B (the new domain) to achieve the same estimated organic traffic volume as Domain A (the old domain).

Data was collated on October 22nd, 2024.

Bias Factors

Bias in a quantitative study refers to systematic errors that can significantly skew the results, leading to incorrect conclusions.

These biases can arise at any stage of the research process, such as in the design, data collection, analysis, and interpretation stages.

Bias undermines the validity and reliability of the study’s results and can lead to misleading conclusions.

Where possible, we have taken steps to mitigate any bias from impacting the study.

  • Selection Bias: We have worked to eliminate this as much as possible, with a high percentage of the data coming from Ahrefs (unfiltered) and an open request to the SEO community. This should negate our own inputs from sectors we specialize in. This has led to a variety of domains in a variety of sectors.
  • Measurement Bias: As we’re using a third-party tool, the bias here is limited to the scope of the tool, database updates, and the keyword set. As we’re comparing two domains and making the assumption that they match in terms of the target keyword set, this bias should be mitigated.
  • Confounding Bias: As we’re comparing date periods within the same tool, no correlations are being made in terms of data analysis.
  • Publication Bias: This study was going to be published and submitted regardless of percentage/data findings. So, this bias is mitigated.

The data set contained domains of varying usage, from lead generation in SaaS, legal, and finance to blogs, local retail, and ecommerce.

Study Findings

The key takeaways from the domain study are:

  • On average, it took 523 days for Domain B to show the same level of organic traffic as Domain A.
  • The shortest times recorded were 19, 22, 23, and 33 days.
  • 17% of domain migrations in the sample didn’t see organic traffic return to the same levels after 1,000 days. This is a significant improvement versus the 42% from the previous study.
  • We classified three migrations as “In progress” as they were less than two years old, and traffic was returning slowly.
  • We classified 25 migrations (2.8%) as “Inconclusive” as Domain B traffic had hit the levels of Domain A, but wasn’t stable.

From the original data set, a number of domains dropped and are now redirected to domain squatters/private domain sellers.

As these aren’t “genuine” domain migrations, and the new domain was never intended to maintain the same keywords and traffic, these have been discounted and not included in the data.

Why Do Migration Results Differ?

No two websites are the same, and there are several variables in a website migration we can control – and several we can’t.

The discourse around migrations in the SEO industry has really not changed for a number of years, with basic best practices being established and then layering the basics with situation-dependent needs to mitigate risks.

Google rebuilds its index on a page-by-page basis, so opening up new crawl paths and URLs ahead of time could speed up the initial Discover and Crawl phases.

From experience, launching the new domain and URL structure 24-48 hours ahead of performing the migration, i.e. implementing redirects, can help speed up the process as Google has already began to crawl and start processing the new URL paths in the majority of cases. This coupled with the change of address tool in Google Search Console can smooth a lot of early migration lag.

Backlink Profiles & Migrations

While crowd-sourcing domains for this study, I also asked the community why there are “time lags” in migrations.

Natalia Witczyk proposed the idea that it’s related to backlink profiles and how long it takes Google to process profile transference:

From my experience, for the rankings and traffic to be back to normal levels, the backlink profile has to be recrawled and the redirects have to be reflected.

That takes close to no time if the backlink profile is non-existent, so the return to normal traffic levels happens fast. If the backlink profile is extensive, there’s more to recrawl and that would take Google more time.

This prompted me to look at the total number of referring domains each domain had, and there is some correlation to this being the case, but with a large number of outliers – likely due to how the migration was carried out.

For more information and best practices on website migrations, I’d recommend reading the below articles:

More Resources:


Featured Image: ParinPix/Shutterstock

Google CEO Describes A 2025 Beyond A Search Box via @sejournal, @martinibuster

Google’s Sundar Pichai outlined the 2025 strategy, emphasizing consumer-focused AI, rapid development of agentic apps, a Chrome AI prototype called Project Mariner, and upgrades to Gemini and Project Astra, signaling a shift toward AI apps as the user interface for search.

Although Pichai did not say Google is de-emphasizing the Google Search box, he did emphasize that 2025 will increase the focus on AI apps as the main point of contact between users and how they interact with Google.

For example, Project Mariner is a Chrome AI extension that can do things like take a top ten restaurants list from TripAdvisor and drop it into Google Maps.

This focus on AI shows that Google is in transition toward an AI-based user experiences that represent a larger interpretation of what Search means, a search experience that goes far beyond textual question and answering.

Google’s Future Hinges On AI

Google CEO, Sundar Pichai, outlined a vision for 2025 that emphasizes an urgency to go back to its roots as a company that innovates quickly, what Pichai referred to as being “scrappy” which means being tough and resourceful, able to accomplish a lot in a quick amount of time (and fewer resources). Most importantly he emphasized solving real-world problems.

He also prioritizes “building big, new business” which could mean creating new business opportunities with AI, reflecting a strong focus on AI as the engine for innovation in 2025.

Gemini App

Pichai also cited Gemini App as a central focus for 2025, commenting that they’re experiencing growth with Gemini and that scaling broader adoption of Gemini will be a focus in 2025. This aligns with the observation that Google is increasingly focusing on a Search-adjacent approach to consumer focused AI products and services.

What this means for SEO is that we really need to start thinking in terms of a bigger picture of what Search means. Perhaps 2025 will be, after over 15 years of Google’s departure from the ten blue links paradigm, that the SEO community thinks deeper about what search means when it’s multimodal.

Pichai was quoted as saying:

“With the Gemini app, there is strong momentum, particularly over the last few months… But we have some work to do in 2025 to close the gap and establish a leadership position there as well. …Scaling Gemini on the consumer side will be our biggest focus next year.”

AI Products Will “Evolve Massively”

The co-founder of Google Deep Mind was quoted as saying that Google was going to “turbo charge” the Gemini app, saying that:

“…the products themselves are going to evolve massively over the next year or two.”

That means that the Gemini app is going to gain more functionalities in a bid to make it more ubiquitous as the interface between potential website visitors and Google Search, a significant departure from interfacing with the search box.

This is something that publishers and SEOs need to think really hard about as we enter 2025. Google is focusing on increasing user adoption of the Gemini app. If that happens then that will mean more people interfacing with that instead of the Google Search box.

Universal Assistant (Project Astra)

Another thing that the SEO industry seriously needs to consider is Google’s universal assistant that’s code-named Project Astra. The Deep Mind co-founder is reported to have discussed their Universal Assistant, which what Project Astra is referred as.

Screenshot of DeepMind Project Astra web page showing how it is referred to as a Universal AI Assistant.

He’s quoted as saying that it can:

“…seamlessly operate over any domain, any modality or any device.”

What that word “domain” means is that it can function across any subject, like answering questions about healthcare, directions, entertainment, over any topic. The part about modality is a reference to text, voice, images, and video.

This is a serious situation for SEO. Google’s new Deep Research agentic search is an example of a disruptive technology that may have a negative impact on the web ecosystem.

One of the Google Deep Mind researchers cited as working on Project Astra is also listed as a co-inventor on a patent about controlling interactive AI agents through multi-modal inputs.

The patent is titled, Controlling Interactive Agents Using Multi-Modal Inputs. The description of the invention is:

“Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for controlling agents. In particular, an interactive agent can be controlled based on multi-modal inputs that include both an observation image and a natural language text sequence.”

That’s just one of dozens of researchers cited as having worked on Project Astra. Astra is another one of the projects that Google is working on that replaces the traditional search box as people’s point of contact for interacting with web data.

Takeaway About Google’s Plans For 2025

The takeaway from all this is that publishers and SEOs need to take a break from focusing solely on the search box and give some time to considering what’s going on in multimodal AI.  In 2025, AI is not just AI Overviews. AI is Gemini, it’s new features coming to Gemini and possibly the release of features developed from Project Astra, a multimodal universal agent. Agentic Search is already here in the form of Gemini Deep Research. All of these are a departure from the traditional search box as the point of contact between users, Google and websites.

Read the report on CNBC

Google CEO Pichai tells employees to gear up for big 2025: ‘The stakes are high’

Google Gemini Deep Research May Erode Website Earnings via @sejournal, @martinibuster

There’s a compelling theory floating around that Google’s AI agent, called Deep Research, could negatively impact affiliate sites. If true, not only would this impact affiliate site earnings, it could also decrease ad revenues and web traffic and to informational sites, including to the “lucky” sites that are linked to by Google’s AI research assistant.

Gemini Deep Research

Gemini Deep Research is a new tool available to premium subscribers to Gemini Advanced. Deep Research takes a user’s queries and researches an answer on the web then generates a report. The research can be further refined to produce increasingly precise results.

Google rolled out Deep Research on December 11th. It describes it as a time-saver that creates a research plan and once approved will carry out the research.

Google explains:

“Deep Research uses AI to explore complex topics on your behalf and provide you with findings in a comprehensive, easy-to-read report, and is a first look at how Gemini is getting even better at tackling complex tasks to save you time.

Under your supervision, Deep Research does the hard work for you. After you enter your question, it creates a multi-step research plan for you to either revise or approve. Once you approve, it begins deeply analyzing relevant information from across the web on your behalf.”

Deep Research presents a report that features a summary and recommendations. If searching for a product it will summarize the pros and cons with enough data that a user won’t need to click a link to visit a site, they can just go directly to a retailer and purchase the product, thereby eliminating the possibility of a site visitor clicking an affiliate link from a review website and depriving that informational site of revenue.

According to an article by Marie Haynes on YouKnowAI, the thoroughness of the summary generated by Gemini Deep Research negates the need to visit a websites, thereby depriving the site of affiliate link revenue.

YouKnowAI explains:

“…perhaps sites like foodnetwork.com will get clicks and subsequent affiliate sales. I’ve found in my own research so far that I’m not clicking on sites as I get what I need to know from the research and then go to official sites or perhaps Amazon, or stores near me to purchase.

…The obvious question here is what happens when sites like foodnetwork.com and seriouseats.com see a reduction in traffic? “

If it’s true that Gemini Deep Research users won’t need to visit sites to make up their minds then it’s possible that this new tool will also negatively affect web traffic and advertising revenue.

Is Google Out Of Touch With The Web Ecosystem?

In a recent interview, Google’s CEO, Sundar Pichai, insisted that Google cares about the web ecosystem. When asked how Google supports the web ecosystem he struggled to articulate an answer. After a long series of uhms and false starts he started talking about how Google’s own YouTube platform enables multinational media corporations to monetize their intellectual properties on YouTube.

“He avoids mentioning websites, speaking in the abstract about the “ecosystem” and then when he runs out of things to say changes course and begins speaking about how Google compensates copyright holders who sign up for YouTube’s Content ID program.

He answered:

‘Look I… uh… It’s a… very important question… uhm… look I… I… think… I think more than any other company… look you know… we for a long time through… you know… be it in search making sure… while it’s often debated, we spend a lot of time thinking about the traffic we send to the ecosystem.

Even through the moment through the transition over the past couple of years. It’s an important priority for us.’”

This Is Why Google CEO’s Explanation Falls Short

1. YouTube is not the web ecosystem, it’s Google’s own platform.

2. Multinational mega corporations are not web creators.

Pichai’s answer sent the unintended message that Google is completely out of touch with web creators and if the author of the article about Google Gemini’s Deep Research tool is correct, this is further proof that Google continues to focus on providing information to users at the expense of creators.

Is Gemini Deep Research Harvesting Data Without Giving Back?

There’s an old television episode of The Twilight Zone called To Serve Man that relates the story of a benevolent race of aliens who bring advanced technologies that allow humans to live in peace, with food security and prosperity for everyone. As evidence of their good intentions they give the world a book written in an alien language that’s titled To Serve Man. The episode ends when government cryptographers translate the book and discover that it’s a cookbook and that the aliens true intentions are to farm humans as a food source.

Google’s mission statement promising “to organize the world’s information and make it universally accessible and useful” also seems like proof of their good intentions. However, the mission statement doesn’t explicitly say that Google will refer users to the sources of information. It only promises to organize and provide the information itself in a way that’s accessible and useful. While referring users to the creators of the information could be a part of making information accessible and useful, it’s not explicitly stated; it’s not even implied in the mission statement.

Is Google Gemini Deep Research further proof that Google is harvesting websites as an information source?

If you’re a creator, does it make you feel farmed?

Featured Image by Shutterstock/Nomad_Soul

Google Explains How CDNs Impact Crawling & SEO via @sejournal, @martinibuster

Google published an explainer that discusses how Content Delivery Networks (CDNs) influence search crawling and improve SEO but also how they can sometimes cause problems.

What Is A CDN?

A Content Delivery Network (CDN) is a service that caches a web page and displays it from a data center that’s closest to the browser requesting that web page. Caching a web page means that the CDN creates a copy of a web page and stores it. This speeds up web page delivery because now it’s served from a server that’s closer to the site visitor, requiring less “hops” across the Internet from the origin server to the destination (the site visitor’s browser).

CDNs Unlock More Crawling

One of the benefits of using a CDN is that Google automatically increases the crawl rate when it detects that web pages are being served from a CDN. This makes using a CDN attractive to SEOs and publishers who are concerned about increasing the amount of pages that are crawled by Googlebot.

Normally Googlebot will reduce the amount of crawling from a server if it detects that it’s reaching a certain threshold that’s causing the server to slow down. Googlebot slows the amount of crawling, which is called throttling. That threshold for “throttling” is higher when a CDN is detected, resulting in more pages crawled.

Something to understand about serving pages from a CDN is that the first time pages are served they must be served directly from your server. Google uses an example of a site with over a million web pages:

“However, on the first access of a URL the CDN’s cache is “cold”, meaning that since no one has requested that URL yet, its contents weren’t cached by the CDN yet, so your origin server will still need serve that URL at least once to “warm up” the CDN’s cache. This is very similar to how HTTP caching works, too.

In short, even if your webshop is backed by a CDN, your server will need to serve those 1,000,007 URLs at least once. Only after that initial serve can your CDN help you with its caches. That’s a significant burden on your “crawl budget” and the crawl rate will likely be high for a few days; keep that in mind if you’re planning to launch many URLs at once.”

When Using CDNs Backfire For Crawling

Google advises that there are times when a CDN may put Googlebot on a blacklist and subsequently block crawling. This effect is described as two kinds of blocks:

1. Hard blocks

2. Soft blocks

Hard blocks happen when a CDN responds that there’s a server error. A bad server error response can be a 500 (internal server error) which signals a major problem is happening with the server. Another bad server error response is the 502 (bad gateway). Both of these server error responses will trigger Googlebot to slow down the crawl rate. Indexed URLs are saved internally at Google but continued 500/502 responses can cause Google to eventually drop the URLs from the search index.

The preferred response is a 503 (service unavailable), which indicates a temporary error.

Another hard block to watch out for are what Google calls “random errors” which is when a server sends a 200 response code, which means that the response was good (even though it’s serving an error page with that 200 response). Google will interpret those error pages as duplicates and drop them from the search index. This is a big problem because it can take time to recover from this kind of error.

A soft block can happen if the CDN shows one of those “Are you human?” pop-ups (bot interstitials) to Googlebot. Bot interstitials should send a 503 server response so that Google knows that this is a temporary issue.

Google’s new documentation explains:

“…when the interstitial shows up, that’s all they see, not your awesome site. In case of these bot-verification interstitials, we strongly recommend sending a clear signal in the form of a 503 HTTP status code to automated clients like crawlers that the content is temporarily unavailable. This will ensure that the content is not removed from Google’s index automatically.”

Debug Issues With URL Inspection Tool And WAF Controls

Google recommends using the URL Inspection Tool in the Search Console to see how the CDN is serving your web pages. If the CDN firewall, called a Web Application Firewall (WAF), is blocking Googlebot by IP address you should be able to check for the blocked IP addresses and compare them to Google’s official list of IPs to see if one of them are on the list.

Google offers the following CDN-level debugging advice:

“If you need your site to show up in search engines, we strongly recommend checking whether the crawlers you care about can access your site. Remember that the IPs may end up on a blocklist automatically, without you knowing, so checking in on the blocklists every now and then is a good idea for your site’s success in search and beyond. If the blocklist is very long (not unlike this blog post), try to look for just the first few segments of the IP ranges, for example, instead of looking for 192.168.0.101 you can just look for 192.168.”

Read Google’s documentation for more information:

Crawling December: CDNs and crawling

Featured Image by Shutterstock/JHVEPhoto

Google Says Temporal Anomalies Affect Googlebot Crawl via @sejournal, @martinibuster

Google’s John Mueller answered a question on Reddit about Googlebot crawling and the completeness of the snapshot. The person asking the question received a response that touched on edge cases and temporal anomalies in crawling.

A Googlebot “screenshot” refers to a representation of what a web page looks like to Googlebot.

What a web page looks like depends on how it renders the page after executing JavaScript, loading CSS and downloading necessary images.

Google Search Console’s URL inspection tool gives an idea of what a web page looks like to Google. This tool helps publishers and SEOs understand how Google “sees” a web page.

Question About Knowing What Googlebot “Sees”

The person asking the question was talking about Googlebot screenshots. What they apparently meant was the rendered page as Googlebot itself sees it.

This is the question the Redditor asked:

“Is the Googlebot screenshot a complete picture of what Google can see?”

They later clarified with the following answers to questions:

“How can I know what google see in my article? …I want to know what Googlebot see in my website.”

Is Googlebot Screenshot A Complete Picture?

Returning to the original question of whether the “Googlebot screenshot a complete picture of what Google can see,” Google’s John Mueller offered the following answer.

“For the most part, yes. But there are some edge cases and temporal anomalies. Tell us more about what you’re trying to check.”

Mueller’s response acknowledges that the Googlebot screenshot represents what Google sees when it crawls a page.

Temporal Anomalies In Googlebot Screenshot

The person asking the question referred to a Googlebot screenshot as what Googlebot “sees” when it visits a web page. That also seems to be the context of Mueller’s answer.

Mueller’s answer referred to temporal anomalies which could be a reference to temporary issues at the time the web page was crawled that could have effected what resources were downloaded and consequently affected how the web page looked to Googlebot in that moment.

Google Search Console’s URL Inspection Tool also provides a snapshot that shows a live preview of how a web page appears to Google. It’s a good way to check if everything is rendered by Google the way it’s supposed to look.

Read the discussion on Reddit:

Is the Googlebot screenshot a complete picture of what Google can see?

Featured Image by Shutterstock/Sammby

Ecommerce SEO Pro on AI, 2025 Tactics

Jeff Oxford’s initial attempt at ecommerce was selling dropshipped beer pong tables in 2013. The business, he says, didn’t survive, but his love for optimizing organic search traffic did. Thus began his SEO career and the launch of 180 Marketing, his agency.

Fast forward to 2024, and Jeff is an ecommerce SEO authority. Link building was on his mind when he appeared on the podcast in 2022. He now advises engagement — getting folks to click an organic search listing and consume the page’s content — and reminds merchants that AI search has helped ecommerce rankings.

He and I discussed those tactics and more in our recent conversation. Our entire audio is embedded below. The transcript is edited for clarity and length.

Eric Bandholz: Give us a quick rundown of who you are.

Jeff Oxford: I’ve been an ecommerce SEO nerd for 13 years. I got into it by trying to start my own dropshipping sites, like selling beer pong tables or 3D printers, but things didn’t go as planned. That’s when I fell in love with search engine optimization, particularly for ecommerce websites, and I’ve stuck with it. SEO can be complicated, but at its core, it’s about a few key activities to help sites rank better.

If you’re looking to rank well in 2025, user engagement is crucial. Google determines whether visitors click on your site, stay, or quickly hit the back button. User engagement is a strong signal for rankings. You could have a perfectly optimized page, but you won’t rank well if people aren’t staying.

Google’s data leak earlier this year confirmed what many SEOs suspected — user engagement plays a significant role in rankings. SEO practitioners theorize that Google collects data from Android devices and Chrome to assess how long people stay on a site, influencing rankings.

Bandholz: How do you protect against bots that attempt to manipulate engagement data by visiting and bouncing off competitor sites?

Oxford: The click-through rate on search result pages is a major ranking factor. More clicks should increase rankings. Rand Fishkin, a prominent SEO expert, demonstrated this with live experiments in which people clicked on a site and boosted its ranking from 5th to 1st or 2nd in real time.

But manipulating the click-through rate via bots isn’t that easy. Google has advanced technology to detect fake clicks, primarily to protect its ad system. Google monitors patterns, IPs, and behaviors. So, if you tried to flood your site with bot traffic, Google would likely detect it. Some people use platforms such as Mechanical Turk to pay for manual clicks, but making the pattern look natural is hard. Spikes in traffic are red flags, and Google uses pattern recognition to detect anomalies.

Bandholz: With AI evolving, how can companies detect if competitors are using malicious AI tactics?

Oxford: Negative SEO attacks were a more prominent issue before 2014. Back then, bad backlinks could penalize a site, and competitors could build spammy links to hurt you.

However, in 2014 Google changed its approach. Instead of penalizing sites for bad links, they devalue or ignore them. The same applies to click manipulation. Google can’t always tell if the click manipulation is from the site owner or a competitor. It’s more about neutralizing the effects rather than handing out penalties.

Bandholz: What are some tactics for increasing click-throughs and dwell time?

Oxford: Dwell time is easier to control since it depends on the user experience. To improve it, focus on fast page loads and usability. Basic conversion rate optimization techniques, such as A/B and usability testing, can help keep visitors engaged.

To increase branded traffic, though, it’s more about good digital marketing than manipulation. If you’re running strong branding and marketing campaigns, your branded search traffic will increase naturally. You might run Facebook ads, sponsor events, or create content. Branded search is a symptom of good digital marketing, not the direct goal.

Bandholz: Is there still an opportunity for bottom-of-funnel SEO strategies to capture sales for people searching for products?

Oxford: Absolutely. Backlinks still work surprisingly well despite many thinking they’re dead. Most of the top-ranking sites still have backlinks. For ecommerce sites, getting backlinks to category pages can help with rankings. Google confirmed this in its algorithm leak — backlinks remain a confirmed ranking factor.

It’s harder to get backlinks today than it was 10 years ago. Back then, we could send bulk emails to journalists and get backlinks quickly. But now, bloggers know their value. They’re more likely to ask for money in exchange for backlinks. However, ecommerce sites can still build backlinks through product reviews. You can contact bloggers, offer free products for honest reviews, and earn backlinks.

Bandholz: What else should entrepreneurs and operators consider when improving their SEO?

Oxford: Besides engagement metrics and link building, optimized content is essential. Many ecommerce sites overlook category pages and have only a product grid and a heading. Adding 200 to 300 words of relevant content, discussing the benefits of products, target audience, and usage tips can help these pages rank better. You don’t want fluffy content. Instead, focus on helpful information for potential buyers.

Bandholz: How much should brands budget for optimal SEO?

Oxford: It depends on your niche. In a competitive space like weight loss supplements or CBD, you’re looking at tens of thousands of dollars a month. For less competitive niches, $2,000 to $3,000 might suffice. For ecommerce sites with seven or eight figures in annual revenue, a budget of $3,000 to $8,000 per month is typical. Most of the budget will go toward link building, content creation, and technical SEO.

Bandholz: What other trends do you see in the ecommerce SEO space?

Oxford: Many merchants panicked over artificial intelligence, worrying it might take market share from Google or hurt SEO. But AI has been beneficial for ecommerce sites. Google recently cracked down on low-quality affiliate blogs trying to rank for “best of” keywords, wiping out around 90% of these sites. Now, ecommerce sites are benefiting as they’re absorbing a lot of this traffic.

Google is prioritizing brands over generic affiliate sites. Ecommerce brands with physical locations, verified reviews, and a trustworthy presence get favored in rankings. So, if you have an ecommerce site, blogging will be more impactful now than ever.

Bandholz: Where can people learn more?

Oxford: Our website is 180marketing.com. I’m on LinkedIn.

Google’s December 2024 Updates: Final Results via @sejournal, @martinibuster

Google launched two updates in December that rolled out in succession like waves carrying happy surfers to the beach and a tsunami to the unlucky few. Many SEOs who braced for SERP drama were met with a surprising impact.

Facebook Groups: The Update Was Good!

A common theme in Facebook and forum posts is that the Google Update had a positive effect that resulted in many sites hit by previous sites were returning to life all by themselves. One person reported that a number of their dormant affiliate sites suddenly awakened and were attracting traffic. Then there were some random SEOs taking credit for de-ranked client sites returning to the SERPs, leading me to wonder if they also take credit for their clients losing rankings in the first place…

Black Hat Response To Google’s Update

The forums on Black Hat World (BHW) generally provide an indicator of how much hurt Google’s updates are handing out. Some members of BHW forum were posting about how their sites were coming back.

One member posted:

“This new update is the first update in past years for my websites to grow 🙂 For now, traffic has increased 100%+ and keeps going up…”

Another one echoed that post:

“For me it is going good for 2 days. Hope not to be bad…”

A third member shared:

“My website ranking returned today after 9 months from March update. So seams the keywords and links showing again for first time since. I was affected by the march and thought it wouldn’t recover ever again hopefully this December update once finished rolling out my site won’t be removed again”

And yet another one shared a similar experience:

“My dead website is starting to pick up keywords on ahrefs, increasing impressions on search console, but still no major change regarding traffic.”

There was one outlier who shared that their financial site lost rankings likely due to having been built on expired domains (though it’s easy to imagine other reasons).

Among the celebratory sharing was this outlier post from someone whose glass is half empty:

“Lots of sites got tanked including 3 of mine.”

Overall there was a positive tone to the black hat forum members sharing their actual experiences with Google’s December updates. That fits the pattern of what was being shared in Facebook groups, that the December updates were bringing some sites back from whatever algorithm hits they suffered in previous updates.

Then There Is The X Response.

On X, every Google update announcement is met by a large amount of comments about how big brands are the winners, spam sites are dominating the SERPs (which contradicts the first complaint), and that Google is destroying small businesses. These kinds of commments have been around for decades, from before X/Twitter. The defining characteristic of these kinds of remarks are that they are general in nature and don’t reflect anything specific about the update, they’re just venting about Google.

The response to Google’s update announcement on X was predictably negative. I’m not being snarky when I say the responses are predictably negative, that’s literally the consistent tone for every update announcement that Google makes on X.

Representative posts:

Google is destroying small businesses:

“…How is it possible every update since September 23 has destroyed the traffic of small/medium publishers while boosting corp websites that put out top 10s trash content.”

Google updates consistently benefit Reddit:

“Just call it what it is: The Gift More Traffic to Reddit Update #94”

And the Google destroyed my business post:

“I used to work independently, writing blogs and earning a decent income. I even believed I would never need a traditional job in my life. But thanks to Google, which wiped out my websites one after another, I’m now left with no income, no motivation and no job. Thank you Google”

Again, I am not minimizing the experiences of the people making those posts. I am just pointing out that none of those posts reflect experiences specific to the December Google updates. They are general statements about Google’s algorithms.

There are some outliers who were posting that the update was big but the same people say that for every update. Although 2024 was a year of massive change, those outlier posts have been consistent from well before 2024. I don’t know what’s going on there.

Google’s Core Algorithm Update: What Happened?

It feels clear that Google dialed back something in the algorithm that was suppressing the rankings of many websites. It’s been my opinion that Google’s algorithms that determine if a site is “made for search engines” has been overly hard on expert sites by people whose poor understanding of SEO resulted in otherwise high quality sites laden with high traffic keywords, sometimes showing exact matches to keywords shown in People Also Asked. That, in my opinion results in a “made for search engines” kind of look. Could it be that Google tweaked that algorithm to be a little more forgiving of content spam just as it’s forgiving for link spam?

Something to consider is that this update was followed by an anti-spam update, which could be an improved classifier to catch the spam sites that may have been set loose by the core algorithm update, while leaving the expert sites in the search results.

What About 2025?

Google’s CEO recently stated that 2025 would be a year of major changes. If the two December updates are representative of what’s coming in the future it could be that the heralded changes may not be as harsh as the series of updates in 2024. We can hope, right?