Google AI Overviews Claims More Pixel Height in SERPs via @sejournal, @martinibuster

New data from BrightEdge reveals that Google’s AI Overviews is increasingly blocking organic search results. If this trend continues, Google AI Overviews and advertisements could cover well over half of the available space in search results.

Organic Results Blocking Creeping Up

Google’s AI Overviews feature, launched in May 2024, has been a controversial feature among publishers and SEOs since day one. Many publishers resent that Google is using their content to create answers in the search results that discourage users from clicking through and reading more, thereby negatively influencing earnings.

Many publishers, including big brand publishers, have shut down from a combination of declining traffic from Google and algorithmic suppression of rankings. AI Overviews only added to publisher woes and has caused Google to become increasingly unpopular with publishers.

Google AIO Taking Over Entire Screen

BrightEdge’s research shows that AI Overviews started out in May 2024 taking up to 600 pixels of screen space, crowding out the organic search results, formerly known as the ten blue links. When advertising is factored in there isn’t much space left over for links to publisher sites.

By the end of summer the amount of space taken over by Google’s AIO increased to 800 pixels and continued to climb. At this pace BrightEdge predicts that Google could eventually reach 1,000 pixels of screen space. To put that in perspective, 600 pixels is considered “above the fold,” what users typically see without scrolling.

Graph Showing  Growth Of AIO Pixel Size By Height

Percentage Of Queries Showing AIOs

The percentage of queries that display Google’s AI Overviews have also been creeping up. Health related search queries have been trending higher than any other niche. B2B Technology, eCommerce, and finance queries are also increasingly showing AI Overview search results.

Healthcare search queries initially triggered AIO at around 70% of the time. Health related queries are now triggered over 80% of the time.

B2B technology queries started out in May 2024 showing AIO results at about 30% of the time. Now those same queries trigger AIO results almost 50% of the time.

Finance queries that trigger AI Overviews have grown from around 5% to 20% of the time. BrightEdge data shows that Google AIO coverage is trending upwards and is predicted to cover an increasing amount of search queries across other topics, specifically in travel, restaurants, and entertainment.

BrightEdge’s data shows:

“Finance shows most dramatic trajectory: starting at just 5.3% but projected to reach 15-20% by June 2025

-Healthcare led (67.5% in June)
-B2B Tech: 33.2% → 38.4%, projected 45-50%
-eCommerce: 26.9% → 35.1%, projected 40-45%
-Emerging sectors showing dramatic growth:

Entertainment (shows, events, venues): 0.3% → 5.2%
Travel (destinations, lodging, activities): 0.1% → 4.1%
Restaurants (dining, menus, reservations): ~0% → 6.0%”

BrightEdge explains that restaurant search query coverage started out small, focusing on long tail search queries like “restaurants with vegetarian food for groups” but is now is rolling out in higher amounts, suggesting that Google is feeling more comfortable with their AIO results and is expected to roll out across more search queries in 2025.

They explain:

“AIO’s evolved from basic definitions to understanding complex needs combining multiple requirements (location + features + context)

In 2025, expect AIO’s to handle even more sophisticated queries as they shift from informational to actionable responses.

-Healthcare stable at 65-70%
-B2B Tech/eCommerce will reach 40-50%
-Finance sector will surge from 5.3% to 25%
-Emerging sectors could see a 50-100x growth potential
-AIOs will evolve from informational to actionable (reservations, bookings, purchases)
-Feature complexity: 2.5x current levels”

The Takeaway

I asked BrightEdge for a comment about what they feel publishers should get ahead of for 2025.

Jim Yu, CEO of BrightEdge, responded:

“Publishers will need to adapt to the complexity of content creation and optimization while leaning into core technical SEO to guarantee their sites are seen and valued as authoritative sources.

Citations are a new form of ranking. As search and AI continue to converge, brands need to send the right signals to search and AI engines to help them decide if the content is helpful, unique, and informative. In a multi-modal world, this means schema tags about a publisher’s company, products, images, videos, overall site and content structure, reviews, and more!

In 2025, content, site structure, and authority will matter more than ever, and SEO has a huge role to play in that.

Key Questions marketers need to address in 2025

  • Is your content ready for 4-5 layered intents?
    Can you match Google’s growing complexity?
    Have you mapped your industry’s intent combinations?

Key Actions for 2025

The Pattern is clear: Simple answers → rich, context-aware responses!

  • Intent Monitoring: See which intents AIO’s are serving for your space
    Query Evolution: Identify what new keyword patterns are emerging that AIO’s are serving
    Citation Structure: Align content structure to intents and queries AIO’s are focused on to ensure you are cited
    Competitive Intelligence: Track which competitor content AIOs select and why

AIOs aren’t just displaying content differently – they’re fundamentally changing how users find and interact with information.

The takeaway from the data is that publishers are encouraged to create unambiguous content that directly address topics in order to rank for complex search queries. A careful eye on how AI Overviews are displayed and what kinds of content are cited and linked to is encouraged.

Google’s CEO, Sundar Pichai, recently emphasized increasing the amount of coverage that AI assistants like Gemini handle, which implies that Google’s focus on AI, if successful, may begin to eat into the amount of traffic from the traditional search box. That’s a trend to be on the watch for and a wakeup call to get on top of creating content that resonates with today’s AI Search.

The source of AIO data is from the proprietary BrightEdge Generative Parser™ and DataCubeX, which regularly informs the BrightEdge guide to AIO.

How Long Should An SEO Migration Take? [Study Updated] via @sejournal, @TaylorDanRW

Website migrations, specifically domain migrations, are often seen as one of the more complex parts of SEO.

They are becoming increasingly more common as businesses are consolidating websites and assets to reduce costs and consolidate efforts as more channels and platforms come into play.

As SEO professionals, we are tasked with mitigating as many risks and variables as possible so that the business doesn’t see organic performance issues – either at all or for a longer than necessary period of time.

When we ran this study in 2023, we looked at 171 migrations and found that it took 229 days (on average) for third-party tools to reflect organic traffic for the new domain to return to the same pre-migration levels of the original domain; 42% didn’t return at all.

The reason we’ve repeated this study is that we think it’s important that businesses (and SEO marketers) have data to work off to make informed decisions when planning domain migrations.

Over the years, I’ve been in a number of pitch meetings where the other agencies pitching have promised no traffic loss at all during migration, and more often than not, adequate preparation work, monitoring, or expectation setting has been done.

Study Methodology

This study aims to research and provide a data-led answer to “How long should an SEO migration take,” to help both in-house SEOs and consultants provide better estimations and communications with non-SEO stakeholders of their migration projects.

This is building off of last year’s study in which we looked at 171 domain migrations. This year, we’ve expanded the dataset to 892, thanks to fellow SEO professionals responding to information requests on various Slack channels and X (Twitter).

Using third-party tools, we then measured the number of days it took Domain B (the new domain) to achieve the same estimated organic traffic volume as Domain A (the old domain).

Data was collated on October 22nd, 2024.

Bias Factors

Bias in a quantitative study refers to systematic errors that can significantly skew the results, leading to incorrect conclusions.

These biases can arise at any stage of the research process, such as in the design, data collection, analysis, and interpretation stages.

Bias undermines the validity and reliability of the study’s results and can lead to misleading conclusions.

Where possible, we have taken steps to mitigate any bias from impacting the study.

  • Selection Bias: We have worked to eliminate this as much as possible, with a high percentage of the data coming from Ahrefs (unfiltered) and an open request to the SEO community. This should negate our own inputs from sectors we specialize in. This has led to a variety of domains in a variety of sectors.
  • Measurement Bias: As we’re using a third-party tool, the bias here is limited to the scope of the tool, database updates, and the keyword set. As we’re comparing two domains and making the assumption that they match in terms of the target keyword set, this bias should be mitigated.
  • Confounding Bias: As we’re comparing date periods within the same tool, no correlations are being made in terms of data analysis.
  • Publication Bias: This study was going to be published and submitted regardless of percentage/data findings. So, this bias is mitigated.

The data set contained domains of varying usage, from lead generation in SaaS, legal, and finance to blogs, local retail, and ecommerce.

Study Findings

The key takeaways from the domain study are:

  • On average, it took 523 days for Domain B to show the same level of organic traffic as Domain A.
  • The shortest times recorded were 19, 22, 23, and 33 days.
  • 17% of domain migrations in the sample didn’t see organic traffic return to the same levels after 1,000 days. This is a significant improvement versus the 42% from the previous study.
  • We classified three migrations as “In progress” as they were less than two years old, and traffic was returning slowly.
  • We classified 25 migrations (2.8%) as “Inconclusive” as Domain B traffic had hit the levels of Domain A, but wasn’t stable.

From the original data set, a number of domains dropped and are now redirected to domain squatters/private domain sellers.

As these aren’t “genuine” domain migrations, and the new domain was never intended to maintain the same keywords and traffic, these have been discounted and not included in the data.

Why Do Migration Results Differ?

No two websites are the same, and there are several variables in a website migration we can control – and several we can’t.

The discourse around migrations in the SEO industry has really not changed for a number of years, with basic best practices being established and then layering the basics with situation-dependent needs to mitigate risks.

Google rebuilds its index on a page-by-page basis, so opening up new crawl paths and URLs ahead of time could speed up the initial Discover and Crawl phases.

From experience, launching the new domain and URL structure 24-48 hours ahead of performing the migration, i.e. implementing redirects, can help speed up the process as Google has already began to crawl and start processing the new URL paths in the majority of cases. This coupled with the change of address tool in Google Search Console can smooth a lot of early migration lag.

Backlink Profiles & Migrations

While crowd-sourcing domains for this study, I also asked the community why there are “time lags” in migrations.

Natalia Witczyk proposed the idea that it’s related to backlink profiles and how long it takes Google to process profile transference:

From my experience, for the rankings and traffic to be back to normal levels, the backlink profile has to be recrawled and the redirects have to be reflected.

That takes close to no time if the backlink profile is non-existent, so the return to normal traffic levels happens fast. If the backlink profile is extensive, there’s more to recrawl and that would take Google more time.

This prompted me to look at the total number of referring domains each domain had, and there is some correlation to this being the case, but with a large number of outliers – likely due to how the migration was carried out.

For more information and best practices on website migrations, I’d recommend reading the below articles:

More Resources:


Featured Image: ParinPix/Shutterstock

Google CEO Describes A 2025 Beyond A Search Box via @sejournal, @martinibuster

Google’s Sundar Pichai outlined the 2025 strategy, emphasizing consumer-focused AI, rapid development of agentic apps, a Chrome AI prototype called Project Mariner, and upgrades to Gemini and Project Astra, signaling a shift toward AI apps as the user interface for search.

Although Pichai did not say Google is de-emphasizing the Google Search box, he did emphasize that 2025 will increase the focus on AI apps as the main point of contact between users and how they interact with Google.

For example, Project Mariner is a Chrome AI extension that can do things like take a top ten restaurants list from TripAdvisor and drop it into Google Maps.

This focus on AI shows that Google is in transition toward an AI-based user experiences that represent a larger interpretation of what Search means, a search experience that goes far beyond textual question and answering.

Google’s Future Hinges On AI

Google CEO, Sundar Pichai, outlined a vision for 2025 that emphasizes an urgency to go back to its roots as a company that innovates quickly, what Pichai referred to as being “scrappy” which means being tough and resourceful, able to accomplish a lot in a quick amount of time (and fewer resources). Most importantly he emphasized solving real-world problems.

He also prioritizes “building big, new business” which could mean creating new business opportunities with AI, reflecting a strong focus on AI as the engine for innovation in 2025.

Gemini App

Pichai also cited Gemini App as a central focus for 2025, commenting that they’re experiencing growth with Gemini and that scaling broader adoption of Gemini will be a focus in 2025. This aligns with the observation that Google is increasingly focusing on a Search-adjacent approach to consumer focused AI products and services.

What this means for SEO is that we really need to start thinking in terms of a bigger picture of what Search means. Perhaps 2025 will be, after over 15 years of Google’s departure from the ten blue links paradigm, that the SEO community thinks deeper about what search means when it’s multimodal.

Pichai was quoted as saying:

“With the Gemini app, there is strong momentum, particularly over the last few months… But we have some work to do in 2025 to close the gap and establish a leadership position there as well. …Scaling Gemini on the consumer side will be our biggest focus next year.”

AI Products Will “Evolve Massively”

The co-founder of Google Deep Mind was quoted as saying that Google was going to “turbo charge” the Gemini app, saying that:

“…the products themselves are going to evolve massively over the next year or two.”

That means that the Gemini app is going to gain more functionalities in a bid to make it more ubiquitous as the interface between potential website visitors and Google Search, a significant departure from interfacing with the search box.

This is something that publishers and SEOs need to think really hard about as we enter 2025. Google is focusing on increasing user adoption of the Gemini app. If that happens then that will mean more people interfacing with that instead of the Google Search box.

Universal Assistant (Project Astra)

Another thing that the SEO industry seriously needs to consider is Google’s universal assistant that’s code-named Project Astra. The Deep Mind co-founder is reported to have discussed their Universal Assistant, which what Project Astra is referred as.

Screenshot of DeepMind Project Astra web page showing how it is referred to as a Universal AI Assistant.

He’s quoted as saying that it can:

“…seamlessly operate over any domain, any modality or any device.”

What that word “domain” means is that it can function across any subject, like answering questions about healthcare, directions, entertainment, over any topic. The part about modality is a reference to text, voice, images, and video.

This is a serious situation for SEO. Google’s new Deep Research agentic search is an example of a disruptive technology that may have a negative impact on the web ecosystem.

One of the Google Deep Mind researchers cited as working on Project Astra is also listed as a co-inventor on a patent about controlling interactive AI agents through multi-modal inputs.

The patent is titled, Controlling Interactive Agents Using Multi-Modal Inputs. The description of the invention is:

“Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for controlling agents. In particular, an interactive agent can be controlled based on multi-modal inputs that include both an observation image and a natural language text sequence.”

That’s just one of dozens of researchers cited as having worked on Project Astra. Astra is another one of the projects that Google is working on that replaces the traditional search box as people’s point of contact for interacting with web data.

Takeaway About Google’s Plans For 2025

The takeaway from all this is that publishers and SEOs need to take a break from focusing solely on the search box and give some time to considering what’s going on in multimodal AI.  In 2025, AI is not just AI Overviews. AI is Gemini, it’s new features coming to Gemini and possibly the release of features developed from Project Astra, a multimodal universal agent. Agentic Search is already here in the form of Gemini Deep Research. All of these are a departure from the traditional search box as the point of contact between users, Google and websites.

Read the report on CNBC

Google CEO Pichai tells employees to gear up for big 2025: ‘The stakes are high’

Google Gemini Deep Research May Erode Website Earnings via @sejournal, @martinibuster

There’s a compelling theory floating around that Google’s AI agent, called Deep Research, could negatively impact affiliate sites. If true, not only would this impact affiliate site earnings, it could also decrease ad revenues and web traffic and to informational sites, including to the “lucky” sites that are linked to by Google’s AI research assistant.

Gemini Deep Research

Gemini Deep Research is a new tool available to premium subscribers to Gemini Advanced. Deep Research takes a user’s queries and researches an answer on the web then generates a report. The research can be further refined to produce increasingly precise results.

Google rolled out Deep Research on December 11th. It describes it as a time-saver that creates a research plan and once approved will carry out the research.

Google explains:

“Deep Research uses AI to explore complex topics on your behalf and provide you with findings in a comprehensive, easy-to-read report, and is a first look at how Gemini is getting even better at tackling complex tasks to save you time.

Under your supervision, Deep Research does the hard work for you. After you enter your question, it creates a multi-step research plan for you to either revise or approve. Once you approve, it begins deeply analyzing relevant information from across the web on your behalf.”

Deep Research presents a report that features a summary and recommendations. If searching for a product it will summarize the pros and cons with enough data that a user won’t need to click a link to visit a site, they can just go directly to a retailer and purchase the product, thereby eliminating the possibility of a site visitor clicking an affiliate link from a review website and depriving that informational site of revenue.

According to an article by Marie Haynes on YouKnowAI, the thoroughness of the summary generated by Gemini Deep Research negates the need to visit a websites, thereby depriving the site of affiliate link revenue.

YouKnowAI explains:

“…perhaps sites like foodnetwork.com will get clicks and subsequent affiliate sales. I’ve found in my own research so far that I’m not clicking on sites as I get what I need to know from the research and then go to official sites or perhaps Amazon, or stores near me to purchase.

…The obvious question here is what happens when sites like foodnetwork.com and seriouseats.com see a reduction in traffic? “

If it’s true that Gemini Deep Research users won’t need to visit sites to make up their minds then it’s possible that this new tool will also negatively affect web traffic and advertising revenue.

Is Google Out Of Touch With The Web Ecosystem?

In a recent interview, Google’s CEO, Sundar Pichai, insisted that Google cares about the web ecosystem. When asked how Google supports the web ecosystem he struggled to articulate an answer. After a long series of uhms and false starts he started talking about how Google’s own YouTube platform enables multinational media corporations to monetize their intellectual properties on YouTube.

“He avoids mentioning websites, speaking in the abstract about the “ecosystem” and then when he runs out of things to say changes course and begins speaking about how Google compensates copyright holders who sign up for YouTube’s Content ID program.

He answered:

‘Look I… uh… It’s a… very important question… uhm… look I… I… think… I think more than any other company… look you know… we for a long time through… you know… be it in search making sure… while it’s often debated, we spend a lot of time thinking about the traffic we send to the ecosystem.

Even through the moment through the transition over the past couple of years. It’s an important priority for us.’”

This Is Why Google CEO’s Explanation Falls Short

1. YouTube is not the web ecosystem, it’s Google’s own platform.

2. Multinational mega corporations are not web creators.

Pichai’s answer sent the unintended message that Google is completely out of touch with web creators and if the author of the article about Google Gemini’s Deep Research tool is correct, this is further proof that Google continues to focus on providing information to users at the expense of creators.

Is Gemini Deep Research Harvesting Data Without Giving Back?

There’s an old television episode of The Twilight Zone called To Serve Man that relates the story of a benevolent race of aliens who bring advanced technologies that allow humans to live in peace, with food security and prosperity for everyone. As evidence of their good intentions they give the world a book written in an alien language that’s titled To Serve Man. The episode ends when government cryptographers translate the book and discover that it’s a cookbook and that the aliens true intentions are to farm humans as a food source.

Google’s mission statement promising “to organize the world’s information and make it universally accessible and useful” also seems like proof of their good intentions. However, the mission statement doesn’t explicitly say that Google will refer users to the sources of information. It only promises to organize and provide the information itself in a way that’s accessible and useful. While referring users to the creators of the information could be a part of making information accessible and useful, it’s not explicitly stated; it’s not even implied in the mission statement.

Is Google Gemini Deep Research further proof that Google is harvesting websites as an information source?

If you’re a creator, does it make you feel farmed?

Featured Image by Shutterstock/Nomad_Soul

Google Explains How CDNs Impact Crawling & SEO via @sejournal, @martinibuster

Google published an explainer that discusses how Content Delivery Networks (CDNs) influence search crawling and improve SEO but also how they can sometimes cause problems.

What Is A CDN?

A Content Delivery Network (CDN) is a service that caches a web page and displays it from a data center that’s closest to the browser requesting that web page. Caching a web page means that the CDN creates a copy of a web page and stores it. This speeds up web page delivery because now it’s served from a server that’s closer to the site visitor, requiring less “hops” across the Internet from the origin server to the destination (the site visitor’s browser).

CDNs Unlock More Crawling

One of the benefits of using a CDN is that Google automatically increases the crawl rate when it detects that web pages are being served from a CDN. This makes using a CDN attractive to SEOs and publishers who are concerned about increasing the amount of pages that are crawled by Googlebot.

Normally Googlebot will reduce the amount of crawling from a server if it detects that it’s reaching a certain threshold that’s causing the server to slow down. Googlebot slows the amount of crawling, which is called throttling. That threshold for “throttling” is higher when a CDN is detected, resulting in more pages crawled.

Something to understand about serving pages from a CDN is that the first time pages are served they must be served directly from your server. Google uses an example of a site with over a million web pages:

“However, on the first access of a URL the CDN’s cache is “cold”, meaning that since no one has requested that URL yet, its contents weren’t cached by the CDN yet, so your origin server will still need serve that URL at least once to “warm up” the CDN’s cache. This is very similar to how HTTP caching works, too.

In short, even if your webshop is backed by a CDN, your server will need to serve those 1,000,007 URLs at least once. Only after that initial serve can your CDN help you with its caches. That’s a significant burden on your “crawl budget” and the crawl rate will likely be high for a few days; keep that in mind if you’re planning to launch many URLs at once.”

When Using CDNs Backfire For Crawling

Google advises that there are times when a CDN may put Googlebot on a blacklist and subsequently block crawling. This effect is described as two kinds of blocks:

1. Hard blocks

2. Soft blocks

Hard blocks happen when a CDN responds that there’s a server error. A bad server error response can be a 500 (internal server error) which signals a major problem is happening with the server. Another bad server error response is the 502 (bad gateway). Both of these server error responses will trigger Googlebot to slow down the crawl rate. Indexed URLs are saved internally at Google but continued 500/502 responses can cause Google to eventually drop the URLs from the search index.

The preferred response is a 503 (service unavailable), which indicates a temporary error.

Another hard block to watch out for are what Google calls “random errors” which is when a server sends a 200 response code, which means that the response was good (even though it’s serving an error page with that 200 response). Google will interpret those error pages as duplicates and drop them from the search index. This is a big problem because it can take time to recover from this kind of error.

A soft block can happen if the CDN shows one of those “Are you human?” pop-ups (bot interstitials) to Googlebot. Bot interstitials should send a 503 server response so that Google knows that this is a temporary issue.

Google’s new documentation explains:

“…when the interstitial shows up, that’s all they see, not your awesome site. In case of these bot-verification interstitials, we strongly recommend sending a clear signal in the form of a 503 HTTP status code to automated clients like crawlers that the content is temporarily unavailable. This will ensure that the content is not removed from Google’s index automatically.”

Debug Issues With URL Inspection Tool And WAF Controls

Google recommends using the URL Inspection Tool in the Search Console to see how the CDN is serving your web pages. If the CDN firewall, called a Web Application Firewall (WAF), is blocking Googlebot by IP address you should be able to check for the blocked IP addresses and compare them to Google’s official list of IPs to see if one of them are on the list.

Google offers the following CDN-level debugging advice:

“If you need your site to show up in search engines, we strongly recommend checking whether the crawlers you care about can access your site. Remember that the IPs may end up on a blocklist automatically, without you knowing, so checking in on the blocklists every now and then is a good idea for your site’s success in search and beyond. If the blocklist is very long (not unlike this blog post), try to look for just the first few segments of the IP ranges, for example, instead of looking for 192.168.0.101 you can just look for 192.168.”

Read Google’s documentation for more information:

Crawling December: CDNs and crawling

Featured Image by Shutterstock/JHVEPhoto

Google Says Temporal Anomalies Affect Googlebot Crawl via @sejournal, @martinibuster

Google’s John Mueller answered a question on Reddit about Googlebot crawling and the completeness of the snapshot. The person asking the question received a response that touched on edge cases and temporal anomalies in crawling.

A Googlebot “screenshot” refers to a representation of what a web page looks like to Googlebot.

What a web page looks like depends on how it renders the page after executing JavaScript, loading CSS and downloading necessary images.

Google Search Console’s URL inspection tool gives an idea of what a web page looks like to Google. This tool helps publishers and SEOs understand how Google “sees” a web page.

Question About Knowing What Googlebot “Sees”

The person asking the question was talking about Googlebot screenshots. What they apparently meant was the rendered page as Googlebot itself sees it.

This is the question the Redditor asked:

“Is the Googlebot screenshot a complete picture of what Google can see?”

They later clarified with the following answers to questions:

“How can I know what google see in my article? …I want to know what Googlebot see in my website.”

Is Googlebot Screenshot A Complete Picture?

Returning to the original question of whether the “Googlebot screenshot a complete picture of what Google can see,” Google’s John Mueller offered the following answer.

“For the most part, yes. But there are some edge cases and temporal anomalies. Tell us more about what you’re trying to check.”

Mueller’s response acknowledges that the Googlebot screenshot represents what Google sees when it crawls a page.

Temporal Anomalies In Googlebot Screenshot

The person asking the question referred to a Googlebot screenshot as what Googlebot “sees” when it visits a web page. That also seems to be the context of Mueller’s answer.

Mueller’s answer referred to temporal anomalies which could be a reference to temporary issues at the time the web page was crawled that could have effected what resources were downloaded and consequently affected how the web page looked to Googlebot in that moment.

Google Search Console’s URL Inspection Tool also provides a snapshot that shows a live preview of how a web page appears to Google. It’s a good way to check if everything is rendered by Google the way it’s supposed to look.

Read the discussion on Reddit:

Is the Googlebot screenshot a complete picture of what Google can see?

Featured Image by Shutterstock/Sammby

Ecommerce SEO Pro on AI, 2025 Tactics

Jeff Oxford’s initial attempt at ecommerce was selling dropshipped beer pong tables in 2013. The business, he says, didn’t survive, but his love for optimizing organic search traffic did. Thus began his SEO career and the launch of 180 Marketing, his agency.

Fast forward to 2024, and Jeff is an ecommerce SEO authority. Link building was on his mind when he appeared on the podcast in 2022. He now advises engagement — getting folks to click an organic search listing and consume the page’s content — and reminds merchants that AI search has helped ecommerce rankings.

He and I discussed those tactics and more in our recent conversation. Our entire audio is embedded below. The transcript is edited for clarity and length.

Eric Bandholz: Give us a quick rundown of who you are.

Jeff Oxford: I’ve been an ecommerce SEO nerd for 13 years. I got into it by trying to start my own dropshipping sites, like selling beer pong tables or 3D printers, but things didn’t go as planned. That’s when I fell in love with search engine optimization, particularly for ecommerce websites, and I’ve stuck with it. SEO can be complicated, but at its core, it’s about a few key activities to help sites rank better.

If you’re looking to rank well in 2025, user engagement is crucial. Google determines whether visitors click on your site, stay, or quickly hit the back button. User engagement is a strong signal for rankings. You could have a perfectly optimized page, but you won’t rank well if people aren’t staying.

Google’s data leak earlier this year confirmed what many SEOs suspected — user engagement plays a significant role in rankings. SEO practitioners theorize that Google collects data from Android devices and Chrome to assess how long people stay on a site, influencing rankings.

Bandholz: How do you protect against bots that attempt to manipulate engagement data by visiting and bouncing off competitor sites?

Oxford: The click-through rate on search result pages is a major ranking factor. More clicks should increase rankings. Rand Fishkin, a prominent SEO expert, demonstrated this with live experiments in which people clicked on a site and boosted its ranking from 5th to 1st or 2nd in real time.

But manipulating the click-through rate via bots isn’t that easy. Google has advanced technology to detect fake clicks, primarily to protect its ad system. Google monitors patterns, IPs, and behaviors. So, if you tried to flood your site with bot traffic, Google would likely detect it. Some people use platforms such as Mechanical Turk to pay for manual clicks, but making the pattern look natural is hard. Spikes in traffic are red flags, and Google uses pattern recognition to detect anomalies.

Bandholz: With AI evolving, how can companies detect if competitors are using malicious AI tactics?

Oxford: Negative SEO attacks were a more prominent issue before 2014. Back then, bad backlinks could penalize a site, and competitors could build spammy links to hurt you.

However, in 2014 Google changed its approach. Instead of penalizing sites for bad links, they devalue or ignore them. The same applies to click manipulation. Google can’t always tell if the click manipulation is from the site owner or a competitor. It’s more about neutralizing the effects rather than handing out penalties.

Bandholz: What are some tactics for increasing click-throughs and dwell time?

Oxford: Dwell time is easier to control since it depends on the user experience. To improve it, focus on fast page loads and usability. Basic conversion rate optimization techniques, such as A/B and usability testing, can help keep visitors engaged.

To increase branded traffic, though, it’s more about good digital marketing than manipulation. If you’re running strong branding and marketing campaigns, your branded search traffic will increase naturally. You might run Facebook ads, sponsor events, or create content. Branded search is a symptom of good digital marketing, not the direct goal.

Bandholz: Is there still an opportunity for bottom-of-funnel SEO strategies to capture sales for people searching for products?

Oxford: Absolutely. Backlinks still work surprisingly well despite many thinking they’re dead. Most of the top-ranking sites still have backlinks. For ecommerce sites, getting backlinks to category pages can help with rankings. Google confirmed this in its algorithm leak — backlinks remain a confirmed ranking factor.

It’s harder to get backlinks today than it was 10 years ago. Back then, we could send bulk emails to journalists and get backlinks quickly. But now, bloggers know their value. They’re more likely to ask for money in exchange for backlinks. However, ecommerce sites can still build backlinks through product reviews. You can contact bloggers, offer free products for honest reviews, and earn backlinks.

Bandholz: What else should entrepreneurs and operators consider when improving their SEO?

Oxford: Besides engagement metrics and link building, optimized content is essential. Many ecommerce sites overlook category pages and have only a product grid and a heading. Adding 200 to 300 words of relevant content, discussing the benefits of products, target audience, and usage tips can help these pages rank better. You don’t want fluffy content. Instead, focus on helpful information for potential buyers.

Bandholz: How much should brands budget for optimal SEO?

Oxford: It depends on your niche. In a competitive space like weight loss supplements or CBD, you’re looking at tens of thousands of dollars a month. For less competitive niches, $2,000 to $3,000 might suffice. For ecommerce sites with seven or eight figures in annual revenue, a budget of $3,000 to $8,000 per month is typical. Most of the budget will go toward link building, content creation, and technical SEO.

Bandholz: What other trends do you see in the ecommerce SEO space?

Oxford: Many merchants panicked over artificial intelligence, worrying it might take market share from Google or hurt SEO. But AI has been beneficial for ecommerce sites. Google recently cracked down on low-quality affiliate blogs trying to rank for “best of” keywords, wiping out around 90% of these sites. Now, ecommerce sites are benefiting as they’re absorbing a lot of this traffic.

Google is prioritizing brands over generic affiliate sites. Ecommerce brands with physical locations, verified reviews, and a trustworthy presence get favored in rankings. So, if you have an ecommerce site, blogging will be more impactful now than ever.

Bandholz: Where can people learn more?

Oxford: Our website is 180marketing.com. I’m on LinkedIn.

Google’s December 2024 Updates: Final Results via @sejournal, @martinibuster

Google launched two updates in December that rolled out in succession like waves carrying happy surfers to the beach and a tsunami to the unlucky few. Many SEOs who braced for SERP drama were met with a surprising impact.

Facebook Groups: The Update Was Good!

A common theme in Facebook and forum posts is that the Google Update had a positive effect that resulted in many sites hit by previous sites were returning to life all by themselves. One person reported that a number of their dormant affiliate sites suddenly awakened and were attracting traffic. Then there were some random SEOs taking credit for de-ranked client sites returning to the SERPs, leading me to wonder if they also take credit for their clients losing rankings in the first place…

Black Hat Response To Google’s Update

The forums on Black Hat World (BHW) generally provide an indicator of how much hurt Google’s updates are handing out. Some members of BHW forum were posting about how their sites were coming back.

One member posted:

“This new update is the first update in past years for my websites to grow 🙂 For now, traffic has increased 100%+ and keeps going up…”

Another one echoed that post:

“For me it is going good for 2 days. Hope not to be bad…”

A third member shared:

“My website ranking returned today after 9 months from March update. So seams the keywords and links showing again for first time since. I was affected by the march and thought it wouldn’t recover ever again hopefully this December update once finished rolling out my site won’t be removed again”

And yet another one shared a similar experience:

“My dead website is starting to pick up keywords on ahrefs, increasing impressions on search console, but still no major change regarding traffic.”

There was one outlier who shared that their financial site lost rankings likely due to having been built on expired domains (though it’s easy to imagine other reasons).

Among the celebratory sharing was this outlier post from someone whose glass is half empty:

“Lots of sites got tanked including 3 of mine.”

Overall there was a positive tone to the black hat forum members sharing their actual experiences with Google’s December updates. That fits the pattern of what was being shared in Facebook groups, that the December updates were bringing some sites back from whatever algorithm hits they suffered in previous updates.

Then There Is The X Response.

On X, every Google update announcement is met by a large amount of comments about how big brands are the winners, spam sites are dominating the SERPs (which contradicts the first complaint), and that Google is destroying small businesses. These kinds of commments have been around for decades, from before X/Twitter. The defining characteristic of these kinds of remarks are that they are general in nature and don’t reflect anything specific about the update, they’re just venting about Google.

The response to Google’s update announcement on X was predictably negative. I’m not being snarky when I say the responses are predictably negative, that’s literally the consistent tone for every update announcement that Google makes on X.

Representative posts:

Google is destroying small businesses:

“…How is it possible every update since September 23 has destroyed the traffic of small/medium publishers while boosting corp websites that put out top 10s trash content.”

Google updates consistently benefit Reddit:

“Just call it what it is: The Gift More Traffic to Reddit Update #94”

And the Google destroyed my business post:

“I used to work independently, writing blogs and earning a decent income. I even believed I would never need a traditional job in my life. But thanks to Google, which wiped out my websites one after another, I’m now left with no income, no motivation and no job. Thank you Google”

Again, I am not minimizing the experiences of the people making those posts. I am just pointing out that none of those posts reflect experiences specific to the December Google updates. They are general statements about Google’s algorithms.

There are some outliers who were posting that the update was big but the same people say that for every update. Although 2024 was a year of massive change, those outlier posts have been consistent from well before 2024. I don’t know what’s going on there.

Google’s Core Algorithm Update: What Happened?

It feels clear that Google dialed back something in the algorithm that was suppressing the rankings of many websites. It’s been my opinion that Google’s algorithms that determine if a site is “made for search engines” has been overly hard on expert sites by people whose poor understanding of SEO resulted in otherwise high quality sites laden with high traffic keywords, sometimes showing exact matches to keywords shown in People Also Asked. That, in my opinion results in a “made for search engines” kind of look. Could it be that Google tweaked that algorithm to be a little more forgiving of content spam just as it’s forgiving for link spam?

Something to consider is that this update was followed by an anti-spam update, which could be an improved classifier to catch the spam sites that may have been set loose by the core algorithm update, while leaving the expert sites in the search results.

What About 2025?

Google’s CEO recently stated that 2025 would be a year of major changes. If the two December updates are representative of what’s coming in the future it could be that the heralded changes may not be as harsh as the series of updates in 2024. We can hope, right?

Wikipedia And SEO: Everything You Need To Know via @sejournal, @MattGSouthern

Many people misunderstand how Wikipedia relates to SEO.

Wikipedia doesn’t provide direct SEO benefits like followed backlinks or promotional content. However, it’s a valuable resource for digital marketers and content creators.

This article examines how Wikipedia affects Google’s Knowledge Graph, provides keyword research guidance, supports content planning, and demonstrates effective internal linking practices.

Key topics include:

  • Wikipedia’s content guidelines and their impact on SEO.
  • The platform’s role in Google’s Knowledge Graph.
  • How to use Wikipedia for keyword research and content planning.
  • Lessons from Wikipedia’s internal linking structure.
  • Best ways to include Wikipedia in SEO strategies.

We will also explore how to add insights from Wikipedia into your SEO strategy without breaking its terms of use.

Wikipedia Guidelines & SEO

Wikipedia has strict rules about what content it allows.

These rules include being notable, maintaining a neutral point of view, being verifiable, and using reliable sources.

Following these rules is essential; otherwise, your content may be removed, and your account could be banned.

Many people mistakenly believe that creating a Wikipedia page for their business or adding links to their website will improve their search engine rankings.

However, Wikipedia doesn’t allow entries made for advertising purposes. Also, all external links are labeled as “nofollow,” which means they don’t help with SEO.

John Mueller, a Google Search Advocate, has stated:

“Randomly dropping a link into Wikipedia has no SEO value and will do nothing for your site. All you’re doing is creating extra work for the Wikipedia maintainers, who will remove your link drops. It’s a waste of your time and theirs.”

While you can’t use Wikipedia for direct SEO benefits, you can still find several ways to use the platform to support your overall SEO strategy.

Wikipedia’s Role In Google’s Knowledge Graph

Google’s Knowledge Graph is a system that understands facts and entities and how they relate. It was originally informed by Freebase and also the CIA World Factbook and Wikipedia.

The Knowledge Graph also informs Google Knowledge Panels, which display for known entities on the right hand side of SERPs.

For example, when you search for a historical figure like Leonardo da Vinci, the panel overviews da Vinci’s life, key facts, and related entities, with much of the information coming from Wikipedia.

Screenshot from Google, December 2024

One of the most powerful ways to use Wikipedia is to understand how Google connects different topics and entities.

This can help you when creating content to understand what entities are related to topics. And to optimize your content to align with how search engines interpret and display information, increasing your visibility in search results.

Using Wikipedia For Keyword Research

Keyword research is an integral part of SEO, and Wikipedia can help you find useful terms and phrases.

Subject experts often write Wikipedia articles, so they use specific language that your audience may use when searching for information.

For example, if you are writing about renewable energy, look at Wikipedia articles on solar, wind, and geothermal energy. These articles can help you find key terms to include.

Studying the language in these articles can improve your keyword strategy and ensure your content connects with your audience.

Wikipedia also provides valuable insights into how popular specific topics are. You can access traffic statistics that reveal how many users have visited a page.

If a topic has many page views, it shows a strong interest in that subject. You can use this information to choose which topics to focus on, helping you attract more organic traffic.

Wikipedia As A Content Planning Tool

Wikipedia is a goldmine of ideas and inspiration for content planning.

By looking at the citations, external links, and related pages in Wikipedia articles, you can find helpful information and potential topics for your website.

For example, suppose you have a blog about digital marketing and research “content marketing” on Wikipedia. In that case, you may discover links to articles about the history of content marketing, different content formats, and successful case studies.

These resources can inspire blog posts like “The Evolution of Content Marketing: From Print to Digital” or “10 Proven Content Formats to Engage Your Audience.”

Wikipedia can also help you find content gaps and topics that are not thoroughly covered in your field.

Look for stubs, short articles that lack detailed information, and pages with missing citations or broken links. These areas are good opportunities to create in-depth content that provides value to your audience.

By filling these gaps, you can attract more visitors and make your website a trusted resource in your industry.

Learning From Wikipedia’s Internal Linking Structure

Wikipedia’s internal linking structure is an excellent example of how to organize and connect related information. It links articles extensively, creating an easy-to-navigate web of knowledge.

You can learn effective ways to organize and connect your content by looking at how Wikipedia structures its content and links.

To create a clear information hierarchy, Wikipedia uses categories, subcategories, and hyperlinks.

For example, the “Search Engine Optimization” article falls under the category “Search Engines.” This structure helps users see how different topics relate to one another and makes navigation easier.

Similar principles can be used for your website to keep your content organized, easy to navigate, and connected.

Creating a clear structure and linking related pages improves the user experience and helps search engines understand your content’s context. This can enhance your search engine rankings and overall SEO performance.

Summary: Using Wikipedia As A Tool For SEO

Wikipedia may not directly affect search engine rankings, but it is an essential resource for your SEO strategy.

You can use Wikipedia’s wealth of information to improve your keyword research, content planning, and understanding of how information connects online.

Instead of trying to manipulate Wikipedia for quick SEO wins, use it as a tool for research and learning.

Use the insights you gain from Wikipedia for your website and content strategy.

More Resources:


Featured Image: Antlii/Shutterstock

10 Strategic SEO Insights & Tactical Advice For 2025 And Beyond via @sejournal, @gregjarboe

At the beginning of 2002, there were more than a dozen “search engines,” including crawlers, directories, and paid/PPC/CPC engines.

But by the end of that year, Google had emerged from the pack to become the leading player. (According to Nielsen//NetRatings, Google had a 39% share of 47 million “search hours” in December 2002.1)

I recently called the past 20 years “The Age of the One-Trick Pony.” Back at the beginning of 2003, if you figured out how to improve a website’s visibility in Google’s organic search results, then you could get a well-paying job at an SEO agency.

But, SEO professionals need to prepare for a paradigm shift as that age is about to end.

This means you must invest more time learning innovative marketing disciplines, and demonstrate prudent judgment to manage change.

As we step into the future of search, I can share five strategic insights and five pieces of tactical advice.

Strategic Insights

According to Sun Tzu, an ancient Chinese military strategist, “Tactics without strategy is the noise before defeat.”

And far too many SEOs have spent too little time exploring and evaluating different digital marketing strategies.

This explains Why There Are So Few Vice Presidents of Search Engine Optimization.

So, what can you do to outline a strategy for 2025 and beyond?

1. Embrace AI As A Powerful Tool

You’ve already learned how to use SEO tools that help you improve your company or client’s search engine rankings by analyzing keywords, content, and backlinks.

So, instead of feeling threatened by AI, embrace it as just another tool to add to your toolbox.

Jensen Huang, the founder, president, and CEO of Nvidia, has said, “AI is not going to take your job. The person who uses AI is going to take your job.”

Almost two years later, we’ve learned by comparing the content output generated by ChatGPT, Gemini, and Claude that generative AI tools may be smarter than newbies at times, but not people with more education, expertise, and experience.

For example, researchers at the University of Reading in England created over 30 fake psychology student accounts and used them to submit ChatGPT-4-produced answers to examination questions.

On average, ChatGPT-4 scored better than human students in the first- and second-year exams, where the questions were easier.

In their last year at the university, students are expected to provide deeper insights and use more elaborate analytical skills. Generative AI isn’t particularly good at that, which is why third-year human students got better grades than ChatGPT-4.

So, embrace AI as a powerful tool – but one that requires someone with education, expertise, and experience to use it effectively. And whatever you do, don’t become a tool of your tools.

2. Conduct Audience Research

I’ve been using keyword research tools since 2002, but I started using an audience research tool in 2020.

With classic keyword research, you learn how many searches a keyword gets. With an audience research tool, you also learn about the audience that searches for a keyword, uses words in their bio, or visits a website.

This is a game changer – and it’s arriving at the precise moment when SEO pros need to start creating the kind of user, buyer, marketing, and customer personas that can transform SEO, PPC, and content marketing.

To get a seat at the big table, SEO marketers must move beyond optimizing their sites, pages, articles, and content for an undifferentiated group of search engine users.

Why? Because one size does not fit all.

Digital marketers have been targeting ads at segments of people with specific demographics, intents, and interests for decades.

However, SEO professionals seemed satisfied with trying to guess the intent of users based on the words or phrases in their queries.

Who can blame them? Previously, keyword research tools could only tell SEOs “what” people searched for, but not “who” they were.

Now, SEO and content marketers can get surprising insights into the demographics, interests, and information sources that influence their intent.

For example, let’s say that the B2B company or client you work for wants to know who searches for “customer retention”?

Well, SparkToro tells you that 5,000 to 6,000 people search for “customer retention” each month in the United States.

Screenshot from SparkToro, December 2024

The tool also tells you that 52.6% are female, and 46.8% are male. Surprised?

They also visit websites like HelpScout.com and Userpilot.com, as well as search other keywords like “lifetime customer value formula” and “tools for customer success.”

Or, let’s say that the target audience that your B2C company or client wants to reach is a “nutritionist.”

Screenshot from SparkToro, December 2024

The tool tells you that 15,000 people have “nutritionist” in their social media bios. The tool also tells you that 81.9% are female, and 174.3% are male.

They also visit websites like Cenegenics.com and CleanPlates.com. And they’re searching for keywords like “fat content of foods” and “how much eggs have protein.”

Finally, let’s say you’ve just started doing work for TheSill.com.

You could tell Eliza that 125,000 people globally visit TheSill.com each month.

Screenshot from SparkToro, December 2024

And the audience located in the United States is 52.4% female, and 47.2% male.

This audience also visits BHG.com and FoodandWine.com, And they search for “how to clip plants” and “cheap tall plants.”

That’s why I think conducting audience research gives you a competitive advantage over keyword research when it comes to crafting more resonant, effective content.

3. Focus On High-Quality, Original Content

It’s worth re-reading Google’s guidance on building high-quality sites published in 2011, following the first Panda algorithm change.

It’s also worth reading the Googe Search Center documentation for creating helpful, reliable, people-first content.

If you have time, then it’s also worth checking out Leveraging YouTube, LinkedIn, And Cross-Channel Strategies For Success.

What will you learn from all this background reading? Content is still king.

And while AI may help you crank out content more efficiently, it still doesn’t create the high-quality, original content that readers crave and Google rewards.

However, it’s important to recognize that there are different perspectives on creating high-quality, original content.

Some experts are focused on increasing efficiency (doing things right), while others are focused on improving effectiveness (doing the right things).

Spoiler alert: I’m in the second camp, along with a lot of content marketers. But, a lot of senior executives, especially chief financial officers (CFOs), are in the first camp.

So, SEO marketers won’t be surprised when they read what Stephanie Stahl said: “Creating content that prompts a desired action isn’t easy.”

SEO professionals may feel a surge of empathetic pain when they read that the biggest challenge that 54% of B2B content creation teams face is “lack of resources.”

But, SEO pros may also feel a surge of hope when they read that Stahl also said, “a group of top performers has found a way to surge ahead. They’ve figured out how to understand their audience’s needs, produce high-quality content, and use AI to create more efficient workflows.”

So, how do the most successful content marketers differ from their less successful peers? Well, according to Stahl, top performers are more likely to:

  • Have the right technology to manage content across the organization.
  • Have a scalable model for content creation.
  • Say their scalable model is creating the desired outcomes.

But the factors that B2B top performers say contribute to their content marketing success are:

  • Understand our audience (82%).
  • Produce high-quality content (77%).
  • Possess industry expertise (70%).
  • Have high-performing team members (69%).
  • Set goals that align with their organization’s objectives (62%).
  • Measure and demonstrate content performance effectively (53%).

So, the debate between efficiency and effectiveness isn’t over. You don’t need to pick sides, but you should be aware that executives at your company are probably debating this topic, too.

4. Build Strong Backlinks

I don’t need to tell SEO pros they need to build strong backlinks. I also don’t need to tell you that this is getting harder to do.

Back in 2002, all you needed to do was submit your URL to the Yahoo! Directory and the Open Directory Project (also known as DMOZ). But, both directories have since been shut down, with the Yahoo! Directory closing in 2014 and the Open Directory Project in 2017.

Meanwhile, Google’s Penguin algorithm update, which rolled out from 2012 to 2016, targeted link spam and manipulative link-building practices.

So, how do you build strong backlinks these days?

You can start by reading What Links Should You Build For A Natural Backlink Profile?

Or,  download “Link Building For SEO: A Complete Guide.”

5. Prioritize User Experience (UX):

Finally, read about how AI is transforming user experiences and influencing SEO rankings.

Every SEO I’ve met over the past 20 years knows how to evaluate the usability of webpages.

For any of the new SEO experts that I haven’t met yet, here’s what Google has to say about the usability of webpages:

“Our systems also consider the usability of content. When all things are relatively equal, content that people will find more accessible may perform better.”

Google adds:

“For example, our systems would look at page experience aspects, such as if content is mobile-friendly, so that those on mobile devices can easily view it. Similarly, they look to see if content loads quickly, also important to mobile users.”

Get it? Got it? Good.

Tactical Advice

It’s worth knowing that Sun Tzu also said, “Strategy without tactics is the slowest route to victory.”

This quote reminds me of the scene from the superhero comedy film, Mystery Men (1999), where Mr. Furious says, “If you want to push something down, you have to pull it up. If you want to go left, you have to go right.”

But I must admit that the ancient Chinese military strategist is right. You need tactical advice as well as strategic insights to achieve your professional goals and advance in this field or industry.

1. Stay Updated With Algorithm Changes

In November 2003, Google surprised SEO professionals with its first major algorithm update. It was called the “Florida Update” because it hit the industry like a hurricane.

Since then, SEO pros have monitored Google’s algorithm updates and adjusted their strategies accordingly.

If you want to stay updated about algorithm changes, start by reading, Why & How To Track Google Algorithm Updates.

2. Leverage Schema Markup

You’ll also want to implement schema markup to help search engines understand your content and display rich snippets.

If you don’t already know how to do that, then read What Is Schema Markup & Why Is It Important For SEO?

After you’ve done your homework, use tools like Google’s Schema Markup Testing Tool to test your structured data.

3. Optimize For Core Web Vitals

You already know that improving page load speed, interactivity, and visual stability enhance your user experience.

You can brush up on reading about page speed and Core Web Vitals.

4. Track And Analyze Your Performance

It goes without saying that you need to use digital analytics tools to track and analyze your performance. But it’s well worth reading: Beyond Rankings and Beyond Pageviews.

SEO professionals need to have a seat at the table when digital marketing teams decide which events to turn into “key events” in GA4.

Why? So, we can go beyond tracking which default channel was the source of website traffic and begin measuring which channels are generating leads or driving online sales.

5. Adapt To Emerging Trends

    SEO marketers have been doing this for more than two decades. But it won’t hurt you to download the “State of SEO 2025.”

    Summary: Adapting To The New Age Of Search

    By following these strategic SEO insights and tactical advice, you can position yourself for success in the ever-evolving digital landscape – whether you remain at your current company or need to re-invent yourself at another one.

    To close with another quote from Sun Tzu: “In the midst of chaos, there is also opportunity.”

    Footnote:

    1 Sullivan, D. (2002, March 4-6). Search Engine Strategies 2003 [Conference Handbook (p. 42)]. Hilton Boston Park Plaza, MA, United States.


    More Resources:


    Featured Image: Buravleva stock/Shutterstock