Google Search Relations team members recently shared insights about web standards on the Search Off the Record podcast.
Martin Splitt and Gary Illyes explained how these standards are created and why they matter for SEO. Their conversation reveals details about Google’s decisions that affect how we optimize websites.
Why Some Web Protocols Become Standards While Others Don’t
Google has formally standardized robots.txt through the Internet Engineering Task Force (IETF). However, they left the sitemap protocol as an informal standard.
This difference illustrates how Google determines which protocols require official standards.
“With robots.txt, there was a benefit because we knew that different parsers tend to parse robots.txt files differently… With sitemap, it’s like ‘eh’… it’s a simple XML file, and there’s not that much that can go wrong with it.”
This statement from Illyes reveals Google’s priorities. Protocols that confuse platforms receive more attention than those that work well without formal standards.
The Benefits of Protocol Standardization for SEO
The standardization of robots.txt created several clear benefits for SEO:
Consistent implementation: Robots.txt files are now interpreted more consistently across search engines and crawlers.
Open-source resources: “It allowed us to open source our robots.txt parser and then people start building on it,” Illyes noted.
Easier to use: According to Illyes, standardization means “there’s less strain on site owners trying to figure out how to write the damned files.”
These benefits make technical SEO work more straightforward and more effective, especially for teams managing large websites.
Inside the Web Standards Process
The podcast also revealed how web standards are created.
Standards groups, such as the IETF, W3C, and WHATWG, work through open processes that often take years to complete. This slow pace ensures security, clear language, and broad compatibility.
Illyes explained:
“You have to show that the thing you are working on actually works. There’s tons of iteration going on and it makes the process very slow—but for a good reason.”
Both Google engineers emphasized that anyone can participate in these standards processes. This creates opportunities for SEO professionals to help shape the protocols they use on a daily basis.
Security Considerations in Web Standards
Standards also address important security concerns. When developing the robots.txt standard, Google included a 500-kilobyte limit specifically to prevent potential attacks.
Illyes explained:
“When I’m reading a draft, I would look at how I would exploit stuff that the standard is describing.”
This demonstrates how standards establish security boundaries that safeguard both websites and the tools that interact with them.
Why This Matters
For SEO professionals, these insights indicate several practical strategies to consider:
Be precise when creating robots.txt directives, since Google has invested heavily in this protocol.
Use Google’s open-source robots.txt parser to check your work.
Know that sitemaps offer more flexibility with fewer parsing concerns.
Consider joining web standards groups if you want to help shape future protocols.
As search engines continue to prioritize technical quality, understanding the underlying principles behind web protocols becomes increasingly valuable for achieving SEO success.
This conversation shows that even simple technical specifications involve complex considerations around security, consistency, and ease of use, all factors that directly impact SEO performance.
How can you structure content so that AI pulls your site into the response?
Do you really need to change your SEO strategy?
For years, SEO teams followed a familiar SEO playbook:
Optimize upper-funnel content to capture awareness,
mid-funnel content to drive consideration,
lower-funnel content to convert.
One page, one keyword, one intent.
But with the rise of ChatGPT, Perplexity, Copilot, Gemini, and now Google’s AI Mode, that linear model is increasingly outdated.
So, how do you move forward and keep your visibility high in modern search engine results pages (SERPs)?
We’ve reverse-engineered AI Overviews, so you don’t have to. Let’s dive in.
What We’ve Discovered Through Reverse Engineering Google’s AI Overviews (AIO)
From what we’re seeing across client industries and in how AI-driven results behave, the traditional funnel model – the idea of users moving cleanly from awareness to consideration to conversion – feels increasingly out of step with how people actually search.
How Today’s Search Users Actually Search
Today’s users jump between channels, devices, and questions.
They skim, abandon, revisit, and decide faster than ever.
AI Overviews don’t follow a tidy funnel because most people don’t either.
They surface multiple types of information at once, not because it’s smarter SEO, but because it’s closer to how real decisions get made.
AIOs & AI Mode Aren’t Just Answering Queries – They’re Expanding Them
Traditionally, SEO strategy followed a structured framework. Take a travel-related topic, for example:
Informational (Upper-Funnel) – “How to plan a cruise?”
Commercial (Mid-Funnel) – “Best cruise lines for families”
Transactional (lower-Funnel) – “Find Best Alaska Cruise Deals”
However, AI Overviews don’t stick to that structure.
Instead, they blend multiple layers of intent into a single, comprehensive response.
How AI Overviews Answer & Expand Search Queries
Let’s stay with the travel theme. A search for “Mediterranean cruise” might return an AI Overview that includes:
Best Time to go (Informational).
Booking Your Cruise (Commercial).
Cruise Lines (Navigational).
AI Mode Example for ‘Mediterranean Cruise’
What’s Happening Here?
In this case, Google isn’t just answering the query.
It anticipates what the user will want to know next, acting more like a digital concierge than a traditional search engine.
We started noticing this behavior showing up more often, so we wanted to see how common it actually is.
To get a clearer picture, we pulled 200 cruise-related informational queries from SEMrush and ran them through our custom-built AI SERP scraper. The goal was to see how often these queries triggered AI Overviews, and what kind of intent those Overviews covered.
The patterns were hard to miss:
88% of those queries triggered an AI Overview
More than half didn’t just answer the initial question.
52% mixed in other layers of intent, like brand suggestions, booking options, or comparisons, right alongside the basic information someone might’ve been looking for.
Using a different query related to Mediterranean Cruises, the AIO response acts as a travel agent, guiding the user on topics like:
How to fly,
Destinations with region,
Cruise prices,
Cruise lines that sail to that destination.
While it’s an Information non-brand search query, the AIO response is lower-funnel as well.
Again, less than half of the queries were matched intent.
Here are some examples of queries that were identified as Informational and provided only the top-of-funnel response without driving the user further down the funnel.
The Verdict
Even when someone asks a simple, top-of-funnel question, AI is already steering them toward what to do next, whether that’s comparing prices, picking a provider, or booking a trip.
What Does This Mean for SEO Strategies Moving Forward?
If AI Overviews and AI Mode are blending intent types, content, and SEO strategies need to catch up:
It’s no longer enough to rank for high-volume informational keywords. If your content doesn’t address multiple layers of intent, AI will fill the gaps with someone else’s content.
SEO teams need to analyze how AI handles their most important queries. What related questions is it pulling in? Are those answers coming from your site or your competitors?
Think beyond keyword volume. Long-tail queries may have lower search traffic, but they often align better with AI-cited content. Structure your pages with clear headings, bullets, and concise, helpful language—that’s what AI models prefer to surface.
The Future of SEO in an AI World: Hybrid Intent Optimization
The fundamentals of technical and on-page SEO still matter. But if your content is still built around single keywords and single intent types, you’re likely to lose visibility as AI continues to reshape the SERP.
The brands that adapt to this shift by creating content that mirrors the blended, fast-moving behavior of actual users are the ones that will continue to own key moments across the funnel, even as the funnel itself evolves.
AsAItransformssearchbehavior,it‘scrucialtoadaptyourSEOstrategiesaccordingly.AtDAC,wespecializeinaligningyourcontentwiththelatestsearchtrendstoenhancevisibilityandengagement.Reachouttoustodaytofuture-proofyour strategy with our award-winning TotalSERP approachandstayaheadintheevolvingdigitallandscape.
https://www.dacgroup.com/” class=”btn-learn-more button-green medium-size”>Optimize Your SEO For AI Search, Now
Image Credits
Featured Image: Image by DAC. Used with permission.
In-Post Image: Images by DAC. Used with permission.
Site migration issues happen. You plan, create a staging site, and then when the site goes live, there’s bound to be something wrong.
Quality assurance gets thrust into overdrive the moment that migrations are complete.
You sift through thousands of pages, metadata, and more to fix any problems before someone else notices.
It’s a lot of work and time-consuming to feel confident that a site migration is complete without issues.
But, I’m going to show you how to identify migration issues quickly using Google Sheets and AI. You still have a lot to do (migration experts, rejoice!), but this script is going to help you:
Compare old and new ScreamingFrog crawls.
Identify immediate issues that you need to resolve.
SEOs have their own strategies and practices that they follow, and this script is going to allow you to QA migrations quickly based on your own requirements.
You can adapt the script below to make this work for you, whether you’re working on a small local business site or an enterprise.
Setting Everything Up With Screaming Frog And Google Sheets
I’m using Screaming Frog for this example because it makes it easy for me to export data for both sites.
We’re going to assume the following:
Your first version is your live website, which we’ll call the Old Crawl.
Your second version is your new site on a staging environment, which we’ll call New Crawl.
You’re going to create a Google Sheets with the following Sheets:
Overview.
Old Crawl.
New Crawl.
Once your Sheet is set up properly, run your ScreamingFrog scan using any settings that you like.
You’ll run the scan for your Old and New Crawl and then inmport the data to the Old Crawl and New Crawl tabs in your Sheets.
Your sheets will look something like this:
ScreamingFrog Export Crawl Results (Screenshot of Google Sheet, March 2025)
The New Crawl will look very similar.
Once you fill in both the New and Old Crawl sheets, you’ll need to populate your Overview sheet.
The table that you create in this sheet should contain the following columns:
Existing (old) URL.
New URL.
Status Code.
Indexability.
Title 1.
Meta Description 1.
H1-1.
H2-1.
Column 3.
Column 4.
Your Overview sheet will look something like this:
Migration QA Overview Sheet (Screenshot of Google Sheet, March 2025)
Once you have your sheets set up, it’s time to put your favorite AI to work to compare your data.
I used ChatGPT, but you can use any AI you like. I’m sure Claude, Deepseek, or Gemini would do equally as well as long as you use similar prompts.
Prompts To Create Your Google Sheets Data
You can fill in your Google Sheet formulas by hand if you’re a formula guru, but it’s easier to let AI do it for you since we’re making basic comparisons.
Remember, the Old Crawl is the live site, and the New Crawl is my staging site.
Now, go to your AI tool and prompt it with the following:
I need a Google Sheets formula that compares values between two sheets: "Old Crawl" and "New Crawl." The formula should:
Look up a value in column A of "Old Crawl" using the value in column A of the current sheet.
Look up a value in column A of "New Crawl" using the value in column B of the current sheet.
Find the corresponding column in both sheets by matching the column header in row 1 with the current column header.
If the values match, return "Pass".
If they don't match, return "Error (old<>new)" with the differing values shown.
Use TEXTJOIN("<>", TRUE, ...) to format the error message.
Ensure compatibility with Google Sheets by specifying explicit ranges instead of full-column references.
You can adjust these prompt points on your own.
For example, you can change “Old Crawl” to “Live Site,” but be sure that the sheet names match up properly.
ChatGPT generated code for me that looks something like this:
You can use these basic formulas to start comparing rows by pasting the formula in row 2.
Adding the formula is as simple as double-clicking the field and pasting it in.
I know that you’ll want to make this a little more complex. You can do a lot of things with Google Sheets and formulas, so tweak things as needed.
Ideas For Expanding Your Migration Sheet
Your formulas will depend on the settings of your Screaming Frog crawl, but here are a few that I think will work well:
Create a function to compare all of the status codes between the Old Crawl and New Crawl to identify key issues that exist. For example, if a page has anything but a 200 code, you can highlight the issue to fix it quickly.
Add a formula to highlight metadata that is too long or short, so that you can add it to your task list for when the audit is over.
Create a function to monitor Response Time between both the Old and New Crawl so that you can identify any issues that the new crawl may have or report speed increases if switching to a new host or server.
Create another function to compare the URL structure of each URL. You might compare trailing slashes, structure and more.
Develop a new function for Inlinks to be sure that no internal links were lost in the migration. You can also check external links using the same concept.
Google’s Martin Splitt recently shared insights on how JavaScript mistakes can hurt a website’s search performance.
His talk comes as Google Search Advocate John Mueller also urges SEO pros to learn more about modern client-side technologies.
Mistake 1: Rendered HTML vs. Source HTML
During the SEO for Paws Conference, a live-streamed fundraiser by Anton Shulke, Splitt drew attention to a trend he’s noticing.
Many SEO professionals still focus on the website’s original source code even though Google uses the rendered HTML for indexing. Rendered HTML is what you see after JavaScript has finished running.
Splitt explains:
“A lot of people are still looking at view source. That is not what we use for indexing. We use the rendered HTML.”
This is important because JavaScript can change pages by removing or adding content. Understanding this can help explain some SEO issues.
Mistake 2: Error Pages Being Indexed
Splitt pointed out a common error with single-page applications and JavaScript-heavy sites: they often return a 200 OK status for error pages.
This happens because the server sends a 200 response before the JavaScript checks if the page exists.
Splitt explains:
“Instead of responding with 404, it just responds with 200 … always showing a page based on the JavaScript execution.”
When error pages get a 200 code, Google indexes them like normal pages, hurting your SEO.
Splitt advises checking server settings to handle errors properly, even when using client-side rendering.
Mistake 3: Geolocation Request Issue
Another problem arises when sites ask users for location or other permissions.
Splitt says Googlebot will always refuse the request if a site relies on geolocation (or similar requests) without a backup plan.
Splitt explains:
“Googlebot does not say yes on that popup. It says no on all these requests … so if you request geolocation, Googlebot says no.”
The page can appear blank to Googlebot without alternative content, meaning nothing is indexed. This can turn into a grave SEO mistake.
How to Debug JavaScript for SEO
Splitt shared a few steps to help diagnose and fix JavaScript issues:
Start with Search Console: Use the URL Inspection tool to view the rendered HTML.
Check the Content: Verify if the expected content is there.
Review HTTP Codes: Look at the status codes in the “More info” > “Resources” section.
Use Developer Tools: Open your browser’s developer tools. Check the “initiator” column in the Network tab to see which JavaScript added specific content.
Splitt adds:
“The initiator is what loaded it. If it’s injected by JavaScript, you can see which part of the code did it.”
Following these steps can help you find the problem areas and work with your developers to fix them.
See Splitt’s full talk in the recording below:
A Shift in SEO Skills
Splitt’s advice fits with Mueller’s call for SEOs to broaden their skill set.
Mueller recently suggested that SEO professionals learn about client-side frameworks, responsive design, and AI tools.
Mueller stated:
“If you work in SEO, consider where your work currently fits in … if your focus was ‘SEO at server level,’ consider that the slice has shrunken.”
Modern JavaScript techniques create new challenges that old SEO methods cannot solve alone. Splitt’s real-world examples show why understanding these modern web practices is now critical.
What This Means For SEO Professionals
Both Google Advocates point to a clear trend: SEO now requires more technical skills. As companies look for professionals who can blend SEO and web development, the demand for these modern skills is growing.
To keep up, SEO pros should:
Learn How JavaScript Affects Indexing: Know the difference between source and rendered HTML.
Master Developer Tools: Use tools like Search Console and browser developer tools to spot issues.
Collaborate with Developers: Work together to build sites that serve users and search engines well.
Broaden Your Skillset: Add client-side techniques to your traditional SEO toolkit.
Looking Ahead
As the web evolves, so must the skills of SEO professionals. However, leveling up your knowledge doesn’t have to be intimidating.
This fresh look at JavaScript’s role in SEO shows that even simple changes can have a big impact.
“Investigated a number of 404s recorded on a client website.
And a significant amount were generated by an AI service, which appears to have just made up articles, and URLs, in citations. It isn’t even using the right URL structure 🤦♂️
Debating the value of redirects and any potential impact.”
Thornton adds:
“On one hand, mistakes by more obscure AI bots might not seem worth correcting for the sake of adding more redirects. On the other, if it’s a relatively small client with a high value for conversions, even a couple of lost sales due to the damage to the brand will be noticeable.”
Google’s Perspective
Predicting an increase in hallucinated links, Google Search Advocate John Mueller offers guidance that can help navigate this issue.
First, he recommends having a good 404 page in place, stating:
“A good 404 page could help explain the value of the site, and where to go for more information. You could also use the URL as a site-search query & show the results on the 404 page, to get people closer.”
Before investing in solutions, he recommends collecting data.
Mueller states:
“I wonder if this is going to be a more common thing? It’s tempting to extrapolate from one off [incidents], but perhaps it makes sense to collect some more data before spending too much on it.”
In a follow-up comment, Mueller predicted:
“My tea leaves say that for the next 6-12 months we’ll see a slight uptick of these hallucinated links being clicked, and then they’ll disappear as the consumer services adjust to better grounding on actual URLs.”
Don’t Hope For Accidental Clicks
Mueller provided a broader perspective, advising SEO professionals to avoid focusing on minor metrics.
“I know some SEOs like to over-focus on tiny metrics, but I think sites will be better off focusing on a more stable state, rather than hoping for accidental by-clicks. Build more things that bring real value to the web, that attract & keep users coming back on their own.”
What This Means
As AI adoption grows, publishers may need to develop new strategies for mitigating hallucinations.
Ammon Johns, recognized as a pioneer in the SEO industry, offers a potential solution to consider.
“I think any new custom 404 page should include a note to anyone that arrived there from an AI prompt to explain hallucinations and how AI makes so many of them you’ve even updated your site to warn people. Always make your market smarter – education is the ultimate branding.”
It’s too early to recommend a specific strategy at this time.
Mueller advises monitoring these errors and their impact before making major changes.
John Mueller, a Google Search Advocate, suggests that SEO professionals should reconsider how their work fits into the modern web stack.
He references a “vibes-based” visualization highlighting how developers’ focus areas have shifted.
Mueller notes a disconnect between what industry pros pay attention to (such as JavaScript frameworks, performance optimizations, or new AI-driven tech) and what online businesses need.
However, he sees this as an opportunity for SEO professionals. He provides advice on staying relevant amid shifting business priorities.
Changing Business Priorities
Laurie Voss, VP of Developer Relations at Llama Index, shared a chart showing the areas of focus of software professionals from 1990 to 2025.
Screenshot from: Seldo.com, March 2025.
In the early days, developers were mainly concerned with hardware and networking. By the mid-2000s, the focus shifted to HTML, CSS, and server technologies. More recently, we’ve seen a move toward client frameworks, responsive design, and AI-powered development.
Although the data is subjective, Mueller highlights its value for SEOs. It shows how quickly areas like server-level work have become less critical for average web developers.
Mueller’s Take
Mueller’s point is straightforward: as web development changes, SEO must change, too. The skills that made you valuable five years ago might not be enough today.
Screenshot from: Seldo.com, March 2025.
Mueller says:
“If you work in SEO, consider where your work currently fits in with a graph like this. It’s not an objective graph based on data, but I think it’s worth thinking about how your work could profit from adding or shifting “tracks.””
He adds:
“What the average web developer thinks about isn’t necessarily what’s relevant for the “online business” (in whichever form you work). Looking at the graph, if your focus was “SEO at server level,” consider that the slice has shrunken quite a bit already.”
This matches Voss’s argument in the article “AI’s effects on programming jobs.”
Voss believes AI won’t kill development jobs but will create a new abstraction layer, changing how work is done. The same likely applies to SEO work.
What Should SEO Pros Focus On?
Reading between the lines of Mueller’s comment and the chart, several areas stand out for SEOs to develop:
Mobile performance skills
Working with AI tools
Understanding responsive design
Knowledge of client-side frameworks and how they affect SEO
Prompt engineering
In other words, step outside server-level optimizations and focus on client-side rendering and user experience elements.
Our Take At Search Engine Journal
Mueller’s advice hits home for us at SEJ. We’ve watched SEO evolve firsthand.
Not long ago, technical SEO mostly meant handling sitemaps, robots.txt files, and basic schema markup. Now, we’re writing about JavaScript rendering, Core Web Vitals, and AI content evaluation.
The most successful industry pros are those who expand their technical knowledge rather than stick to outdated practices. Those who understand traditional optimization and new web technologies will continue to thrive as our industry changes.
Mueller’s reminder to adapt isn’t just sound advice; it’s essential for staying relevant in search.
If you stay organized, leverage tools to scale, and pay attention to details, you will have done everything you can to ensure business continuity in the short, medium, and long term pertaining to organic performance.
Aside from the technical aspects of migration, an enterprise migration, more often than not, comes with the added pressures of:
Strong levels of C-level/VP-level attention and communications.
Multiple project teams and stakeholders making SEO-impacting decisions.
SEO pros needing to be involved in “non-traditional” SEO calls and planning meetings.
In a large site migration, there is also the increased potential for something known as “migration lag.”
What Is Migration Lag?
Migration lag refers to the time period after launching a new website where traffic and rankings drop as search engines discover and index the new site.
For huge enterprise sites with hundreds of thousands of URLs, this lag can last for months.
To minimize migration lag, you must have a solid redirect strategy before the new site launches. This means:
Prioritizing redirects for high-traffic and high-value pages. Focus on redirecting pages that drive the most traffic and revenue first.
Using wildcards to redirect categories of pages. For example, redirect /product/* to /new-site/all-products/.
Including URL parameters in redirects. Make sure redirects pass on any query parameters, like /product/123?color=red to /new-site/product/123?color=red.
Breaking redirect chains. If a page has been redirected multiple times, point the final redirect to the new destination URL.
Redirecting backlinks. Find all links pointing to the old site and set up redirects so they point to the proper new pages. This preserves the link equity you’ve built up.
Accounting for recent redirects. If you’ve done any redirects in the past six months, set up new redirects to point those pages to the proper new URLs.
With technical SEO savvy and patience, you can navigate an enterprise website migration with minimal traffic and rankings loss.
Stay on top of your redirects and keep optimizing and reacting to your data and Google’s ever-changing search engine results pages (SERPs), and search traffic will return to normal.
Soft-Launch Pre-Migration
In June 2023, John Mueller floated the idea of launching a new domain “early” before the official migration switchover.
This was interesting, as the general best practice narrative has been to not let the new domain be open to crawling before the migration switchover date.
As with any new recommendation, this is something I’ve tested since on personal project sites and with client moves.
Testing has shown that indexing has happened faster for the new domain, especially when compared to the domains in my “How Long Should A Migration Take” study.
In the Google Search Console screenshot below, I migrated a domain on January 28, but I put the new domain live and crawlable/indexable from January 21.
By February 1, the new domain was 100% indexed, and Google had even crawled and processed all the /feed URLs that were set to nodindex.
Screenshot from Google Search Console, February 2025
While this was a small website (1,300 URLs), the data was similar to other domain migrations and subdomain to subfolder migrations I’ve taken this approach with.
The most common pushback I’ve had to this approach has been the wider business desire to “make a splash” with PR around the launch and the chance of an existing customer finding the new site early. If they share the new site, the potential problems this could cause can diminish any benefits gained.
The second most common pushback, which is valid, is if there have been substantial changes to content, product, or brand that need to remain under embargo until the scheduled launch date.
Defining The Migration Strategy
Once you’ve audited your existing site and redirects, it’s time to map out how you want to handle the migration.
The strategy you develop now will determine how seamless this transition is for both your users and search engines.
Define Goals
What do you want to achieve with this migration? Are you aiming to consolidate domains, move to a new content management system (CMS), restructure content, or combine?
Be very clear on your objectives so you can develop the best approach.
Prioritize Redirects
With hundreds of thousands of URLs, you’ll need to determine which redirects are most critical to implement first. Focus initially on:
Your most important pages (homepage, product pages, etc.).
Pages that generate a substantial amount of website leads/revenue, either directly or indirectly.
Pages that generate the most organic traffic for the website.
Pages with strong backlink profiles. Those that are crawled frequently by Google/other search engines should be prioritized above those with bigger backlink clusters – but this is an objective measure you will need to determine.
Once the high-priority redirects are handled, work your way down from there. Don’t worry about redirecting every single URL right away.
As long as you have the majority of important pages and traffic accounted for, the remaining redirects can be added over time.
A great way to prioritize redirects, is to create a dashboard of all relevant data you wish to consider and prioritize by (such as the examples I’ve given above) and creating a matrix with RANK.EQ in Google Sheets, and then a prioritization categorizer.
The example below is a very simplified version of this. First, you want to collate all your data at the URL level:
Image by author, February 2025
You then want to rank these values against their individual metric data sets. To do this, you use =RANK.EQ(VALUE,VALUE RANGE).
This lets you see which URLs are in the higher percentile and which ones are in the lower percentile:
Image by author, February 2025
You then want to automate batch assignment, and this requires three steps.
First, a “reverse RANK.EQ”, which would be:
=COUNT(A:A) - RANK.EQ(A1, A:A) + 1
Which will tell you which URLs are “the best” based on all four metric ranks combined:
Image by author, February 2025
From here, you can either convert the Overall EQ to percentages in another column, and then run a rule against them that if they are =< or => certain thresholds, they fall into different batches.
You can also split the rows up by volume ordered by the Overall EQ if you have redirect limits (like when moving to Salesforce Commerce Cloud or SAP Hybris; read more below).
Map Content And URL Structure
Determine how you want to reorganize or restructure your content on the new site.
Map out which existing URLs will redirect to which new destinations. Group related content and consolidate where possible.
In some cases, temporary 302 redirects may make sense, especially if the page content is still being migrated.
Be very careful when using wildcards, and always do spot checks to ensure there are no 404 errors. Redirect parameters whenever possible to avoid duplicate content issues.
Backlinks
Make a list of any pages with strong backlink profiles and ensure they are redirected properly. Reach out to webmasters linking to those pages and let them know the new URL to see if they will update the link on their page.
This helps to preserve the SEO value built up over time.
With careful planning and strategic prioritizing, you can migrate an enterprise website and put the necessary redirects in place without (too much) chaos. But go slowly; this is not a task to rush!
Think through each step and check your work along the way.
Establishing The Migration Project Timelines
When managing a large website migration, establishing realistic timelines is crucial.
Trying to redirect hundreds of thousands of URLs in a short timeframe is a recipe for disaster.
You need to plan ahead and be strategic in how you phase the work.
It’s also very important that migration timelines are a collaborative effort involving all stakeholders.
Far too often, the business determines an arbitrary deadline without taking into account the feasibility of all teams to complete all necessary actions comfortably in time.
Avoid Phased/Partial Migrations
Avoiding phased or partial migrations is crucial when managing redirects for an enterprise website. Piecemealing your migration will only create more work and headaches down the road.
I worked on a migration in the past two years that was consolidating multiple domains (products) under a new umbrella domain, and the original plan was to do one after the other in a phased approach.
More than a year later, the second domino still hasn’t fallen. Google has started to rank the umbrella domain for products in the group it isn’t optimized for – causing internal domain cannibalization and performance issues as the brand entity is “fractured” across multiple domains.
Prior to this, I’d never witnessed a phased or partial migration mitigate the risks to the performance that the cautious decision-makers felt it would.
Do It All At Once
The best approach is to redirect all URLs at the same time. This ensures:
No pages are left orphaned without a redirect in place.
There are no redirect chains created that need to be cleaned up later. Redirect chains can negatively impact SEO and user experience.
All backlinks point to the proper new destination page. If done in phases, old pages may accumulate new backlinks that then need to be redirected.
Setting Up 301 Redirects At Scale
At an enterprise level, setting up 301 redirects for tens or hundreds of thousands of URLs requires some strategic planning.
Here are some tips for tackling this at scale:
Using Wildcards And Handling Parameter URLs
When managing redirects for an enterprise website, wildcards and parameters become your best friends. With so many URLs, creating individual redirects for each would be an endless task.
Wildcards allow you to redirect groups of pages at once.
Say you have product pages like /product/abc123, /product/def456, /product/ghi789. You can set up a wildcard redirect like /product/* to point to the new /products page.
This single redirect will capture all product pages and send visitors to the right place.
Parameters, like IDs, SKUs, or dates, often change when site content gets updated or reorganized.
Rather than tracking down each instance of an old parameter to redirect it, use a redirect that includes the parameter.
For example, if you have a URL like /blog/post?id=123 that is now /news/story/123, set up the redirect /blog/post?id= to point to /news/story/.
This will catch any page with that parameter pattern and send visitors to the new structure.
When used properly at an enterprise scale, wildcards and parameters can:
Save countless hours of manual redirect creation and maintenance.
Ensure no page is left behind during a migration or site architecture change.
Continue to capture new pages that match the pattern as the site grows and evolves.
Be very careful when using wildcards and parameters in your redirects. Test them thoroughly to ensure no unintended pages are caught in the net.
Monitor them regularly, even after launch, to catch any issues early. Used responsibly, though, they are indispensable tools for managing redirects at an enterprise level.
Breaking Redirect Chains
Redirect chains can easily form when you have a high volume of redirects on an enterprise website.
A redirect chain occurs when a URL redirects to another URL that also redirects, creating a chain of multiple redirects to reach the final destination page.
To avoid redirect chains, you’ll need to trace back through your recent redirect history to find the original source URL. Once you identify the initial URL that started the chain, redirect it directly to the final destination page.
This will cut out all the middle redirects in the chain and provide a much better user experience.
Check your server log files to view URL redirect histories from the past three to six months. Look for any patterns of the same URL redirecting multiple times.
Use a redirect crawler tool to automatically detect redirect chains on your site. These tools will crawl your site and log any series of multiple redirects for the same URL.
For recent redirects less than 180 days old, double-check that the original URL is now redirecting properly to the correct final destination. Newer redirects have a higher chance of issues, so verifying them will help avoid future problems.
If you discover broken redirect chains, fix them by redirecting the initial source URL directly to the last destination URL in the chain. Remove any middle redirects that are no longer needed.
Test all fixes to ensure the redirect chain is fully broken and the user experience is improved. Check that SEO rankings and traffic have stabilized for the URLs involved.
By diligently detecting and breaking redirect chains, you’ll provide a much better overall experience for your users and site visitors.
Your enterprise website will function more efficiently, and you’ll avoid potential drops in search rankings and traffic.
Historic Redirects
When migrating an enterprise website, it’s easy to forget about redirects that were already in place. These historic redirects, especially those under six months old, still need to be accounted for to avoid traffic loss.
As you audit your site’s current redirects, make a list of any that point to pages that will be changing or removed in the migration.
These redirects will need to be updated to point to the new destination URLs. Some things to look for include:
Temporary event pages that now redirect to a general section.
Product pages that now redirect to an updated model.
Blog posts that redirect to a category archive.
Double-check that any historic redirects over six months old still need to be in place. Some may be sending signals to search engines that are no longer needed.
Removing unnecessary historic redirects will also help to simplify your site’s redirect structure and make it easier to manage going forward.
When setting up your migration’s redirect plan, be sure to factor in updating any historic redirects to their new destination URLs.
Leaving these behind could result in lost traffic and rankings for important pages on your site.
Staying on top of your enterprise website’s historic and new redirects during migration is key to a successful transition with minimal SEO impact.
Overcoming Redirect Limits
If you have an enterprise website with hundreds of thousands of pages, you may run into issues with redirect limits from your CMS or ecommerce platform.
Many systems like SAP Hybris and Salesforce Commerce Cloud cap the number of redirects you can have at 50,000 to 100,000. For a major website migration (especially enterprise ecommerce websites), this likely won’t cut it.
To get around these constraints, you’ll need to get creative. A few options to consider:
Use wildcard redirects to capture categories of pages. For example, redirect /products/* to /shop/*. This single redirect will capture all pages that start with /products.
Exclude parameters from redirects when possible. If you have pages like /product-name?color=red and /product-name?size=large, redirect only /product-name to the new URL. The parameters are often not indexed or linked to, so you can leave them out of the redirect.
Break up redirect chains. If you have a series of three+ redirects for a single page, break up the chain and create direct redirects from the initial URLs to the final destination. Historically, chained redirects were thought to pass along link juice, but this has been proven false. Keep redirects as direct as possible.
Prioritize mission-critical pages. When you start to reach the redirect limit, focus on redirecting pages that drive significant traffic and revenue. You can leave less important pages unredirected or with a 404 error temporarily.
Ask your CMS vendor about increasing limits. Many systems will increase redirect limits on an enterprise website if you ask and explain your needs. Be prepared to pay additional fees for this add-on.
With creative thinking and persistence, you can overcome most redirect limits and complete an enterprise website migration without losing a big chunk of your organic traffic.
The key is having a well-thought-out redirect strategy and implementing it well before you hit your CMS’s limits.
Once the redirects have been implemented, it’s time to see how your organic traffic and rankings have been impacted.
Benchmarking Your Progress
This will help determine if any further optimization is needed. Here are a few key metrics to monitor:
Organic search traffic. Compare traffic from major search engines like Google before and after the migration. Expect some initial drop in traffic, but it should start recovering within one to two months. If traffic is still significantly lower after three months, revisit your redirect implementation.
Keyword rankings.Track the rankings of your most important keywords to see if their positions have changed. Drops in ranking are common after a migration, but rankings should start improving again over time as search engines recrawl your new site. Major drops that don’t improve could signal redirect or content issues that need to be addressed.
Indexation. Use a tool like Google Search Console to check how much of your new site has been indexed. A large, complex site can take three to six months for Google to fully recrawl and re-index. Look for steady increases in indexation over time. If indexation stalls or drops, there may be technical issues preventing Google from accessing parts of your site.
404 errors. Monitor your 404 errors to ensure any broken links are redirecting properly. High numbers of 404s, especially old URLs, indicate redirects that need to be created or fixed.
Backlinks. Do a backlink audit to verify that any links pointing to your old site now redirect to the proper new URLs. Failure to redirect backlinks is a common cause of traffic and ranking loss after a website migration.
Regularly benchmarking these key metrics will give you valuable insight into how well your enterprise website migration and redirect implementation is going.
Make adjustments as needed to get your new site’s organic performance back on track.
Communicating Migration Performance To The C-Level
Communicating migration performance to leadership is crucial for continued support and investment in your enterprise website.
Even if the migration itself goes smoothly, problems can arise after launch if the C-suite isn’t on board.
Not all metrics need to matter directly to SEO, but giving the C-level more data and clarity can help prevent knee-jerk reactions and bad decisions from being imposed on the migrations team.
Be transparent that there may be an initial dip in metrics as the new site establishes itself. Having targets will help determine if the migration is meeting business needs after things settle in.
Share Detailed Reports
In the months following the migration, provide regular reports on how the new site performs compared to the old site and the established KPIs.
Compare these same metrics from the old site to give context on progress. Be open about any issues, and have solutions and next steps ready to propose.
It often helps to create a Looker Studio report so the C-level has instant access to data and feels as though they have some control over the situation.
Finally, Don’t Forget To Highlight Wins
While reporting on challenges is important, it is also important to showcase successes from the migration.
Promoting wins, big and small, demonstrates the value of the investment in the migration and builds confidence in your team.
Keeping leadership regularly informed about how the new enterprise website is performing is essential.
With open communication and a mix of progress reports and wins, executives will remain supportive and engaged in optimizing the site to achieve the best results.
Google has released a new episode in its “SEO Office Hours Shorts” video series, in which Developer Advocate Martin Splitt addresses a question many website owners face: Should all 404 error pages be redirected to the homepage?
The Clear Answer: Don’t Do It
In the latest installment of the condensed Q&A format, Splitt responds to a question from a user named Chris about whether “redirecting all 404 pages to the homepage with 301 redirects can have a negative impact on rankings or overall website performance in search.”
Splitt’s response was unambiguous: “Yes, and also it annoys me as a user.”
Why 404s Serve A Purpose
404 error pages signal to users and search engine crawlers that a URL is broken or nonexistent. This transparency helps people understand what they’re dealing with rather than being unexpectedly redirected to an unrelated page.
Splitt explained:
“A 404 is a very clear signal this link is wrong and broken or this URL no longer exists because maybe the product doesn’t exist or something has changed.”
Impact on Search Crawlers
Splitt says blanket redirects to the homepage can disrupt search engine crawlers’ efficiency.
When crawlers encounter a legitimate 404, they recognize that the content no longer exists and can move on to other URLs. However, redirecting them to the homepage creates a confusing loop.
Splitt noted:
“For a crawler, they go like homepage and then click through or basically crawl through your website, finding content, and eventually they might run into a URL that doesn’t exist.
But if you redirect, they’re kind of like being redirected, and then it all starts over again.”
Best Practices for Handling Missing Content
Splitt offered clear guidance on proper redirects:
If content has moved to a new location, use a redirect to that specific new URL
If content is truly gone, maintain the 404 status code
Don’t redirect to the homepage or what you think is the “closest” match
Splitt emphasized:
“If it moved somewhere else, use a redirect. If it’s gone, don’t redirect me to the homepage.”
This latest guidance aligns with Google’s longstanding recommendation to maintain accurate HTTP status codes to help users and search engines understand your site structure.
New Format
The SEO Office Hours Shorts format is a new approach from Google’s Search Relations team.
The original format was a live show where anyone could call in and get their questions answered in real time.
This format then transitioned to recorded sessions where Google personnel responded to a selection of pre-approved questions.
Now, SEO Office Hours is presented as short videos. If you prefer one of the previous formats, Splitt encourages feedback in the comments section of the video below:
Featured Image: Screenshot from YouTube.com/GoogleSearchCentral, March 2025.
Google has updated its Search Central documentation to include details about AI Mode in its robots meta tag specifications.
This update coincides with the rollout of AI Mode via Google Labs.
Document Updates For AI Mode
The updated document now specifies that you can control the appearance of your content in Google’s AI-powered features, including AI Overviews and the new AI Mode.
Key additions to the document include:
Nosnippet rule expansion The nosnippet directive “applies to all forms of search results (at Google: web search, Google Images, Discover, AI Overviews, AI Mode) and will also prevent the content from being used as a direct input for AI Overviews and AI Mode.”
Max-snippet controls The max-snippet rule specifies that limitations apply to “all forms of search results (such as Google web search, Google Images, Discover, Assistant, AI Overviews, AI Mode) and will also limit how much of the content may be used as a direct input for AI Overviews and AI Mode.”
What Is AI Mode?
AI Mode is an experimental feature initially available to Google One AI Premium subscribers. It leverages a custom version of Gemini 2.0 to deliver a search experience fully powered by AI.
The feature employs a “query fan-out” technique that issues multiple related searches across subtopics and data sources before synthesizing a comprehensive response.
Google highlights the following advantages of AI Mode over standard AI Overviews:
Handles complex, multi-part questions that might otherwise require multiple searches
Supports follow-up questions for continued conversations
Synthesizes information from multiple data sources simultaneously
Provides multimodal interaction capabilities through voice, text, or images
What This Means For Your Website
Publishers rely on website traffic to generate revenue through ads, subscriptions, or conversions. However, people may be less inclined to visit the original site when AI features summarize the content.
To counter this, you can use the “nosnippet” tag to block content from AI responses or use “max-snippet:[number]” to limit the amount of text displayed, motivating searchers to visit the site for complete information.
Looking Ahead
The robots meta tag controls offer ways to manage content in traditional search results and AI experiences.
Google’s cautious rollout of AI Mode shows that the company is aware of publishers’ concerns about content use in generative AI applications.
This update reflects Google’s effort to balance new features with publishers’ control over their content.