A Guide To Enterprise-Level Migrations (100k + URLs) via @sejournal, @TaylorDanRW

An enterprise website migration is no small feat.

We’re talking hundreds of thousands of URLs and years of SEO equity on the line.

To pull it off without traffic loss, you need a solid redirect strategy.

With the right approach, you can migrate an enterprise website without losing traffic or search rankings.

If you stay organized, leverage tools to scale, and pay attention to details, you will have done everything you can to ensure business continuity in the short, medium, and long term pertaining to organic performance.

Aside from the technical aspects of migration, an enterprise migration, more often than not, comes with the added pressures of:

  • Strong levels of C-level/VP-level attention and communications.
  • Multiple project teams and stakeholders making SEO-impacting decisions.
  • SEO pros needing to be involved in “non-traditional” SEO calls and planning meetings.

In a large site migration, there is also the increased potential for something known as “migration lag.”

What Is Migration Lag?

Migration lag refers to the time period after launching a new website where traffic and rankings drop as search engines discover and index the new site.

For huge enterprise sites with hundreds of thousands of URLs, this lag can last for months.

To minimize migration lag, you must have a solid redirect strategy before the new site launches. This means:

  • Prioritizing redirects for high-traffic and high-value pages. Focus on redirecting pages that drive the most traffic and revenue first.
  • Using wildcards to redirect categories of pages. For example, redirect /product/* to /new-site/all-products/.
  • Including URL parameters in redirects. Make sure redirects pass on any query parameters, like /product/123?color=red to /new-site/product/123?color=red.
  • Breaking redirect chains. If a page has been redirected multiple times, point the final redirect to the new destination URL.
  • Redirecting backlinks. Find all links pointing to the old site and set up redirects so they point to the proper new pages. This preserves the link equity you’ve built up.
  • Accounting for recent redirects. If you’ve done any redirects in the past six months, set up new redirects to point those pages to the proper new URLs.

With technical SEO savvy and patience, you can navigate an enterprise website migration with minimal traffic and rankings loss.

Stay on top of your redirects and keep optimizing and reacting to your data and Google’s ever-changing search engine results pages (SERPs), and search traffic will return to normal.

Soft-Launch Pre-Migration

In June 2023, John Mueller floated the idea of launching a new domain “early” before the official migration switchover.

This was interesting, as the general best practice narrative has been to not let the new domain be open to crawling before the migration switchover date.

As with any new recommendation, this is something I’ve tested since on personal project sites and with client moves.

Testing has shown that indexing has happened faster for the new domain, especially when compared to the domains in my “How Long Should A Migration Take” study.

In the Google Search Console screenshot below, I migrated a domain on January 28, but I put the new domain live and crawlable/indexable from January 21.

By February 1, the new domain was 100% indexed, and Google had even crawled and processed all the /feed URLs that were set to nodindex.

Screenshot from Google Search Console, February 2025

While this was a small website (1,300 URLs), the data was similar to other domain migrations and subdomain to subfolder migrations I’ve taken this approach with.

The most common pushback I’ve had to this approach has been the wider business desire to “make a splash” with PR around the launch and the chance of an existing customer finding the new site early. If they share the new site, the potential problems this could cause can diminish any benefits gained.

The second most common pushback, which is valid, is if there have been substantial changes to content, product, or brand that need to remain under embargo until the scheduled launch date.

Defining The Migration Strategy

Once you’ve audited your existing site and redirects, it’s time to map out how you want to handle the migration.

The strategy you develop now will determine how seamless this transition is for both your users and search engines.

Define Goals

What do you want to achieve with this migration? Are you aiming to consolidate domains, move to a new content management system (CMS), restructure content, or combine?

Be very clear on your objectives so you can develop the best approach.

Prioritize Redirects

With hundreds of thousands of URLs, you’ll need to determine which redirects are most critical to implement first. Focus initially on:

  • Your most important pages (homepage, product pages, etc.).
  • Pages that generate a substantial amount of website leads/revenue, either directly or indirectly.
  • Pages that generate the most organic traffic for the website.
  • Pages with strong backlink profiles. Those that are crawled frequently by Google/other search engines should be prioritized above those with bigger backlink clusters – but this is an objective measure you will need to determine.

Once the high-priority redirects are handled, work your way down from there. Don’t worry about redirecting every single URL right away.

As long as you have the majority of important pages and traffic accounted for, the remaining redirects can be added over time.

A great way to prioritize redirects, is to create a dashboard of all relevant data you wish to consider and prioritize by (such as the examples I’ve given above) and creating a matrix with RANK.EQ in Google Sheets, and then a prioritization categorizer.

The example below is a very simplified version of this. First, you want to collate all your data at the URL level:

Image by author, February 2025

You then want to rank these values against their individual metric data sets. To do this, you use =RANK.EQ(VALUE,VALUE RANGE).

This lets you see which URLs are in the higher percentile and which ones are in the lower percentile:

Image by author, February 2025

You then want to automate batch assignment, and this requires three steps.

First, a “reverse RANK.EQ”, which would be:

=COUNT(A:A) - RANK.EQ(A1, A:A) + 1

Which will tell you which URLs are “the best” based on all four metric ranks combined:

Image by author, February 2025

From here, you can either convert the Overall EQ to percentages in another column, and then run a rule against them that if they are =< or => certain thresholds, they fall into different batches.

You can also split the rows up by volume ordered by the Overall EQ if you have redirect limits (like when moving to Salesforce Commerce Cloud or SAP Hybris; read more below).

Map Content And URL Structure

Determine how you want to reorganize or restructure your content on the new site.

Map out which existing URLs will redirect to which new destinations. Group related content and consolidate where possible.

The new information architecture should be intuitive and user-friendly.

Redirect Types

For the bulk of redirects, use 301 permanent redirects.

In some cases, temporary 302 redirects may make sense, especially if the page content is still being migrated.

Be very careful when using wildcards, and always do spot checks to ensure there are no 404 errors. Redirect parameters whenever possible to avoid duplicate content issues.

Backlinks

Make a list of any pages with strong backlink profiles and ensure they are redirected properly. Reach out to webmasters linking to those pages and let them know the new URL to see if they will update the link on their page.

This helps to preserve the SEO value built up over time.

With careful planning and strategic prioritizing, you can migrate an enterprise website and put the necessary redirects in place without (too much) chaos. But go slowly; this is not a task to rush!

Think through each step and check your work along the way.

Establishing The Migration Project Timelines

When managing a large website migration, establishing realistic timelines is crucial.

Trying to redirect hundreds of thousands of URLs in a short timeframe is a recipe for disaster.

You need to plan ahead and be strategic in how you phase the work.

It’s also very important that migration timelines are a collaborative effort involving all stakeholders.

Far too often, the business determines an arbitrary deadline without taking into account the feasibility of all teams to complete all necessary actions comfortably in time.

Avoid Phased/Partial Migrations

Avoiding phased or partial migrations is crucial when managing redirects for an enterprise website. Piecemealing your migration will only create more work and headaches down the road.

I worked on a migration in the past two years that was consolidating multiple domains (products) under a new umbrella domain, and the original plan was to do one after the other in a phased approach.

More than a year later, the second domino still hasn’t fallen. Google has started to rank the umbrella domain for products in the group it isn’t optimized for – causing internal domain cannibalization and performance issues as the brand entity is “fractured” across multiple domains.

Prior to this, I’d never witnessed a phased or partial migration mitigate the risks to the performance that the cautious decision-makers felt it would.

Do It All At Once

The best approach is to redirect all URLs at the same time. This ensures:

  • No pages are left orphaned without a redirect in place.
  • There are no redirect chains created that need to be cleaned up later. Redirect chains can negatively impact SEO and user experience.
  • All backlinks point to the proper new destination page. If done in phases, old pages may accumulate new backlinks that then need to be redirected.

Setting Up 301 Redirects At Scale

At an enterprise level, setting up 301 redirects for tens or hundreds of thousands of URLs requires some strategic planning.

Here are some tips for tackling this at scale:

Using Wildcards And Handling Parameter URLs

When managing redirects for an enterprise website, wildcards and parameters become your best friends. With so many URLs, creating individual redirects for each would be an endless task.

Wildcards allow you to redirect groups of pages at once.

Say you have product pages like /product/abc123, /product/def456, /product/ghi789. You can set up a wildcard redirect like /product/* to point to the new /products page.

This single redirect will capture all product pages and send visitors to the right place.

Parameters, like IDs, SKUs, or dates, often change when site content gets updated or reorganized.

Rather than tracking down each instance of an old parameter to redirect it, use a redirect that includes the parameter.

For example, if you have a URL like /blog/post?id=123 that is now /news/story/123, set up the redirect /blog/post?id= to point to /news/story/.

This will catch any page with that parameter pattern and send visitors to the new structure.

When used properly at an enterprise scale, wildcards and parameters can:

  • Save countless hours of manual redirect creation and maintenance.
  • Ensure no page is left behind during a migration or site architecture change.
  • Continue to capture new pages that match the pattern as the site grows and evolves.

Be very careful when using wildcards and parameters in your redirects. Test them thoroughly to ensure no unintended pages are caught in the net.

Monitor them regularly, even after launch, to catch any issues early. Used responsibly, though, they are indispensable tools for managing redirects at an enterprise level.

Breaking Redirect Chains

Redirect chains can easily form when you have a high volume of redirects on an enterprise website.

A redirect chain occurs when a URL redirects to another URL that also redirects, creating a chain of multiple redirects to reach the final destination page.

To avoid redirect chains, you’ll need to trace back through your recent redirect history to find the original source URL. Once you identify the initial URL that started the chain, redirect it directly to the final destination page.

This will cut out all the middle redirects in the chain and provide a much better user experience.

  • Check your server log files to view URL redirect histories from the past three to six months. Look for any patterns of the same URL redirecting multiple times.
  • Use a redirect crawler tool to automatically detect redirect chains on your site. These tools will crawl your site and log any series of multiple redirects for the same URL.
  • For recent redirects less than 180 days old, double-check that the original URL is now redirecting properly to the correct final destination. Newer redirects have a higher chance of issues, so verifying them will help avoid future problems.
  • If you discover broken redirect chains, fix them by redirecting the initial source URL directly to the last destination URL in the chain. Remove any middle redirects that are no longer needed.
  • Test all fixes to ensure the redirect chain is fully broken and the user experience is improved. Check that SEO rankings and traffic have stabilized for the URLs involved.

By diligently detecting and breaking redirect chains, you’ll provide a much better overall experience for your users and site visitors.

Your enterprise website will function more efficiently, and you’ll avoid potential drops in search rankings and traffic.

Historic Redirects

When migrating an enterprise website, it’s easy to forget about redirects that were already in place. These historic redirects, especially those under six months old, still need to be accounted for to avoid traffic loss.

As you audit your site’s current redirects, make a list of any that point to pages that will be changing or removed in the migration.

These redirects will need to be updated to point to the new destination URLs. Some things to look for include:

  • Temporary event pages that now redirect to a general section.
  • Product pages that now redirect to an updated model.
  • Blog posts that redirect to a category archive.

Double-check that any historic redirects over six months old still need to be in place. Some may be sending signals to search engines that are no longer needed.

Removing unnecessary historic redirects will also help to simplify your site’s redirect structure and make it easier to manage going forward.

When setting up your migration’s redirect plan, be sure to factor in updating any historic redirects to their new destination URLs.

Leaving these behind could result in lost traffic and rankings for important pages on your site.

Staying on top of your enterprise website’s historic and new redirects during migration is key to a successful transition with minimal SEO impact.

Overcoming Redirect Limits

If you have an enterprise website with hundreds of thousands of pages, you may run into issues with redirect limits from your CMS or ecommerce platform.

Many systems like SAP Hybris and Salesforce Commerce Cloud cap the number of redirects you can have at 50,000 to 100,000. For a major website migration (especially enterprise ecommerce websites), this likely won’t cut it.

To get around these constraints, you’ll need to get creative. A few options to consider:

  • Use wildcard redirects to capture categories of pages. For example, redirect /products/* to /shop/*. This single redirect will capture all pages that start with /products.
  • Exclude parameters from redirects when possible. If you have pages like /product-name?color=red and /product-name?size=large, redirect only /product-name to the new URL. The parameters are often not indexed or linked to, so you can leave them out of the redirect.
  • Break up redirect chains. If you have a series of three+ redirects for a single page, break up the chain and create direct redirects from the initial URLs to the final destination. Historically, chained redirects were thought to pass along link juice, but this has been proven false. Keep redirects as direct as possible.
  • Prioritize mission-critical pages. When you start to reach the redirect limit, focus on redirecting pages that drive significant traffic and revenue. You can leave less important pages unredirected or with a 404 error temporarily.
  • Ask your CMS vendor about increasing limits. Many systems will increase redirect limits on an enterprise website if you ask and explain your needs. Be prepared to pay additional fees for this add-on.

With creative thinking and persistence, you can overcome most redirect limits and complete an enterprise website migration without losing a big chunk of your organic traffic.

The key is having a well-thought-out redirect strategy and implementing it well before you hit your CMS’s limits.

Benchmarking Organic Performance (Traffic, Rankings, Indexation)

Once the redirects have been implemented, it’s time to see how your organic traffic and rankings have been impacted.

Benchmarking Your Progress

This will help determine if any further optimization is needed. Here are a few key metrics to monitor:

  • Organic search traffic. Compare traffic from major search engines like Google before and after the migration. Expect some initial drop in traffic, but it should start recovering within one to two months. If traffic is still significantly lower after three months, revisit your redirect implementation.
  • Keyword rankings. Track the rankings of your most important keywords to see if their positions have changed. Drops in ranking are common after a migration, but rankings should start improving again over time as search engines recrawl your new site. Major drops that don’t improve could signal redirect or content issues that need to be addressed.
  • Indexation. Use a tool like Google Search Console to check how much of your new site has been indexed. A large, complex site can take three to six months for Google to fully recrawl and re-index. Look for steady increases in indexation over time. If indexation stalls or drops, there may be technical issues preventing Google from accessing parts of your site.
  • 404 errors. Monitor your 404 errors to ensure any broken links are redirecting properly. High numbers of 404s, especially old URLs, indicate redirects that need to be created or fixed.
  • Backlinks. Do a backlink audit to verify that any links pointing to your old site now redirect to the proper new URLs. Failure to redirect backlinks is a common cause of traffic and ranking loss after a website migration.

Regularly benchmarking these key metrics will give you valuable insight into how well your enterprise website migration and redirect implementation is going.

Make adjustments as needed to get your new site’s organic performance back on track.

Communicating Migration Performance To The C-Level

Communicating migration performance to leadership is crucial for continued support and investment in your enterprise website.

Even if the migration itself goes smoothly, problems can arise after launch if the C-suite isn’t on board.

Set Clear Expectations

Before the migration, sit down with executives and set concrete goals and key performance indicators (KPIs) for the new site.

Not all metrics need to matter directly to SEO, but giving the C-level more data and clarity can help prevent knee-jerk reactions and bad decisions from being imposed on the migrations team.

Be transparent that there may be an initial dip in metrics as the new site establishes itself. Having targets will help determine if the migration is meeting business needs after things settle in.

Share Detailed Reports

In the months following the migration, provide regular reports on how the new site performs compared to the old site and the established KPIs.

Compare these same metrics from the old site to give context on progress. Be open about any issues, and have solutions and next steps ready to propose.

It often helps to create a Looker Studio report so the C-level has instant access to data and feels as though they have some control over the situation.

Finally, Don’t Forget To Highlight Wins

While reporting on challenges is important, it is also important to showcase successes from the migration.

Promoting wins, big and small, demonstrates the value of the investment in the migration and builds confidence in your team.

Keeping leadership regularly informed about how the new enterprise website is performing is essential.

With open communication and a mix of progress reports and wins, executives will remain supportive and engaged in optimizing the site to achieve the best results.

More Resources:


Featured Image: Munthita/Shutterstock

Google’s Martin Splitt Warns Against Redirecting 404s To Homepage via @sejournal, @MattGSouthern

Google has released a new episode in its “SEO Office Hours Shorts” video series, in which Developer Advocate Martin Splitt addresses a question many website owners face: Should all 404 error pages be redirected to the homepage?

The Clear Answer: Don’t Do It

In the latest installment of the condensed Q&A format, Splitt responds to a question from a user named Chris about whether “redirecting all 404 pages to the homepage with 301 redirects can have a negative impact on rankings or overall website performance in search.”

Splitt’s response was unambiguous: “Yes, and also it annoys me as a user.”

Why 404s Serve A Purpose

404 error pages signal to users and search engine crawlers that a URL is broken or nonexistent. This transparency helps people understand what they’re dealing with rather than being unexpectedly redirected to an unrelated page.

Splitt explained:

“A 404 is a very clear signal this link is wrong and broken or this URL no longer exists because maybe the product doesn’t exist or something has changed.”

Impact on Search Crawlers

Splitt says blanket redirects to the homepage can disrupt search engine crawlers’ efficiency.

When crawlers encounter a legitimate 404, they recognize that the content no longer exists and can move on to other URLs. However, redirecting them to the homepage creates a confusing loop.

Splitt noted:

“For a crawler, they go like homepage and then click through or basically crawl through your website, finding content, and eventually they might run into a URL that doesn’t exist.

But if you redirect, they’re kind of like being redirected, and then it all starts over again.”

Best Practices for Handling Missing Content

Splitt offered clear guidance on proper redirects:

  1. If content has moved to a new location, use a redirect to that specific new URL
  2. If content is truly gone, maintain the 404 status code
  3. Don’t redirect to the homepage or what you think is the “closest” match

Splitt emphasized:

“If it moved somewhere else, use a redirect. If it’s gone, don’t redirect me to the homepage.”

This latest guidance aligns with Google’s longstanding recommendation to maintain accurate HTTP status codes to help users and search engines understand your site structure.

New Format

The SEO Office Hours Shorts format is a new approach from Google’s Search Relations team.

The original format was a live show where anyone could call in and get their questions answered in real time.

This format then transitioned to recorded sessions where Google personnel responded to a selection of pre-approved questions.

Now, SEO Office Hours is presented as short videos. If you prefer one of the previous formats, Splitt encourages feedback in the comments section of the video below:


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral, March 2025.

Google Updates Robots Meta Tag Document To Include AI Mode via @sejournal, @MattGSouthern

Google has updated its Search Central documentation to include details about AI Mode in its robots meta tag specifications.

This update coincides with the rollout of AI Mode via Google Labs.

Document Updates For AI Mode

The updated document now specifies that you can control the appearance of your content in Google’s AI-powered features, including AI Overviews and the new AI Mode.

Key additions to the document include:

Nosnippet rule expansion
The nosnippet directive “applies to all forms of search results (at Google: web search, Google Images, Discover, AI Overviews, AI Mode) and will also prevent the content from being used as a direct input for AI Overviews and AI Mode.”

Max-snippet controls
The max-snippet rule specifies that limitations apply to “all forms of search results (such as Google web search, Google Images, Discover, Assistant, AI Overviews, AI Mode) and will also limit how much of the content may be used as a direct input for AI Overviews and AI Mode.”

What Is AI Mode?

AI Mode is an experimental feature initially available to Google One AI Premium subscribers. It leverages a custom version of Gemini 2.0 to deliver a search experience fully powered by AI.

The feature employs a “query fan-out” technique that issues multiple related searches across subtopics and data sources before synthesizing a comprehensive response.

Google highlights the following advantages of AI Mode over standard AI Overviews:

  • Handles complex, multi-part questions that might otherwise require multiple searches
  • Supports follow-up questions for continued conversations
  • Synthesizes information from multiple data sources simultaneously
  • Provides multimodal interaction capabilities through voice, text, or images

What This Means For Your Website

Publishers rely on website traffic to generate revenue through ads, subscriptions, or conversions. However, people may be less inclined to visit the original site when AI features summarize the content.

To counter this, you can use the “nosnippet” tag to block content from AI responses or use “max-snippet:[number]” to limit the amount of text displayed, motivating searchers to visit the site for complete information.

Looking Ahead

The robots meta tag controls offer ways to manage content in traditional search results and AI experiences.

Google’s cautious rollout of AI Mode shows that the company is aware of publishers’ concerns about content use in generative AI applications.

This update reflects Google’s effort to balance new features with publishers’ control over their content.


Featured Image Credit: Google. 

9 Trends You Should Watch To Keep Your Website Afloat in 2025

This post was sponsored by Bluehost. The opinions expressed in this article are the sponsor’s own.

Is my website ready for 2025’s tech and SEO changes?

How can I keep my site fast, secure, and user-friendly?

What makes a hosting provider future-proof?

In 2025, the extent to which you adapt to emerging technologies, changing user expectations, and evolving search engine algorithms will determine if you’ll thrive or struggle to stay relevant.

Staying ahead of emerging trends is essential for maintaining a fast, secure, and user-friendly website.

Optimizing performance, strengthening security measures, and enhancing user experience will be key factors in staying competitive.

The first step to ensuring your website remains resilient and future-ready is choosing a reliable hosting provider with scalable infrastructure and built-in optimization tools.

1. AI-Powered User Experience

Artificial intelligence has transformed how websites interact with visitors, making online experiences more personalized, engaging, and efficient.

Use AI For Higher Conversion Rates

AI-driven personalization allows websites to deliver tailored content and product recommendations based on user behavior, preferences, and past interactions to create an intuitive experience.

The result? Visitors remain engaged, increasing conversions.

Chatbots and AI-powered customer support are also becoming essential for websites looking to provide instant, 24/7 assistance.

These tools answer common questions, guide users through a website, and even process transactions, reducing the need for human intervention while improving response times.

And they’re gaining in popularity.

71% of businesses in a recent survey either already have a chatbot integrated into their sites and customer service processes or plan to get one in the near future.

And they’re reaping the benefits of this technology; 24% of businesses with a chatbot already installed report excellent ROI.

Use AI For Speeding Up Website Implementation

AI is also revolutionizing content creation and website design.

Based on user data, automated tools can generate blog posts, optimize layouts, and suggest design improvements.

This streamlines website management, making it easier for you to maintain a professional and visually appealing online presence.

For example, many hosting providers now include AI-powered website builders, offering tools that assist with design and customization. These features, such as responsive templates and automated suggestions, can make building and optimizing a website more efficient.

2. Voice Search & Conversational Interfaces

Voice search is becoming a major factor in how users interact with the web, with more people relying on smart speakers, mobile assistants, and voice-activated search to find information.

To put this into perspective, ChatGPT from OpenAI reportedly holds 60% of the generative AI market, performing more than one billion searches daily. If just 1% of those are via its voice search, that equates to 10 million voice searches every day on ChatGPT alone.

Reports estimate 20.5% of people globally use voice search daily. And these numbers are increasing.

You need to adapt by optimizing for conversational SEO and natural language queries, which tend to be longer and more specific, making long-tail keywords and question-based content more important than ever.

To stay ahead, websites should structure content in a way that mimics natural conversation:

  • FAQ-style pages.
  • Featured snippet optimization.
  • Ensuring fast-loading, mobile-friendly experiences.

If this is an upgrade that makes sense for your industry, be sure that your host supports SEO-friendly themes and plugins that help websites rank for voice queries.

3. Core Web Vitals & SEO Best Practices

Google continues to refine its ranking algorithms, with Core Web Vitals playing a critical role in determining search visibility.

Implement Core Web Vital Data & Monitor Website Speed

These performance metrics, Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS), measure how quickly a page loads, how responsive it is, and how stable its layout appears to users.

Websites that meet these benchmarks not only rank higher in search results but also provide a better overall user experience.

One study found that pages ranking in the top spots in the SERPs were 10% more likely to pass CWV scores than URLs in position 9.

Ensure Your Website Is Faster Than Your Competitors To Rank Higher

As part of the prioritization of performance, mobile-first approach remains essential; Google prioritizes sites that are fast and responsive on smartphones and tablets.

Ensuring faster load times through optimized images, efficient coding, and proper caching techniques can make a significant impact on search rankings.

Leverage Structured Data To Tell Google What Your Website Is About

Structured data, on the other hand, helps search engines better understand a website’s content, improving the chances of appearing in rich snippets and voice search results.

4. Mobile-First & Adaptive Design

With mobile devices accounting for the majority of web traffic, mobile optimization remains a top priority in 2025.

Google’s mobile-first indexing means that search engines primarily evaluate the mobile version of a site when determining rankings.

A website that isn’t optimized for mobile results in overall poor performance, lower search rankings, and a frustrating user experience.

To keep up, many websites are adopting:

  • Adaptive design – Ensures that websites adjust dynamically to different screen sizes, providing an optimal layout on any device.
  • Progressive Web Apps (PWAs) – Combine the best features of websites and mobile apps, offering faster load times, offline capabilities, and app-like functionality without requiring a download.

Best practices for a seamless mobile experience include responsive design, fast-loading pages, and touch-friendly navigation.

Optimizing images, minimizing pop-ups, and using mobile-friendly fonts and buttons can also greatly enhance usability.

5. Enhanced Website Security & Data Privacy

Cyber threats are becoming more sophisticated.

You must take proactive measures to protect your websites from attacks, data breaches, and unauthorized access.

Implementing strong security protocols not only safeguards sensitive information but also builds trust with visitors.

Key security measures include:

  • SSL certificates – Encrypt data transmitted between users and a website, ensuring secure connections—something that search engines and users now expect as a standard feature.
  • Multi-Factor Authentication (MFA) – Adds an extra layer of security by requiring multiple verification steps before granting access, reducing the risk of compromised credentials.
  • Zero-trust security models – Ensures that all access requests, even from within a network, are continuously verified, minimizing potential security gaps.

Beyond technical defenses, compliance with evolving privacy laws such as GDPR and CCPA is essential.

You must be transparent about how they collect, store, and process user data, providing clear consent options and maintaining privacy policies that align with current regulations.

6. Sustainability & Green Web Hosting

Every website, server, and data center requires energy to function, contributing to global carbon emissions.

Optimizing websites through lighter code, efficient caching, and reduced server load also plays a role in minimizing environmental impact.

Choosing a hosting provider that values sustainability is an important step toward a greener web.

For example, Bluehost has taken steps to improve energy efficiency, ensuring that website owners can maintain high-performance sites while supporting environmentally friendly initiatives.

7. AI-Generated & Interactive Content

AI tools can assist in creating blog posts, product descriptions, and videos with minimal manual input, helping businesses maintain a steady content flow efficiently.

Beyond static content, interactive features like quizzes, calculators, and AR are becoming key for user engagement.

These elements encourage participation, increasing time on site and improving conversions.

To integrate interactive features smoothly, a hosting provider that supports interactive plugins and flexible tools can help keep websites engaging and competitive.

8. The Role of Blockchain in Web Security

Blockchain is emerging as a tool for web hosting and cybersecurity, enhancing data security, decentralization, and content authenticity.

Unlike traditional hosting, decentralized networks distribute website data across multiple nodes, reducing risks like downtime, censorship, and cyberattacks. Blockchain-powered domains also add security by making ownership harder to manipulate.

Beyond hosting, blockchain improves data verification by storing information in a tamper-proof ledger, benefiting ecommerce, digital identity verification, and intellectual property protection.

9. The Importance of Reliable Web Hosting

No matter how advanced a website is, it’s only as strong as the hosting infrastructure behind it. In 2025, website performance and uptime will remain critical factors for success, impacting everything from user experience to search engine rankings and business revenue.

Scalable hosting solutions play a crucial role in handling traffic spikes, ensuring that websites remain accessible during high-demand periods.

Whether it’s an ecommerce store experiencing a surge in holiday traffic or a viral blog post drawing in thousands of visitors, having a hosting plan that adapts to these changes is essential.

Reliable hosting providers help mitigate these challenges by offering scalable infrastructure, 100% SLA uptime guarantees, and built-in performance optimizations to keep websites running smoothly.

Features like VPS and dedicated hosting provide additional resources for growing businesses, ensuring that increased traffic doesn’t compromise speed or stability. Investing in a hosting solution that prioritizes reliability and scalability helps safeguard a website’s long-term success.

Future-Proof Your Website Today

The digital landscape is changing fast, and staying ahead is essential to staying competitive.

From AI-driven personalization to enhanced security and sustainable hosting, adapting to new trends ensures your site remains fast, secure, and engaging. Investing in performance and user experience isn’t optional, it’s the key to long-term success.

Whether launching a new site or optimizing an existing one, the right hosting provider makes all the difference.

Bluehost offers reliable, high-performance hosting with built-in security, scalability, and guaranteed uptime, so your website is ready for the future.

Get started today and build a website designed to thrive.


Image Credits

Featured Image: Image by Bluehost. Used with permission.

How To Create a Certified Fast Website To Compete In 2025

This post was sponsored by Bluehost. The opinions expressed in this article are the sponsor’s own.

Imagine clicking on a website only to wait several seconds for it to load.

Frustrating, right?

Your prospective customers think so, too.

In a world where attention spans are shrinking, even a one-second delay can lead to lost visitors, lower rankings, and missed revenue opportunities.

Research finds that B2C websites that load in one second or less have conversion rates three times higher than those that load in five seconds or more.

Conversion rates are 2.5 times higher for B2C websites that load in one second or less.

In other words, speed is no longer a luxury.

Speed is a necessity.

A fast-loading website enhances user experience, boosts SEO rankings, and drives higher conversions.

And with search engines and consumer expectations continuing to evolve, businesses must prioritize performance to stay ahead of the competition.

Implementing the right strategies ensures that websites remain fast, competitive, and ready for the demands of 2025.

A trusted partner like Bluehost provides the robust infrastructure, advanced caching mechanisms, and built-in performance enhancements needed to help websites reach peak efficiency.

1. How To Select The Right Hosting Plan

A website’s performance starts with selecting the right hosting plan. The plan should align with the site’s current and future needs to effectively accommodate growth and traffic fluctuations.

Assess Your Website’s Needs

Before settling on a hosting plan, it’s crucial to evaluate key factors like traffic expectations, content types, and scalability.

For example, websites with heavy multimedia content require more resources than text-based sites, and anticipated visitor numbers influence server capacity needs.

Additionally, selecting a plan that supports future growth ensures smooth scaling without performance bottlenecks.

Match Your Website’s Needs To What The Host Provides

Different hosting solutions cater to different website requirements, ranging from budget-friendly shared hosting to more robust, performance-driven plans. Bluehost offers multiple hosting options tailored to various business needs.

Shared Hosting can work well for smaller websites with moderate traffic, offering a cost-effective way to get started.

Bluehost’s VPS hosting offers more power and flexibility by providing dedicated resources, making it an excellent choice for growing websites that need additional performance.

For large-scale websites demanding maximum speed and control, our dedicated hosting plans deliver exclusive server access with top-tier performance for optimal speed and scalability.

2. Implement Caching Mechanisms

Caching is an essential tool for optimizing website speed by reducing the need to load the same data repeatedly. By storing frequently accessed files, caching decreases server load, enhances response times, and ensures visitors experience faster page loads.

Websites that effectively utilize caching experience better performance, lower bounce rates, and improved search rankings.

Use Built-In Caching Features

For instance, Bluehost provides multiple caching mechanisms to enhance website performance, such as PHP APC (Alternative PHP Cache). A powerful opcode caching system, PHP APC improves database query speed and optimizes PHP script execution, ensuring that frequently accessed data is retrieved faster.

On the other hand, edge caching minimizes latency by delivering content from servers closest to the user, reducing server response times and improving load speeds.

Bluehost makes it easy to use caching to enhance website speed. Caching can be enabled directly through the Bluehost control panel, ensuring seamless implementation.

Additionally, Bluehost is powered by Dell rack-mount servers, which use AMD EPYC chips, DDR5 RAM, and ultrafast NVMe storage. With caching plugins like W3 Total Cache or WP Rocket, your web pages will load faster, improving the user experience, SEO, traffic, and conversion rates.

3. Absolutely Leverage Content Delivery Networks (CDNs)

Another way to speed up websites is to examine how content is delivered to users. A Content Delivery Network (CDN) enhances website performance by distributing content across multiple servers worldwide. This reduces latency and ensures visitors load pages faster, regardless of location.

CDNs minimize the physical distance between the server and the user by caching static assets like images, stylesheets, and scripts at various data centers worldwide. This results in load times and reduced bandwidth usage.

Beyond speed improvements, CDNs also enhance website security by protecting against DDoS attacks, traffic spikes, and malicious bots. Some CDNs offer additional features, such as image optimization, automated compression, and firewall rules, that further improve performance and security.

CDNs & Bluehost

Bluehost offers built-in CDN solutions, including Cloudflare integration, to help websites achieve optimal performance and security.

Activating a CDN through Bluehost’s dashboard is straightforward, and configuring settings that best suit a website’s needs significantly improves speed and reliability.

4. Optimize Images & Media

Impact of Media Files on Load Times

Large images and unoptimized videos can significantly slow down a website. Why? High-resolution media files require more bandwidth and processing power, leading to slower page loads and a poorer user experience.

This is particularly problematic for mobile users and those with slower internet connections since heavy media files can take significantly longer to load, frustrating visitors and increasing bounce rates.

Additionally, media files that are not optimized can consume excessive server resources, potentially affecting overall website performance. If too many large files are loaded simultaneously, the hosting environment can strain, causing slowdowns for all users.

Image- and media-based slowdowns are widespread on websites that rely heavily on visual content, such as e-commerce platforms, portfolios, and media-heavy blogs.

Reducing file sizes, choosing appropriate formats, and leveraging compression techniques can greatly enhance website speed while maintaining visual quality.

How To Size Images The Right Way

First, while it may be common and easy to do, avoid using the width and height attributes in HTML to resize images since this forces the browser to scale the image, increasing load times and decreasing performance.

Instead, resize images before uploading them using graphic editing tools such as Photoshop, GIMP, or online compression services. Scaling images improperly can lead to pixelation and a stretched appearance, negatively impacting user experience.

By resizing images to their intended display size before uploading, websites can significantly reduce the amount of data a browser needs to process, resulting in faster page loads and a more visually appealing layout.

Appropriately resized images will also have a higher visual quality because they are sized for the right display dimensions.

How To Compress Images For Better Website Performance

Compressing images using tools like Squoosh, TinyPNG, or plugins like Smush helps reduce file sizes without sacrificing quality.

Implementing lazy loading ensures that off-screen images and videos only load when needed, reducing initial load times and enhancing overall site performance.

5. Minimize Plugins & External Scripts

How To Discover Your Plugins’ Usage

Overloading a website with excessive plugins and external scripts can severely impact performance. Therefore, it’s essential to regularly assess installed plugins and remove outdated, redundant, or unnecessary ones.

Limiting the number of external scripts running on a page can also help reduce loading times and improve efficiency.

How To Choose Efficient Plugins

Selecting the right plugins is crucial for maintaining website performance. First, look for lightweight, well-coded plugins that prioritize speed and efficiency.

Then, regularly auditing your plugins and removing outdated or redundant ones can prevent conflicts and minimize resource usage.

Bluehost provides hosting environments tailored for WordPress users, ensuring compatibility with essential caching, security, and SEO plugins.

By hosting your website on a reliable platform like Bluehost, you can benefit from a stable infrastructure that complements the best WordPress plugins. This will help you enhance functionality without compromising speed.

6. Tips For Compression, Minification & Technical Tweaks

Additional technical optimizations, in addition to caching and CDNs, can further improve site speed and performance. Compression and minification techniques help reduce file sizes, while other backend optimizations ensure web pages load efficiently.

Implementing these strategies can significantly improve desktop and mobile user experiences.

Benefits Of Compression

Reducing the size of HTML, CSS, and JavaScript files significantly improves page speed. Compressed files require less bandwidth and load faster, creating a smoother user experience.

Effortless Compression & Technical Optimization With Bluehost

Bluehost makes compression easy. GZIP compression can be enabled via Bluehost’s control panel or by modifying the .htaccess file.

Plugins like Autoptimize help minify code by removing unnecessary characters, ensuring that files remain lightweight and optimized for performance.

Utilizing ETags & Expires Headers

Another important aspect of page speed optimization involves using ETags and expired headers, which help streamline browser requests and improve overall efficiency.

These settings instruct a visitor’s browser on how to handle cached content, preventing unnecessary reloads and reducing the number of requests made to the server.

ETags (Entity Tags) are used by browsers to determine whether cached resources have been modified since the last visit. If the content remains unchanged, the browser loads the local copy instead of downloading it again, minimizing bandwidth usage and speeding up load times.

On the other hand, expired headers specify a timeframe for when specific resources should be refreshed.

By setting an appropriate expiration date for static files like images, CSS, and JavaScript, web developers can ensure that repeat visitors are not unnecessarily reloading content that has not changed.

For example, a website logo that remains consistent across pages can be cached efficiently so that users do not have to download it every time they navigate the site.

Properly configuring these settings enhances website performance, particularly for sites with recurring visitors. It prevents redundant data transfers and reduces the workload on the browser and server.

Many hosting providers, including Bluehost, offer tools and support to help website owners implement these optimizations effectively. This ensures a faster and more seamless user experience.

7. Regularly Monitor & Execute Maintenance

Practice Continuous Performance Assessment

Technology changes and slows down. Websites are no exception.

Therefore, websites should undergo regular performance assessments to ensure they’re continually optimized for the best user experience.

Routine speed testing helps identify areas where performance can be improved, whether by addressing slow-loading elements, optimizing server response times, or refining backend processes.

Various tools can assist in performance evaluation. Google PageSpeed Insights, for example, provides detailed reports on website speed and offers specific recommendations for improvements.

Lighthouse, a Google open-source tool, analyzes performance, accessibility, and SEO, helping site owners fine-tune their pages.

Beyond automated tools, ongoing monitoring through website analytics platforms, such as Google Analytics, can offer valuable insights into user behavior.

High bounce rates and low engagement metrics may indicate slow performance, guiding further refinements.

Businesses running ecommerce platforms or large applications should consider integrating application performance monitoring (APM) tools to track performance bottlenecks in real time.

Maintenance Tips

Regular updates to website software, regardless of the platform used, are essential for security and performance.

Content management systems (CMS) like WordPress, Joomla, and Drupal require frequent updates to core files, themes, and plugins to prevent compatibility issues and vulnerabilities. Similarly, frameworks and libraries for custom-built sites must be kept up to date to ensure efficiency and security.

Database optimization is another crucial maintenance task. Over time, databases accumulate redundant data, slowing down query execution.

Periodic optimizations, such as removing unused tables, cleaning up post revisions, and properly indexing databases, can enhance efficiency.

Server maintenance is equally important. Websites hosted on dedicated or VPS servers should have automated backups, uptime monitoring, and log analysis configured.

Cloud-based hosting solutions like Bluehost Cloud provide performance-tracking tools that can help identify and mitigate slowdowns at the infrastructure level, a 100% uptime SLA, and more to ensure websites run smoothly.

Lastly, implementing a proactive security strategy ensures ongoing performance stability. Regular malware scans, security patches, and SSL certificate renewals help prevent vulnerabilities that could slow down or compromise a website.

Security plugins and firewalls, such as Cloudflare, add an extra layer of protection while minimizing unwanted traffic that could strain server resources.

That’s what makes Bluehost the superior choice. We offer automated backups, performance monitoring tools, and dedicated 24/7 support professionals who can help keep your website running at peak efficiency.

And with a range of hosting plans tailored to different needs, Bluehost ensures that your website will remain fast, secure, and scalable as it grows.

Building a certified fast website in 2025 requires strategic hosting, caching, content delivery, and ongoing maintenance.

Leveraging Bluehost’s robust hosting plans, integrated CDN, and performance optimization tools ensures your website remains fast, competitive, and ready for the evolving digital landscape.

Bluehost’s hosting solutions provide an easy and reliable way to optimize performance.


Image Credits

Featured Image: Image by Bluehost. Used with permission.

Google Updates Product Markup To Support Member Pricing & Sales via @sejournal, @MattGSouthern

Google updates structured data guidelines, enabling merchants to showcase sale, member, and strikethrough pricing in search results.

  • Google added support for new structured data properties to show different price types.
  • Merchants can now display loyalty program pricing tiers in their product markup.
  • These changes align Google Search with existing Merchant Center capabilities.
Google CrUX Report Update Targets LCP & Network Delays via @sejournal, @MattGSouthern

Google Chrome has released its latest Chrome User Experience Report (CrUX).

This update zeroes in on individual components of page speed, offering fresh data on largest contentful paint (LCP) image subparts and real-world network conditions (RTT).

For SEO professionals, this means you’ll have a better understanding of what needs improvement.

Barry Pollard, Web Performance Developer Advocate at Google Chrome, made the announcement on Bluesky.

Key Updates to CrUX Report

Speed and user experience are known to impact search visibility, and Google’s latest CrUX update breaks down site performance barriers with greater detail:

Granular LCP Details
New “image subparts” let you pinpoint what’s slowing down your largest image element.

With Time to First Byte, Resource Load Delay, Resource Load Duration, and Element Render Delay all in one mix, you can see if your bottleneck is server lag, render delays, or even how late the browser finds your image.

RTT Over ECT
Rather than lumping user connections into outdated “3G/4G” bins, Google’s new round trip time (RTT) tri-bins highlight the true speed of your audience’s networks.

Identifying high-latency segments can guide you toward optimizing for users in specific regions or network conditions.

BigQuery & CrUX Vis Updates
SEO professionals can access updated BigQuery datasets, which include more granular RTT information and broader coverage for metrics like Interaction to Next Paint (INP).

Additionally, the CrUX Vis tool (cruxvis.withgoogle.com) now shows everything from streamlined LCP subparts to country-level RTT stats, ideal for spotting speed issues at a glance.

Better Data Coverage, More Competitive Edge
By retiring the Effective Connection Type (ECT) dimension, Google can report richer data for a broader range of sites.

How To Leverage The New Metrics

  • Zero In on Server Delays: Pinpoint high Time to First Byte if your hosting setup or backend scripts are slowing that crucial first load.
  • Check Image Discovery Times: A Resource Load Delay might indicate a late-discovered hero image or JavaScript injection issues, which are common trouble spots in modern frameworks.
  • Optimize Media Files: Long Resource Load Duration can highlight oversized or unoptimized images that hamper page speed, a frequent cause of poor LCP.
  • Tailor Solutions for Different Regions: With RTT tri-bins now informing you if certain countries or networks face big delays, you can serve region-specific content faster (e.g., via CDNs or local servers).

Why This Matters

The latest CrUX update provides new data and identifies where your page may fail visitors, potentially harming your rankings. This can help you fix issues quickly, potentially improving site performance and visibility.

To analyze your site, visit cruxvis.withgoogle.com or check the CrUX BigQuery dataset.


Featured Image: julslst/Shutterstock

How To Use XML Sitemaps To Boost SEO via @sejournal, @jes_scholz

What was considered best practice yesterday does not hold true today and this is especially relevant when it comes to XML sitemaps, which are almost as old as SEO itself.

The problem is, it’s time-consuming to sort valuable advice from all the misinformation on forums and social media about how to optimize XML sitemaps

So, while most of us recognize the importance of submitting sitemaps to Google Search Console and Bing Webmaster Tools, as well as in the robots.txt file – for faster content discovery and refresh, more efficient crawling of SEO-relevant pages, and valuable indexing reporting to identify SEO issues – the finer details of implementing sitemaps to improve SEO performance may be missed.

Let’s clear up the confusion and dive into the current best practices for sitemap optimization.

In this article, we cover:

  • What is an XML sitemap?
  • How to create a sitemap.
  • Valid XML sitemap format.
  • Types of sitemaps.
  • Optimization of XML sitemaps.
  • XML sitemap best practice checklist.

What Is An XML Sitemap?

An XML sitemap is a file that lists all of your website’s URLs.

It acts as a roadmap to tell the crawlers of indexing platforms (like search engines, but also large language models (LLMs)) what content is available and how to reach it.

sitemap vs website crawlingImage from author, February 2025

In the example above, a search engine will find all nine pages in a sitemap with one visit to the XML sitemap file.

On the website, it will have to jump through five internal links on five pages to find page 9.

This ability of XML sitemaps to assist crawlers in faster indexing is especially important for websites that:

  • Have thousands of pages and/or a deep website architecture.
  • Frequently add new pages.
  • Frequently change the content of existing pages.
  • Suffer from weak internal linking and orphan pages.
  • Lack of a strong external link profile.

Even though indexing platforms could technically find your URLs without it, by including pages in an XML sitemap, you’re indicating that you consider them to be quality landing pages.

And while there is no guarantee that an XML sitemap will get your pages crawled faster, let alone indexed or ranked, submitting one certainly increases your chances.

How To Create A Sitemap

There are two ways to create a sitemap: Static sitemaps that must be manually updated, or dynamic sitemaps that are updated in real-time or by a regular cron job.

Static sitemaps are simple to create using a tool such as Screaming Frog.

The problem is that as soon as you create or remove a page, your sitemap is outdated. If you modify the content of a page, the sitemap won’t automatically update the lastmod tag.

So, unless you love manually creating and uploading sitemaps for every single change, it’s best to avoid static sitemaps.

Dynamic XML sitemaps, on the other hand, are automatically updated by your server to reflect relevant website changes.

To create a dynamic XML sitemap you can do one of the following:

  • Ask your developer to code a custom script, being sure to provide clear specifications.
  • Use a dynamic sitemap generator tool.
  • Install a plugin for your content management system (CMS), for example, Yoast plugin for WordPress.

Valid XML Sitemap Format

Image from author, February 2025

Your sitemap needs three items to introduce itself to indexing platforms:

  • XML Version Declaration: Specifies the file type to inform indexing platforms what they can expect from the file.
  • UTF-8 Encoding: Ensures all the characters used can be understood.
  • Specify The Namespace: Communicates what rules the sitemap follows. Most sitemaps use the “http://www.sitemaps.org/schemas/sitemap/0.9” namespace to show that the file conforms to standards set by sitemaps.org.

This is followed by a URL container for each page. In a standard XML sitemap, there are only two tags that should be included for a URL:

  1. Loc (a.k.a. Location) Tag: This compulsory tag contains the absolute, canonical version of the URL location. It should accurately reflect your site protocol (http or https) and if you have chosen to include or exclude www.
  2. Lastmod (a.k.a. Last Modified) Tag: An optional but highly recommended tag to communicate the date and time the page was published or the last meaningful change. This helps indexing platforms understand which pages have fresh content and prioritize them for crawling.

Google’s documentation on sitemaps states:

“Google uses the value if it’s consistently and verifiably (for example by comparing to the last modification of the page) accurate. The value should reflect the date and time of the last significant update to the page. For example, an update to the main content, the structured data, or links on the page is generally considered significant, however an update to the copyright date is not.”

Bing’s documentation agrees on the importance of the lastmod tag:

“The “lastmod” tag is used to indicate the last time the web pages linked by the sitemaps were modified. This information is used by search engines to determine how frequently to crawl your site, and to decide which pages to index and which to leave out.”

Mistakes, such as updating the value when the sitemap is generated rather than when the individual page was last modified, or worse, trying to manipulate crawlers by updating the date without significantly altering the page, may result in this signal being ignored by search engines for your website. Damaging your ability to have your content efficiently crawled.

Do not include the Changefreq (a.k.a. Change Frequency) Tag or priority tag. Once upon a time, these hinted at how often to crawl, but are now ignored by search engines.

Types Of Sitemaps

There are many different types of sitemaps. Let’s look at the ones you actually need.

XML Sitemap Index

XML sitemaps have a couple of limitations:

  • A maximum of 50,000 URLs.
  • An uncompressed file size limit of 50 MB.

Sitemaps can be compressed using gzip to save bandwidth for your server. But once unzipped, the sitemap still can’t exceed either limit.

Whenever you exceed either limit, you will need to split your URLs across multiple XML sitemaps.

Those sitemaps can then be combined into a single XML sitemap index file, often named sitemap-index.xml. Essentially, it is a sitemap for sitemaps.

You can create multiple sitemap index files. But be aware that you cannot nest sitemap index files.

For indexing platforms to easily find every one of your sitemap files, you will want to:

  • Submit your sitemap index to Google Search Console and Bing Webmaster Tools.
  • Specify your sitemap or sitemap index URL(s) in your robots.txt file. Pointing indexing platforms directly to your sitemap as you welcome them to crawl.
Image from author, February 2025

Image Sitemap

Image sitemaps were designed to improve the indexing of image content, originally offering additional tags.

In modern-day SEO, however, it’s best practice to utilize JSON-LD schema.org/ImageObject markup to call out image properties to indexing platforms, as it provides more attributes than an image XML sitemap.

Because of this, a dedicated XML image sitemap is unnecessary. Simply add the image XML namespace declaration and the image tags directly to the main XML sitemap within the associated URL.

Image from author, February 2025

Know that images don’t have to be on the same domain as your website to be submitted in a sitemap. You can use a CDN as long as it’s verified in Google Search Console.

Video Sitemap

Similar to images, Google says video sitemap tags can be added within an existing sitemap.

However, unlike images, video extensions in sitemaps offer a multitude of additional tags.

Image from author, February 2025

If you leverage these tags extensively, consider a dedicated video sitemap.

Adding these extensions increases the file size of your sitemap significantly and may lead to you exceeding the file size limits.

Either method will help Google discover, crawl, and index your video content as long as the video is related to the content of the host page and is accessible to Googlebot.

While Bing does support video extensions in XML sitemaps, Fabrice Canel confirmed to me that they prefer submission via IndexNow. Although Bing’s documentation still mentions the mRSS format.

Google News Sitemap

Google News sitemaps can only be used for article content that was created in the last two days. Once the articles are older than 48 hours, remove the URLs from the sitemap.

Again, while Google News sitemap tags can be included in your regular sitemap, this is not recommended.

Unlike for image and video, only Google leverages the news sitemap extension, not Bing or other indexers.

Image from author, February 2025

Contrary to some online advice, Google News sitemaps don’t support image URLs.

HTML Sitemap

XML sitemaps take care of indexing platform needs. HTML sitemaps were designed to assist human users in finding content.

The question becomes: If you have a good user experience and well-crafted internal links, do you need an HTML sitemap?

Check the page views of your HTML sitemap in Google Analytics. Chances are, it’s very low. If not, it’s a good indication that you need to improve your website navigation.

HTML sitemaps are generally linked in website footers. Taking link equity from every single page of your website.

Ask yourself. Is that the best use of that link equity? Or are you including an HTML sitemap as a nod to legacy website best practices?

If few humans use it, and indexing platforms don’t need it as you have strong internal linking and an XML sitemap, does that HTML sitemap have a reason to exist? I would argue no.

XML Sitemap Optimization

XML sitemap optimization involves how you structure your sitemaps and what URLs are included.

How you choose to do this impacts how efficiently indexing platforms crawl your website and, thus, your content visibility.

Here are four ways to optimize XML sitemaps:

1. Only Include SEO Relevant Pages In XML Sitemaps

An XML sitemap is a list of pages you want to be crawled (and subsequently given visibility to by indexing platforms), which isn’t necessarily every page of your website.

A bot arrives at your website with an “allowance” for how many pages it will crawl.

The XML sitemap indicates that you consider the included URLs more important than those that aren’t blocked but not in the sitemap.

You’re using it to tell indexing platforms, “I’d really appreciate it if you’d focus on these URLs in particular.”

To help them crawl your site more intelligently and reap the benefits of faster (re)indexing, do not include:

  • 301 redirect URLs.
  • 404 or 410 URLs.
  • Non-canonical URLs.
  • Pages with noindex tags.
  • Pages blocked by robots.txt.
  • Paginated pages.
  • Parameter URLs that aren’t SEO-relevant.
  • Resource pages accessible by a lead gen form (e.g., white paper PDFs).
  • Utility pages that are useful to users, but not intended to be landing pages (login page, contact us, privacy policy, account pages, etc.).

I’ve seen recommendations to add 3xx, 4xx, or non-indexable pages to sitemaps in the hope it will speed up deindexing.

But similar to manipulation of the last mod date, such attempts to get these pages processed faster may result in the sitemaps being ignored by search engines as a signal, damaging your ability to have your valuable content efficiently crawled.

But remember, Google is going to use your XML submission only as a hint about what’s important on your site.

Just because it’s not in your XML sitemap doesn’t necessarily mean that Google won’t index those pages.

2. Ensure Your XML Sitemap Is Valid

XML sitemap validators can tell you if the XML code is valid. But this alone is not enough.

There might be another reason why Google or Bing can’t fetch your sitemap, such as robots directives. Third-party tools won’t be able to identify this.

As such, the most efficient way to ensure your sitemap is valid is to submit it directly to Google Search Console and Bing Webmaster Tools.

Image from author, February 2025

When valid in GSC and BWT, you will see the green “Success” status.

Image from author, February 2025

If you get a red message instead, click on the error to find out why, fix it, and resubmit.

But in Google Search Console and Bing Webmaster Tools, you can do so much more than simple validation.

3. Leverage Sitemap Reporting For Indexing Analysis

Image from author, February 2025

Say you submit 80,000 pages all in one sitemap index, and 9,000 are excluded by both Google and Bing.

Sitemap reporting will help you to understand overarching why, but provides limited reporting on which URLs are problematic.

So, while it’s valuable information, it’s not easily actionable. You need to discover which types of pages were left out.

What if you use descriptive sitemap names that reflect the sections of your website – one for categories, products, articles, etc.?

Image from author, February 2025

Then, we can drill down to see that 7,000 of the 9,000 non-indexed URLs are category pages – and clearly know where to focus attention.

This can also be done within a sitemap index file.

Now, I know both Google and Schema.org show examples encouraging numbered naming. So, you may have ended up in a /sitemap-products-index.xml file with something like this:

  • /products-1.xml
  • /products-2.xml

Which is not the most insightful naming convention. What if we break it down into parent categories? For example:

  • /products-mens.xml
  • /products-womens.xml
  • /products-kids.xml

And if your website is multilingual, be sure to leverage language as an additional separation layer.

Such smart structuring of sitemaps to group by page type allows you to dive into the data more efficiently and isolate indexing issues.

Just remember, for this to effectively work, sitemaps need to be mutually exclusive, with each URL existing in only one sitemap. The exception is the Google News sitemap.

4. Strategize Sitemap Size

As mentioned before, search engines impose a limit of 50,000 URLs per sitemap file.

Some SEO specialists intentionally reduce this number, say to 10,000. This can be helpful to speed up indexing.

However, you can only download 1,000 URLs in GSC. So, if 2,000 URLs in a certain sitemap are not indexed, you can only access half of them. If you are trying to do content cleanup, this will not be enough.

Image from author, February 2025

To gain full visibility on all URLs causing issues, break sitemaps down into groups of 1,000.

The obvious downside is that this has a higher setup time as all URLs need to be submitted in Google Search Console and Bing Webmaster Tools. This may also require high levels of ongoing management.

XML Sitemap Best Practice Checklist

Do invest time to:

✓ Dynamically generate XML sitemaps.

✓ Compress sitemap files.

✓ Use a sitemap index file.

✓ Include the & tags.

✓ Use image tags in existing sitemaps.

✓ Use video and Google News sitemaps if relevant.

✓ Reference sitemap URLs in robots.txt.

✓ Submit sitemaps to both Google Search Console and Bing Webmaster Tools.

✓ Include only SEO-relevant pages in XML sitemaps.

✓ Ensure URLs are included only in a single sitemap.

✓ Ensure the sitemap code is error-free.

✓ Group URLs in descriptively named sitemaps based on page type.

✓ Strategize how to break down large sitemap files.

✓ Use Google Search Console and Bing Webmaster Tools to analyze indexing rates.

Now, go check your own sitemaps and make sure you’re doing it right.

More Resources:


Featured Image: BEST-BACKGROUNDS/Shutterstock

Google’s JavaScript SERPs Impact Trackers, AI

Google’s search engine results pages now require JavaScript, effectively “hiding” the listings from organic rank trackers, artificial intelligence models, and other optimization tools.

The world’s most popular search engine began requiring JavaScript on search pages last month. Google stated the move aimed to protect its services from bots and “abuse,” perhaps a thinly veiled allusion to competitive AI.

These changes could complicate search engine optimization in at least three ways: rank tracking, keyword research, and AI visibility.

Google Search now requires browsers to have JavaScript enabled.

Impact of JavaScript

Web crawlers can scrape and index JavaScript-enabled pages even when the JavaScript itself renders the content. Googlebot does this, for example.

A web-scraping bot grabs the content of an HTML page in four steps, more or less:

  • Request. The crawler sends a simple HTTP GET request to the URL.
  • Response. The server returns the HTML content.
  • Parse. The crawler parses (analyzes) the HTML, gathering the content.
  • Use. The content is passed on for storage or use.

For example, before the JavaScript switch, bots from Ahrefs and Semrush crawled Google SERPs. A bot could visit the SERP for, say, “men’s running shoes,” parse the HTML, and use the data to produce rank-tracking and traffic reports.

The process is relatively more complicated with JavaScript.

  • Request. The crawler sends a simple HTTP GET request to the URL.
  • Response. The server returns a basic HTML skeleton, often without much content (e.g.,

    ).

  • Execute. To run the JavaScript and load dynamic content., the crawler renders the page in a headless browser such as Puppeteer, Playwright, or Selenium.
  • Wait. The crawler waits for the page to load, including API calls and data updates. A few milliseconds might seem insignificant, but it slows down the crawlers and adds costs.
  • Parse. The crawler parses the dynamic and static HTML, gathering the content as before.
  • Use. The content is passed on for storage or use.

The two additional steps — Execute and Wait — are far from trivial since they require full browser simulation and thus much more CPU and RAM. Some have estimated that JavaScript-enabled crawling takes three to 10 times more computing resources than scraping static HTML.

Feature HTML Scraping JavaScript Scraping
Initial response Full HTML content Minimal HTML with placeholders
JavaScript execution Not required Required
Tools Requests, BeautifulSoup, Scrapy Puppeteer, Playwright, Selenium
Performance Faster, lightweight Slower, resource-heavy
Content availability Static content only Both static and dynamic content
Complexity Low High

It is worth clarifying that Google does not render the entire SERP with JavaScript, instead requiring that visitors’ browsers enable JavaScript — essentially the same impact.

The time and resources to crawl a SERP vary greatly. Hence one cannot easily assess the impact of Google’s new JavaScript requirement on one tool or another other than an educated guess.

Rank tracking

Marketers use organic rank-tracking tools to monitor where a web page appears on Google SERPs — listings, featured snippets, knowledge panels, local packs — for target keywords.

Semrush, Ahrefs, and other tools crawl millions, if not billions, of SERPs monthly. Rendering and parsing those dynamic results pages could raise costs significantly, perhaps fivefold.

For marketers, this potential increase might mean tracking tools become more expensive or relatively less accurate if they crawl SERPs infrequently.

Keyword research

Google’s JavaScript requirement may also impact keyword research since identifying relevant, high-traffic keywords could become imprecise and more costly.

These changes may force marketers to find other ways to identify content topics and keyword gaps. Kevin Indig, a respected search engine optimizer, suggested that marketers turn to page- or domain-level traffic metrics if keyword data becomes unreliable.

AI models

The hype surrounding AI engines reminds me of voice search a few years ago, although the former is becoming much more transformative.

Likely AI models crawled Google results to discover pages and content. An AI model asked to find the best running shoe for a 185-pound male might scrape a Google SERP and follow links to the top 10 sites. Thus some marketers expected a halo effect from ranking well on Google.

But AI models must now spend extra time and computing power to parse Google’s JavaScript-driven results pages.

Wait and Adapt

As is often the case with Google’s changes, marketers must wait to gauge the JavaScript effect, but one thing is certain: SEO is changing.