Optimizing Interaction To Next Paint (INP): A Step-By-Step Guide via @sejournal, @DebugBear

This post was sponsored by DebugBear. The opinions expressed in this article are the sponsor’s own.

Keeping your website fast is important for user experience and SEO.

The Core Web Vitals initiative by Google provides a set of metrics to help you understand the performance of your website.

The three Core Web Vitals metrics are:

This post focuses on the recently introduced INP metric and what you can do to improve it.

How Is Interaction To Next Paint Measured?

INP measures how quickly your website responds to user interactions – for example, a click on a button. More specifically, INP measures the time in milliseconds between the user input and when the browser has finished processing the interaction and is ready to display any visual updates on the page.

Your website needs to complete this process in under 200 milliseconds to get a “Good” score. Values over half a second are considered “Poor”. A poor score in a Core Web Vitals metric can negatively impact your search engine rankings.

Google collects INP data from real visitors on your website as part of the Chrome User Experience Report (CrUX). This CrUX data is what ultimately impacts rankings.

Image created by DebugBear, May 2024

How To Identify & Fix Slow INP Times

The factors causing poor Interaction to Next Paint can often be complex and hard to figure out. Follow this step-by-step guide to understand slow interactions on your website and find potential optimizations.

1. How To Identify A Page With Slow INP Times

Different pages on your website will have different Core Web Vitals scores. So you need to identify a slow page and then investigate what’s causing it to be slow.

Using Google Search Console

One easy way to check your INP scores is using the Core Web Vitals section in Google Search Console, which reports data based on the Google CrUX data we’ve discussed before.

By default, page URLs are grouped into URL groups that cover many different pages. Be careful here – not all pages might have the problem that Google is reporting. Instead, click on each URL group to see if URL-specific data is available for some pages and then focus on those.

Screenshot of Google Search Console, May 2024

Using A Real-User Monitoring (RUM) Service

Google won’t report Core Web Vitals data for every page on your website, and it only provides the raw measurements without any details to help you understand and fix the issues. To get that you can use a real-user monitoring tool like DebugBear.

Real-user monitoring works by installing an analytics snippet on your website that measures how fast your website is for your visitors. Once that’s set up you’ll have access to an Interaction to Next Paint dashboard like this:

Screenshot of the DebugBear Interaction to Next Paint dashboard, May 2024

You can identify pages you want to optimize in the list, hover over the URL, and click the funnel icon to look at data for that specific page only.

Image created by DebugBear, May 2024

2. Figure Out What Element Interactions Are Slow

Different visitors on the same page will have different experiences. A lot of that depends on how they interact with the page: if they click on a background image there’s no risk of the page suddenly freezing, but if they click on a button that starts some heavy processing then that’s more likely. And users in that second scenario will experience much higher INP.

To help with that, RUM data provides a breakdown of what page elements users interacted with and how big the interaction delays were.

Screenshot of the DebugBear INP Elements view, May 2024

The screenshot above shows different INP interactions sorted by how frequent these user interactions are. To make optimizations as easy as possible you’ll want to focus on a slow interaction that affects many users.

In DebugBear, you can click on the page element to add it to your filters and continue your investigation.

3. Identify What INP Component Contributes The Most To Slow Interactions

INP delays can be broken down into three different components:

  • Input Delay: Background code that blocks the interaction from being processed.
  • Processing Time: The time spent directly handling the interaction.
  • Presentation Delay: Displaying the visual updates to the screen.

You should focus on which INP component is the biggest contributor to the slow INP time, and ensure you keep that in mind during your investigation.

Screenshot of the DebugBear INP Components, May 2024

In this scenario, Processing Time is the biggest contributor to the slow INP time for the set of pages you’re looking at, but you need to dig deeper to understand why.

High processing time indicates that there is code intercepting the user interaction and running slow performing code. If instead you saw a high input delay, that suggests that there are background tasks blocking the interaction from being processed, for example due to third-party scripts.

4. Check Which Scripts Are Contributing To Slow INP

Sometimes browsers report specific scripts that are contributing to a slow interaction. Your website likely contains both first-party and third-party scripts, both of which can contribute to slow INP times.

A RUM tool like DebugBear can collect and surface this data. The main thing you want to look at is whether you mostly see your own website code or code from third parties.

Screenshot of the INP Primary Script Domain Grouping in DebugBear, May 2024

Tip: When you see a script, or source code function marked as “N/A”, this can indicate that the script comes from a different origin and has additional security restrictions that prevent RUM tools from capturing more detailed information.

This now begins to tell a story: it appears that analytics/third-party scripts are the biggest contributors to the slow INP times.

5. Identify Why Those Scripts Are Running

At this point, you now have a strong suspicion that most of the INP delay, at least on the pages and elements you’re looking at, is due to third-party scripts. But how can you tell whether those are general tracking scripts or if they actually have a role in handling the interaction?

DebugBear offers a breakdown that helps see why the code is running, called the INP Primary Script Invoker breakdown. That’s a bit of a mouthful – multiple different scripts can be involved in slowing down an interaction, and here you just see the biggest contributor. The “Invoker” is just a value that the browser reports about what caused this code to run.

Screenshot of the INP Primary Script Invoker Grouping in DebugBear, May 2024

The following invoker names are examples of page-wide event handlers:

  • onclick
  • onmousedown
  • onpointerup

You can see those a lot in the screenshot above, which tells you that the analytics script is tracking clicks anywhere on the page.

In contrast, if you saw invoker names like these that would indicate event handlers for a specific element on the page:

  • .load_more.onclick
  • #logo.onclick

6. Review Specific Page Views

A lot of the data you’ve seen so far is aggregated. It’s now time to look at the individual INP events, to form a definitive conclusion about what’s causing slow INP in this example.

Real user monitoring tools like DebugBear generally offer a way to review specific user experiences. For example, you can see what browser they used, how big their screen is, and what element led to the slowest interaction.

Screenshot of a Page View in DebugBear Real User Monitoring, May 2024

As mentioned before, multiple scripts can contribute to overall slow INP. The INP Scripts section shows you the scripts that were run during the INP interaction:

Screenshot of the DebugBear INP script breakdown, May 2024

You can review each of these scripts in more detail to understand why they run and what’s causing them to take longer to finish.

7. Use The DevTools Profiler For More Information

Real user monitoring tools have access to a lot of data, but for performance and security reasons they can access nowhere near all the available data. That’s why it’s a good idea to also use Chrome DevTools to measure your page performance.

To debug INP in DevTools you can measure how the browser processes one of the slow interactions you’ve identified before. DevTools then shows you exactly how the browser is spending its time handling the interaction.

Screenshot of a performance profile in Chrome DevTools, May 2024

How You Might Resolve This Issue

In this example, you or your development team could resolve this issue by:

  • Working with the third-party script provider to optimize their script.
  • Removing the script if it is not essential to the website, or finding an alternative provider.
  • Adjusting how your own code interacts with the script

How To Investigate High Input Delay

In the previous example most of the INP time was spent running code in response to the interaction. But often the browser is already busy running other code when a user interaction happens. When investigating the INP components you’ll then see a high input delay value.

This can happen for various reasons, for example:

  • The user interacted with the website while it was still loading.
  • A scheduled task is running on the page, for example an ongoing animation.
  • The page is loading and rendering new content.

To understand what’s happening, you can review the invoker name and the INP scripts section of individual user experiences.

Screenshot of the INP Component breakdown within DebugBear, May 2024

In this screenshot, you can see that a timer is running code that coincides with the start of a user interaction.

The script can be opened to reveal the exact code that is run:

Screenshot of INP script details in DebugBear, May 2024

The source code shown in the previous screenshot comes from a third-party user tracking script that is running on the page.

At this stage, you and your development team can continue with the INP workflow presented earlier in this article. For example, debugging with browser DevTools or contacting the third-party provider for support.

How To Investigate High Presentation Delay

Presentation delay tends to be more difficult to debug than input delay or processing time. Often it’s caused by browser behavior rather than a specific script. But as before, you still start by identifying a specific page and a specific interaction.

You can see an example interaction with high presentation delay here:

Screenshot of the an interaction with high presentation delay, May 2024

You see that this happens when the user enters text into a form field. In this example, many visitors pasted large amounts of text that the browser had to process.

Here the fix was to delay the processing, show a “Waiting…” message to the user, and then complete the processing later on. You can see how the INP score improves from May 3:

Screenshot of an Interaction to Next Paint timeline in DebugBear, May 2024

Get The Data You Need To Improve Interaction To Next Paint

Setting up real user monitoring helps you understand how users experience your website and what you can do to improve it. Try DebugBear now by signing up for a free 14-day trial.

Screenshot of the DebugBear Core Web Vitals dashboard, May 2024

Google’s CrUX data is aggregated over a 28-day period, which means that it’ll take a while before you notice a regression. With real-user monitoring you can see the impact of website changes right away and get alerted automatically when there’s a big change.

DebugBear monitors lab data, CrUX data, and real user data. That way you have all the data you need to optimize your Core Web Vitals in one place.

This article has been sponsored by DebugBear, and the views presented herein represent the sponsor’s perspective.

Ready to start optimizing your website? Sign up for DebugBear and get the data you need to deliver great user experiences.

Image Credits

Featured Image: Image by Redesign.co. Used with permission.

International SEO For 2024: 9-Point Checklist For Success via @sejournal, @LidiaInfanteM

Getting your international SEO strategy right can be an elusive feat.

There are a lot more factors at play than people give credit for, and it’s often a thankless job.

A successful international SEO strategy requires a deep knowledge of your company’s commercial strategy as well as technical SEO knowledge, cultural sensitivity, and excellent data skills.

Yet the industry often regards international SEO as just your hreflang setup.

In this article, I will distill the complexities of international SEO success into an actionable step-by-step list that will take you from beginner to advanced practitioner. Let’s begin!

Part I: Be Commercially Aware

1. Understand Why Your Company Is Going International

Companies can grow by expanding their products and services, focusing on gaining market penetration or expanding into new markets.

While your team’s goal might be traffic, leads, or revenue, the leadership team is likely working under a different set of parameters. Most of the time, leadership’s ultimate goal is to maximize shareholder value.

  • In founder-owned companies, growth goals might be slower and more sustainable, usually aimed at maintaining and growing profitability.
  • VC-owned companies have high growth goals because they must provide their investors with a return that’s higher than the stock market. This is what is known as the alpha, or your company’s ability to beat the market in growth.
  • Publicly traded companies are likely aiming to grow their share value.
  • Startups, depending on their maturity stage, are likely looking to prove product-market fit or expand their reach fast to show that their operations are scalable and have the potential to be profitable in the future. The goal of this is to aid in raising further capital from investors.

Understanding why businesses go international is essential for informing your SEO decisions. What’s best practice for SEO isn’t always what’s best for business.

You must adapt your strategy to your company’s growth model.

  • Companies choosing to grow sustainably and maintain profitability will likely expand more slowly to a market that resembles their core market.
  • VC-owned companies will be able to invest in a wider range of countries, with a smaller concern for providing their users with an experience on par with that of their core markets.
  • Startups can try to beat their competitors to market by expanding quickly and throwing a lot of money at the project, or they might be concerned with cash flow and try to expand fast but cut corners by using automatic translation.

2. Stack Rank Your Target Markets To Prioritize Your Investment

I promise I’ll get to hreflang implementation soon, but so much about international SEO has to do with commercial awareness – so bear with me; this will make you a better professional.

Many companies have different market tiers to reflect how much of a priority each market is. Market prioritization can happen using many different metrics, such as:

  • Average order value or lifetime customer value.
  • Amount of investment required.
  • Market size.
  • And market similarity.

American companies often prioritize developed English-speaking countries such as the UK, Canada, or Australia. These are most similar to their core market, and most of their market knowledge will be transferable.

After that, companies are likely to target large European economies, such as Germany and France. They might also target the LatAm market and Spain in the same effort.

The last prioritization tier can vary widely among companies, with a focus on the Nordic, Brazilian, or Asian markets.

Part II: Know Your Tech

3. Define Your International URL Structure

When doing international SEO, there are 4 different possible URL structures, each with its pros and cons.

ccTLD Structure

A ccTLD structure is set up to target different countries based on the domain type.

This structure is not ideal for companies that target different languages rather than different countries. For example, a .es website is targeting Spain, not the Spanish language.

An advantage to this kind of structure is that the ccTLD sends a very strong localization signal to search engines as to what market they are targeting, and they can lead to improved trust and CTR in your core country.

On the other hand, ccTLDs can dilute your site’s authority, as links will be spread across domains rather than concentrated on the .com.

gTLD With Subdirectories

This is my personal favorite when it comes to international SEO.

These URL structures can look like website.com/en if they’re targeting languages or website.com/en-gb if they’re targeting countries.

This configuration aggregates the authority you gain across your different territories into a single domain, it’s cheaper to maintain, and the .com TLD is widely recognizable by users worldwide.

On the other hand, this setup can look less personalized to people outside the US, who might wonder if you can service their markets.

gTLD With Subdomains

This setup involves placing international content on a subdomain like us.website.com. While once popular, it’s slipping in favor because it doesn’t bring anything unique to the table anymore.

This setup offers a clear signal to users and search engines about the intended audience of a specific subdomain.

However, subdomains often face issues with SEO, as Google tends to view them as separate entities. This separation can dilute link, similar to the ccTLD approach but without the geo-targeting advantages.

gTLD With Parameters

This is the setup where you add parameters at the end of the URL to indicate the language of the page, such as website.com/?lang=en.

I strongly advise against this setup, as it can present multiple technical SEO challenges and trust issues.

4. Understand Your Hreflang Setup

In the words of John Mueller: hreflang can be one of the most complex aspects of SEO.

Tweet by John Mueller talking about how hreflang can be one of the more complex aspects of SEO.Screenshot from Twitter, May 2024

Hreflang reminds me of a multilingual form of a canonical tag, where we tell search engines that one document is a version of the other and explain the relationship between them.

I find hreflang implementation very interesting from a technical point of view. Because development teams mostly manage it, and it can be very much hit or miss.

Often, hreflang is constructed from existing fields in your content management system (CMS) or content database.

You might find that your development team is pulling the HTML lang tag, which follows a different ISO standard than hreflang, leading to a broken implementation.

Other times, there is a field in your CMS that your development team pulls from to build your hreflang setup.

Finding out how your hreflang tags are generated can be extremely helpful in identifying the sources of different issues or mitigating potential risks.

So speak to your engineering team and ask them how you’re currently generating hreflang.

5. Implement Hreflang Without Errors

There are three ways to implement hreflang on your site:

  • On your sitemap.
  • Through your HTTP header.
  • On your HTML head.

The method most of us are most familiar with is the HTML head. And while you can use more than one method, they should match each other perfectly. Otherwise, you risk confusing search engines.

Here are some basic rules for getting it done correctly:

  • In your hreflang implementation, the URL must include domain and protocol.
  • You must follow the ISO 639-1 language codes – don’t go around making up your own.
  • Hreflang tags must be reciprocal. If the page you’re listing as a language alternative does not list you back, your implementation won’t work.
  • Audit your hreflang regularly. My favorite tool for this, since it added the hreflang cluster analysis and link graphs, is Ahrefs. For the record, Ahrefs is not paying me to say this; it’s a genuine recommendation and has helped me a lot in my work.
  • You should only have one page per language.
  • Your hreflang URLs should be self-canonicalizing and respond with a 200 code.

Follow the above rules, and you’ll avoid the most common hreflang mistakes that SEO pros make.

And if you’re interested in the technical SEO aspect beyond hreflang, I recommend reading Mind your language by Rob Owen.

Part III: Invest In Content Incrementally

6. Translate Your Top-performing Content Topics

Now that you have the basic commercial and technical knowledge covered, you’re ready to start creating a content strategy.

You likely have a wealth of content in your core market that can be recycled. But you want to focus on translating high-converting topics, not just any topic; otherwise, you might be wasting your budget!

Let’s go step by step.

Cluster Your Website’s Content By Topic

  • Crawl your site using your favorite SEO tool and extract the URL and H1.
  • Use ChatGPT to classify that list of URLs into topics. You might already know what you usually write about, so include those topics in your prompt. You don’t want to have a classification that’s too granular, so you can prompt chatGPT to only create groups with a minimum of 10 URLs (adjust this to reflect the size of your website) and class everything else as other. This is an example of what your prompt might look like: “I will provide you with a list of article titles and their corresponding URL. Classify this list into the following topics: survey best practices, research and analysis, employee surveys, market research and others. Return this in a table format with the URL, title and group name.”
  • Start a spreadsheet with all your URLs in the first column, titles in the second column, and the group they belong to in the third column.

Measure Your Performance By Topic

  • Export your GSC data and use a =VLOOKUP formula to match your clicks to your URLs.
  • Export your conversion data and use a =VLOOKUP formula to match your conversions (leads, sales, sign-ups, or revenue) to the right URL.
  • You can then copy your topics column onto a new sheet. Remove duplicates and use the =SUMIF formula to aggregate your click data and conversion data by topic.

Choose What Topics You’ll Be Translating First

Using this data, you can now choose what topics are most likely to drive conversions based on your core market data. Choose how many topics or pieces of content you’ll be translating based on your budget.

Personally, I like translating one topic at a time because I’ve found that generating topical authority on one specific topic makes it easier for me to rank on an adjacent topic that I write about next.

7. Localize Your English Content

Once you’re set up with all your key pages and a few content topics, it’s time to evaluate your investment and see where you could be getting a bigger return.

At this stage, many companies have translated their content into a few different languages and likely copied the US content into their UK and Australian sites. Now that you’ve done some translation, it’s time to work on localization.

If you’ve just copied your US content into your UK and Australian sites, your Google Search Console indexing report might be screaming at you, “Duplicate, Google selected a different canonical than the user.”

A very easy fix that could yield great returns is to localize your English content to the nuances of those English-speaking markets.

You will want to instruct your translation and localization providers to adapt the spellings of certain words, change the choice of words, introduce local expressions, and update any cited statistic for the US with their local equivalent.

For example, if I’m targeting a British audience, “analyze” becomes “analyse,” a “stroller” becomes a “pram,” and “soccer” becomes “football.”

8. Invest In In-market Content

Once you’ve got the basics in place, you can start tackling the specific needs of other markets. This strategy is expensive, and you should only use it in your priority markets, but it can really set you apart from your competitors.

For this, you will need to work with a local linguist to identify pain points, use cases, or needs exclusive to your target market.

For example, if France suddenly made it mandatory to run a diversity and inclusion study for companies with over 250 employees, I’d want to know this and create some content on DEI surveys at SurveyMonkey.

9. Integrate With Other Content Workflows

In step six, we evaluated our top-performing content, chose the best articles to translate, and got it all down. But wait. Some of these source articles have been updated. And there is even more content now!

To run a successful international SEO campaign you must integrate with all the other teams publishing content within your organization.

Usually, the teams creating content in an organization are SEO, content, PR, product marketing, demand generation, customer marketing, customer service, customer education, or solutions engineering.

That’s a lot, and you won’t be able to integrate with everyone all at once. Prioritize the teams that create the most revenue-generating content, such as SEO, content, or product marketing.

Working with these teams, you will have to establish a process for what happens when they create a new piece, update some content, or remove an existing piece.

These processes can differ for everyone, but I can tell you what I do with my team and hope it inspires you.

  • When a piece of content that’s already been localized into international markets is updated, we get the content in a queue to be re-localized the next quarter.
  • When they create a new piece of content, we evaluate its performance, and if it’s performing above average, we add it to a localization queue for the next quarter.
  • When they change the URL of a piece of content or delete it, all international sites must follow suit at the same time, since due to some technical limitations, not making the change globally would create some hreflang issues.

Wrapping Up

International SEO is vast and complex, and no article can cover it all, but many interesting resources have been created by SEO pros across the community for those who want to learn more.

Navigating the complexities of international SEO is no small feat. It’s an intricate dance of aligning commercial strategies with technical precision, cultural insights, and data-driven decisions.

From understanding your company’s core motives for global expansion to meticulously implementing hreflang tags and localizing content, every step plays a crucial role in building a successful international presence.

More resources: 

Featured Image: BritCats Studio/Shutterstock

Early Look at Google’s AI Overviews

AI Overviews, Google’s generative AI search feature, is now live for all users. Google tested it for months, calling it Search Generative Experience.

AI Overviews summarizes search results for some queries, but Google has not disclosed the percentage. The feature has two versions:

  • An expandable AI answer on top of search results, pushing organic search listings further down the page.
  • A “Generate” button to create an Overview by clicking it.

The latter is less intrusive, but I’ve seen no statistics on which is more frequent.

AI Overviews often contain links referencing the sources. Google claims those links are more “clickable” — i.e., prominent — than conventional organic listings. We cannot verify this information because Google has provided no AI click data in Search Console or elsewhere.

Yet I doubt Google’s claim because frequently the links are not visible without expanding an AI answer. For example, searching “how to choose a career” produces an AI answer but no immediately visible source link.

Searching “how to choose a career” produces an AI answer but no immediately visible source link. Click image to enlarge.

Content providers can block Google from showing their info in Overviews using nosnippet, max-snippet, or data-nosnippet meta tags. But any of those could impact organic search listings. I suggest waiting a bit before deciding, although it’s worth experimenting if you see a drop in overall organic clicks for an important query.

Keep an eye on your key pages for traffic losses via Search Console and Google Analytics 4. On Search Console, identify the queries that bring fewer clicks and then search on them for potential AI Overviews.

Despite widespread angst, the traffic impact of AI Overviews is impossible to evaluate at this early stage. Last month I addressed a third-party study of SGE’s impact, as it was then partially public.

Organic search traffic has been declining for years owing to all the new sections in search result pages. AI Overviews will likely continue this trend.

Still, monitoring traffic losses is important.

It’s possible to optimize a page to appear in Overviews. Last year, in “SEO for SGE,” I listed a few basics:

  • Create relevant content addressing all kinds of problems of your target audience.
  • Optimize product pages and categories based on users’ needs.
  • Structure the site to surface popular topics.
  • Obtain external links to key pages. Links drive discovery and the Knowledge Graph, among other things. They are especially important for co-citation links, which place your site next to known entities and gradually become one through those associations.
  • Use Google’s submission tools. Ranking organically is the only way to appear in Gemini, hence AI Overviews.

Indexed and Ranked

My main takeaway is this. AI Overviews rely on current rankings for each query. In that respect, SEO isn’t changing. It is still about getting pages indexed and ranked for relevant queries.

Google Revs Ecommerce SERPs

Google is revving up its product search results, making it easier for consumers to price shop without leaving search engine page results.

Search for an unbranded product such as “buy blue womens sun dress” and scroll past sponsored listings and local results. Below that, on the primary SERP, Google added a grid of tile-like product boxes triggered by purchase-intent queries. Each tile can include a product name, images, price, store name, average star ratings, and review count.

Screenshot of product-grid boxes

Product boxes appear on primary SERPs and can include product names, images, prices, store names, average star ratings, and review counts. Click image to enlarge.

The tiles function differently from conventional organic results. Instead of sending shoppers to a product detail page on an ecommerce site, the tiles link to shopping knowledge panels that load in the SERP. The panels are similar to product detail pages but with one big difference: Google tacks on a merchant list with pricing.

“This is particularly useful for users because they can compare prices much more easily,” says ecommerce SEO consultant Aleyda Solis. But for online stores, it’s yet another hurdle to get the click.

How Google ranks product tiles remains unclear. But they are populated by structure data — Schema.org markup or similar. SEO consultants and ecommerce store owners have wrestled for years over which structured data types are worth publishing since Google wasn’t paying attention to all of them.

But last February, Google expanded support for product structured data, announcing new shipping and returns classes and product variants such as sizes, colors, and materials. This will likely bury skirmishes about the value of structured data since visibility in product grids and shopping knowledge panels depends on it.

Shopping Knowledge Panels

In shopping knowledge panels, the store name on the product tile gets the top ranking on the merchant list. But size, color, and other sort-by options let shoppers reshuffle the merchant list by those variants.

Screenshot of a shopping knowledge panel

Shopping knowledge panels load directly in SERPs and contain sort-by options that reorder the list of merchants. Click image to enlarge.

The sort-by feature will likely incent store owners to get their Schema act together or risk disappearing from the merchant list. Shoppers using the feature could unwittingly filter out merchants that ignore product variants.

“If you have technical constraints or don’t have a developer, there are tools that facilitate the implementation of product Schema markup. Wordlift is one. Schema App is another,” says Solis. You can also use ChatGPT to generate product Schema.

For ecommerce merchants, the shopping knowledge panel lessens the importance of unique landing pages. Many searchers will likely go straight from the product grid to the shopping knowledge panel to a merchant’s product detail page.

The development could be a win for Amazon, which will appear in more product knowledge panels due to the breadth and depth of its catalog. Moreover, Amazon could use predatory pricing to undercut smaller ecommerce stores in merchant lists.

Last September, Google’s domain name registrar business was acquired by Squarespace. “Maybe Google thinks we won’t need domains anymore,” speculates Ross Kernez, a digital strategist. “If everything gets converted to SGE [Search Generative Experience] and only ecommerce survives, the top of the funnel will be gone. Transactional queries will still be here, but that means people could need fewer domains,” says Kernez.

Mike King, CEO of marketing agency iPullRank, disagrees. “We’ve heard of the death of websites when mobile apps appeared. People were like, we’re not going to need websites anymore. Everything’s going to be an app. Well, that didn’t happen,” says King.

Diminished Value?

Either way, conventional organic listings are getting pushed further below the fold. With AI results, paid shopping, pay-per-click ads, map packs, forums, image carousels, and now product grids, it is possible to secure top traditional organic rankings and receive less traffic.

With the rise of ChatGPT, the growth of product review search on TikTok and Instagram, and the recent completion of its March core update, Google appears to be reinventing web search and, perhaps, diminishing the value of organic search as a marketing channel.

The result could force marketers to prioritize other traffic sources such as social networks, email marketing, and generative AI optimization.

Google’s enormous audience cannot be ignored. But with so much volatility in the SERPs, diversifying ecommerce traffic sources is becoming increasingly important. I see no evidence of ecommerce merchants shifting resources from organic search to TikTok, ChatGPT, Reddit, and Facebook. But it does appear that relying on organic traffic is getting riskier.

Using Python To Explain Homepage Redirection To C-Suite (Or Any SEO Best Practise) via @sejournal, @artios_io

If you’re an SEO professional you might be working on a site that redirects the home page to a subdirectory, for example redirecting to a country specific version of the site. Or, using placeholder content linking to the main site’s home page in a subdirectory.

In both those cases you could be struggling to convince your client or colleagues to follow best practices. If so, then this article is for you.

I will show you a way to communicate to C-suite that makes sense to them technically and commercially by providing data-driven examples.

To do this, I will show you how to use Python to calculate the TIPR of all site pages to provide a comparative before-and-after scenario to justify your requested changes.

We will cover:

First of all, let’s discuss why the home page should be merged with the root.

Hosting Placeholder Pages On The Root and Root Redirects

Some sites host a placeholder page on the root URL or, worse, redirect the root to a subdirectory or page. 

Many global brands, especially those in the fashion sector, will operate multi-regional sites where each regional store resides in their regional folder.

For example, if I went to Gucci, I’d expect to find the following stores:

…and so on.

In Gucci’s case, not only is there a regional folder, but there’s also a language folder, which is all very logical.

Because I reside in London, the root folder https://www.gucci.com/ redirects me to the UK store.

A site search for Gucci.com (site:gucci.com) shows that the root folder is indexed and offers a regional store selection menu.

For many sites, the root folder will permanently redirect to their default or most popular regional store.

Why The Home Page Should Be Merged With The Root

Search engines use authority (a measure of a page’s probability of being discovered via hyperlinks) to determine their relative importance on the web. Therefore, the more authority a page has, the higher its rank position potential in the search results (SERPs).

Given most sites accrue the most links to their root URL, this is where the search engine rank power resides.

This isn’t ideal for the site architecture, as it means that all the product listing pages (PLPs) and product description pages (PDPs) are an extra hop away from the home page.

This extra hop sounds small; however, it’s not inconsequential, as we’ll illustrate now and quantify later.

Let’s visualize the link graph of sites note setting their home page in the root folder.

Below is a real site where its root URL has a page-level authority score (according to Ahrefs) of 40 PR, redirecting to its main English language store /en (21 PR) before linking to all of the PLPs and PDPs.

Root URL has a page level authority score (according to Ahrefs) of 40Image from author, April 2024

Naturally, all of the pages (blue) via the logo will link to their regional store home page (for their users) and other regional home pages (shown in pink) instead of linking the root URL, which artificially inflates the value of the regional home page.

Note the site pages in site level 2 (which are directly linked from the home page) have a page level rating of 19 PR and the other pages in site level 3 have 18 PR.

What also happens is that the pages are one step removed from the root URL and thus don’t receive all of the authority.

Think of the deterioration of musical sound quality when making copies of a copy instead of a copy of the original music.

That’s the experience your site is offering to search engines when they’re trying to evaluate the relative importance of your site content!

If the stores linked to the root URL, this would be undesirable as it would create a load of redirects sitewide, further wasting the distribution of authority.

The best practice approach would be to cut out the middle man by merging the root with the home page so that all site pages are one less hop removed, as shown below:

merging the root with the home pageImage from author, April 2024

Following the merge of the home page and the root URL, the home page PR is now 72, which is much closer to the site’s domain authority of 75 DR, and each of the pages got an additional 1 PR, increasing their potential to rank.

The Struggles Of Communicating The Benefits To Non-SEO Expert Leadership

To a non-SEO expert audience, such as your marketing and IT colleagues, this all sounds rather academic and abstract – and probably quite unbelievable.

Even if you used the diagrams above, they’re naturally more interested in the traffic impact, if not the revenue implications.

They probably have no idea of Google’s PageRank metric for measuring page authority and don’t care unless you provide the numbers.

Using Python To Estimate PageRank Uplift

Fortunately, with the power of data science, we can make those complex calculations in Python to estimate the new PR values following the best practice move to the root URL.

Take the PageRank formula:

PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))

As explained in The Anatomy of a Large-Scale Hypertextual Web Search Engine by the founders of Google:

“We assume page A has pages T1…Tn which point to it (i.e., are citations). The parameter d is a damping factor which can be set between 0 and 1. We usually set d to 0.85. … Also C(A) is defined as the number of links going out of page A.

Note that the PageRanks form a probability distribution over web pages, so the sum of all web pages’ PageRanks will be one.”

The main gist of the formula is that the amount of PageRank a URL (A) has is mainly determined by the PageRank (PR Ti) of the pages linking to it (Ti) and the number of internal links on those pages C(Ti).

The Python version of the PageRank formula may be found here.

As a thought experiment armed with the knowledge of the above formula, we’d expect:

  • The new home page to benefit from having all pages link to the root URL (PR Ti) shared along with the other outbound internal links C(Ti).
  • All of the site pages to benefit from their more powerful parent URL (the new merged home page in the root URL).

With the above in mind, all we need to do now is recalculate the sitewide impact of merging the /en folder with the root URL on the whole site, which is done in several phases:

  • Calculate TIPR of all site pages: As explained earlier in what data science can do for site architectures, while site auditing software gives the relative PageRank internally, this needs to be combined with the external PageRank from the internet using link intelligence tools like Ahrefs.
  • Calculate the new TIPR of the new home page: i.e. /en merged or migrated with the root URL.
  • Calculate the new TIPR of all subsequent and remaining pages on the website.

As shown in the diagrams above the new best practice configuration shows the new TIPR values of all pages.

Once the TIPR calculation steps are followed, your next job is to translate the technical benefits of SEO into the commercial impact to secure buy-in from your colleagues.

One outcome metric to model would be the organic search traffic as a function of TIPR. With sufficient data points (say 10,000), this can be achieved using machine learning (ML).

Your input would be the dataset prior to the TIPR recalculation where you’d feed the TIPR column and the search clicks (presumably joined on from Google Search Console).

The chart below visualizes the relationship between TIPR and clicks.

Relationship between TIPR and clicks: Blue line modelImage from author, April 2024

The blue line is an approximate model indicating how many more clicks a page would receive with an increase in unit PageRank.

Mathematicians are fond of saying, “All models are wrong but some are useful.” However, the science can be quite persuasive in providing some credibility to your forecasted uplifts using the Python predict() function using your ML model. You can find an example here.

In the above case, we see that up to 20 TIPR, there’s a 0.35 visits per month traffic uplift per page, and beyond 20 TIPR, it’s 0.75 visits.

Using A Data-Driven Approach Is More Persuasive To C-Suite

This might not sound like much. However, aggregated across the hundreds of thousands of indexable URLs, we forecasted an additional 200,000 in monthly traffic for one client.

This forecast gave them the confidence and willingness to finally follow through on the repeated recommendation of setting the home page to root, which the company received from numerous SEO consultants.

The difference is the quantification, both technically and commercially.

By combining TIPR and applying the PageRank formula to simulate a before-and-after scenario for your technical SEO recommendation – in this case, setting the root URL as the home page – your SEO is data-driven and, more importantly, much more persuasive.

Not only technically but also commercially, which will help you implement more of your SEO recommendations and, hopefully, promote your career.

That aside, taking a data-driven approach can also help you sense-check your best practice recommendations based on ideas you’ve read online.

It’s true today as it was 20 years ago: The best SEO pros constantly test ideas rather than unquestioningly follow best practice dogma.

More resources:

Featured Image: BestForBest/Shutterstock

Extending Your Schema Markup From Rich Results To A Knowledge Graph via @sejournal, @marthavanberkel

You probably know Schema Markup for its ability to help pages achieve a rich result. But did you also know that you can use Schema Markup to build a reusable content knowledge graph for your organization’s web content?

When SEO professionals do Schema Markup with the sole intention of achieving a rich result, they miss the opportunity to create semantic Schema Markup and build their content knowledge graph.

Content knowledge graphs enable search engines to contextualize the content on your site and make inferences more easily.

Let me illustrate the power of Schema Markup for inferencing by introducing myself. My name is Martha van Berkel, and this is my knowledge graph. It explains who I am and how I relate to other things.

  • I studied at MIT and have a degree in Mathematics and Engineering.
  • I am Canadian, the co-founder and CEO of Schema App, and I know a lot about Schema Markup.
  • I worked at Cisco for 14 years.
  • I also used to own a 1965 Austin Healey Sprite, and my car was in the movie “Losing Chase,” which Kevin Bacon directed. In fact, Kevin Bacon used to drive my car.
Martha van Berkel's knowledge graphImage created by author, March 2024

What can you infer from my knowledge graph? Are you thinking about how you can win the “6 degrees of separation from Kevin Bacon” game? Or are you thinking about how it makes sense that I know about Schema Markup and am writing about it because I am the CEO of Schema App?

You can make these inferences because of the understanding you’ve developed from reading the specific relationships within my knowledge graph. I used the properties for a Person Schema.org type to describe and define my relationship with these things, the same language we use to optimize our web pages.

If I didn’t specifically define the relationship between me and these other things, you might think that I work at Cisco, am a user of Schema App, and am related to Kevin Bacon. This is why being specific and adding context is important!

So, just like I used Schema Markup to bring context and clarity to who I am, you can use Schema Markup to add more context to your website content so that search engines and AI can make accurate, powerful inferences about it.

In this article, we will discuss why it is important to start thinking about Schema Markup for its semantic value and how you can use your Schema Markup to build a reusable content knowledge graph.

Why Is It Important To Start Thinking About Schema Markup For Its Semantic Value?

The search landscape is evolving quickly. Search engines are racing to provide a new search experience that leverages inferencing and chat experiences.

We see this in Google’s Gemini and Bing’s ChatGPT and from new entrants, such as Perplexity AI. In the chat experience, search engines need to be able to provide users with quick, accurate answers and deal with evolving contexts.

Consumers are now also using more hyper-longtail queries in their searches. Instead of searching for [female doctor Nashville women’s health], they are searching for [find me a female doctor who can help me with my cramps and has an appointment available within the next 2 days].

Search engines and large language models (LLMs) cannot easily infer the answer to this query across a company’s website data without understanding how the information is connected. This contextual inference is why search engines have moved from lexical to semantic search.

So, how do you make it easy for these machines to understand and infer things from your content? By translating your web content into a standardized vocabulary understood by humans and search engines – Schema Markup.

When you implement Schema Markup, you can identify and describe the things, also known as entities, on your site and use the schema.org properties to explain how they are related.

Entities are unique, well-defined, and distinguishable things or concepts. An entity can be a person, a place, or even a concept, and it has attributes and characteristics.

Your website content discusses entities related to your organization (e.g., brand, products, services, people, locations, etc.), and you can use Schema Markup to describe your entities and connect them with other entities on your site.

Entities are the foundational building blocks of a content knowledge graph.

The value of a content knowledge graph far exceeds SEO. Gartner’s 2024 Emerging Tech Impact Radar report identified knowledge graphs as a key software enabler and important investment to enable generative AI adoption.

Many AI projects are powered by large language models prone to hallucination and errors. Research shows that when paired together, knowledge graphs can provide factual knowledge, resulting in more accurate answers from the LLM.

By creating a content knowledge graph through Schema Markup, SEO pros can enable search engine understanding and help prepare their organization to be leaders in innovations with AI.

Read More: Entities & Ontologies: The Future Of SEO?

Implementing Schema Markup To Build A Content Knowledge Graph Vs. Just Rich Results

You might wonder: How is this different from implementing Schema Markup to achieve a rich result?

When an SEO pro’s goal is to achieve a rich result, they tend to only add Schema Markup to the pages and content that are eligible for the rich results. As such, they are only telling search engines small parts of the organization’s story.

They are not providing search engines with detailed information or context about the entities on their site and how they are connected to one another.

This leaves search engines to guess the intentions and meaning of their content – just like you might think I’m related to Kevin Bacon and work at Cisco if I didn’t establish my relationship with these things in my introduction.

When an SEO pro’s goal is to build a content knowledge graph, they use Schema Markup to identify, describe, and explain the relationship between the entities on their site so that search engines can truly understand and contextualize the information on their organization’s website.

So, how do you start creating your Schema Markup with the intention of building a knowledge graph?

How To Implement Schema Markup To Build Your Content Knowledge Graph

1. Identify The Pages On Your Website That Describe Your Key Entities

Your website can contain thousands of entities (like specific products, individuals, services, locations, and more).

However, certain entities are important to your business goals and outcomes. This content is often what you need people and search engines to know about so that you can convert them into customers or inform them about your brand.

Common key entities often include your organization, services, products, people, and brand, but this ultimately depends on your business objectives.

For example, if you are a healthcare provider looking to establish a trusted reputation and drive appointment bookings through your website, your key entities could include your organization, medical facilities, physicians, and services offered.

Once you’ve identified which entities are important to your organization, you can find the page on your site that best represents them. Ideally, each page would define one entity and how it relates to other entities on the site.

2. Use The Schema.org Vocabulary To Describe The Entities

When you implement Schema Markup on a page, you are using the Schema.org vocabulary to make a series of statements that describe the entity. The Schema.org type categorizes the entity, while the Schema.org property describes the entity.

For example, a physician detail page might include information about the physician’s name, medical specialty, who they work for, the hospital or medical clinic they work at, the medical services they provide, and the geographical area they serve.

You can use Schema Markup to describe these aspects of the entity and express it as a graph with specific connections.

Schema exampleImage from author, March 2024

This helps search engines understand details about the physician to provide answers to a detailed query like [find me a cardiologist near me who can perform an EKG and has an appointment available in the next 2 days].

Every page on your website describes something about your business.

Implementing Schema Markup on each page clearly tells search engines what the page is about and how its concepts relate to other concepts on your website. Now, search engines and large language models can use this data to make inferences and confidently answer specific queries.

3. Connect The Entities On Your Website

Even though each web page is home to a unique entity, the content on your webpage might mention other entities that you’ve defined on other pages of your site.

If you want to build a content knowledge graph, you have to showcase how the entities on your website are connected and provide context using the right schema.org property.

This goes beyond a hyperlink connecting both pages using anchor text. With Schema Markup, you use the Schema.org properties that best describe the relationship to connect the entities.

For example, if the Physician works for the organization HealthNetwork, we can use the memberOf property to state that the Physician is a memberOf the Organization HealthNetwork.

Example of Physician Organization relationship using Schema MarkupImage from author, March 2024

When you look at the content on the page, if there are URLs linked to take you to another step in the journey, these are entities that should be linked within the Schema Markup. For physicians, this may be the service line pages, the hospitals where they practice, etc.

This provides search engines with more contextual information about the physician, which enables them to answer more complex queries.

Using these basics, you have started building your content knowledge graph. This should be done in addition to trying to achieve rich results. However, the properties you use to connect your entities are likely different from Google’s required properties for the rich result.

4. Link Your Entities To Other External Authoritative Knowledge Bases To Disambiguate Them

In addition to connecting the entities on your site, you can further define the entities mentioned on your pages by linking them to known entities on external authoritative knowledge bases like Wikipedia, Wikidata, and Google’s Knowledge Graph.

This is known as entity linking.

Entity linking can help you define the entities mentioned in your text more explicitly so that search engines can disambiguate the entity identified on your site with greater confidence and showcase your page for more relevant queries.

At Schema App, we’ve tested how entity linking can impact SEO. We found that disambiguating entities like places resulted in pages performing better on [near me] and other location-based search queries.

Our experiments also showed that entity linking can help pages show up for more relevant non-branded search queries, increasing click-through rates to the pages.

Here’s an example of entity linking. If your page talks about “Paris”, it can be confusing to search engines because there are several cities in the world named Paris.

If you are talking about the city of Paris in Ontario, Canada, you can use the sameAs property to link the Paris entity on your site to the known Paris, Ontario entity on Wikipedia, Wikidata, and Google’s Knowledge Graph.

Example of Entity LinkingImage from author, March 2024

Content Knowledge Graph Brings Context To Your Content

If your organization is using Schema Markup on select pages for the purpose of achieving a rich result, it is time to rethink your Schema Markup strategy.

Rich results can come and go.

However, the content knowledge graph you create using your Schema Markup can help search engines better understand and infer things about your organization through your content and prepare your organization to innovate with AI.

Like it or not, knowledge graphs are here to stay and you can start building yours by implementing proper semantic Schema Markup on your site.

More resources:

Featured Image: Gracia Chua/Schema App

Google Clarifies Vacation Rental Structured Data via @sejournal, @martinibuster

Google’s structured data documentation for vacation rentals was recently updated to require more specific data in a change that is more of a clarification than it is a change in requirements. This change was made without any formal announcement or notation in the developer pages changelog.

Vacation Rentals Structured Data

These specific structured data types makes vacation rental information eligible for rich results that are specific to these kinds of rentals. However it’s not available to all websites. Vacation rental owners are required to be connected to a Google Technical Account Manager and have access to the Google Hotel Center platform.

VacationRental Structured Data Type Definitions

The primary changes were made to the structured data property type definitions where Google defines what the required and recommended property types are.

The changes to the documentation is in the section governing the Recommended properties and represents a clarification of the recommendations rather than a change in what Google requires.

The primary changes were made to the structured data type definitions where Google defines what the required and recommended property types are.

The changes to the documentation is in the section governing the Recommended properties and represents a clarification of the recommendations rather than a change in what Google requires.

Address Schema.org property

This is a subtle change but it’s important because it now represents a recommendation that requires more precise data.

This is what was recommended before:

“streetAddress”: “1600 Amphitheatre Pkwy.”

This is what it now recommends:

“streetAddress”: “1600 Amphitheatre Pkwy, Unit 6E”

Address Property Change Description

The most substantial change is to the description of what the “address” property is, becoming more descriptive and precise about what is recommended.

The description before the change:

Information about the street address of the listing. Include all properties that apply to your country.

The description after the change:

The full, physical location of the vacation rental.
Provide the street address, city, state or region, and postal code for the vacation rental. If applicable, provide the unit or apartment number.
Note that P.O. boxes or other mailing-only addresses are not considered full, physical addresses.

This is repeated in the section for address.streetAddress property

This is what it recommended before:

address.streetAddress Text
The full street address of your vacation listing.

And this is what it recommends now:

address.streetAddress Text
The full street address of your vacation listing, including the unit or apartment number if applicable.

Clarification And Not A Change

Although these updates don’t represent a change in Google’s guidance they are nonetheless important because they offer clearer guidance with less ambiguity as to what is recommended.

Read the updated structured data guidance:

Vacation rental (VacationRental) structured data

Featured Image by Shutterstock/New Africa

SEO Takeaways from SGE’s Partial Rollout

Google extended Search Generative Experience last month beyond Labs, its testing program. A limited number of searchers now see AI snapshots in results regardless of whether they signed up.

Many observers believe it’s the first step to SGE becoming fully public this year. Google hasn’t much changed SGE in Labs for months, perhaps signaling its satisfaction thus far.

I’ve closely monitored SGE developments. Here are my observations and expectations.

Traffic Losses Overestimated

Authoritas, a search-engine-optimization platform, has been testing SGE in Labs and publishing the results. In March, the tests found the SGE appears in 91% of U.S. search results for brand and product terms in two ways.

First, the results could contain a “Generate” button that produces an AI-powered answer only when clicked, such as this query of “best laptops.”

Clicking the “Generate” button produces AI-powered answers. Click image to enlarge.

Second, the search results could contain an instant answer, such as the example below for “How healthy is spaghetti squash?” Clicking “Show more” expands the explanation.

Instant SGE answers such as this example appear automatically. Clicking “Show more” expands the explanation. Click image to enlarge.

The instant answer takes much more SERP space and will likely steal more clicks from organic listings because it pushes them further down the page.

Fortunately, according to the same Authoritas study, clicks on the “Generate” button drive 81.4% of SGE responses, much more than an instant answer. This indicates that organic listings won’t be hugely impacted, at least for now, since there’s no page disruption unless the button is clicked.

Organic Listings Displaced

However, when SGE is triggered, organic results appear far below the initial screen.

For example, searching for “smart TV” and clicking “Generate” produces an AI answer that occupies an entire screen on mobile and desktop.

Authoritas estimated an average organic listings drop of 1,243 pixels, depending on the search term. SGE results more or less eliminate the visibility of organic listings, especially for queries seeking consumables such as household goods.

Even before SGE, organic visibility was increasingly limited owing to the various features Google inserts at or near the top, such as ads, “People also ask” boxes, image packs, local packs, and more.

Opportunities in SGE

The good news is that SGE answers contain links, an opportunity for organic visibility. Authoritas states that SGE’s snapshots, on average, contain five unique links, and just one matches the top 10 organic listings below it. Perhaps it’s because the snapshots often address related queries.

For example, SGE snapshots for the search of “best laptops” list many makes and models as well as links for the best laptops for students, budgets, and coding. Organic listings for “best laptops” do not include those additional choices.

SGE snapshots for “best laptops” list many makes and models as well as related links, such as “The Best Laptops for Sims 4” (for students). Click image to enlarge.

Thus after optimizing important keywords, consider creating supporting content for related queries, increasing the chances of showing up in SGE results. For ideas, keep an eye on “Related searches” because those keywords seem to appear in SGE.

“Related searches” keywords seem to appear in SGE.

seo enhancements
How do you improve your mobile site?

Your site should be mobile-friendly. Because nowadays, most people are searching on Google with their phone. But what does it mean to have a mobile-friendly site? And where do you start? In this SEO basics article, you’ll find an overview of what you could do to improve your mobile site.

Table of contents

When is a site mobile-friendly?

A site is mobile-friendly when it:

  • helps users get their tasks done quickly and joyfully.
  • loads correctly on a mobile device like a smartphone or tablet.
  • loads lightning-fast.
  • presents content in a readable fashion, without users having to pinch and zoom.
  • offers enough space to navigate by touch.
  • offers added value for mobile users.
  • is instantly understandable for search engines.

Why is mobile SEO important?

Mobile SEO makes sure your mobile site offers the best possible presentation of your content to a mobile device user. Since our world is increasingly mobile-oriented, it’s important that your site is mobile-friendly. If your site isn’t (properly) available for mobile users, you’ll miss out on a decent ranking in the search engines, and thus miss the income. Therefore, you should do everything in your power to optimize the mobile view of your site and make it as good as possible. In fact, it should be excellent!

Important to Google

Since 2016, Google uses the mobile version of a site to determine the site’s rankings. So if your site isn’t up to scratch or shows less content on your mobile site, you’ll find it difficult to get good rankings. That’s why it’s so important to create a fully functioning and responsive design for your (mobile) site.

Luckily, Google has a great getting started guide to help you improve your mobile site. Plus, they’ve also set up a Page Experience initiative that gives you metrics — the so-called Core Web Vitals — on how humans and machines perceive your site’s performance. So, use these metrics to help you figure out what to focus on while improving your mobile site.

Treat it as one website

Don’t forget to see your site as being a single thing. You shouldn’t have a ‘mobile site’ that’s distinct from your ‘desktop site.’ You should have one site that adapts to whatever screen it’s being viewed on. That also means that the content of the different views should be the same.

How to improve your mobile website

To improve your mobile SEO, you need to focus on a couple of things:

  • Make a joyful user experience.
  • Make sure your site is responsive.
  • Improve your site speed.
  • Use structured data.
  • Don’t block JavaScript, HTML and CSS code.
  • Don’t use too many redirects.
  • Choose the correct viewport.
  • Don’t use interstitials or pop-ups.
  • Verify mobile-friendliness.
  • Tell Google about your site.

Let’s go over these topics in more detail.

Focus on making your site easy and joyful to use with mobile SEO

Offer a great user experience to your users, and you’ll notice that Google will enjoy it too. So, how do you do that? First, figure out what you want users to do on your site. Then, make sure that it’s easy for people to do. Do you want people to call you? Make sure you put your phone number front and center, so it’s easy to find. Want to enhance conversions? Make that buy button stand out and function properly! In other words: bring focus to your site, and helpfully guide your visitors through the steps you want them to take.

But don’t just focus on your intent. Look at your users too! Figure out why they visit and which tasks they mostly do on your site. Then make sure it’s easy for them. Because If something frustrates your user, it hurts you and your results. That’s why you should test, improve, and fully optimize your mobile site.

Responsive design

There are multiple ways to improve your site so it’s available for mobile users. The most important one is to create a responsive design. This is also the the technology that Google advocates. With a responsive design, your site lives on one URL, which makes it easier for Google to understand and index it.

If you use WordPress, chances are your theme is already responsive and can adapt to all screens. Still, it’s good to check how your site scales in Google Chrome’s Developer Tools. Because if it doesn’t scale correctly, you should talk to your web developer about fixing it – or choose a different theme.

Improve your site speed

One of the most important things you can do to improve your site’s mobile SEO is to improve the site’s loading speed. Time after time, studies have shown that people leave sites that load slowly, and probably never return. That’s why site speed has been a ranking factor for years, and why Google is increasingly focusing on fixing this common issue. See the Page Experience update and the Core Web Vitals metrics’ introduction for more proof.

If you need more tips, we have a post on how to improve your site speed and which tools that might help you.

Get better web hosting for your site

The number one tip to optimize the speed of your mobile site is to invest in better web hosting. Many sites run on budget hosts that share a lot of the server space with other websites, which can cause their sites to slow down. That’s why it really is essential to stay away from cheap hosting and get a good plan at a renowned host — it truly pays for itself!

Don’t know where to start? We have a page with WordPress web hosting companies that we vouch for, as we vetted them personally.

Optimize images

If there is one quick win to improve your site speed, it’s this: optimize your images. Don’t load those 3000 x 2000 pixel HD images on your site. Scale them to the correct size, then make them smaller with tools like ImageOptim, Squoosh, or WordPress plugins like WP Smush. You can also look into serving those images in next-gen image formats like WebP.

Minify code

Every request your site has to make has an impact on your site speed. That’s why you have to work on reducing these requests to improve your mobile site. One way to do this is by minifying code.

Minifying code means that you group and link together assets like JavaScript and CSS. As a result, the browser has to load fewer files, which leads to a faster site. This sounds hard to implement, but a plugin like WP Rocket can take care of all your caching needs. Or you can use Cloudflare’s Automatic Platform Optimization for WordPress to get a load of enhancements in one go.

Browser caching

By using browser caching, you’re telling the browser that page elements that don’t change often can be saved inside its cache. This way, the browser only has to download new and dynamic content whenever it visits again. Again, this is something that a plugin like WP Rocket can help you with. Or you can also do it yourself if you like.

Reduce redirects

A redirect leads a visitor from one requested page to another, because the requested page was moved or deleted. While this leads to a good user experience if done well, the more redirects you use, the slower your site will be. Don’t make endless redirects. Also, try not to keep links around that point to deleted posts redirected to new ones. Always make direct links.

Use structured data to improve your mobile site

Structured data is essential for every site. With structured data, you can describe your content in a way that search engines can understand. It gives you a direct line of communication with search engines, so to say. In return, search engines might reward you with awesome rich results.

Your mobile site needs to have the same structured data as your desktop variant — otherwise, Google might get confused. Yoast SEO automatically adds structured data for the most important parts of your site, which you can fine-tune it to your liking.

Don’t block assets like JavaScript, HTML and CSS

We’ve said it before, and we’re going to keep saying it: Don’t block assets like JavaScript, HTML and CSS. Doing so makes it harder for Google to access your site and that could lead to bad rankings. Check your Google Search Console to see if you’re blocking resources. If so, we advise that you take away all blockades if you want to truly optimize your mobile site.

Improve legibility

Make sure that your mobile site is readable on mobile devices. Use different devices to check if your typography is in order and make changes when necessary. Typography can make or break the user experience of your site.

Improve tap target sizes

People hate it when their fingers can’t hit a button, link, or menu item without fault. They can feel frustrated when navigation is hard or unnatural. Please fix it to improve your mobile site.

Choose the correct viewport

The viewport determines the width of the page for the device used to view it. By specifying a correct viewport, you make sure that visitors with specific devices get the right version of your site. Fail to do this, and you might show your desktop site to a small-screen smartphone user — a big no-no.

Don’t use interstitials or pop-ups

Google will penalize sites that use large pop-ups or interstitials to promote newsletters, sign-up forms, or ads. These often get in the way of the user quickly accessing the content they requested. Don’t use these. If you must though, make sure you abide by Google’s rules.

Test your site and tell Google about it

Before you start working on your mobile SEO, you should run a mobile usability test on Google to see where you should start. As you work, you should keep testing to see if you’re making progress. If your mobile site is optimized, you need to tell Google so your site will be checked and indexed. Use Search Console to stay on top of the performance of your site.

Investigate other technologies

There are other ways to improve the performance of your mobile site. One of these technologies is the Accelerated Mobile Pages (AMP) framework. This is an initiative by Google and others to get web pages to load super fast on mobile devices. By wrapping your content in special HTML code, you can optimize the pages in a way that Google can use to improve the performance. Keep in mind that AMP is not without its drawbacks, and not every project will benefit from it.

AMP is not the only technology that helps you optimize your mobile site. Other companies offer similar solutions, like Cloudflare’s various optimized delivery technologies. There are so many options these days!


Mobile is the new baseline, the new default. Do everything you can to fix your mobile site and make it perfect, not just in Google’s eyes, but, more importantly, your visitors. Mobile SEO is not just about great content and a flawless technical presentation. It’s more about creating an excellent user experience. Once you’ve achieved that, you’re on your way to the top!

Read more: Mobile SEO: the ultimate guide »

Coming up next!

All about Core Web Vitals: INP (Interaction to Next Paint)

Google’s Core Web Vitals have emerged as critical metrics for SEO. These metrics help you optimize your websites for a superior user experience. A new player is making headlines among these vital metrics: Interaction to Next Paint (INP). This one replaces the First Input Delay (FID). This post will explain what INP entails, its significance, and how to improve your site’s performance for SEO.

Essence of Interaction to Next Paint (INP)

Interaction to Next Paint measures the responsiveness of a web page to user inputs, such as clicks, taps, and keypresses. It represents the time from when a user interacts with your page to when they see a response on the screen. Unlike its predecessor, First Input Delay, which only accounted for the first input, INP provides a broader view by capturing the responsiveness throughout the life of the page.

Google is very dedicated to enhance the user experiences offered by sites. To validate these, it introduces a more nuanced and comprehensive metrics. It now introduces Interaction to Next Paint into the Core Web Vitals. INP measures a critical aspect of the user’s experience — the responsiveness of a page to user interaction.

By integrating INP into Core Web Vitals, Google aims to provide developers with a complete picture of their page’s performance. In addition, it encourages improvements that genuinely enhance the user experience.

Why INP matters

A seamless user experience is the cornerstone of successful SEO. Interaction to Next Paint directly influences how users perceive the efficiency and usability of a webpage. Pages that respond swiftly to user interactions are more likely to engage visitors. Better performance can reduce bounce rates, and, ultimately, higher rankings in search results.

As the transition from FID to INP unfolds, webmasters and SEO experts must embrace this broader metric. Understanding and optimizing for INP will be crucial for maintaining and improving search rankings.

Real-world improvements for yoast.com

Despite the challenges in optimizing for INP, our team at Yoast has remarkably improved responsiveness. By focusing on efficient code execution and minimizing render-blocking resources, we have significantly enhanced our site’s performance.

Google Search Console already provides INP reports, splitting into mobile and desktop issues. At Yoast, we’ve used these to guide our optimizations. In addition, Screaming Frog now includes INP pass/fail within their crawl reports, which helps as well.

Below shows how the work we did in December and January has reduced the number of issues dramatically:

INP score on desktop
INP score on mobile

But remember, while it’s always great to have zero errors, don’t obsess about cutting off milliseconds to get there. If there are significant performance issues, then solve these as soon as you can. Always keep in mind, though, don’t spend dollars to save pennies! Focus on the general page experience; things will naturally progress from there.

Improving Interaction to Next Paint

The shift to INP necessitates a fresh approach to measuring and enhancing web performance. Tools like Google’s Lighthouse, PageSpeed Insights, and the Chrome User Experience Report offer valuable insights into INP scores and opportunities for optimization.

Practical strategies to enhance your INP score

Improving your Interaction to Next Paint INP score benefits your site’s user experience. It’s an important part of staying competitive in SEO. Here are actionable tips to help you enhance your INP score:

1. Optimize event callbacks

Event callbacks are at the heart of user interactions. Reducing the time these callbacks take to process can significantly improve your INP score. Assess the complexity of your event handlers and streamline their code to ensure quick execution.

2. Avoid blocking the main thread

The main thread is where the browser processes user events, executes JavaScript, and renders updates to the screen. Keeping it unblocked ensures that the page can respond to user inputs promptly. Avoid heavy computations or long-running tasks on the main thread to prevent delays in responsiveness.

3. Break up long tasks

Tasks taking more than 50 milliseconds can interfere with the page’s ability to respond to user inputs effectively. Breaking these long tasks into smaller chunks allows the browser to intersperse input handling between these tasks, improving the overall responsiveness.

4. Optimize JavaScript execution

JavaScript can significantly impact your page’s responsiveness. Optimizing how JavaScript is loaded and executed on your page can improve INP scores. Techniques include deferring non-critical JavaScript, using async scripts, and removing unused code.

5. Minimize unnecessary tasks

Evaluate the tasks on your page and identify any that are not essential to the immediate user experience. Postponing or eliminating unnecessary tasks can free up resources, allowing the browser to prioritize user interactions.

6. Prioritize important actions

Not all tasks are created equal. By prioritizing important actions — such as those directly related to user interactions — you ensure that these tasks are executed first, leading to a smoother and more responsive experience.

7. Leverage requestIdleCallback

The requestIdleCallback API allows you to schedule background tasks to run when the browser is idle. This is particularly useful for tasks not critical to the immediate user experience. By using requestIdleCallback, you ensure these tasks do not interfere with the page’s responsiveness to user inputs.

Continuous improvements

Implementing these strategies requires a thoughtful approach to web development and an understanding of how user interactions are processed. Tools like Lighthouse and PageSpeed Insights can provide insights into your Interaction to Next Paint score. In addition, these can identify specific areas for improvement.

You can significantly enhance your site’s responsiveness by optimizing event callbacks, minimizing main thread blockage, breaking up long tasks, and prioritizing user-centric actions. This leads to a better user experience. It also aligns with Google’s emphasis on page responsiveness as a critical SEO component in the Core Web Vitals era.

Improving INP is a continuous process that can lead to substantial gains in user satisfaction and engagement. As you implement these changes, monitor your site’s performance. Check the impact on your INP scores and refine your strategies for even better results.

Looking ahead

The introduction of INP signals Google’s ongoing commitment to refining its page experience signals. Staying informed and proactive in optimizing for INP and other Core Web Vitals is imperative for you to excel in SEO.

Interaction to Next Paint is a pivotal metric for assessing and enhancing web page responsiveness. Understand its nuances, embrace the available tools, and implement data-driven optimization strategies. Esure that your website meets the ever-changing standards of user experience and SEO.

Let’s continue the conversation in the comments below. Share your experiences, challenges, and successes in working to improve INP. Together, let’s prepare those sites for lift-off!

Coming up next!