Google Updates Crawl Budget Best Practices via @sejournal, @MattGSouthern

Google has updated its crawl budget guidelines, stressing the need to maintain consistent link structures between mobile and desktop websites.

  • Large websites must ensure mobile versions contain all desktop links or risk slower page discovery.
  • The update mainly impacts sites with over 10,000 pages or those experiencing indexing issues.
  • Link structure consistency across mobile and desktop is now a Google-recommended best practice for crawl budget optimization.
The SEO Agency Guide To Efficient WordPress Hosting & Management via @sejournal, @kinsta

This post was sponsored by Kinsta. The opinions expressed in this article are the sponsor’s own.

Managing client sites can quickly become costly in terms of time, money, and expertise, especially as your agency grows.

You’re constantly busy fixing slow WordPress performance, handling downtime, or regularly updating and backing up ecommerce sites and small blogs.

The solution to these challenges might lie in fully managed hosting for WordPress sites.

Opting for a fully managed hosting provider that specializes in WordPress and understands agency needs can save you both time and money. By making the switch, you can focus on what truly matters: serving your current clients and driving new business into your sales funnel.

WordPress Worries & How To Keep Clients Happy

For SEO agencies managing multiple client sites, ensuring consistently fast performance across the board is essential. Websites with poor performance metrics are more likely to see a dip in traffic, increased bounce rates, and lost conversion opportunities.

Managed hosting, especially hosting that specializes and is optimized for WordPress, offers agencies a way to deliver high-speed, well-performing sites without constantly battling technical issues.

Clients expect seamless performance, but handling these technical requirements for numerous websites can be a time-consuming process. While WordPress is versatile and user-friendly, it does come with performance challenges.

SEO agencies must deal with frequent updates, plugin management, security vulnerabilities, and optimization issues.

Challenges like bloated themes, inefficient plugins, and poor hosting infrastructure can lead to slow load times. You also need to ensure that client WordPress sites are secured against malware and hackers, which requires regular monitoring and updates.

With managed hosting, many of these tasks are automated, significantly reducing the workload on your team.

Managed hosting for WordPress simplifies the process by providing a full suite of performance, security, and maintenance services.

Instead of spending valuable time on manual updates, backups, and troubleshooting, you can rely on your hosting provider to handle these tasks automatically, resulting in reduced downtime, improved site performance, and a more efficient use of resources.

Ultimately, you can focus your energy on SEO strategies that drive results for your clients.

Basics Of Managed Hosting For WordPress

Managed hosting providers like Kinsta take care of all the technical aspects of running WordPress websites, including performance optimization, security, updates, backups, and server management.

We take over the responsibilities ensure the platform runs smoothly and securely without the constant need for manual intervention.

Kinsta also eliminates common performance bottlenecks in WordPress include slow-loading themes, outdated plugins, inefficient database queries, and suboptimal server configurations.

Key Benefits Of Efficient Managed Hosting For SEO

1. Performance & Speed

Core Web Vitals, Google’s user experience metrics, play a significant role in determining search rankings. Managed hosting improves metrics like LCP, FID, and CLS by offering high-performance servers and built-in caching solutions.

CDNs reduce latency by serving your website’s static files from servers closest to the user, significantly improving load times.

Kinsta, for example, uses Google Cloud’s premium tier network and C2 virtual machines, ensuring the fastest possible load times for WordPress sites. We also provide integrated CDN services, along with advanced caching configurations, which ensure that even resource-heavy WordPress sites load quickly.

And the benefits are instantly noticeable.

Before the switch, Torro Media faced performance issues, frequent downtimes, and difficulties scaling their websites to handle traffic growth. These issues negatively affected their clients’ user experience and SEO results.

After migrating to Kinsta, Torro Media saw noteable improvements:

  • Faster website performance – Site load times significantly improved, contributing to better SEO rankings and overall user experience.
  • Reduced downtime – Kinsta’s reliable infrastructure ensured that Torro Media’s websites experienced minimal downtime, keeping client websites accessible.
  • Expert support – Our support team helped Torro Media resolve technical issues efficiently, allowing the agency to focus on growth rather than troubleshooting.

As a result, Torro was able to scale its operations and deliver better results for its clients.

2. WP-Specific Security

Security is a critical component of managed hosting. Platforms like Kinsta offer automatic security patches, malware scanning, and firewalls tailored specifically for WordPress.

These features are vital to protecting your clients’ sites from cyber threats, which, if left unchecked, can lead to ranking drops due to blacklisting by search engines.

Downtime and security breaches negatively impact SEO. Google devalues sites that experience frequent downtime or security vulnerabilities.

Managed hosting providers minimize these risks by maintaining secure, stable environments with 24/7 monitoring, helping ensure that your clients’ sites remain online and safe from attacks.

3. Automatic Backups & Recovery

Automatic daily backups are a standard feature of managed hosting, protecting against data loss due to server crashes or website errors. For agencies, this means peace of mind, knowing that they can restore their clients’ sites quickly in case of a problem. The ability to quickly recover from an issue helps maintain SEO rankings, as prolonged downtime can hurt search performance.

Managed hosting providers often include advanced tools such as one-click restore points and robust disaster recovery systems. Additionally, having specialized support means that you have access to experts who understand WordPress and can help troubleshoot complex issues that affect performance and SEO.

Importance Of An Agency-Focused Managed WordPress Hosting Provider

For SEO agencies, uptime guarantees are essential to maintaining site availability. Managed hosting providers, like Kinsta, who specialize in serving agencies, offer a 99.9% uptime SLA and multiple data center locations, ensuring that websites remain accessible to users across the globe.

Scalability and flexibility matter, too. As your agency grows, your clients’ hosting needs may evolve. Managed hosting platforms designed for agencies offer scalability, allowing you to easily add resources as your client portfolio expands.

With scalable solutions, you can handle traffic surges without worrying about site downtime or slowdowns.

Agency Dashboard - Managed Hosting for WordPress

1. The Right Dashboards

A user-friendly dashboard is crucial for managing multiple client sites efficiently. Kinsta’s MyKinsta dashboard, for example, allows agencies to monitor performance, uptime, and traffic across all sites in one centralized location, providing full visibility into each client’s website performance.

Hosting dashboards like Kinsta’s MyKinsta provide real-time insights into key performance metrics such as server response times, resource usage, and traffic spikes. These metrics are essential for ensuring that sites remain optimized for SEO.

2. Balance Costs With Performance Benefits

For agencies, managing hosting costs is always a consideration. While managed hosting may come with a higher price tag than traditional shared hosting, the benefits, such as faster performance, reduced downtime, and enhanced security, translate into better client results and long-term cost savings.

Kinsta offers flexible pricing based on traffic, resources, and features, making it easier for agencies to align their hosting solutions with client budgets.

By automating tasks like backups, updates, and security management, managed hosting allows agencies to significantly reduce the time and resources spent on day-to-day maintenance. This frees up your team to focus on delivering SEO results, ultimately improving efficiency and client satisfaction.

Don’t think it makes that big of a difference? Think again.

After migrating to Kinsta, 5Tales experienced:

  • Improved site speed – Load times dropped by over 50%, which enhanced user experience and SEO performance.
  • Better support – Kinsta’s specialized support team helped troubleshoot issues quickly and provided expert-level advice.
  • Streamlined management – With our user-friendly dashboard and automated features, 5Tales reduced the time spent on maintenance and troubleshooting.

Overall, 5Tales saw an increase in both client satisfaction and SEO rankings after moving to Kinsta.

3. Managed Hosting & Page Speed Optimization

Tools like Kinsta’s Application Performance Monitoring (APM) provide detailed insights into website performance, helping agencies identify slow-loading elements and optimize them. This level of transparency enables faster troubleshooting and more precise optimization efforts, which are critical for maintaining fast page speeds.

It’s also easy to integrate managed hosting platforms with your existing tech stack. Kinsta works seamlessly with SEO tools like Google Analytics, DebugBear, and others, allowing agencies to track site performance, analyze traffic patterns, and ensure sites are running at peak efficiency.

Conclusion

Managed hosting is not just a convenience. It’s a critical component of success for SEO agencies managing WordPress sites.

By leveraging the performance, security, and time-saving benefits of a managed hosting provider like Kinsta, agencies can improve client results, enhance their relationships, and streamline their operations.

When it comes to SEO, every second counts. A fast, secure, and well-maintained website will always perform better in search rankings. For agencies looking to deliver maximum value to their clients, investing in managed hosting is a smart, long-term decision.

Ready to make the switch?

Kinsta offers a guarantee of no-shared hosting, 99.99% uptime guarantee, and 24/7/365 support, so we’re here when you need us. Plus, we makes it easy, effortless, and free to move to Kinsta.

Our team of migration experts have experience switching from all web hosts. And when you make the switch to Kinsta, we’ll give you up to $10,000 in free hosting to ensure you avoid paying double hosting bills.


Image Credits

Featured Image: Image by Kinsta. Used with permission.

In-Post Image: Images by Kinsta. Used with permission.

Google Revises URL Parameter Best Practices via @sejournal, @MattGSouthern

In a recent update to its Search Central documentation, Google has added specific guidelines for URL parameter formatting.

The update brings parameter formatting recommendations from a faceted navigation blog post into the main URL structure documentation, making these guidelines more accessible.

Key Updates

The new documentation specifies that developers should use the following:

  • Equal signs (=) to separate key-value pairs
  • Ampersands (&) to connect multiple parameters

Google recommends against using alternative separators such as:

  • Colons and brackets
  • Single or double commas

Why This Matters

URL parameters play a role in website functionality, particularly for e-commerce sites and content management systems.

They control everything from product filtering and sorting to tracking codes and session IDs.

While powerful, they can create SEO challenges like duplicate content and crawl budget waste.

Proper parameter formatting ensures better crawling efficiency and can help prevent common indexing issues that affect search performance.

The documentation addresses broader URL parameter challenges, such as managing dynamic content generation, handling session IDs, and effectively implementing sorting parameters.

Previous Guidance

Before this update, developers had to reference an old blog post about faceted navigation to find specific URL parameter formatting guidelines.

Consolidating this information into the main guidelines makes it easier to find.

The updated documentation can be found in Google’s Search Central documentation under the Crawling and Indexing section.

Looking Ahead

If you’re using non-standard parameter formats, start planning a migration to the standard format. Ensure proper redirects, and monitor your crawl stats during the switch.

While Google has not said non-standard parameters will hurt rankings, this update clarifies what they prefer. New sites and redesigns should adhere to the standard format to avoid future headaches.


Featured Image: Vibe Images/Shutterstock

Google’s Mueller Dismisses Core Web Vitals Impact On Rankings via @sejournal, @MattGSouthern

Google Search Advocate John Mueller has reaffirmed that Core Web Vitals are not major ranking factors, responding to data that suggested otherwise.

His statements come amid growing industry discussion about the immediate impact of site performance on search visibility.

Mueller’s Stance

Mueller stated on LinkedIn:

“We’ve been pretty clear that Core Web Vitals are not giant factors in ranking, and I doubt you’d see a big drop just because of that.”

The main benefit of improving website performance is providing a better user experience.

A poor experience could naturally decrease traffic by discouraging return visitors, regardless of how they initially found the site.

Mueller continues:

“Having a website that provides a good experience for users is worthwhile, because if users are so annoyed that they don’t want to come back, you’re just wasting the first-time visitors to your site, regardless of where they come from.”

Small Sites’ Competitive Edge

Mueller believes smaller websites have a unique advantage when it comes to implementing SEO changes.

Recalling his experience of trying to get a big company to change a robots.txt line, he explains:

“Smaller sites have a gigantic advantage when it comes to being able to take advantage of changes – they can be so much more nimble.”

Mueller noted that larger organizations may need extensive processes for simple changes, while smaller sites can update things like robots.txt in just 30 minutes.

He adds:

“None of this is easy, you still need to figure out what to change to adapt to a dynamic ecosystem online, but I bet if you want to change your site’s robots.txt (for example), it’s a matter of 30 minutes at most.”

Context

Mueller’s response followed research presented by Andrew Mcleod, who documented consistent patterns across multiple websites indicating rapid ranking changes after performance modifications.

In one case, a site with over 50,000 monthly visitors experienced a drop in traffic within 72 hours of implementing advertisements.

Mcleod’s analysis, which included five controlled experiments over three months, showed:

  • Traffic drops of up to 20% within 48 hours of enabling ads
  • Recovery periods of 1-2 weeks after removing ads
  • Consistent patterns across various test cases

Previous Statements

This latest guidance aligns with Mueller’s previous statements on Core Web Vitals.

In a March podcast, Mueller confirmed that Core Web Vitals are used in “ranking systems or in Search systems,” but emphasized that perfect scores won’t notably affect search results.

Mueller’s consistent message is clear: while Core Web Vitals are important for user experience and are part of Google’s ranking systems, you should prioritize content quality rather than focus on metrics.

Looking Ahead

Core Web Vitals don’t directly affect rankings, per Mueller.

While Google’s stance on ranking factors remains unchanged, the reality is that technical performance and user experience work together to influence traffic.


Featured Image: Ye Liew/Shutterstock

Google’s Mueller On How To Handle Legacy AMP Subdomains via @sejournal, @MattGSouthern

Google’s John Mueller advises site owners on managing outdated AMP subdomains, suggesting redirects or complete DNS removal.

  • Mueller recommends either keeping 301 redirects or removing the AMP subdomain entirely from DNS.
  • For sites with 500,000 pages, crawl budget impact from legacy AMP URLs isn’t a major concern.
  • AMP subdomains have their own separate crawl budget from the main domain.
How Page Performance Hurts UX & How You Can Fix It via @sejournal, @DebugBear

This post was sponsored by DebugBear. The opinions expressed in this article are the sponsor’s own.

From a user’s perspective, a slow website can be incredibly frustrating, creating a poor experience. But the impact of sluggish load times goes deeper than just user frustration.

Poor page performance affects search rankings, overall site engagement, E-E-A-T, and conversion rates that results in abandoned sessions, lost sales, and damaged trust.

Even if Google’s Core Web Vitals (CWV) Report is all green.

Sure, Chrome UX (CrUX) and Google’s CWV reports can indicate there’s an issue, but that’s it. They don’t provide you with enough details to identify, troubleshoot, and fix the issue.

And fixing these issues are vital to your digital success.

Core Web Vitals - DebugBear Page Performance ToolImage from DebugBear, October 2024

This article explores why slow websites are bad for user experience (UX), the challenges that cause them, and how advanced page performance tools can help fix these issues in ways that basic tools can’t.

UX, Brand Perception & Beyond

While often at the bottom of a technical SEO checklist, site speed is critical for UX. Sites that load in once second convert 2.5 to 3 times more than sites that require five seconds to load.

And yet, today, an estimated 14% of B2C ecommerce websites require five seconds or more to load.

These numbers become even more pronounced for mobile users, for whom pages load 70.9% slower. Mobile users have 31% fewer pageviews and an average of 4.8% higher bounce rate per session.

According to a recent Google study, 53% of mobile users will abandon a page if it takes more than three seconds to load.

Poor page experience can negatively other aspects of your site, too:

  • Search Rankings – Google includes page experience, of which CWV and page performance is a factor, when ranking web pages.
  • User Trust – Poor performing pages fail to meet a potential customer’s expectations. They are often perceived by users as the brand inconveniencing them, introducing stress, negative emotions, and a loss of a sense of control to the buying process. Slower pages can also cause users to forget information gained from previous pages, reducing the effectiveness of advertising, copy, and branding campaigns between clicks.
  • User Retention – Site visitors who experience slow load times may never return, reducing retention rates and customer loyalty.

Why Basic Page Performance Tools Don’t Fully Solve The Problem

Tools like Google PageSpeed Insights or Lighthouse give valuable insights into how your website performs, but they can often be limited. They tell you that there’s an issue but often fall short of explaining what caused it or how to fix it.

Google’s Chrome User Experience Report (CrUX) and Core Web Vitals have become essential in tracking website performance and user experience.

These metrics—Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS)—offer valuable insights into how users perceive a website’s speed and stability.

However, CrUX and Core Web Vitals only tell part of the story. They indicate that a problem exists but don’t show the root cause or offer an immediate path for improvement.

For instance, your LCP might be poor, but without deeper page speed analysis, you wouldn’t know whether it’s due to an unoptimized image, a slow server response, or third-party scripts.

Page Performance Broken Down By Geolocation - DebugBearImage from DebugBear, October 2024

Here’s where DebugBear stands out. DebugBear digs deeper, offering more granular data and unique features that basic tools don’t provide.

Continuous Monitoring and Historical Data – Many speed testing tools only offer snapshots of performance data. DebugBear, on the other hand, allows for continuous monitoring over time, providing an ongoing view of your site’s performance. This is crucial for detecting issues that crop up unexpectedly or tracking the effectiveness of your optimizations.

Granular Breakdown by Device, Location, and Browser – Basic tools often provide aggregated data, which hides the differences between user experiences across various devices, countries, and network conditions. DebugBear lets you drill down to see how performance varies, allowing you to optimize for specific user segments.

Pinpointing Content Elements Causing Delays – One of DebugBear’s standout features is its ability to show exactly which content elements—images, scripts, or third-party code—are slowing down your website. Rather than wasting hours digging through code and experimenting with trial and error, DebugBear highlights the specific elements causing delays, allowing for targeted, efficient fixes.

Why You Need Continuous Page Speed Testing

One of the biggest pitfalls in web performance optimization is relying on single-point speed tests.

Page Performance Breakdown - Content Elements in DebugBearImage from DebugBear, October 2024

Running a one-time test may give you a snapshot of performance at that moment, but it doesn’t account for fluctuations caused by different factors, such as traffic spikes, varying user devices, or changes to site content.

Without continuous testing, you risk spending hours (or even days) trying to identify the root cause of performance issues.

DebugBear solves this problem by continuously tracking page speed across different devices and geographies, offering detailed reports that can be easily shared with team members or stakeholders.

If a performance dip occurs, DebugBear provides the data necessary to quickly identify and rectify the issue, saving you from the endless trial-and-error process of manual debugging.

Without tools like DebugBear, you’re left with only a high-level view of your website’s performance.

This means hours of trying to guess the underlying issues based on broad metrics, with no real insight into what’s dragging a site down.

Different Users Experience Performance Differently

Not all users experience your website’s performance in the same way.

Device type, geographic location, and network speed can significantly affect load times and interaction delays.

For example, a user on a fast fiberoptic connection in the U.S. may have a completely different experience than someone on a slower mobile network in India.

This variance in user experience can be hidden in aggregate data, leading you to believe your site is performing well when a significant portion of your audience is actually struggling with slow speeds.

Here’s why breaking down performance data by device, country, and browser matters:

  • Device-Specific Optimizations – Some elements, like large images or animations, may perform well on desktop but drag down speeds on mobile.
  • Geographic Performance Variations – International users may experience slower speeds due to server location or network conditions. DebugBear can highlight these differences and help you optimize your content delivery network (CDN) strategy.
  • Browser Differences – Different browsers may handle elements like JavaScript and CSS in different ways, impacting performance. DebugBear’s breakdown by browser ensures you’re not overlooking these subtleties.

Without this granular insight, you risk alienating segments of your audience and overlooking key areas for optimization.

And troubleshooting these issues becomes and expensive nightmare.

Just ask SiteCare.

WordPress web development and optimization service provider SiteCare uses DebugBear to quickly troubleshoot a full range of WordPress sites, solve performance issues faster, and monitor them for changes, providing high quality service to its clients, saving thousands of hours and dollars every year.

DebugBear offers these breakdowns, providing a clear view of how your website performs for all users, not just a select few.

Real User Monitoring: The Key To Accurate Performance Insights

In addition to synthetic testing (which mimics user interactions), real user monitoring (RUM) is another powerful feature technical SEOs and marketing teams will find valuable.

While synthetic tests offer valuable controlled insights, they don’t always reflect the real-world experiences of your users.

RUM captures data from actual users as they interact with your site, providing real-time, accurate insights into what’s working and what isn’t.

For instance, real user monitoring can help you:

  • Identify performance issues unique to specific user segments.
  • Detect trends that may not be visible in synthetic tests, such as network issues or slow third-party scripts.
  • Measure the actual experience users are having on your website, not just the theoretical one.

Without real user monitoring, you might miss critical issues that only surface under specific conditions, like a heavy user load or slow mobile networks.

If you’re not using continuous page speed testing and in-depth reports, you’re flying blind.

You may see an overall decline in performance without understanding why, or you could miss opportunities for optimization that only reveal themselves under specific conditions.

The result?

Wasted time, frustrated users, lost conversions, and a website that doesn’t perform up to its potential.

DebugBear solves this by offering both continuous monitoring and granular breakdowns, making it easier to troubleshoot issues quickly and accurately.

With detailed reports, you’ll know exactly what to fix and where to focus your optimization efforts, significantly cutting down on the time spent searching for problems.


Image Credits

Featured Image: Image by Shutterstock. Used with permission.

In-Post Images: Images by DebugBear. Used with permission.

Google To Retire Sitelinks Search Box In November via @sejournal, @MattGSouthern

Google has announced the retirement of the sitelinks search box feature.

This change, set to take effect on November 21, marks the end of a tool that has been part of Google Search for over a decade.

The sitelinks search box, introduced in 2014, allowed users to perform site-specific searches directly from Google’s search results page.

It appeared above the sitelinks for certain websites, usually when searching for a company by name.

Declining Usage

Google cites declining usage as the reason for this decision, stating:

“Over time, we’ve noticed that usage has dropped.”

Potential Impact

Google affirms that removing the sitelinks search box won’t affect search rankings or the display of other sitelinks.

This change is purely visual and doesn’t impact a site’s position in search results.

Implementation

This update will be rolled out globally, affecting search results in all languages and countries.

Google has confirmed that the change won’t be listed in the Search status dashboard, indicating that it’s not considered a significant algorithmic update.

Search Console & Rich Results Test

Following the removal of the sitelinks search box, Google plans to update the following tools:

  1. The Search Console rich results report for sitelinks search box will be removed.
  2. The Rich Results Test will no longer highlight the related markup.

Structured Data Considerations

While you can remove the sitelinks search box structured data from their sites, Google says that’s unnecessary.

Unsupported structured data won’t cause issues in Search or trigger errors in Search Console reports.

It’s worth noting that the ‘WebSite’ structured data, also used for site names, continues to be supported.

Historical Context

The sitelinks search box was initially announced in September 2014 as an improvement to help users find specific website content more easily.

It supported features like autocomplete and allowed websites to implement schema markup for better integration with their own search pages.

Looking Ahead

Website owners and SEO professionals should take note of this update, though no immediate action is required.


Featured Image: MrB11/Shutterstock

Google Explains How Cumulative Layout Shift (CLS) Is Measured via @sejournal, @MattGSouthern

Google’s Web Performance Developer Advocate, Barry Pollard, has clarified how Cumulative Layout Shift (CLS) is measured.

CLS quantifies how much unexpected layout shift occurs when a person browses your site.

This metric matters to SEO as it’s one of Google’s Core Web Vitals. Pages with low CLS scores provide a more stable experience, potentially leading to better search visibility.

How is it measured? Pollard addressed this question in a thread on X.

Understanding CLS Measurement

Pollard began by explaining the nature of CLS measurement:

“CLS is ‘unitless’ unlike LCP and INP which are measured in seconds/milliseconds.”

He further clarified:

“Each layout shift is calculated by multipyling two percentages or fractions together: What moved (impact fraction) How much it moved (distance fraction).”

This calculation method helps quantify the severity of layout shifts.

As Pollard explained:

“The whole viewport moves all the way down – that’s worse than just half the view port moving all the way down. The whole viewport moving down a little? That’s not as bad as the whole viewport moving down a lot.”

Worse Case Scenario

Pollard described the worst-case scenario for a single layout shift:

“The maximum layout shift is if 100% of the viewport (impact fraction = 1.0) is moved one full viewport down (distance fraction = 1.0).

This gives a layout shift score of 1.0 and is basically the worst type of shift.”

However, he reminds us of the cumulative nature of CLS:

“CLS is Cumulative Layout Shift, and that first word (cumulative) matters. We take all the individual shifts that happen within a short space of time (max 5 seconds) and sum them up to get the CLS score.”

Pollard explained the reasoning behind the 5-second measurement window:

“Originally we cumulated ALL the shifts, but that didn’t really measure the UX—especially for pages opened for a long time (think SPAs or email). Measuring all shifts meant, given enough, time even the best pages would fail!”

He also noted the theoretical maximum CLS score:

“Since each element can only shift when a frame is drawn and we have a 5 second cap and most devices run at 60fps, that gives a theoretical cap on CLS of 5 secs * 60 fps * 1.0 max shift = 300.”

Interpreting CLS Scores

Pollard addressed how to interpret CLS scores:

“… it helps to think of CLS as a percentage of movement. The good threshold of 0.1 means about the page moved 10%—which could mean the whole page moved 10%, or half the page moved 20%, or lots of little movements were equivalent to either of those.”

Regarding the specific threshold values, Pollard explained:

“So why is 0.1 ‘good’ and 0.25 ‘poor’? That’s explained here as was a combination of what we’d want (CLS = 0!) and what is achievable … 0.05 was actually achievable at the median, but for many sites it wouldn’t be, so went slightly higher.”

See also: What is CLS and How to Optimize It?

Why This Matters

Pollard’s insights provide web developers and SEO professionals with a clearer understanding of measuring and optimizing for CLS.

As you work with CLS, keep these points in mind:

  • CLS is unitless and calculated from impact and distance fractions.
  • It’s cumulative, measuring shifts over a 5-second window.
  • The “good” threshold of 0.1 roughly equates to 10% of viewport movement.
  • CLS scores can exceed 1.0 due to multiple shifts adding up.
  • The thresholds (0.1 for “good”, 0.25 for “poor”) balance ideal performance with achievable goals.

With this insight, you can make adjustments to achieve Google’s threshold.


Featured Image: Piscine26/Shutterstock

Google Expands Structured Data Support For Product Certifications via @sejournal, @MattGSouthern

Google has announced an update to its Search Central documentation, introducing support for certification markup in product structured data.

This change will take full effect in April and aims to provide more comprehensive and globally relevant product information.

New Certification Markup For Merchant Listings

Google has added Certification markup support for merchant listings in its product structured data documentation.

This addition allows retailers and ecommerce sites to include detailed certification information about their products

Transition From EnergyConsumptionDetails to Certification Type

A key aspect of this update is replacing the EnergyConsumptionDetails type with the more versatile Certification type.

The new type can support a wider range of countries and broader certifications.

Google recommends websites using EnergyConsumptionDetails in their structured data to switch to the Certification type before April.

This will ensure product pages remain optimized for Google’s merchant listing experiences.

Expanded Capabilities & Global Relevance

The move to the Certification type represents an expansion in the types of product certifications that can be communicated through structured data.

While energy efficiency ratings were a primary focus of the EnergyConsumptionDetails type, the new Certification markup can encompass a much wider array of product certifications and standards.

This change is relevant for businesses operating in multiple countries, as it allows for more nuanced and locally applicable certification information to be included

Implementation Guidelines

Google has provided examples in its updated documentation to guide webmasters in implementing the new Certification markup.

These examples include specifying certifications such as CO2 emission classes for vehicles and energy efficiency labels for electronics.

The structured data should be added to product pages using JSON-LD format, with the Certification type nested within the product’s structured data.

Review the full documentation to ensure proper implementation.

Including certification information in structured data could lead to more informative product listings, potentially influencing user click-through rates and purchase decisions.

For consumers, this update means access to more detailed and standardized product information directly in search results, particularly regarding certifications and compliance with various standards.

Next Steps

Website owners and SEO professionals should take the following steps:

  1. Review current use of EnergyConsumptionDetails in product structured data.
  2. Plan for the transition to the Certification type before April.
  3. Implement the new Certification markup on product pages, following Google’s guidelines.
  4. Test the implementation using Google’s Rich Results Test tool.

As with any significant change to structured data implementation, it is advisable to monitor search performance and rich result appearances after making these updates.


Featured Image: lilik ferri yanto/Shutterstock