GA4 Update Brings Alignment With Google Ads Targeting via @sejournal, @MattGSouthern

Google announced an update to the advertising section within Google Analytics 4 (GA4).

The enhancement aims to clarify and align user counts eligible for remarketing and ad personalization.

Under the change, advertisers can now quickly view the size of their “Advertising Segments” within GA4’s interface.

These segments represent the pool of users whose data can be leveraged for remarketing campaigns and personalized ad targeting through products like Google Ads.

Improved Synchronization For Unified Insights

Previously, there could be discrepancies between the user counts shown as eligible for advertising use cases in GA4 and the Google Ads Audience Manager.

With this update, Google says the numbers will be fully aligned, allowing marketers to confidently make data-driven advertising decisions.

Expanding Advertising Segment Visibility

Along with the alignment fix, the update expands visibility into advertising segment sizes within the GA4 interface.

A new “Advertising segments” panel under the “Advertising” section reports the number of users GA4 collects and sends to ad products for personalization.

An “advertising segment” is a list of GA4 users synchronized with Google advertising products for remarketing and personalized ad targeting purposes.

Segment sizes can vary based on targeting requirements for different ad networks.

Why SEJ Cares

This update from Google addresses a key pain point for advertisers utilizing GA4 and Google Ads.

Full alignment between advertising audience sizes across products eliminates confusion and enables more data-driven strategies.

The added transparency into advertising segment sizes directly in GA4 is also a welcomed upgrade.

How This Can Help You

With aligned user counts, advertisers can plan and forecast remarketing campaigns with greater precision using GA4 data.

This unified view means you can make media investment decisions based on accurate reach projections.

Additionally, the new advertising segments panel provides extra context about the scope of your audiences for ad personalization.

This visibility allows for more informed strategies tailored to your specific segment sizes.


Featured Image: Lightspring/Shutterstock

Google Launches Custom Event Data Import For GA4 via @sejournal, @MattGSouthern

Google announced a new feature for Google Analytics 4 (GA4), rolling out support for custom event data import.

This allows you to combine external data sources with existing GA4 data for more comprehensive reporting and analysis.

Google’s announcement reads:

“With this feature, you can use a combination of standard fields and event-scoped custom dimensions to join and analyze imported event metadata with your existing Analytics data.

You can then create custom reports for a more complete view of your Analytics data and imported event metadata.”

Custom Event Data Import: How It Works

Google’s help documentation describes the new capability:

“Custom event data import allows you to import and join data in ways that make sense to you. You have more flexibility in the choice of key and import dimensions.”

You begin the process by defining reporting goals and identifying any relevant external data sources not collected in Google Analytics.

You can then set up custom, event-scoped dimensions to use as “join keys” to link the imported data with Analytics data.

Mapping Fields & Uploading Data

Once the custom dimensions are configured, Google provides a detailed mapping interface for associating the external data fields with the corresponding Analytics fields and parameters.

This allows seamless integration of the two data sources.

Google’s help documentation reads:

“In the Key fields table, you’ll add the Analytics fields to join your imported data. In the Import fields table, you’ll select the external fields to include via the join key across both standard Analytics fields/dimensions and custom typed-in event parameters.”

After the data is uploaded through the import interface, Google notes it can take up to 24 hours for the integrated data set to become available in Analytics reports, audiences, and explorations.

Why SEJ Cares

GA4’s custom event data import feature creates opportunities for augmenting Google Analytics data with a business’s proprietary sources.

This allows you to leverage all available data, extract actionable insights, and optimize strategies.

How This Can Help You

Combining your data with Google’s analytics data can help in several ways:

  1. You can create a centralized data repository containing information from multiple sources for deeper insights.
  2. You can analyze user behavior through additional lenses by layering your internal data, such as customer details, product usage, marketing campaigns, etc., on top of Google’s engagement metrics.
  3. Combining analytics data with supplementary data allows you to define audience segments more granularly for targeted strategies.
  4. Using the new data fields and dimensions, You can build custom reports and dashboards tailored to your specific business.

For businesses using GA4, these expanded reporting possibilities can level up your data-driven decision-making.


Featured Image: Muhammad Alimaki/Shutterstock

Google Analytics Update To Improve Paid Search Attribution via @sejournal, @MattGSouthern

Google has announced an update to the attribution models in Google Analytics 4 (GA4) to improve the accuracy of paid search campaigns.

Google plans to roll out adjustments over the next two weeks to address a longstanding issue where conversions originating from paid search were mistakenly attributed to organic search traffic.

According to the company’s statement, this misattribution occurs with single-page applications when the “gclid” parameter — a unique identifier for paid search clicks — fails to persist across multiple page views.

As a result, conversions that should have been credited to paid search campaigns were incorrectly assigned to organic search channels.

Improved Conversion Attribution Methodology

To address this problem, Google is modifying how it attributes conversions to ensure campaign information is captured from the initial event on each page.

Under the new methodology, the attribution will be updated to reflect the appropriate traffic source if a user exits the site and returns through a different channel.

This change is expected to increase the number of conversions attributed to paid search campaigns, potentially impacting advertising expenditures for marketers leveraging Google Ads.

Preparation & Review Recommended

In light of the impending update, Google strongly advises advertisers to review their budget caps and make necessary adjustments before the changes take effect.

As more conversions may be assigned to paid search efforts, campaign spending levels could be affected.

Proactive budget management should be used to align with evolving performance data.

Why SEJ Cares

Improved attribution accuracy gives you a clearer picture of how well your paid search advertising works.

This will allow you to make smarter decisions about where to spend your marketing budget and how to improve your paid search campaigns based on precise data.

How This Can Help You

With more accurate conversion data, you can:

  • Gain a clearer picture of your paid search campaigns’ actual impact and return on investment (ROI).
  • Optimize campaigns based on reliable performance metrics, allowing for more effective budget allocation and targeting strategies.
  • Identify areas for improvement or expansion within your paid search efforts, informed by precise attribution data.
  • Make data-driven decisions regarding budget adjustments, bid strategies, and overall campaign management.

To get the most out of these changes, review your budget caps and make necessary adjustments to anticipate the potential increase in conversions attributed to paid search campaigns.

Staying ahead will make it easier to adapt to the new attribution method and leverage the improved data.


Featured Image: Piotr Swat/Shutterstock

MozCon 2024 Announcements: AI SEO Tools & Affordable API Plans via @sejournal, @MattGSouthern

Today, at its MozCon 2024 conference, Moz announced a range of new AI-driven features and product updates.

The announcements focused on modernizing tools to address the changing needs of SEO professionals.

Ethan Hays, Moz’s General Manager, acknowledges that “the search landscape has seen more turbulence this last year than at almost any point in Moz’s 20-year history. Our goal is to help our customers face that head-on.”

Modern Tools For The Future Of SEO

Moz showcased a revamped version of STAT, its enterprise SERP tracking solution, and several usability improvements to the Moz Pro interface.

The company introduced Moz AI, a suite of tools designed to streamline workflows and enhance enterprise-grade data.

Additionally, Moz unveiled new affordable API plans, including Beta endpoints for keyword, intent, brand, and link metrics

Moz AI

Moz AI is a collection of three features integrated into Moz Pro:

  • Search Intent in Keyword Suggestions
  • Domain Search Theme
  • Domain Keyword Topics in Domain Overview.

These AI-driven capabilities allow Moz Pro customers to optimize their SEO strategies and make data-informed decisions with greater efficiency.

Accessible & Affordable Data

Moz announced the launch of new Moz API plans, making enterprise-quality data more accessible and affordable.

Starting at $5 per month, customers can now access an expanded set of Beta endpoints, including:

  • Keyword Metrics
  • Keyword Suggestions
  • Search Intent
  • Ranking Keywords
  • Brand Authority.

These plans offer a cost-effective solution for businesses looking to build custom tools, dashboards, SaaS products, and apps using Moz’s robust data.

Hays emphasized Moz’s commitment to its customers, stating, “Our core commitment is unchanged: empowering SEO practitioners through software, education, and community.”

MozCon 2024

MozCon 2024, which is currently underway, will feature live demonstrations of these product enhancements. Attendees will be able to experience the new features firsthand and interact with Moz experts.

Following Moz’s presentation, we will update this article with first-hand details straight from MozCon.

In Summary

As Moz celebrates its 20th anniversary, it remains dedicated to empowering SEO professionals with tools, education, and community support.

As Moz marks its 20th anniversary, it continues to focus on providing SEO professionals with advanced tools, educational resources, and community support to navigate the constantly changing search environment.

How To Find Competitors’ Keywords: Tips & Tools

This post was sponsored by SE Ranking. The opinions expressed in this article are the sponsor’s own.

Wondering why your competitors rank higher than you?

The secret to your competitors’ SEO success might be as simple as targeting the appropriate keywords.

Since these keywords are successful for your competitors, there’s a good chance they could be valuable for you as well.

In this article, we’ll explore the most effective yet simple ways to find competitors’ keywords so that you can guide your own SEO strategy and potentially outperform your competitors in SERPs.

Benefits Of Competitor Keyword Analysis

Competitor keywords are the search terms your competitors target within their content to rank high in SERPs, either organically or through paid ads.

Collecting search terms that your competitors rely on can help you:

1. Identify & Close Keyword Gaps.

The list of high-ranking keywords driving traffic to your competitors may include valuable search terms you’re currently missing out on.

To close these keyword gaps, you can either optimize your existing content with these keywords or use them as inspiration for creating new content with high traffic potential.

2. Adapt To Market Trends & Customer Needs.

You may notice a shift in the keywords your competitors optimize content for. This could be a sign that market trends or customer expectations are changing.

Keep track of these keywords to jump on emerging trends and align your content strategy accordingly.

3. Enhance Visibility & Rankings.

Analyzing your competitors’ high-ranking keywords and pages can help you identify their winning patterns (e.g., content format, user intent focus, update frequency, etc).

Study what works for your rivals (and why) to learn how to adapt these tactics to your website and achieve higher SERP positions.

How To Identify Your Competitors’ Keywords

There are many ways to find keywords used by competitors within their content. Let’s weigh the pros and cons of the most popular options.

Use SE Ranking

SE Ranking is a complete toolkit that delivers unique data insights. These insights help SEO pros build and maintain successful SEO campaigns.

Here’s the list of pros that the platform offers for agency and in-house SEO professionals:

  1. Huge databases. SE Ranking has one of the web’s largest keyword databases. It features over 5 billion keywords across 188 regions. Also, the number of keywords in their database is constantly growing, with a 30% increase in 2024 compared to the previous year.
  2. Reliable data. SE Ranking collects keyword data, analyzes it, and computes core SEO metrics directly from its proprietary algorithm. The platform also relies on AI-powered traffic estimations that have up to a 100% match with GSC data.

Thanks to SE Ranking’s recent major data quality update, the platform boasts even fresher and more accurate information on backlinks and referring domains (both new and lost).

As a result, by considering the website’s backlink profile, authority, and SERP competitiveness, SE Ranking now makes highly accurate calculations of keyword difficulty. This makes it easy to see how likely your own website or page is to rank at the top of the SERPs for a particular query.

  1. Broad feature set. Beyond conducting competitive (& keyword) research, you can also use this tool to track keyword rankings, perform website audits, handle all aspects of on-page optimization, manage local SEO campaigns, optimize your content for search, and much more.
  2. Great value for money. The tool offers premium features with generous data limits at a fair price. This eliminates the need to choose between functionality and affordability.

Let’s now review how to use SE Ranking to discover the keywords your competitors are targeting for both organic search and paid advertising.

First, open the Competitive Research Tool and input your competitor’s domain name into the search bar. Select a region and click Analyze to initiate analysis of this website.

Image created by SE Ranking, May 2024

Depending on your goal, go either to the Organic Traffic Research or Paid Traffic Research tab on the left-hand navigation menu.

Here, you’ll be able to see data on estimated organic clicks, total number of keywords, traffic cost, and backlinks.

Image created by SE Ranking, May 2024

Upon scrolling this page down, you’ll see a table with all the keywords the website ranks for, along with data on search volume, keyword difficulty, user intent, SERP features triggered by keywords, ranking position, URLs ranking for the analyzed keyword, and more.

Image created by SE Ranking, May 2024

What’s more, the tool allows you to find keywords your competitors rank for but you don’t.

To do this, head to the Competitor Comparison tab and add up to two websites for comparison.

Image created by SE Ranking, May 2024

Within the Missing tab, you’ll be able to see existing keyword gaps.

Image created by SE Ranking, May 2024

While the platform offers many benefits, there are also some downsides to be aware of, such as:

  1. Higher-priced plans are required for some features. For instance, historical data on keywords is only available to Pro and Business plan users.
  2. Data is limited to Google only. SE Ranking’s Competitor Research Tool only provides data for Google.

Use Google Keyword Planner

Google Keyword Planner is a free Google service, which you can use to find competitors’ paid keywords.

Here’s the list of benefits this tool offers in terms of competitive keyword analysis:

  1. Free access. Keyword Planner is completely free to use, which makes it a great option for SEO newbies and businesses with limited budgets.
  2. Core keyword data. The tool shows core SEO metrics like search volume, competition, and suggested bid prices for each identified keyword.
  3. Keyword categorization. Keyword Planner allows you to organize keywords into different groups, which may be helpful for creating targeted ad campaigns.
  4. Historical data. The tool has four years of historical data available.

Once you log into your  Google Ads account, navigate to the Tools section and select Keyword Planner.

Screenshot from Google Ads, May 2024

Now, click on the Discover new keywords option.

Screenshot from Google Ads, May 2024

Choose Start with a website option, enter your competitor’s website domain, region, and language, then choose to analyze the whole site (recommended for deeper insights) or a specific URL.

Screenshot from Google Ads, May 2024

And there you have it — a table with all keywords that your analyzed website uses in its Google Ads campaigns.

Screenshot from Google Ads, May 2024

Although Keyword Planner can be helpful, it’s not the most effective and data-rich tool for finding competitors’ keywords. Its main drawbacks are the following:

  1. No organic data. The tool offers data on paid keywords, which is mainly suitable for advertising campaigns.
  2. Broad search volume data. Since it’s displayed in ranges rather than exact numbers, it might be difficult to precisely assess the demand for identified keywords.
  3. No keyword gap feature. Using this tool, you cannot compare your and your competitors’ keywords side-by-side and, therefore, find missing keyword options.

So, if you want to access more reliable and in-depth data on competitors’ keywords, you’ll most likely need to consider other dedicated SEO tools.

Use SpyFu

SpyFu is a comprehensive SEO and PPC analysis tool created with the idea of “spying” on competitors.

Its main pros in terms of competitor keyword analysis are the following:

  1. Database with 10+ years of historical data. Although available only in a Professional plan, SpyFu offers long-term insights to monitor industry trends and adapt accordingly.
  2. Keyword gap analysis. Using this tool, you can easily compare your keywords to those of your competitors using metrics like search volume, keyword difficulty, organic clicks, etc.
  3. Affordability. It’s suitable for businesses on a tight budget.

To explore competitor data, simply visit their website and enter your competitor’s domain in the search bar.

You’ll be presented with valuable insights into their SEO performance, from estimated traffic to the list of their top-performing pages and keywords. Navigate to the Top Keywords section and click the View All Organic Keywords button to see the search terms they rank for.

Screenshot of SpyFu, May 2024

Yet, this free version provides an overview of just the top 5 keywords for a domain along with metrics like search volume, rank change, SEO clicks, and so on. To perform a more comprehensive analysis, you’ll need to upgrade to a paid plan.

When it comes to the tool’s cons, it would be worth mentioning:

  1. Keyword data may be outdated. On average, SpyFu updates data on keyword rankings once a month.
  2. Limited number of target regions. Keyword data is available for just 14 countries.

Wrapping Up

There’s no doubt that finding competitors’ keywords is a great way to optimize your own content strategy and outperform your rivals in SERPs.

By following the step-by-step instructions described in this article, we’re sure you’ll be able to find high-value keywords you haven’t considered before.

Ready to start optimizing your website? Sign up for SE Ranking and get the data you need to deliver great user experiences.


Image Credits

Featured Image: Image by SE Ranking. Used with permission.

Optimizing Interaction To Next Paint (INP): A Step-By-Step Guide via @sejournal, @DebugBear

This post was sponsored by DebugBear. The opinions expressed in this article are the sponsor’s own.

Keeping your website fast is important for user experience and SEO.

The Core Web Vitals initiative by Google provides a set of metrics to help you understand the performance of your website.

The three Core Web Vitals metrics are:

This post focuses on the recently introduced INP metric and what you can do to improve it.

How Is Interaction To Next Paint Measured?

INP measures how quickly your website responds to user interactions – for example, a click on a button. More specifically, INP measures the time in milliseconds between the user input and when the browser has finished processing the interaction and is ready to display any visual updates on the page.

Your website needs to complete this process in under 200 milliseconds to get a “Good” score. Values over half a second are considered “Poor”. A poor score in a Core Web Vitals metric can negatively impact your search engine rankings.

Google collects INP data from real visitors on your website as part of the Chrome User Experience Report (CrUX). This CrUX data is what ultimately impacts rankings.

Image created by DebugBear, May 2024

How To Identify & Fix Slow INP Times

The factors causing poor Interaction to Next Paint can often be complex and hard to figure out. Follow this step-by-step guide to understand slow interactions on your website and find potential optimizations.

1. How To Identify A Page With Slow INP Times

Different pages on your website will have different Core Web Vitals scores. So you need to identify a slow page and then investigate what’s causing it to be slow.

Using Google Search Console

One easy way to check your INP scores is using the Core Web Vitals section in Google Search Console, which reports data based on the Google CrUX data we’ve discussed before.

By default, page URLs are grouped into URL groups that cover many different pages. Be careful here – not all pages might have the problem that Google is reporting. Instead, click on each URL group to see if URL-specific data is available for some pages and then focus on those.

Screenshot of Google Search Console, May 2024

Using A Real-User Monitoring (RUM) Service

Google won’t report Core Web Vitals data for every page on your website, and it only provides the raw measurements without any details to help you understand and fix the issues. To get that you can use a real-user monitoring tool like DebugBear.

Real-user monitoring works by installing an analytics snippet on your website that measures how fast your website is for your visitors. Once that’s set up you’ll have access to an Interaction to Next Paint dashboard like this:

Screenshot of the DebugBear Interaction to Next Paint dashboard, May 2024

You can identify pages you want to optimize in the list, hover over the URL, and click the funnel icon to look at data for that specific page only.

Image created by DebugBear, May 2024

2. Figure Out What Element Interactions Are Slow

Different visitors on the same page will have different experiences. A lot of that depends on how they interact with the page: if they click on a background image there’s no risk of the page suddenly freezing, but if they click on a button that starts some heavy processing then that’s more likely. And users in that second scenario will experience much higher INP.

To help with that, RUM data provides a breakdown of what page elements users interacted with and how big the interaction delays were.

Screenshot of the DebugBear INP Elements view, May 2024

The screenshot above shows different INP interactions sorted by how frequent these user interactions are. To make optimizations as easy as possible you’ll want to focus on a slow interaction that affects many users.

In DebugBear, you can click on the page element to add it to your filters and continue your investigation.

3. Identify What INP Component Contributes The Most To Slow Interactions

INP delays can be broken down into three different components:

  • Input Delay: Background code that blocks the interaction from being processed.
  • Processing Time: The time spent directly handling the interaction.
  • Presentation Delay: Displaying the visual updates to the screen.

You should focus on which INP component is the biggest contributor to the slow INP time, and ensure you keep that in mind during your investigation.

Screenshot of the DebugBear INP Components, May 2024

In this scenario, Processing Time is the biggest contributor to the slow INP time for the set of pages you’re looking at, but you need to dig deeper to understand why.

High processing time indicates that there is code intercepting the user interaction and running slow performing code. If instead you saw a high input delay, that suggests that there are background tasks blocking the interaction from being processed, for example due to third-party scripts.

4. Check Which Scripts Are Contributing To Slow INP

Sometimes browsers report specific scripts that are contributing to a slow interaction. Your website likely contains both first-party and third-party scripts, both of which can contribute to slow INP times.

A RUM tool like DebugBear can collect and surface this data. The main thing you want to look at is whether you mostly see your own website code or code from third parties.

Screenshot of the INP Primary Script Domain Grouping in DebugBear, May 2024

Tip: When you see a script, or source code function marked as “N/A”, this can indicate that the script comes from a different origin and has additional security restrictions that prevent RUM tools from capturing more detailed information.

This now begins to tell a story: it appears that analytics/third-party scripts are the biggest contributors to the slow INP times.

5. Identify Why Those Scripts Are Running

At this point, you now have a strong suspicion that most of the INP delay, at least on the pages and elements you’re looking at, is due to third-party scripts. But how can you tell whether those are general tracking scripts or if they actually have a role in handling the interaction?

DebugBear offers a breakdown that helps see why the code is running, called the INP Primary Script Invoker breakdown. That’s a bit of a mouthful – multiple different scripts can be involved in slowing down an interaction, and here you just see the biggest contributor. The “Invoker” is just a value that the browser reports about what caused this code to run.

Screenshot of the INP Primary Script Invoker Grouping in DebugBear, May 2024

The following invoker names are examples of page-wide event handlers:

  • onclick
  • onmousedown
  • onpointerup

You can see those a lot in the screenshot above, which tells you that the analytics script is tracking clicks anywhere on the page.

In contrast, if you saw invoker names like these that would indicate event handlers for a specific element on the page:

  • .load_more.onclick
  • #logo.onclick

6. Review Specific Page Views

A lot of the data you’ve seen so far is aggregated. It’s now time to look at the individual INP events, to form a definitive conclusion about what’s causing slow INP in this example.

Real user monitoring tools like DebugBear generally offer a way to review specific user experiences. For example, you can see what browser they used, how big their screen is, and what element led to the slowest interaction.

Screenshot of a Page View in DebugBear Real User Monitoring, May 2024

As mentioned before, multiple scripts can contribute to overall slow INP. The INP Scripts section shows you the scripts that were run during the INP interaction:

Screenshot of the DebugBear INP script breakdown, May 2024

You can review each of these scripts in more detail to understand why they run and what’s causing them to take longer to finish.

7. Use The DevTools Profiler For More Information

Real user monitoring tools have access to a lot of data, but for performance and security reasons they can access nowhere near all the available data. That’s why it’s a good idea to also use Chrome DevTools to measure your page performance.

To debug INP in DevTools you can measure how the browser processes one of the slow interactions you’ve identified before. DevTools then shows you exactly how the browser is spending its time handling the interaction.

Screenshot of a performance profile in Chrome DevTools, May 2024

How You Might Resolve This Issue

In this example, you or your development team could resolve this issue by:

  • Working with the third-party script provider to optimize their script.
  • Removing the script if it is not essential to the website, or finding an alternative provider.
  • Adjusting how your own code interacts with the script

How To Investigate High Input Delay

In the previous example most of the INP time was spent running code in response to the interaction. But often the browser is already busy running other code when a user interaction happens. When investigating the INP components you’ll then see a high input delay value.

This can happen for various reasons, for example:

  • The user interacted with the website while it was still loading.
  • A scheduled task is running on the page, for example an ongoing animation.
  • The page is loading and rendering new content.

To understand what’s happening, you can review the invoker name and the INP scripts section of individual user experiences.

Screenshot of the INP Component breakdown within DebugBear, May 2024

In this screenshot, you can see that a timer is running code that coincides with the start of a user interaction.

The script can be opened to reveal the exact code that is run:

Screenshot of INP script details in DebugBear, May 2024

The source code shown in the previous screenshot comes from a third-party user tracking script that is running on the page.

At this stage, you and your development team can continue with the INP workflow presented earlier in this article. For example, debugging with browser DevTools or contacting the third-party provider for support.

How To Investigate High Presentation Delay

Presentation delay tends to be more difficult to debug than input delay or processing time. Often it’s caused by browser behavior rather than a specific script. But as before, you still start by identifying a specific page and a specific interaction.

You can see an example interaction with high presentation delay here:

Screenshot of the an interaction with high presentation delay, May 2024

You see that this happens when the user enters text into a form field. In this example, many visitors pasted large amounts of text that the browser had to process.

Here the fix was to delay the processing, show a “Waiting…” message to the user, and then complete the processing later on. You can see how the INP score improves from May 3:

Screenshot of an Interaction to Next Paint timeline in DebugBear, May 2024

Get The Data You Need To Improve Interaction To Next Paint

Setting up real user monitoring helps you understand how users experience your website and what you can do to improve it. Try DebugBear now by signing up for a free 14-day trial.

Screenshot of the DebugBear Core Web Vitals dashboard, May 2024

Google’s CrUX data is aggregated over a 28-day period, which means that it’ll take a while before you notice a regression. With real-user monitoring you can see the impact of website changes right away and get alerted automatically when there’s a big change.

DebugBear monitors lab data, CrUX data, and real user data. That way you have all the data you need to optimize your Core Web Vitals in one place.

This article has been sponsored by DebugBear, and the views presented herein represent the sponsor’s perspective.

Ready to start optimizing your website? Sign up for DebugBear and get the data you need to deliver great user experiences.


Image Credits

Featured Image: Image by Redesign.co. Used with permission.

Google Launches New ‘Saved Comparisons’ Feature For Analytics via @sejournal, @MattGSouthern

Google announced a new tool for Analytics to streamline data comparisons.

The ‘saved comparisons’ feature allows you to save filtered user data segments for rapid side-by-side analysis.

Google states in an announcement:

“We’re launching saved comparisons to help you save time when comparing the user bases you care about.

Learn how you can do that without recreating the comparison every time!”

Google links to a help page that lists several benefits and use cases:

“Comparisons let you evaluate subsets of your data side by side. For example, you could compare data generated by Android devices to data generated by iOS devices.”

“In Google Analytics 4, comparisons take the place of segments in Universal Analytics.”

Saved Comparisons: How They Work

The new comparisons tool allows you to create customized filtered views of Google Analytics data based on dimensions like platform, country, traffic source, and custom audiences.

These dimensions can incorporate multiple conditions using logic operators.

For example, you could generate a comparison separating “Android OR iOS” traffic from web traffic. Or you could combine location data like “Country = Argentina OR Japan” with platform filters.

These customized comparison views can then be saved to the property level in Analytics.

Users with access can quickly apply saved comparisons to any report for efficient analysis without rebuilding filters.

Google’s documentation states:

“As an administrator or editor…you can save comparisons to your Google Analytics 4 property. Saved comparisons enable you and others with access to compare the user bases you care about without needing to recreate the comparisons each time.”

Rollout & Limitations

The saved comparisons feature is rolling out gradually. There’s a limit of 200 saved comparisons per property.

For more advanced filtering needs, such as sequences of user events, Google recommends creating a custom audience first and saving a comparison based on that audience definition.

Some reports may be incompatible if they don’t include the filtered dimensions used in a saved comparison. In that case, the documentation suggests choosing different dimensions or conditions for that report type.

Why SEJ Cares

The ability to create and apply saved comparisons addresses a time-consuming aspect of analytics work.

Analysts must view data through different lenses, segmenting by device, location, traffic source, etc. Manually recreating these filtered comparisons for each report can slow down production.

Any innovation streamlining common tasks is welcome in an arena where data teams are strapped for time.

How This Can Help You

Saved comparisons mean less time getting bogged down in filter recreation and more time for impactful analysis.

Here are a few key ways this could benefit your work:

  • Save time by avoiding constant recreation of filters for common comparisons (e.g. mobile vs desktop, traffic sources, geo locations).
  • Share saved comparisons with colleagues for consistent analysis views.
  • Switch between comprehensive views and isolated comparisons with a single click.
  • Break down conversions, engagement, audience origins, and more by your saved user segments.
  • Use thoughtfully combined conditions to surface targeted segments (e.g. paid traffic for a certain product/location).

The new saved comparisons in Google Analytics may seem like an incremental change. However, simplifying workflows and reducing time spent on mundane tasks can boost productivity in a big way.


Featured Image: wan wei/Shutterstock