Google AI Overviews: New Research Offers Insights via @sejournal, @martinibuster

New research by BrightEdge offers a snapshot of the kinds of queries that tend to show Google AI Overviews (AIO) and provides insights into the kinds of queries and verticals where AIO are more prevalent.

The findings show dramatic differences in the amount of AI Overviews shown across different verticals in a way that reflects the kinds of queries that are common. This effect works in reverse as well, where some verticals experience less AIO search features.

Is This A Paradigm Shift?

While BrightEdge calls it the greatest paradigm shift in decades, I think that’s understating shifts to Google search in the recent past, not just the ones in 2024. Something that’s not widely understood is that Google Search has been an AI Search engine since at least 2015 with the introduction of RankBrain and other subsequent changes to the backend side of search.

The big change in Search this year is that AI is more obvious on the front-end as a Feature in Search, largely replacing the role that Featured Snippets once played. Perhaps more importantly there may have been an infrastructure change at the beginning of 2024.

BrightEdge Generative Parser

BrightEdge has a technology, called the Generative Parser, which tracks and analyzes patterns in Google’s AI search features. BrightEdge used their Generative Parser to produce research findings about Google’s new AI Overviews (AIO) search feature.

Albert Gouyet, VP of Operations at BrightEdge said this about the BrightEdge Generative Parser:

“It’s fascinating to see the BrightEdge Generative Parser™ giving marketers a front-row seat into how AI in search is developing and giving the community a glimpse into the future. For marketers who rely on organic traffic, early indications suggest that AI will help reach new customers and present new opportunities to create content that serves multiple needs and elevates brand performance.”

What Triggers AIO

BrightEdge’s report indicates that Featured Snippets and questions were likely to trigger the AIO feature. Featured Snippets are answers to questions that are created with direct quotes from websites. BrightEdge found that AI Overviews were more likely to appear when there was also a Featured Snippet.

What Doesn’t Trigger AI Overviews

The research showed that local search queries were the least likeliest to trigger an AI Overview search result. That makes sense because a user is looking for a structured search result (business names, addresses, phone numbers), information that can’t be usefully summarized.

Similarly, search queries that generate sitelinks were also less likely to trigger AIO. Sitelinks are search results related to branded searches which feature multiple links to inner pages of a website. For example, searching for the name of a clothing store can generate a search result that features inner pages for women’s clothes, men’s clothes, etc. This also makes sense because it’s the kind of search query that is best answered with direct data and not a summary.

Verticals Most Likely To Contain AIO

Search results that tended to feature AI Overviews were wildly different when compared by verticals (verticals means specific industries or topics). This likely doesn’t mean that Google was targeting specific verticals for showing more AIO. Search features are always tied to the helpfulness of the features. The helpfulness of features are tested with the Search Quality Raters, workers who test out new kinds of search results and rate them for helpfulness and other criteria.

Search queries related to Healthcare tended to generate AI Overviews at a rate of 63% of the time. That makes sense for search queries that are information-seeking.

B2B technology queries tended to generate AIO results 32% of the time while Ecommerce search queries triggered AI Overviews 23% of the time.

Interestingly, restaurants and travel related queries did not tend to trigger AIO results.

AIO Shown Less Often Than SGE

Another interesting data point is that AIO is triggered 20% less times than Search Generative Experience (SGE) answers were.

BrightEdge offered three insights related to why AIO is shown less than the experimental SGE was.

  1. “This indicates that AI is getting more precise when generating helpful experiences.
  2. This is likely because AI now caters better to people’s needs, such as looking for summaries, recommendations, or conversational experiences.
  3. Ultimately, Google is getting better at selecting answers.”

BrightEdge research pointed out that Google is improving the ability to anticipate follow up questions by providing AI search summaries that more completely answer a question.

They write:

“Since Google l/O, the overlap between citations in AI and traditional results has diminished. Google is ensuring users do not get the same results in the two types of different results. It is also now delivering on its promise to do the second, third, and fourth search for you. AI is beginning to anticipate the following question and give options before a user even asks. This often happens with ‘what,’ ‘where,’ and ‘how’ intent-based queries.”

Early Days Of AIO

Google has received overwhelmingly negative reviews from users and the news media about the quality of Google’s AI Overviews, which in turn can lead to trust issues. BrightEdge’s report can be considered a snapshot of Google AIO today and I’m certain BrightEdge will be back with new data in the future when Google’s (AI) SERPs eventually change again.

Featured image by Shutterstock/Marco Lazzarini

2,596: How To Make The Most Out Of Google’s Leaked Ranking Factors via @sejournal, @Kevin_Indig

Over the last week, I observed many arguments against digging deep into the 2,596 pages.

But the only question we should ask ourselves is, “How can I test and learn as much as possible from these documents?”

SEO is an applied science where theory is not the end goal but the basis for experiments.

Image Credit: Lyna ™

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

14,000 Test Ideas

You couldn’t ask for a better breeding ground for test ideas. But we cannot test every factor the same way. They have different types (number/integer: range, Boolean: yes/no, string: word/list) and reaction times (meaning the speed at which they lead to a change in organic rank).

As a result, we can A/B test fast and active factors while we have to before/after test slow and passive ones.

A 2x2 grid with axes labeled Prioritize tests by speed. (Image Credit: Kevin Indig)

Test ranking factors systematically by:

  1. Selecting a ranking factor.
  2. Selecting the impacted (success) metric.
  3. Define where you test.
  4. Define the type of test.
Flowchart detailing four steps of testing ranking factors systematically.Image Credit: Kevin Indig

Ranking Factors

Most ranking factors in the leak are integers, meaning they work on a spectrum, but some Boolean factors are easy to test:

  • Image compression: Yes/No?
  • Intrusive interstitials: Yes/No?
  • Core Web Vitals: Yes/No?

Factors you can directly control:

  • UX (navigation, font size, line spacing, image quality).
  • Content (fresh, optimized titles, not duplicative, rich in relevant entities, focus on one user intent, high effort, crediting original sources, using canonical forms of a word instead of slang, high-quality UGC, expert author).
  • User engagement (high rate of task completion).

Demoting (negative) ranking factors:

  • Links from low-quality pages and domains.
  • Aggressive anchor text (unless you have an extremely strong link profile).
  • Poor navigation.
  • Poor user signals.

Factors you can only influence passively:

  • Title match and relevance between source and linked document.
  • Link clicks.
  • Links from new and trusted pages.
  • Domain authority.
  • Brand mentions.
  • Homepage PageRank.

Start with an assessment of your performance in the area you want to test in. A straightforward use case would be Core Web Vitals.

Metrics

Pick the right metric for the right factor based on the description in the leaked document or your understanding of how a factor might impact a metric:

  • Crawl rate.
  • Indexing (Yes/No).
  • Rank (for main keyword).
  • Click-through rate (CTR).
  • Engagement.
  • Keywords a page ranks for.
  • Organic clicks.
  • Impressions.
  • Rich snippets.

Where To Test

Find the right place to test:

  • If you’re skeptical, use a country-specific domain or a site where you can test with low risk. If you have a site in many languages, you can roll out changes based on the leaks in one country and compare relative performance against your core country.
  • You can limit tests to a one-page type or subdirectory to isolate the impact as well as you can.
  • Limit tests to pages addressing a specific type of keyword (e.g., “Best X”) or user intent (e.g., ”Read reviews”).

Some ranking factors are sitewide signals, like site authority, and others are page-specific, like click-through rates.

Considerations

Ranking factors can work with or against each other since they’re part of an equation.

Humans are notoriously bad at intuitively understanding functions with many variables, which means we most likely underestimate how much goes into achieving a high rank score, but also how a few variables can significantly impact the outcome.

The high complexity of the relationship between ranking factors shouldn’t keep us from experimenting.

Aggregators can test easier than Integrators because they have more comparable pages that lead to more significant outcomes. Integrators, which have to create content themselves, have differences between every page that dilute test results.

My favorite test: One of the best things you can do for your understanding of SEO is scoring ranking factors by your own perception and then systematically challenge and test your assumptions. Create a spreadsheet with each ranking factor, give it a number between zero and one based on your idea of its importance, and multiply all factors.

Monitoring Systems

Testing only gives us an initial answer to the importance of ranking factors. Monitoring allows us to measure relationships over time and come to more robust conclusions.

The idea is to track metrics that reflect ranking factors, like CTR could reflect title optimization, and chart them over time to see whether optimization bears fruit. The idea no different from regular (or what should be regular) monitoring, except for new metrics.

You can build monitoring systems in:

  • Looker.
  • Amplitude.
  • Mixpanel.
  • Tableau.
  • Domo.
  • Geckoboard.
  • GoodData.
  • Power BI.

The tool is not as important as the right metrics and URL path.

Example Metrics

Measure metrics by page type or a set of URLs over time to measure the impact of optimizations.

Note: I’m using thresholds based on my personal experience that you should challenge.

User Engagement:

  • Average number of clicks on navigation.
  • Average scroll depth.
  • CTR (SERP to site).

Backlink Quality:

  • % of links with high topic-fit/title-fit between source and target.
  • % of links of pages that are younger than 1 year.
  • % of links from pages that rank for at least one keyword in the top 10.

Page Quality:

  • Average dwell time (compared between pages of the same type).
  • % users who spend at least 30 seconds on the site.
  • % of pages that rank in the top 3 for their target keyword.

Site Quality:

  • % of pages that drive organic traffic.
  • % of zero-click URLs over the last 90 days.
  • Ratio between indexed and non-indexed pages.

It’s ironic that the leak happened shortly after Google started showing AI for results (AI Overviews) because we can use AI to find SEO gaps based on the leak.

One example is title matching between source and target for backlinks. With common SEO tools, we can pull titles, anchor text, and surrounding content of the link for referring and target pages.

We can then rate the topical proximity or token overlap with common AI tools, Google Sheets/Excel integrations, or local LLMs and basic prompts like “Rate the topical proximity of the title (column B) compared to the anchor (column C) on a scale of 1 to 10 with 10 being exactly the same and 1 having no relationship at all.”

A spreadsheet displaying SEO page titles, anchors, AI ratings, and explanations. Using AI to rate title-match between link sources and targets. (Image Credit: Kevin Indig)

A Leak Of Their Own

Google’s ranking factor leak isn’t the first time the inner works of a big platform algorithm became available to the public:

1. In January 2023, a Yandex leak revealed many ranking factors that we also found in the latest Google leak. The underwhelming reaction surprised me just as much back then as today.

2. In March 2023, Twitter published most parts of its algorithm. Similar to the Google leak, it lacks “context” between the factors, but it was insightful nonetheless.

Twitter’s algorithm in a system chart.Twitter’s algorithm in a system chart. (Image Credit: Kevin Indig)

3. Also in March 2023, Instagram’s chief Adam Mosseri published an in-depth follow-up post on how the platform ranks content in different parts of its product.

Despite the leaks, there are no known cases of a user or brand hacking the platform in a clean, ethical way.

The more a platform rewards engagement in its algorithm, the harder it is to game. And yet, the Google algorithm leak is quite interesting because it’s an intent-driven platform where users indicate their interest through searches instead of behavior.

As a result, knowing the ingredients for the cake is a big step forward, even without knowing how much of each to use.

I cannot understand why Google has been so secret about ranking factors all along. I’m not saying it should have published them in the degree of the leak. It could have incentivized a better web with fast, easy-to-navigate, good-looking, informative sites.

Instead, it left people guessing too much, which led to a lot of poor content, which led to algorithm updates that cost many businesses a lot of money.


System Diagram from Github.com

Instagram Ranking Explained


Featured Image: Paulo Bobita/Search Engine Journal

Google Ads: A Quick Guide To Every AI-Powered Ad Creative Feature (And What’s Coming Soon) via @sejournal, @adsliaison

I know, we can’t go two minutes without hearing “AI” – much like in the early days (years) of “mobile” and “social media.”

In these early days of the generative AI era, creatives is an area where we’re seeing rapid development for advertisers.

We saw this at Google Marketing Live this year, too, with new AI creative capabilities announced for Search, Shopping, Performance Max, and YouTube ads.

If you’re feeling overwhelmed, I get it. With so many changes and new applications of AI in ad creatives, it can be hard to keep up with what’s available, how these features are designed to work – and how they’re evolving.

Want to maintain control of your ad creatives? I get that, too.

In this primer, we’ll look at the ways these features can support your unique creative strategy.

It’s important to remember the overarching goals of all of these features are to:

  • Help generate ideas and scale relevant ad creatives to reach more customers.
  • Save you time and resources.

What they are not designed to be are:

  • Set-it-and-forget-it campaign tools.
  • Replacements for your own creative inputs and insights.

Marketers need to bring their expertise to ensure the creatives that are generated are accurate and represent their brand.

This is also why these features are grounded in your own inputs, such as landing pages, manual assets, and keywords.

Here, we’ll break down each of the creatives features that use AI in Google Ads today, look at what’s coming soon, and provide an overview of how these tools can play a role in your creative efforts to drive performance.

Automatically Created Assets

First up, automatically created assets. When the automatically created assets (ACA) setting is enabled, it will create headlines and descriptions for your responsive search ads (RSAs) and Performance Max asset groups.

The aim is to augment your manual assets to provide incremental conversion opportunities by creating more relevant ads for more queries.

When enabled, RSAs assemble ad combinations that are predicted to perform best using both the assets you’ve provided and ACAs.

As of February, automatically created assets can use generative AI for advertisers with English language assets. We plan to expand to more languages.

Applicable campaign types: Search, Performance Max.

What it generates: Text assets for responsive search ads and Performance Max asset groups.

Setting location: Campaign level settings.

When it generates assets: After the campaign is live.

What it sources from: Automatically created assets are generated based on your ad’s unique context, which includes your landing page, existing ads, and keywords in your ad group.

Reporting: In the ad level asset report, these assets are labeled “Automatically created” in the “Asset source” column. Combination reports also include automatically created assets.

Controls:

  • Opt in or out at the campaign level.
  • Review and remove assets you no longer want to serve from the asset details report. The system removes any automatically created assets with low performance or if the asset source changes, well, automatically.

What else you need to know:

  • Website content is used to generate ACA, so you need to be sure your website content is accurate, up-to-date, not misleading and in compliance with Google Ads policies and any applicable laws when automatically created assets are enabled in your campaigns.
  • Automatically created assets should augment – not replace – your own headlines and descriptions. They can show alongside your manual headlines and descriptions and also don’t count toward asset limits.
  • Draft and Experiments are compatible with ACA in Search campaigns. You will see automatically created assets on both the test and control arm. However, the control side will serve zero automatically created assets impressions.
  • Ad Strength ratings also now reflect automatically created assets in responsive search ads and in Performance Max campaigns.
  • Learn more about automatically created assets for RSAs here and Performance Max here.

Dynamic Assets

There are a number of automated assets (formerly known as extensions) options offered at the account level, such as seller ratings, automated locations, and longer headlines.

Here, we’re going to focus on the five dynamic asset types available:

  • Dynamic Image assets.
  • Dynamic Sitelinks.
  • Dynamic Structured snippets.
  • Dynamic Callouts.

Applicable campaign types: Most are eligible to show with all ad types across multiple campaigns.

Setting location: Campaigns > Assets > three-dot menu icon > Account level automated assets settings tab.

When it generates assets: When an asset is predicted to improve your ad’s performance, Google Ads automatically creates and displays it below your ad.

What it sources from: Dynamic assets are sourced from content and images on your landing page. Dynamic assets don’t currently use generative AI.

Reporting: Available from the Assets tab.

Controls: You may remove individual dynamic assets you no longer want to show from the Assets tab. Specific dynamic assets can be turned off entirely at the account level (see settings location above).

What else you need to know:

  • Dynamic assets should be used in addition to, not as replacements for, your manual assets.
  • As of March, dynamic assets can show alongside your manual assets. For example, if you only have two eligible manual sitelinks, your ad can show two additional dynamic sitelinks for four sitelinks in total.
  • Review account-level automated assets, including dynamic assets, and their performance from the Assets tab.
  • Learn more about account-level automated assets here.

Asset Generation In Performance Max

This feature is available when creating or editing your Performance Max campaigns. You can generate custom assets using text-to-text and text-to-image prompts.

Applicable campaign types: Performance Max. This was launched globally in May for customers whose Google Ads language is set to English.

What it generates: Text and image assets.

Setting location: Optional during campaign creation and asset group editing workflows.

When it generates assets: This option is available when creating new campaigns and asset groups and when editing existing asset groups.

Start by entering your final URL (a webpage or a social media page) in the “Asset generation panel,” and it will generate a summary of the business, including products or services being advertised in the asset group and unique selling points.

You then choose to “generate assets” for review and editing before publishing.

What it sources from: Gemini models are used to generate text assets based on your website.

For images, it will show assets curated from your website and social channels, our stock image inventory, and those generated via AI. You can also generate unique images using text prompts.

Autogenerated videos are created from the other assets in the asset group.

With image references announced at GML this year, soon you’ll be able to input the type of image you want in a text-to-prompt form and then upload a reference image. It will generate original images that match the original theme, style, and context. (English only to start.)

Reporting: The assets you approve and publish will be reflected in your asset reports.

Controls: During campaign and asset group construction, you can review, edit, and remove any assets prior to launching your campaign.

What else you need to know:

  • Asset generation may not be available if your site is related to a sensitive category, is not in English, and/or recently launched online.
  • There are also some content restrictions when generating images. For example, the tools will avoid generating content with individual people or products with brand names and logos. Often, this means you’ll receive no results for requests like this, but sometimes, you might see an irrelevant response for a prompt that requests people or brands. More on generated images here.
  • Assets created by generative AI go through an Ads Policy review and may not be approved.
  • You should review all assets for accuracy and compliance with Google advertising policies or applicable laws before publishing them.
  • Learn more about building asset groups with generative AI here.

Conversational Experience For Search Ads

Introduced in November, this is the chat-based feature powered by large language models to help when creating new Search campaigns. It’s somewhat similar to asset generation in PMax.

Applicable campaign types: Eligible Search campaigns where the language is set to English.

What it generates: Keywords, headlines and descriptions, images, sitelinks.

Setting location: You’ll see the option when you reach the “Keywords and ad” step during Search campaign construction, if eligible.

When it generates assets: During campaign construction.

What it sources from: Your text prompts and landing page. Image suggestions are either sourced from your landing page or generated by Google AI.

Reporting: The keywords and assets you approve will be reflected in their respective reports.

Controls: You’re able to review and revise suggested assets created in the conversational experience for accuracy and to ensure they are accurate and reflect your brand voice. You can also enter new prompts.

What else you need to know:

  • Not available for domains in sensitive verticals or sub-verticals (for example, sexual content and gambling).
  • Chat in natural language and rephrase your prompt if you aren’t satisfied with the response.
  • The more unique details you use to guide and inform the AI, the more likely the generated ad assets will be unique and set your ads apart.
  • It won’t always get it right. Be sure to carefully review all suggestions for accuracy before publishing.
  • Learn more about using the conversational experience here.

Coming Soon … AI Creatives News From Google Marketing Live

Brand Guidelines For Performance Max

With brand guidelines for PMax creative assets, you’ll soon be able to set your brand colors, either by providing them directly or confirming the colors we detect from your website.

We’ll also aim to closely match the brand font detected on your website.

These guidelines will then be applied to auto-generated videos and responsive display ads, which are also getting a much-needed, modern facelift.

You will be able to review and edit your brand guidelines at any time. Brand guidelines is now in beta, and will be rolling out generally in the coming months.

AI Image Editor In The Asset Library And In Performance Max

Retailers will soon be able to integrate products from their Google Merchant Feed into the image editor in Google Ads (in the U.S. to start).

Editing tools now include cropping, background removal, adjusting color, and removing or restoring aspect ratio. You’ll also be able to create more variations of your assets.

You’ll find the image editor when you create a new image from the Asset library in your Shared library under Tools.

Animated Image Ads For Shorts

Animated image ads in Shorts will be built automatically from the images in your account for a more Shorts-native feel.

For example, when you add your feed to Demand Gen, it will be able to pull product images together and animate them into one ad.

Shopping Ads: 3D Spin & Try on

With 3D Spin, you’ll be able to show 360-degree views of your products in Shopping ads. Shoes will be our first product available with 3D Spin.

You’ll need to have high-resolution images from at least five angles for the 3D view to be assembled. These can show in both Shopping ads and free listings. More details here.

Already available for apparel in free listings, try-on ads are launching with tops. Users can choose to see apparel in this category on models of all different sizes right from the ad.

You’ll need high-resolution images that are at least 512 x 512 pixels, but ideally 1024 pixels or higher. The image should feature one garment on one front-facing model or mannequin in a simple pose. More details here.

Ok, But Are “AI Ads” All Going To Look The Same?

If everyone uses AI, will ads all look the same? It’s a good question.

As we’ve covered, most of these tools use your own unique assets as their primary source to help steer Google AI.

That means, the more unique your landing pages, existing assets, and prompts are, the more that generated assets will uniquely reflect your business.

Hopefully, this primer will give you a clearer picture of what’s available, where these tools appear in your account – and where we’re headed with ad creatives capabilities – to help spark your creativity and ultimately improve campaign performance.

More resources: 


Featured Image: BestForBest/Shutterstock

How to Change User Agents in Chrome, Edge, Safari & Firefox via @sejournal, @vahandev

Whether you are an SEO pro, marketer, or web developer, you might often need to change your browser’s user-agent to test different things.

For example, imagine you’re running a MAC-OS-specific campaign. To find out if your campaign is running properly and not targeting Linux users, changing the user-agent of your browser can help you test.

Changing user-agents is almost a daily task for web developers, as they need to test how websites behave in different browsers and devices.

What Is A User-Agent?

A user-agent is an HTTP request header string identifying browsers, applications, or operating systems that connect to the server.

Browsers have user-agents, and so do bots and crawlers such as search engines Googlebot, Google AdSense, etc.

What Is a User-AgentScreenshot by author, May 2024

Here, we will learn how to change your browser’s user-agent.

The process is called user-agent spoofing.

Spoofing occurs when a browser or client sends a different user-agent HTTP header from what it is and fakes it.

While the term may be alarming, this is not a dangerous activity and will not cause you any problems. (Feel free to spoof your user-agent as much as you want.)

How To Change Your User-Agent On Chrome & Edge

Since Microsoft Edge is now using Chromium, the settings for both Chrome and Edge are the same.

1. Right Click Anywhere On Webpage > Inspect

Alternatively, you can use CTR+Shift+I on Windows and Cmd + Opt +J on Mac.

Chrome Screenshot by author, May 2024

2. Choose More Tools > Network Conditions

Screen new.

Click on the three vertical dots in the upper right corner.

Chrome settingsScreenshot by author, May 2024

3. Uncheck Select Automatically Checkbox

Chrome instructionScreenshot by author, May 2024

4. Choose One Among The Built-In User-Agents List

Chrome settings for user agentScreenshot by author, May 2024

If the user-agent you want doesn’t exist, you can enter any string you want on the field below the list.

For example, you can enter the following (Googlebot’s user-agent) into the custom field:

Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

This may be useful for SEO professionals to identify if there is a cloaking on the website where the webpage shows specific content to Googlebot and different content to website visitors.

Alternatively, you can use the Chrome extension User-Agent Switcher and Manager.

That said, I try not to use browser extensions when the browser can perform the action I want. This is to avoid overloading the browser with add-ons.

Also, extensions have a habit of breaking websites unexpectedly sometimes.

While you might think the website you visited has an issue, the root cause can be one of the add-ons you’re using.

How To Change User-Agent On Safari

1. Go To Preferences

Safari browser preferencesScreenshot by author, May 2024

2. Enable Develop Menu Bar

Go to Advanced and check Show Develop menu in menu bar.

Settings on MacScreenshot by author, May 2024

3. Navigate To Develop > User-Agent

Safari's develop menu. User agentScreenshot by author, May 2024

You can again select from a predefined list or enter a custom user-agent string by choosing “Other…”

Safari custom user agent setting

How To Change User-Agent In Firefox

In Firefox, it is possible to change user-agents via the browser’s built-in settings.

However, it is not as user-friendly as on Chrome or Safari.

It is a real pain to use the browser’s built-in feature.

Instead, we will use a Firefox add-on called User-Agent Switcher.

After installing the add-on, you will see an icon in the upper right corner.

You can select one of the predefined user-agents or enter a custom user-agent by clicking on the pen icon below.

Firefox settingsScreenshot by author, May 2024

Another extension you can use is User-Agent Switcher and Manager.

User-Agents Are Easy To Spoof

The user-agents are easy to spoof, and anyone can use these easy tricks to alter them.

This feature is useful for testing web apps against various devices, especially when the HTML is different for mobile or tablet devices.

It is a cost-efficient way to test websites as one doesn’t need to have many physical devices to be able to test.

However, certain issues may appear on the real device but not when testing by changing the user agent and using a browser emulator.

In that case, if you want to test on multiple real devices, I suggest using Browserstack, which offers testing opportunities on almost all devices.

FAQ

What is a user agent?

User agent is a HTTP request header string identifying browser, application, operating system which connects to the server. Not only browsers have user agent but also bots, search engines crawlers such as Googlebot, Google Adsense etc. which are not browsers.

What is user-agent spoofing?

When browser or any client sends different user-agent HTTP header from what they are and fakes it that is called spoofing.

How does changing the user-agent help SEO professionals?

SEO professionals may find changing the user-agent to be a critical part of their audit process. It is beneficial for several reasons:

  • Identifying cloaking issues: By mimicking different user-agents, such as Googlebot, SEO experts can uncover whether a website presents different content to search engines than users, which violates search engine guidelines.
  • Compatibility: It ensures web applications are compatible across various browsers and devices.
  • User Experience: Developers can optimize the user experience by understanding how content is rendered on different systems.
  • Debugging: Changing the user-agent can help pinpoint browser-specific issues.
  • Quality Assurance: It’s an essential step in quality assurance and helps maintain the integrity and performance of a website.

Can changing your browser’s user-agent pose a security risk?

No, changing your browser’s user-agent, commonly called user-agent spoofing, does not inherently pose a security risk. While the term “spoofing” might suggest malicious intent, this practice in the context of user-agents is harmless. It is a tool for developers and marketers to test how websites and applications interact with various devices and browsers.

More resources:


Featured Image: /Shutterstock

All screenshots taken by author, May 2024

How To Increase Website Traffic After The Google Update: 5 Blogger-Tested Tips via @sejournal, @getStay22

This post was sponsored by Stay22. The opinions expressed in this article are the sponsor’s own.

The latest Google update hit our partners – travel bloggers – hard. Despite the challenges presented by this update, we are amazed by their resilience.

Although we cannot influence Google, we can definitely take charge of the things within our reach.

Here are five travel blogger-tested ways to increase website traffic after the latest Google update.

Get the most out of each visitor. Check out our guide, Ten Tips for Maximizing Revenue for Travel Bloggers.

1. Use Pinterest As An Additional Search Engine

Many people think of Pinterest as a social media platform, but it functions much more as a search engine.

Pinterest’s tools make it a fantastic way to get visitors to your website.

How To Get Started With Pinterest

  1. Create a new business account or link/convert a personal account to gain access to business tools like Pinterest Business Hub and Pinterest Analytics, which allow you to monitor searches and keywords.
  2. Write a catchy bio for your profile. If you have niche keywords you use on your website, include them here.
  3. Claim your website, allowing you to track whenever someone shares your website content on Pinterest. Additionally, it will automatically include your Pinterest profile and a follow button so you can attract more visitors.

How To Use Pinterest Like A Pro

Pinterest is built around pictures, so having great visuals is critical. You want to make yours stand out on the page.

  1. When you travel, snap many pictures and pick the best ones later.
  2. Use design tools like Canva to add text, create interesting layouts, and add branding elements.
  3. Create boards to organize your content into categories, such as by destination or blog post type.
  4. Just like any other search engine, keyword research is necessary on Pinterest. Use business tools like Pinterest Trends, which lets you see keyword search volume and which pins are most popular for them.
  5. Use your selected keywords everywhere possible, including pin titles, pin descriptions, and board descriptions.
Screenshot from Pinterest Trends, June 2024

Before you start pinning, optimize your blog posts to maximize revenue to ensure every visitor counts. See how to do that in our guide Ten Tips for Maximizing Revenue for Travel Bloggers.

2. Keep Visitors Engaged With Email Marketing

Email marketing is important because you own it. Instagram could ban your account, Google can derank you, but your email list is always yours.

Use this strategy to build less volatile, consistent traffic instead of new traffic.

How To Get Started With Email Marketing

  1. Select an email marketing provider. Travel bloggers love MailerLite and MailChimp because they are low-cost and simple.
  2. Start collecting email addresses. Create a mailing list signup form with your website host and insert it at the end of posts to collect your readers’ information.

How To Use Email Marketing Like A Pro

  1. Create a lead magnet that entices readers to share their information in exchange for it. Popular lead magnets in the travel blogging community are travel guides, ebooks, and itineraries.
  2. The goal of email is to get your reader onto your site. Keep your emails short and include impressive pictures and a clear link to your website. Leave them wanting more so that they click through.
  3. Before sending an email directing traffic to one of your pages, ensure the page is set up to maximize revenue. See how in our guide, Ten Tips for Maximizing Revenue for Travel Bloggers.

“While SEO is often touted as the king of traffic, good email marketing is the absolute queen. We regularly see spikes in our web analytics when we email our audience, but those spikes aren’t just temporary. They help our users build habits. We simply wouldn’t have nearly the business we have today without activating our email list.”

Beth Stanos, Wanderful CEO & Founder

    3. Add Your Content To Flipboard To Grab More Website Visits

    Flipboard is an article curation website that lets users flip through articles from different publications, like magazines.

    How To Get Started With Flipboard

    1. Download the Flipboard app on mobile and create an account and profile.
    2. Create Flipboard magazines. These are topic categories that group your content (much like Pinterest boards).
    3. Add your blog posts to them from the magazine page on Flipboard or the Flipit Chrome bookmark.
    Screenshot taken from Flipboard, June 2024

    How To Use Flipboard Like A Pro

    1. Add your fresh blog posts to Flipboard right after publishing.
    2. Add relevant content to your magazines that isn’t your own. Engaging with other bloggers’ content and maintaining new content in your magazines helps your reach.
    3. Include your target keywords in your magazine descriptions, post captions, and profile.

    “Flipboard has become a major traffic source for all 3 of my websites. They have fantastic tools for creators, like storyboards, scheduling, and RSS feeds for magazines, making creating and sharing compelling content easy.”

    Amanda O’Brien, theboutiqueadventurer.com Founder

    4. Get Discovered With Short-Form Videos On Social Media

    Did you know 77% of travelers use social media when planning their trips? Get their eyes on your content with short-form videos on platforms like TikTok, Instagram, and YouTube shorts.

    These platforms have algorithms to serve your content to people who don’t follow you, making them ideal for readers to discover you.

    How To Get Started Making Videos

    1. Create your accounts on YouTube, Instagram, and TikTok.
    2. Download Capcut, a free mobile app for video editing. It’s better than the in-app editing functionality, with pre-installed templates and effects.
    3. Create a Linktree account or an alternative. All three platforms make it difficult for users to leave and visit your site, so a place where users can go for all your links is necessary.
    “PLACES ON EARTH WITH THE BLUEST WATER” by @thetravelshack, June 2024

    How To Make Short Video Content Like A Pro

    1. While traveling, keep video creation in the back of your mind. When you see eye-catching scenery or an interesting environment, be ready to capture it in video. Your video’s “wow” factor differentiates between a mediocre and a fantastic video.
    2. On Instagram, write detailed captions. On YouTube and TikTok, keep captions as brief as possible.
    3. Voice-over, on-screen text, and music are crucial to your engagement rate. You can add these with Capcut or in-app.
    4. Pick your video thumbnails carefully. Make sure they stand out in a grid of other videos.
    5. Finally, keep your short video short. The ideal length for a Reel is seven to eleven seconds, a TikTok 21 to 34 seconds, and a YouTube Short 25 to 35 seconds.

    “TikTok helped us to get our second blog on Mediavine. We shared detailed videos about “how to do something” and always included a call to action, directing viewers to our blog for more information. In recent years, we’ve had videos go viral while encouraging followers to visit the link in our bio. This has resulted in thousands of clicks to our websites.”

    Natasha Alden, theworldpursuit.com Co-Founder

    5. Expose Yourself To New Audiences With Guest Posting

    Having your writing published on another website gives you access to a whole new audience, making it an excellent strategy for attracting new readers.

    How To Start Guest Posting

    1. Find places that accept guest posts. Many publications have open submissions that you can submit to.
    2. If you see a site that has guest posts but no open submissions, find their contact information and send an inquiry email.
    3. Create a pitch by explaining your travel expertise and provide writing samples.

    How To Guest Post Like A Pro

    1. When creating your pitch, understand the site’s needs. What content gaps do they have? Focus your pitch there.
    2. Organic guest posting opportunities arise if you take the time to network. Meeting and developing friendships with travel bloggers is a great way to share audiences. You can network on social media, as well as in-person events like TBEX, TravelCon, and Traverse.
    3. Create a clear link back to your blog and include a catchy description so readers will be enticed to read more of your content.
    4. Strategize by monitoring the traffic of potential sites with tools like SEMRush and pick the best option.
    Screenshot from japan.travel, June 2024

    “Creating content for another publication to share as a standalone article allows you to share new expertise about a certain area. Whether through the article itself or subsequent social shares, a new audience gets exposure to you as a writer.”

    William Tang, goingawesomeplaces.com Founder

    See how William Tang made travel blogging his full-time job with the help of Stay22’s revenue-boosting tools. Read the case study.

    Compensate For Traffic Dips By Maximizing Your Revenue.

    Maximize your conversions with strategic CTA placement and Stay22’s cutting-edge AI-powered affiliate tools! Skyrocket your affiliate sales with these proven tips:

    DOWNLOAD OUR GUIDE: Ten Tips for Maximizing Revenue for Travel Bloggers.


    Image Credits

    Featured Image: Image by Shutterstock. Used with permission.

    Yes, Clicks and Engagement Impact SEO

    For years Google denied using clicks on organic listings as a ranking signal. We now know otherwise. Recently revealed documents show Google indeed uses click-related signals in its algorithm.

    Here’s an overview of those documents with my takeaways on the search engine optimization impact.

    Organic Clicks

    At Google’s federal antitrust trial last year, Google Search V.P. Pandu Nayak testified the search engine uses a ranking system called “Navboost” (also called “Glue”) that analyzes user interactions on search results (such as clicks, hovers, scrolls, and swipes) and keeps the data for 13 months.

    Navboost delivers results based on users’ interactions with related search result pages. The system also determines whether search results should include other features, such as videos, featured snippets, “People also ask,” and local packs.

    The same trial produced an internal Google PDF from 2016 naming “user interactions” (i.e., clicks) as one of the top three ranking factors.

    Many in the search engine community have long suspected that Google uses clicks in its ranking algorithm. We’ve also shared that it’s highly limiting, existing only for higher-ranking pages (that generate many clicks). Thus other ranking signals, such as inbound links and on-page relevance and quality, remain fundamental for all but the very top listings.

    Chrome Browser Clicks

    Last month, documents leaked from Google Search’s internal engineering department. The contents revealed the importance of organic clicks on rankings. Moreover, the documents described Google’s monitoring of users’ Chrome browser interactions with websites. Google has denied that for years.

    So, no matter the traffic source, Google knows how visitors using Chrome interact with a website.

    ‘Long clicks’

    SEO pros have theorized for years that click data informs content quality. “Dwell time” and “pogo-sticking” are oft-discussed bounce-rate metrics.

    Thanks to the engineering leak, we know Google does have such a metric, calling it “long clicks.” The longer a ranking page holds users, the better.

    SEO Takeaways

    • Keep a close eye on your highest-traffic pages. Google has the most data about them. Make sure they are engaging and trigger meaningful interactions. Use heatmaps and other usability tools to understand visitors’ actions and optimize accordingly.
    • Focus on meaningful clicks, not just from search results. Try to improve click-throughs from email newsletters and even ads. Google accumulates this data from Chrome. Aim link-building campaigns on those that drive clicks.
    • Optimize page titles as they impact click-throughs on search result pages. Review top SERP listings and snippets regularly to improve clicks. Use structured data where possible to earn rich snippets.
    5 Content Marketing Ideas for July 2024

    This July offers plenty of content topics thanks to Independence Day and the Summer Olympics in Paris. But even less apparent ideas — must-have items, humor, giveaways, junk food — can make for compelling content.

    Content marketing involves creating, publishing, and promoting articles, videos, and podcasts to attract, engage, and retain customers. It is crucial for search engine optimization, social media marketing, and more.

    Unfortunately, coming up with new content topics month in and month out can be a task. What follows are five ideas for July 2024.

    Summer Product Tutorials

    In many ways, July in the Northern Hemisphere is summer itself.

    Celebrations such as U.S. Independence Day on July 4, with picnics, parades, fireworks displays, and community events, make us all feel like summer has arrived.

    The weather calls us outdoors, making July an excellent time to publish summer product tutorials, showing shoppers how to use your business’s items to work or play in the sun.

    The Home Depot sells products to build raised garden beds, as explained in this tutorial.

    Consider The Home Depot. The company’s ecommerce site has a do-it-yourself project section, complete with video tutorials explaining a project and the products. Examples include the building of garden beds, playground equipment, and more.

    This concept could work for many products.

    Must-have Items

    Not every product needs a tutorial. Knowing how to use Velcro from The Home Depot in a garden project might be helpful, but instructions are not required for everyday items.

    Screenshot of Mr. Porter's article on sneakers.

    Mr. Porter addresses why its sneakers are a must-have.

    Take shoes, for example. Content marketers could try a “must-have” article or video, such as the article from clothier Mr. Porter, “5 Must-Have Sneakers For Stepping Out In Style This Summer.”

    A must-have article puts an editorial spin on product recommendations. The Mr. Porter article starts with a brief introduction, making the case for summertime sneakers.

    “Summer dressing is all about fun and experimentation. It’s about trying new things and breaking out of the style comfort zone we’ve been stuck in through winter.”

    Each of the five recommendations includes vivid descriptions and photography, emphasizing the shoe’s design elements and versatility. That’s critical, given that the prices are upwards of $990 a pair.

    Must-have content is useful for organic search rankings, social media posts, and email newsletters.

    Here are hypothetical examples:

    • “10 Must-have Tools for the Perfect BBQ,”
    • “5 Must-have Hats for Summer Sun,”
    • “10 Must-have Dresses for Stylish Summer Soirees.”

    International Joke Day

    July 1, 2024, is International Joke Day. This pseudo-holiday reminds us to celebrate humor and laughter. It encourages folks to share jokes, puns, and funny stories with friends, family, and even strangers.

    Photo of a male speaking into a microphone on a stage.

    Tell a few jokes on July 1.

    Jokes are prime, entertainment-focused content marketing. Consider assembling jokes endemic to your industry or products. Then publish them in a blog post, newsletters, and social media.

    Jokes are an exception to avoiding AI-generated content. Here are five ChatGPT-produced jokes (with minor edits) for this article.

    • In our store, we sell emotions — mainly frustration and impatience.
    • I tried to buy a camouflage shirt online but couldn’t find it.
    • I ordered a chicken and an egg from Amazon. I’ll let you know which comes first.
    • I bought a belt online. It was a waist of money.
    • Why don’t online shoppers get lost? They follow their carts.

    National Give Something Away Day

    The goal of ecommerce content marketing is to sell products. But too much selling can be counterproductive. An exception could be National Give Something Away Day on July 15, 2024, when marketers can combine charity giving with a sales promotion.

    Photo of third-world people in a jungle setting.

    Use National Give Something Away Day for charitable giving.

    Here is the idea.

    • Identify a charity that makes sense for your business.
    • Connect with the charity for a sales-based contribution, such as 5% of revenue on July 15.
    • Interview folks at the charity about their work and impact.
    • Produce an article, video, or podcast of the interview.
    • Run ads and public-relations campaigns to promote the event.

    Junk Food Day

    Photo of french fried potatoes and a canned drink on a restaurant table.

    Content can promote junk food or discourage it.

    National Junk Food Day on July 21, 2024, allows content marketers to praise the fun of junk food or the opposite, healthy eating.

    Content about healthy eating and lifestyles could include:

    • Power equipment retailer’s newsletter: “Healthy Lunchbox Tips for Construction Pros.”
    • Kitchen shop’s blog: “From Junk to Joy: Transform Meals with Healthy Ingredients.”
    • Apparel brand’s podcast: “Dress for Success: 10 Nutritious Tips for a Busy Lifestyle.”

    Conversely, content marketing could praise the temporary enjoyment of junk food. Here are more examples.

    • Power equipment retailer: “Break Time Bliss. 35 Junk Food Picks for Hardworking Builders.”
    • Kitchen shop’s blog: “15 Forbidden Meals for National Junk Food Day.”
    • Apparel brand’s podcast: “Stylish Snacks for National Junk Food Day.”
    MozCon 2024 Announcements: AI SEO Tools & Affordable API Plans via @sejournal, @MattGSouthern

    Today, at its MozCon 2024 conference, Moz announced a range of new AI-driven features and product updates.

    The announcements focused on modernizing tools to address the changing needs of SEO professionals.

    Ethan Hays, Moz’s General Manager, acknowledges that “the search landscape has seen more turbulence this last year than at almost any point in Moz’s 20-year history. Our goal is to help our customers face that head-on.”

    Modern Tools For The Future Of SEO

    Moz showcased a revamped version of STAT, its enterprise SERP tracking solution, and several usability improvements to the Moz Pro interface.

    The company introduced Moz AI, a suite of tools designed to streamline workflows and enhance enterprise-grade data.

    Additionally, Moz unveiled new affordable API plans, including Beta endpoints for keyword, intent, brand, and link metrics

    Moz AI

    Moz AI is a collection of three features integrated into Moz Pro:

    • Search Intent in Keyword Suggestions
    • Domain Search Theme
    • Domain Keyword Topics in Domain Overview.

    These AI-driven capabilities allow Moz Pro customers to optimize their SEO strategies and make data-informed decisions with greater efficiency.

    Accessible & Affordable Data

    Moz announced the launch of new Moz API plans, making enterprise-quality data more accessible and affordable.

    Starting at $5 per month, customers can now access an expanded set of Beta endpoints, including:

    • Keyword Metrics
    • Keyword Suggestions
    • Search Intent
    • Ranking Keywords
    • Brand Authority.

    These plans offer a cost-effective solution for businesses looking to build custom tools, dashboards, SaaS products, and apps using Moz’s robust data.

    Hays emphasized Moz’s commitment to its customers, stating, “Our core commitment is unchanged: empowering SEO practitioners through software, education, and community.”

    MozCon 2024

    MozCon 2024, which is currently underway, will feature live demonstrations of these product enhancements. Attendees will be able to experience the new features firsthand and interact with Moz experts.

    Following Moz’s presentation, we will update this article with first-hand details straight from MozCon.

    In Summary

    As Moz celebrates its 20th anniversary, it remains dedicated to empowering SEO professionals with tools, education, and community support.

    As Moz marks its 20th anniversary, it continues to focus on providing SEO professionals with advanced tools, educational resources, and community support to navigate the constantly changing search environment.

    Top SEO & Content Marketing Trends To Future-Proof Your Strategy via @sejournal, @hethr_campbell

    What are the key SEO and content marketing trends you should prioritize in 2024 and beyond?

    Join us and find out! 

    With search taking on a more human-first and user-centric approach, powered by AI intelligence and collaboration, now’s the time to adapt your strategy.  

    That’s why on June 18, we’re hosting a live webinar with Conductor about which SEO trends to focus on this year (and which to ignore) in order to drive sustainable growth and success. 

    Here are some key takeaways from the presentation:

    • Predictions on SGE’s Potential Impact on SEO: Understand how Google’s Search Generative Experience (SGE) could transform the SEO landscape and what it means for your strategy.
    • How to Adapt to Industry-Wide Disruptions: Learn best practices for navigating significant changes in search, while maintaining a competitive edge.
    • Deep Dive into E-E-A-T: Discover everything you need to know about Experience, Expertise, Authority, and Trustworthiness (E-E-A-T) and how to integrate these principles into your content.
    • Agile Approach to Algorithm Updates: Find out why flexibility and agility are crucial in responding to rapid and significant changes in Google’s algorithms.  
    • Staying Indexed in 2024: Learn why staying indexed is becoming a new challenge and how to ensure your content remains visible.
    • Trends to Ignore: Save time and resources by identifying which SEO trends aren’t worth your focus.

    In this webinar, Zack Kadish, Sr. SEO Strategy Director at Conductor, and Alex Carchietta, Sr. Customer Success Manager at Conductor, will provide the expert insights you need to thrive in this ever-evolving industry.

    Don’t miss this opportunity to stay ahead of the curve and ensure your SEO and content marketing strategies are primed for the future.

    Plus, if you join us live, Zack and Alex will answer any questions you have at the end of the presentation. 

    Can’t make it to the live event? Sign up anyway, and we’ll send you a recording of the webinar to watch at your convenience.

    How SEO Experts Can Utilize ChatGPT For BigQuery With Examples via @sejournal, @vahandev

    AI is shaping every field by making skills (such as coding or data visualization) accessible to everyone, which weren’t available in the past.

    An AI operator who can run the right prompts can perform low- and medium-level difficulty tasks, allowing more focus on strategic decision-making.

    In this guide, we will walk you through step by step how to use AI chatbots with ChatGPT as an example to run complex BigQuery queries for your SEO reporting needs.

    We will review two examples:

    It will also give you an overall idea of how you can use chatbots to reduce the burden when running SEO reports.

    Why Do You Need To Learn BigQuery?

    SEO tools like Google Search Console or Google Analytics 4 have accessible user interfaces you can use to access data. But often, they limit what you can do and show incomplete data, which is usually called data sampling.

    In GSC, this happens because the tool omits anonymized queries and limits table rows to up to 1,000 rows.

    Screenshot from GSC Screenshot from Google Search Console, May 2024

    By using BigQuery, you can solve that problem and run any complex reports you want, eliminating the data sampling issue that occurs quite often when working with large websites.

    (Alternatively, you may try using Looker Studio, but the purpose of this article is to illustrate how you can operate ChatGPT for BigQuery.)

    For this article, we assume you have already connected your GSC and GA4 accounts to BigQuery. If you haven’t done it yet, you may want to check our guides on how to do it:

    SQL Basics

    If you know Structured Query Language (SQL), you may skip this section. But for those who don’t, here is a quick reference to SQL statements:

    Statement Description
    SELECT Retrieves data from tables
    INSERT Inserts new data into a table
    UNNEST Flattens an array into a set of rows
    UPDATE Updates existing data within a table
    DELETE Deletes data from a table
    CREATE Creates a new table or database
    ALTER Modifies an existing table
    DROP Deletes a table or a database.

    The conditions we will be using so you can familiarize yourself:

    Condition Description
    WHERE Filters records for specific conditions
    AND Combines two or more conditions where all conditions must be true
    OR Combines two or more conditions where at least one condition must be true
    NOT Negates a condition
    LIKE Searches for a specified pattern in a column.
    IN Checks if a value is within a set of values
    BETWEEN Select values within a given range
    IS NULL Checks for null values
    IS NOT NULL Checks for non-null values
    EXISTS Checks if a subquery returns any records

    Now, let’s dive into examples of how you can use BigQuery via ChatGPT.

    1. How To Analyze Traffic Decline Because Of Google Algorithm Impact 

    If you have been affected by a Google algorithm update, the first thing you should do is run reports on affected pages and analyze why you have been impacted.

    Remember, the worst thing you can do is start changing something on the website right away in panic mode. This may cause fluctuations in search traffic and make analyzing the impact even harder.

    If you have fewer pages in the index, you may find using GSC UI data satisfactory for analyzing your data, but if you have tens of thousands of pages, it won’t let you export more than 1,000 rows (either pages or queries) of data.

    Say you have a week of data since the algorithm update has finished rolling out and want to compare it with the previous week’s data. To run that report in BigQuery, you may start with this simple prompt:

    Imagine you are a data analyst experienced in Google Analytics 4 (GA4), Google Search Console, SQL, and BigQuery.
    Your task is to generate an SQL query to compare 'WEB' Search Console data for the periods '2024-05-08' to '2024-05-20' and '2024-04-18' to '2024-04-30'. 
    Extract the total clicks, impressions, and average position for each URL for each period. 
    Additionally, calculate the differences in these metrics between the periods for each URL 
    (where average position should be calculated as the sum of positions divided by the sum of impressions).
    
    Details:
    
    BigQuery project name: use_your_bigquery_projectname
    Dataset name: searchconsole
    Table name: searchdata_url_impression
    Please provide the SQL query that meets these requirements.
    

    Once you get an SQL code, copy and paste it into the BigQuery SQL editor, but I bet the initial code you will get will have errors. For example, table column names may not match what is in your BigQuery dataset.

    Error in BigQuery SQL when column name doesn't match the dataset column.Error in BigQuery SQL when column name doesn’t match the dataset column.

    Things like this happen quite often when performing coding tasks via ChatGPT. Now, let’s dive into how you can quickly fix issues like this.

    Simply click on your dataset in the left-right panel, select all columns on the right side, and click Copy as Table.

    How to select all columns of table in bigquery.How to select all columns of the table in BigQuery.

    Once you have it, just copy and paste it as a follow-up prompt and hit enter.

    Generate SQL again by taking into account that my table schema is as follows : 
    "fullname mode type description
    data_date DATE 
    site_url STRING 
    url STRING 
    query STRING 
    is_anonymized_query BOOLEAN 
    is_anonymized_discover BOOLEAN 
    country STRING 
    search_type STRING 
    device STRING 
    is_amp_top_stories BOOLEAN 
    is_amp_blue_link BOOLEAN 
    is_job_listing BOOLEAN 
    is_job_details BOOLEAN 
    is_tpf_qa BOOLEAN 
    is_tpf_faq BOOLEAN 
    is_tpf_howto BOOLEAN 
    is_weblite BOOLEAN 
    is_action BOOLEAN 
    is_events_listing BOOLEAN 
    is_events_details BOOLEAN 
    is_search_appearance_android_app BOOLEAN 
    is_amp_story BOOLEAN 
    is_amp_image_result BOOLEAN 
    is_video BOOLEAN 
    is_organic_shopping BOOLEAN 
    is_review_snippet BOOLEAN 
    is_special_announcement BOOLEAN 
    is_recipe_feature BOOLEAN 
    is_recipe_rich_snippet BOOLEAN 
    is_subscribed_content BOOLEAN 
    is_page_experience BOOLEAN 
    is_practice_problems BOOLEAN 
    is_math_solvers BOOLEAN 
    is_translated_result BOOLEAN 
    is_edu_q_and_a BOOLEAN 
    impressions INTEGER 
    clicks INTEGER 
    sum_position INTEGER 
    is_product_snippets BOOLEAN 
    is_merchant_listings BOOLEAN 
    is_learning_videos BOOLEAN "

    This will fix the generated SQL according to the GSC table structure as follows:

    WITH period1 AS (
     SELECT 
       url,
       SUM(clicks) AS clicks,
       SUM(impressions) AS impressions,
       IFNULL(SUM(sum_position) / NULLIF(SUM(impressions), 0), 0) AS avg_position
     FROM 
       `use_your_bigquery_projectname.searchconsole.searchdata_url_impression`
     WHERE 
       data_date BETWEEN '2024-05-08' AND '2024-05-20' 
     AND search_type = 'WEB'
      GROUP BY 
      url
    ), 
    period2 AS (
     SELECT 
      url,
      SUM(clicks) AS clicks,
      SUM(impressions) AS impressions,
      IFNULL(SUM(sum_position) / NULLIF(SUM(impressions), 0), 0) AS avg_position
     FROM 
      `use_your_bigquery_projectname.searchconsole.searchdata_url_impression`
      WHERE 
      data_date BETWEEN '2024-04-18' AND '2024-04-30'
     AND search_type = 'WEB'
     GROUP BY 
     url
    ),
    comparison AS (
     SELECT
      COALESCE(p1.url, p2.url) AS url,
      IFNULL(p1.clicks, 0) AS clicks_period1,
      IFNULL(p2.clicks, 0) AS clicks_period2,
      IFNULL(p1.impressions, 0) AS impressions_period1,
      IFNULL(p2.impressions, 0) AS impressions_period2,
      IFNULL(p1.avg_position, 0) AS avg_position_period1,
      IFNULL(p2.avg_position, 0) AS avg_position_period2,
      IFNULL(p1.clicks, 0) - IFNULL(p2.clicks, 0) AS clicks_diff,
      IFNULL(p1.impressions, 0) - IFNULL(p2.impressions, 0) AS impressions_diff,
      IFNULL(p1.avg_position, 0) - IFNULL(p2.avg_position, 0) AS avg_position_diff
     FROM
      period1 p1
     FULL OUTER JOIN
       period2 p2
     ON
      p1.url = p2.url
    )
    SELECT 
     url,
     clicks_period1,
     clicks_period2,
     clicks_diff,
     impressions_period1,
     impressions_period2,
     impressions_diff,
     avg_position_period1,
     avg_position_period2,
     avg_position_diff
    FROM 
     comparison
    ORDER BY 
     url;
    

    Now, run it and enjoy the data either by exporting it into the CSV or Google Sheets.

    How to run SQL in BigQueryHow to run SQL in BigQuery.

    In case you have millions of URLs, you may not be able to work in Google Sheets or CSV export because the data is too big. Plus, those apps have limitations on how many rows you can have in a single document. In that case, you can save results as a BigQuery table and connect to it with Looker Studio to view the data.

    But please remember that BigQuery is a freemium service. It is free up to 1 TB of processed query data a month. Once you exceed that limit, your credit card will be automatically charged based on your usage.

    That means if you connect your BigQuery to Looker Studio and browse your data there, it will count against your billing every time you open your Looker dashboard.

    That is why, when exports have a few tens of thousands or hundreds of thousands of rows, I like using Google Sheets. I can easily connect it to Looker Studio for data visualization and blending, and this will not count against my billing.

    If you have ChatGPT Plus, you can simply use this custom GPT I’ve made, which takes into account table schemas for GA4 and Search Console. In the above guide, I assumed you were using the free version, and it illustrated how you can use ChatGPT overall for running BigQuery.

    In case you want to know what is in that custom GPT, here is the screenshot of the backend.

    Custom GPT with bigQuery table schemasCustom GPT with BigQuery table schemas.

    Nothing complicated – you just need to copy tables from BigQuery as JSON in the step explained above and upload them into the custom GPT so it can refer to the table structure. Additionally, there is a prompt that asks GPT to refer to the JSON files attached when composing queries.

    This is another illustration of how you can use ChatGPT to perform tasks more effectively, eliminating repetitive tasks.

    If you need to work with another dataset (different from GA4 or GSC) and you don’t know SQL, you can upload the table schema from BigQuery into ChatGPT and compose SQLs specific to that table structure. Easy, isn’t it?

    As homework, I suggest you analyze which queries have been affected by AI Overviews.

    There is no differentiator in the Google Search Console table to do that, but you can run a query to see which pages didn’t lose ranking but had a significant CTR drop after May 14, 2024, when Google introduced AI Overviews.

    You can compare the two-week period after May 14th with the two weeks prior. There is still a possibility that the CTR drop happened because of other search features, like a competitor getting a Featured Snippet, but you should find enough valid cases where your clicks were affected by AI Overviews (formerly Search Generative Experience or “SGE”).

    2. How To Combine Search Traffic Data With Engagement Metrics From GA4 

    When analyzing search traffic, it is vital to understand how much users engage with content because user engagement signals are ranking factors. Please note that I don’t mean the exact metrics defined in GA4.

    However, GA4’s engagement metrics – such as “average engagement time per session,” which is the average time your website was in focus in a user’s browser – may hint at whether your articles are good enough for users to read.

    If it is too low, it means your blog pages may have an issue, and users don’t read them.

    If you combine that metric with Search Console data, you may find that pages with low rankings also have a low average engagement time per session.

    Please note that GA4 and GSC have different attribution models. GA4 uses last data-driven or last-click attribution models, which means if one visits from Google to an article page once and then comes back directly two more times, GA4 may attribute all three visits to Google, whereas GSC will report only one.

    So, it is not 100% accurate and may not be suitable for corporate reporting, but having engagement metrics from GA4 alongside GSC data provides valuable information to analyze your rankings’ correlations with engagement.

    Using ChatGPT with BigQuery requires a little preparation. Before we jump into the prompt, I suggest you read how GA4 tables are structured, as it is not as simple as GSC’s tables.

    It has an event_params column, which has a record type and contains dimensions like page_location, ga_session_id, and engagement_time_msec.  It tracks how long a user actively engages with your website.

    event_params key engagement_time_msec is not the total time on the site but the time spent on specific interactions (like clicking or scrolling), when each interaction adds a new piece of engagement time. It is like adding up all the little moments when users are actively using your website or app.

    Therefore, if we sum that metric and average it across sessions for the pages, we obtain the average engagement time per session.

    Now, once you understand engagement_time_msec , let’s ask ChatGPT to help us construct a query that pulls GA4 “average engagement time per session” for each URL and combines it with GSC search performance data of articles.

    The prompt I would use is:

    Imagine you are a data analyst experienced in Google Analytics 4 (GA4), Google Search Console, SQL, and BigQuery.
    Compose a SQL query that pulls the following data from Google Search Console for each URL for the previous 7 days, excluding the current day:
    
    1. Clicks,
    2. Impressions,
    3. Average position (calculated as the sum of positions divided by the sum of impressions).
    
    From GA4 BigQuery table unnest from event_params ga_session_id, engagement_time_msec and page_location.
    Select only rows which have engagement_time_msec set as not null group all sessions with the same IDs and page_location and SUM engagement_time_msec and devides to SUM of sessions number 
    
    Join GA4 and Google Search Console data by URLs for the same period. 
    Additonally Optimize the query to pull from GA4's table partitions and not query the entire table.
    
    Details:
    1. BigQuery project name: use_your_bigquery_projectname 
    2. GA4 dataset name: use_your_ga4_bigquery_dataset_name
    3. GA4 table name: use_your_ga4_bigquery_table_name_under_dataset
    3. Google Search Console dataset name: use_your_gsc_bigquery_table_name_under_dataset
    3. Google Search Console table name: use_your_gsc_bigquery_table_name_under_dataset
    4. Here is BigQuery tables schemas for GA4: [copy table schema here]
    5. Here is BigQuery tables schemas for Google Search Console: [copy table schema here]
    
    
    

    Once I copied and pasted into BigQuery, it gave me results with “average engagement time per session” being all nulls. So, apparently, ChatGPT needs more context and guidance on how GA4 works.

    I’ve helped to provide additional knowledge as a follow-up question from GA4’s official documentation on how it calculates engagement_time_msec. I copied and pasted the document into the follow-up prompt and asked to refer to that knowledge when composing the query, which helped. (If you get any syntax error, just copy/paste it as a follow-up question and ask to fix it.)

    As a result of 15 minutes of effort, I got the right SQL:

    
    WITH
    -- Step 1: Filter GA4 data for the last 7 days and unnest event_params
    ga4_data AS (
      SELECT
        event_date,
        event_timestamp,
        (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'ga_session_id') AS ga_session_id,
        (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'engagement_time_msec') AS engagement_time_msec,
        (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page_location
      FROM
        `your_bq_project_name.your_bq_ga4_dataset.events_*`
      WHERE
         _TABLE_SUFFIX BETWEEN FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 8 DAY))
        AND FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
        AND (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'engagement_time_msec') IS NOT NULL
    ),
    
    -- Step 2: Aggregate GA4 data by session ID and page location
    aggregated_ga4_data AS (
      SELECT
        page_location,
        SUM(engagement_time_msec/1000) / NULLIF(COUNT(DISTINCT ga_session_id), 0) AS avg_engagement_time_msec
      FROM
        ga4_data
      GROUP BY
        page_location
    ),
    -- Step 3: Filter GSC data for the last 7 days and select urls which had clicks
    gsc_data AS (
      SELECT
        url,
        SUM(clicks) AS clicks,
        SUM(impressions) AS impressions,
        SUM(sum_position) / SUM(impressions) AS avg_position
      FROM
        `your_bq_project_name.searchconsole.searchdata_url_impression`
      WHERE
        data_date BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 8 DAY) AND DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)
        and 
        clicks > 0
      GROUP BY
        url
    )
    
    -- Joining Google Search Console data with GA4 data by page_location and url 
    SELECT
      gsc.url,
      gsc.clicks,
      gsc.impressions,
      gsc.avg_position,
      ga4.avg_engagement_time_msec
    FROM
      gsc_data AS gsc
    LEFT JOIN
      aggregated_ga4_data AS ga4
    ON
      gsc.url = ga4.page_location
    ORDER BY
      gsc.clicks DESC;
    

    This pulls GSC data with engagement metrics from GA4.

    Search Console combined data with GA4Search Console combined data with GA4

    Please note that you might notice discrepancies between the numbers in the GA4 UI and the data queried from BigQuery tables.

    This happens because GA4 focuses on “Active Users” and groups rare data points into an “(other)” category, while BigQuery shows all raw data. GA4 also uses modeled data for gaps when consent isn’t given, which BigQuery doesn’t include.

    Additionally, GA4 may sample data for quicker reports, whereas BigQuery includes all data. These variations mean GA4 offers a quick overview, while BigQuery provides detailed analysis. Learn a more detailed explanation of why this happens in this article.

    Perhaps you may try modifying queries to include only active users to bring results one step closer to GA4 UI.

    Alternatively, you can use Looker Studio to blend data, but it has limitations with very large datasets. BigQuery offers scalability by processing terabytes of data efficiently, making it ideal for large-scale SEO reports and detailed analyses.

    Its advanced SQL capabilities allow complex queries for deeper insights that Looker Studio or other dashboarding tools cannot match.

    Conclusion

    Using ChatGPT’s coding abilities to compose BigQuery queries for your reporting needs elevates you and opens new horizons where you can combine multiple sources of data.

    This demonstrates how ChatGPT can streamline complex data analysis tasks, enabling you to focus on strategic decision-making.

    At the same time, these examples taught us that humans absolutely need to operate AI chatbots because they may hallucinate or produce wrong answers.

    More resources: 


    Featured Image: NicoElNino/Shutterstock