Charts: U.S. Retail Ecommerce Sales Q3 2023

The Census Bureau of the U.S. Department of Commerce reports quarterly domestic retail ecommerce sales. Newly released figures (PDF) for Q3 2023 show sales of $281.1 billion, a growth of 2.3% over the prior quarter.

Per the DoC, ecommerce sales are for “goods and services where the buyer places an order (or the price and terms of the sale are negotiated) over an Internet, mobile device, extranet, electronic data interchange network, electronic mail, or other comparable online system. Payment may or may not be made online.”

The DoC’s estimated total retail sales (online and in-store) for Q3 2023 stood at $1,825.3 billion, an increase of 1.5% from Q2 2023.

Ecommerce accounted for 15.6% of total U.S. retail sales in Q3 2023, up slightly from 15.5% in the prior quarter.

The DoC estimates U.S. ecommerce retail sales in Q3 2023 grew by 7.6% compared to Q3 2022, while total quarterly retail sales experienced a 2.3% annual rise in the same period.

GA4: User Acquisition vs. Sessions

The reports in Google Analytics 4 can be confusing. Take, for example, acquisition reports — the channels sending traffic to your site — at Reports > Acquisition. The section lists two types of acquisition: user and traffic (i.e., sessions).

I’ll address those reports in this post.

Traffic Sources in GA4

“User  acquisition” vs. “Traffic acquisition”

These two reports in the Acquisition section disclose where each visitor came from. User acquisition represents the initial source that brought the person to your site. Traffic acquisition is the most recent.

Say a first-time visitor came to your site from clicking an organic search listing. GA4 will group that initial visit in the “Organic search” channel for both reports. But if she leaves and returns a few days later from a Facebook ad, GA4 will list the second visit in the “Paid social” channel in the “Traffic acquisition” report and “Organic search” in “User acquisition.”

In other words, the “Traffic acquisition” report doesn’t differentiate between new and returning users. It shows the total sessions and where each originated. The “User acquisition” report shows the total users and their initial (first-time) source.

Channels

Both reports show “default channel groups” — traffic-source categories — including:

  • “Direct.” Visitors who typed your URL in their browser’s search bar.
  • “Organic Search.” Visitors who clicked an organic listing on Google, Bing. or another search engine.
  • “Organic Social.” Visitors who clicked an organic post on a social media site.
  • “Email.”
  • “Paid Search.”
  • “Paid Social.”
  • “Referral.” Visitors who clicked non-ad links on third-party sites.
  • “Organic Video.” Visitors who clicked non-ad links on sites such as YouTube, TikTok, and Vimeo.

For details on each channel, click the drop-down menus on each report. In “User Acquisition,” click the “First user default channel group” menu and select “First user source” for the list of originating domains.

For detailed sources of users, click the “First user default channel group” menu and select “First user source.” Click image to enlarge.

In the “Traffic Acquisition” report, click “Session default channel group” and select “Session source.”

For detailed sources of sessions, click the “Session default channel group” menu and select “Session source.” Click image to enlarge.

The numbers may be different owing to new versus returning visits.

You can limit both reports to a single page via the “Add filter” option at the top.

  • Select “Add filter.”
  • Click “Select dimensions.”
  • Select “Landing page + query string”
  • Select “contains” below “Match type.”
  • Paste your page’s URL slug (the part after the “/”) or any word (such as “smaller”) from that URL.

The traffic and user acquisition reports will now show data for that URL.

Screenshot of filter box in GA4.Screenshot of filter box in GA4.

Limit the acquisition reports to a single page, such as this example for a URL containing “smaller.” Click image to enlarge.

In many cases, the user and traffic acquisition reports are interchangeable. The differences, again, result from returning visitors. Regardless, the reports show the sources of traffic — channels that introduce your brand and drive return visits.

100M Phone Call Insights: Your Key To Data-Driven Marketing Strategies via @sejournal, @hethr_campbell

Looking to create golden sales opportunities efficiently and easily?

Do you think you’re attracting good leads, but results aren’t showing it?

What could you do with voice-of-customer insights summarized across calls?

The key might lie within phone calls.

CallRail has analyzed more than 100 million phone calls and consumer communication trends, and they’re ready to share their findings with you.

On November 1, I moderated a webinar with Jason Tatum, CallRail’s VP of Product. Tatum covered the past and future state of phone calls and how you can use AI to gain valuable insights that will transform your business.

Here’s a summary of the webinar. To access the entire presentation, complete the form.

What We Uncovered

Consumers prefer to call businesses for many types of transactions — particularly high-stakes purchases such as in healthcare or insurance, where two-thirds of consumers prefer calling over any other type of contact.

[Find out why inbound callers are your best leads] Instantly access the on-demand webinar →

Main Insight: Mining The Conversation Is Untapped Gold

Phone conversations provide some of the richest insights you can get about buyers’ needs and motivations.

This type of conversation data could improve:

  • Lead conversion.
  • Customer experience.
  • Agent performance.
  • Marketing optimization.
100M Phone Call Insights: Your Key To Data-Driven Marketing StrategiesScreenshot by CallRail, Nov 2023

8 Golden Insights From Call Tracking

The strategy of mining calls is helping companies gain massive insights for strengthening their marketing strategy.

Call tracking, alone, can offer you insights into:

  • Call attribution.
  • Call recording & transcription.
  • Lead qualification.
  • CRM integration.
  • Marketing ROI.

With 100 million calls backing this up, we know that call tracking helps marketers drive the most high-quality leads because call tracking:

  • Identifies which ads are performing the best.
  • Allows marketers to double down on performance.
  • Saves money and increases ROI.

[See the data] Instantly access the on-demand webinar →

7 More Golden Insights From Call Tracking With AI

We’ve also discovered that by implementing call tracking with the technology of artificial intelligence, calls are also providing companies with:

  • Keywords and phrases.
  • Buying intent.
  • Product and service interests.
  • Sentiment.
  • Agent performance.
  • Call outcomes.
  • Patterns across calls.

From our data, we’re confirming that phone calls are, in fact, one of the richest sources of untapped intelligence compared to SMS, chat, and forms.

How To Extract The Gold

Now that you see the data that thousands of companies are mining from phone calls, it’s time to learn how to get that same data for yourself.

However, manually identifying patterns across thousands of calls, agents, locations, or campaigns is virtually impossible without these steps:

  1. [Get the steps] Instantly access the on-demand webinar →

100M Phone Call Insights: Your Key To Data-Driven Marketing Strategies [Slides]

Here’s the presentation:

Join Us For Our Next Webinar!

The Evolution Of Search & SERPs 2024

In this roundtable discussion with Shelley Walsh, Ben Steele, and Matt Southern, you’ll get expert insights into the evolution of search to give you a competitive advantage in 2024.

Charts: Ecommerce in the U.K.

Insider Intelligence projects U.K. ecommerce retail sales to decrease in 2023 by 0.6%, leading to a reduced share of total retail sales at 32.0%. This is a decline from its peak of 37.6% in 2021.

Amazon.co.uk secured the top U.K. ecommerce position in 2022 with net sales of $15.36 billion, followed by Sainsburys.co.uk at $7.59 billion. Tesco.com claimed the third spot with net sales of $7.16 billion.

That’s according to Statista, which defines ecommerce as the sale of physical goods via a digital channel to a private end user.

Per Insider Intelligence (formerly eMarketer), China will remain the world’s largest ecommerce market in 2023, followed by the U.S. The U.K. will finish third with total annual ecommerce revenue in 2023 of $196 billion.

According to Semrush, in September 2023, Amazon.co.uk was the most trafficked retail site in the United Kingdom, with nearly 412.93 million monthly visits. Ebay.co.uk secured second with 176.08 million, while Argos.co.uk drew 43.45 million visits.

A Look At Today’s Marketing Data Standards & What They Mean For Your Strategy via @sejournal, @sejournal

The data-driven realm of marketing is dynamic – what worked for you yesterday might not work today. 

And with constant search algorithm shifts, privacy regulations, and the unstoppable rise of AI, staying informed is the key to getting results.

So if you’re eager to unlock the true potential of your marketing efforts, our upcoming webinar has the latest data standard insights to keep you in-the-know. 

Claravine teamed up with Advertiser Perceptions this year to conduct a sweeping survey of marketers and agencies – and the results are in!

Their findings reveal where data standards have the most impact on marketing data, as well as how companies are navigating new privacy laws, harnessing the power of AI, and fine-tuning their data organization strategies.

Join us on Wednesday, November 8 to discover what 140 marketers and agencies had to say, and how you can incorporate these insights into your strategy. 

You’ll leave this webinar with: 

  • A better understanding of how your marketing data management compares to enterprise advertisers. If you want to know how you stack up against the competition, we’ll give you a comprehensive view of how your data management strategies, practices, and tools compare to those used by enterprise advertisers. You can then identify areas where you excel and pinpoint opportunities for improvement.
  • An overview of the current state of data standards and analytics, and how marketers are managing risk while improving the ROI of their programs. This webinar will provide you with a real-time snapshot of the data standards landscape. You’ll understand the latest trends and emerging practices. Plus, you’ll see how marketers are tackling the challenges of managing data while maximizing results. 
  • Tactics and best practices that you can use to improve your marketing data now, including how to measure success and define data standards. This live session is packed with actionable insights that you can put to work immediately. Learn how to measure the success of your marketing campaigns and define data standards more effectively, giving you the power to optimize your strategies. 

Chris Comstock, Chief Growth Officer at Claravine, will dive into the latest marketing data trends among top advertisers and discuss ways poor data standards skews insights.

He’ll also share actionable tips to help you benchmark your performance, mitigate privacy risks, and boost ROI through better data.

If you’ve ever wondered how your approach to managing marketing data compares to the top marketers and enterprise advertisers, this webinar will help you size up your game. 

Be sure to mark your calendar and set your alarm – you don’t want to miss this one! 

And no worries if you can’t make the live event – sign up now, and we’ll send you a recording after the webinar.

How To Perform Website Experiments [+ SEJ Experiment Walk-Through & Results] via @sejournal, @Juxtacognition

How do you know what to change on your website or landing page to make it more effective?

Are the tweaks and tests you made to your site really successful? Or do they just look successful?

Could the best practices you’ve been following for years actually be what’s holding you back from achieving your goals and KPIs?

If you’ve ever asked yourself these questions or ones like it, it’s time to run your own website experiment.

Learn a step-by-step process for conducting website experiments to help you get the data you need to make good decisions.

We’ll show you how to set up and design experiments to collect data that will have the most value and impact for you and your audience.

You’ll learn:

  • When to run an experiment: Learn how and when to run your experiments to get the most informative and accurate data.
  • How to avoid data disasters: Learn how to prevent other factors from influencing your results and what to do if things do go as expected.
  • Experiment design: Tips for working with low-traffic sites, reducing costs caused by testing, and other challenges you’ll face when testing and running experiments.

Join our very own Angie Nikoleychuk, Content Marketing Manager, and learn the key factors to focus on when running website experiments, how to manage your data collection, and how to learn about the things that matter to your audience.

PLUS: You’ll get an exclusive walk-through of our new ad experiment, where we examined how ad types and layouts affect user behavior on our website. You’ll never guess what we found. We sure didn’t!

Don’t miss out on this opportunity!

View the slides below, download your copy of the resources, and watch on demand to learn how to run easy experiments on your site.

Join Us For Our Next Webinar!

How To Boost 2024 SEO Performance With Pillar Pages & Topic Clusters

Join SEO experts and Conductor’s Customer Success Managers, Alex Carchietta and Zack Kadish, to learn how effective pillar pages and clustered content improve site structure, internal linking, and on-page SEO.

Brand Metrics for Ecommerce Companies

Brand metrics describe how shoppers feel about a business and its products.

Brand metrics are often predictive, indicating how likely shoppers will purchase, and prescriptive, defining the features to emphasize and promote.

What marketers learn from brand metrics could impact content marketing, search engine optimization, advertising, and even personalization on a store’s website or in email and text messages.

What Are Brand Metrics?

Brand metrics are measurements — quantitative and qualitative assessments allowing marketers to track and analyze a brand’s performance, strength, and perception over time.

These metrics offer insights into brand awareness, loyalty, and overall brand health relative to the company’s target audience and the broader market.

For the most part, brand metrics are subjective indicators of potential performance based on what shoppers say they believe or what they say they will do.

The metrics often include:

  • Brand awareness. Measures the percentage of a target market that is familiar with a brand. A relatively high level of brand awareness can be a competitive advantage and is frequently the first step in the buyer’s journey. Brand recall, recognition, and share of voice are related metrics.
  • Brand equity. The value that a brand adds to a product. It is why some folks pay more for an iPhone. Brand equity is often gauged in terms of buyer perceptions and associations.
  • Brand loyalty. Assesses the likelihood that customers will repeatedly purchase a brand’s products regardless of whether it is a retailer or direct-to-consumer business. Loyalty indicates a stable customer base and can forecast sales. High brand loyalty often reduces marketing costs.
  • Brand consistency. Measures how uniformly a brand presents itself across channels and touchpoints. Marketers can use this measurement to ensure shoppers receive a consistent brand experience, which, in turn, can reinforce brand identity and trust.
  • Shopper satisfaction. Evaluates how happy customers are with a brand’s products or services. Often, this metric is associated with a Net Promoter Score.
  • Purchase intent. Gauges the likelihood that a shopper will buy a brand’s product. This metric helps forecast sales.

Brand metrics often derive from surveys and conjoint analysis, where respondents consider multiple features jointly.

How to Use Brand Metrics

In general, brand-related measures provide three types of insights:

  • Familiarity,
  • Perception,
  • Investment.

Consumers familiar with a store or product are relatively easier to convert into customers. Thus boosting brand awareness can lead to lower customer acquisition costs.

Moreover, shoppers familiar with a product or brand often have favorite features. Thus, familiarity with a product or brand can lead to relatively more personalized messaging.

Perception metrics tell marketers if their brand adds value to the selling proposition. Shoppers might pay more for a product from a store they trust. This is common with Amazon. Shoppers would rather purchase a product from Amazon because they believe the company will deliver quickly, manage returns well, and ultimately stand with the shopper if there is a problem.

Measures of investment indicate the likelihood of a purchase. Shoppers likely to buy could respond to unique, specialized marketing campaigns versus those with low intent.

Integrating these insights into marketing decisions ensures a more targeted approach.

Imagine, for example, a DTC business specializing in bar and cocktail products. If it has high brand loyalty — meaning its customers love the brand — the company could run a refer-a-friend campaign. But a similar campaign for a lesser-known brand would not likely work since shoppers would presumably not recommend an unfamiliar product.

Image of various cocktails and accessories on a shelfImage of various cocktails and accessories on a shelf

A company with loads of brand loyalty could try a refer-a-friend campaign.

Measurement Frameworks

Brand measures, while indicative, complement a broader marketing framework.

For example, the Data & Marketing Association in the U.K. released a comprehensive marketing measurement framework that encourages using brand metrics with other performance indicators.

Such frameworks are not a rote way to measure every promotion but rather a method for developing campaign measurements wherein brand metrics play a role.

New Features Coming To GA4 For Analytics 360 Properties via @sejournal, @kristileilani

Google announced new features to meet large enterprises’ and agencies’ diverse reporting needs for those who switch to GA4 for Analytics 360 properties.

Initially rolled out to adapt to a fast-evolving technology and regulatory landscape, GA4’s new features surpass its core functionality of understanding consumer behavior across digital platforms.

The updated analytics tool uses Google AI  to provide more relevant insights, predict future purchasing behaviors, and offer solutions for unknowns in the consumer process.

Custom Reporting Experiences

To make reporting more efficient, Google announced it would soon allow users to assign customized reporting experiences to different organizational roles and teams.

No more one-size-fits-all reports for everyone in your organization. With reporting experiences, advertising teams can view campaign-specific reports instead of the default GA4 interface.

New Account Structuring

The upcoming update will also bring in new data governance features. Users can structure their accounts with subproperties and roll-up properties, facilitating better access control and reporting.

Google also plans to integrate subproperties and roll-up properties with Display & Video 360, Campaign Manager 360, and Search Ads 360.

Faster Export To BigQuery

Another notable change is the faster and more reliable export of GA4 data to BigQuery, Google’s cloud data warehouse.

To ensure a standardized data analysis process, Google will introduce a service legal agreement (SLA) to make insights available for export at the same time daily in the coming year.

Making The Switch To GA4

The data from Universal Analytics will no longer be accessible starting July 1, 2024. Google urges users to complete the transition to GA4 by March 2024 to ensure they retain the data they need. Failure to switch will result in deleting all Universal Analytics properties and data.

This sweeping change appeared to be a necessity given the ever-changing technology and regulatory conditions, as certain advertising functionalities of Universal Analytics will cease to operate for traffic in the European Economic Area (EEA).

To simplify the transition process, Google is rolling out an API-based upgrade option for GA4 properties to assist with migrations of thousands of sites.


Featured image: Travis Wolfe/Shutterstock

7 Key Metrics In Measuring Content Effectiveness via @sejournal, @hethr_campbell

Measuring content performance is the compass that guides successful digital marketing endeavors.

It provides invaluable insights into what resonates with your audience – enabling you to refine strategies, optimize resources, and maximize ROI.

To do this successfully, you need to understand the metrics behind evaluating a successful customer journey.

If you’re looking to maximize your marketing efforts and create the kind of content that both captivates and converts, watch this webinar on demand now.

Join Wayne Cichanski from iQuanti, along with Rachel Schardt and Rayan Nahas from Conductor, as they dive into 7 key metrics for measuring content effectiveness that often go overlooked.

In this webinar, our guests dive into:

  • Content Relevancy: Learn how to measure the relevancy of your content and ensure it aligns perfectly with your audience’s needs and interests.
  • Content Depth: Your content should do more than just scratch the surface. Explore ways to gauge the depth of your content and ensure it offers genuine value to your readers.
  • Influenced & Direct Conversion Rate: Conversion rates are the ultimate litmus test for content quality. Discover how to measure the true impact of your content on your business’ bottom line.
  • Engagement Metrics: Discover the engagement metrics you should be paying close attention to, and learn how to leverage them to boost audience interaction and retention.
  • Consumer Demand/Volume: Learn how to measure consumer demand for your content, helping you tailor your strategies to meet their expectations.
  • Tone-Matching Demographics and Persona: Matching your content’s tone with your target demographics and personas is crucial for effective communication. Learn how to ensure your messaging is resonating with your intended audience.

If you’re looking to serve up relevant and compelling content that engages your ideal audience, then this webinar is for you

View the slides:

7 Key Metrics In Measuring Content Effectiveness [Slides]

Join Us For Our Next Webinar!

Trends In Paid Search: Navigating The Digital Landscape In 2024

Join Sreekant Lanka from iQuanti and Irina Klein from OneMain Financial as they dive into the future of paid search and explore the trends, strategies, and technologies that will shape the search marketing landscape.


Image Credits:

Featured Image: Paulo Bobita/Search Engine Journal

Google Search Console Data & BigQuery For Enhanced Analytics

Google Search Console is a great tool for SEO pros.

But as many of us know, using the interface exclusively comes with some limitations.

In the past, you often had to have specific knowledge or the help of a developer to overcome some of them by pulling the data from the API directly.

Google started offering a native Google Search Console (GSC) to what was Google Data Studio (now Looker Studio) connector around 2018.

This integration allows users to directly pull data from GSC into Looker Studio (Google Data Studio) to create customizable reports and dashboards without needing third-party connectors or additional API configurations.

But then, in February 2023, things got interesting.

Google now allows you to put in place an automated, built-in bulk data export to BigQuery, Google’s data warehouse storage solution.

Let’s get candid for a minute: most of us still rely on the GSC interface to do many of our activities.

This article will dive into why the bulk data export to BigQuery is a big deal.

Be warned: This is not a silver bullet that will solve all of the limitations we face as SEO pros. But it’s a great tool if you know how to set it up and use it properly.

Break Free From Data Constraints With BigQuery Bulk Exports

Initially, the bulk data export was meant for websites that received traffic to tens of thousands of pages and/or from tens of thousands of queries.

Data Volumes

Currently, you have three data export options beyond the BigQuery bulk data export:

  • Most of the reports in GSC allow you to export up to 1,000 rows.
  • You can get up to 50,000 rows via a Looker Studio integration.
  • With the API, you get up to 50,000 rows, enabling you to pull a few more elements beyond the performance data: URL Inspection, sitemaps, and sites’ data.

Daniel Waisberg, Search Advocate at Google, explains it this way:

“The most powerful way to export performance data is the bulk data export, where you can get the biggest amount of data.”

There are no row limits when you use the BigQuery bulk export.

BigQuery’s bulk data export allows you to pull all rows of data available in your GSC account.

This makes BigQuery much more suitable for large websites or SEO analyses requiring a complete dataset.

Data Retention

Google BigQuery enables unlimited data retention, allowing SEO pros to perform historical trend analyses that are not restricted by the 16-month data storage limit in Google Search Console.

Looker Studio and the API do not inherently offer this feature. This means you gain a real capacity to see evolutions over multiple years, and better understand and analyze progressions.

As a storage solution, BigQuery allows you to stock your data for as long as you wish and overcome this limitation.

The ability to retain and access unlimited historical data is a game-changer for SEO professionals for several reasons:

  • Comprehensive long-term analysis: Unlimited data retention means that SEO analysts can conduct trend analyses over extended periods. This is great news for those of us who want a more accurate assessment of how our SEO strategies are performing in the long term.
  • Seasonal and event-driven trends: If your website experiences seasonal fluctuations or events that cause periodic spikes in traffic, the ability to look back at longer historical data will provide invaluable insights.
  • Customized reporting: Having all of your data stored in BigQuery makes it easier to generate custom reports tailored to specific needs. You can create a report to answer virtually any question.
  • Improved troubleshooting: The ability to track performance over time makes it easier to identify issues, understand their root causes, and implement effective fixes.
  • Adaptability: Unlimited data retention gives you the flexibility to adapt your SEO strategies while maintaining a comprehensive historical perspective for context.

Data Caveats

Just like most data tracking tools, you won’t be surprised to learn that there is no retroactivity.

Keep in mind that the GSC bulk data export starts sending data daily to BigQuery only after you set it up. This means that you won’t be able to store and access the data before that.

It’s a “from this point forward” system, meaning you need to plan ahead if you want to make use of historical data later on. And even if you plan ahead, the data exports will start up to 48 hours later.

While the bulk data export does include significant metrics such as site and URL performance data, not all types of data are exported.

For example, coverage reports and other specialized reports available in GSC are not part of what gets sent to BigQuery.

Two primary tables are generated: searchdata_site_impression and searchdata_url_impression. The former aggregates data by property, so if two pages show up for the same query, it counts as one impression.

The latter table provides data aggregated by URL, offering a more granular view. In plain English, when you use Google Search Console’s bulk data export to BigQuery, two main tables are created:

  • searchdata_site_impression: This table gives you an overview of how your entire website is doing in Google Search. For example, if someone searches for “best sausage dog costume” and two pages from your website appear in the results, this table will count it as one “impression” (or one view) for your entire site rather than two separate views for each page.
  • searchdata_url_impression: This table is more detailed and focuses on individual web pages. Using the same example of “best sausage dog costume,” if two pages from your site show up in the search results, this table will count it as two separate impressions, one for each page that appears.

Another important element is that you are dealing with partitioned data tables. The data in BigQuery is organized into partition tables based on dates.

Each day’s data gets an update, and it’s crucial to be mindful of this when formulating your queries, especially if you want to keep your operations efficient.

If this is still a bit obscure for you, just remember that the data comes in daily and that it has an impact on how you go about things when doing data analysis.

Why Set This Up?

There are advantages to setting up BigQuery bulk exports:

Joining GSC Data With Other Data Sources

Getting the Google Search Console out in a data warehouse means that you can enjoy the advantages of joining it with other data sources (either directly in BigQuery or in your own data warehouse).

You could, for instance, blend data from the GSC and Google Analytics 4 and have more insightful information regarding conversions and behaviors driven by organic Google traffic.

Run Complex Calculations/Operations Using SQL

A solution such as BigQuery allows you to query your data in order to run complex calculations and operations to drive your analysis deeper.

Using SQL, you can segment, filter, and run your own formulas.

Anonymized Queries

BigQuery deals with anonymized queries differently from other ETL vendors that access the data via the API.

It aggregates all the metrics for the anonymized queries per site/URL per day.

It doesn’t just omit the rows, which helps analysts get complete sums of impressions and clicks when you aggregate the data.

What’s The Catch?

Unfortunately, no tool or solution is perfect. This new built-in integration has some downfalls. Here are the main ones:

It Means Developing Expertise Beyond SEO

You should get familiar with Google Cloud Platform, BigQuery, and SQL on top of your GSC knowledge.

Starting a bulk data export entails carrying out tasks in GSC but also Google Cloud.

An SQL-Based Platform Requiring Specific Expertise

With BigQuery, you need SQL to access and make the most of your data.

You therefore need to make SQL queries or have someone in-house to do it for you.

The platform also has its own way of functioning.

Using it efficiently requires knowing how to use it, which requires time and experience.

While Looker Studio does allow SQL-like data manipulation, it may not offer the full power and flexibility of SQL for complex analyses.

API data would need to be further processed to achieve similar results.

URL Impressions Contain More Anonymized Queries

“One thing to be mindful of is the difference in anonymized query volume between the  searchdata_url_impression table and the searchdata_site_impression table.

Like the GSC interface, some queries for particular URLs in particular countries might be so infrequent that they could potentially identify the searcher.

As a result, you’ll see a greater portion of anonymized queries in your searchdata_url_impression table than in your searchdata_site_impression table.” Source: Trevor Fox.

Potential Costs

Even though this feature is initially free, it might not be the case forever.

BigQuery is billed based on the amount of data stored in a project and the queries that you run.

The solution has thresholds from where you start to pay potentially each month.

Over time, it might then become costly – but it all depends on the amount of data exported (websites with many pages and queries will probably be heavier in that regard) and the queries you run to access and manipulate it.

How To Get Your GSC Data In BigQuery

1. Create A Google Cloud Project With BigQuery And Billing Enabled

The first step is to create a project in Google Cloud with BigQuery and billing enabled.

Access the Console. On the top left, click on the project you currently are in (or Select a project if you have none), this will open a popup.

Click on NEW PROJECT and follow the steps. Be careful when you choose the region because you will have to pick the same one when you set up the bulk export in the GSC.

This part is not spoken about very often. If you wish to query two datasets like GSC and GA4 data, they need to be in the same region.

“For some areas like Europe and North America, you can query across the wider continental region but in places like Australia you can’t query across Melbourne and Sydney.

Both datasets need to be in the exact same location”

Sarah Crooke, BigQuery Consultant at Melorium, Australia, said:

Once the project is created, go to the Billing section. Use the search bar at the top to find it. Google Cloud does not have the most user-friendly interface without the search bar.

You need to create a billing account. Piece of advice before you proceed: Take the time to investigate if you don’t already have a billing account set up by someone else in the company.

Once that’s done, you can assign the billing account to your project. You need a billing account in order to set up the bulk export.

Please follow the instructions provided by the Google Cloud documentation to do so.

Then, you need to go to the APIs & Services section (again, you can use the search bar to find it).

Look for the Bigquery API. Enable it for the project you created.

One more step: You need to add a user. This will enable Google Search Console to dump the data in BigQuery. Here is the official documentation to do this.

Let’s break it down quickly: 

  • Navigate in the sidebar to IAM and Admin. The page should say Permissions for project .
  • Click + GRANT ACCESS.
  • It will open a panel with Add principals.
  • In New Principals, put search-console-data-export@system.gserviceaccount.com
  • Select two roles: BigQuery Job User and BigQuery Data Editor. You can use the search bar to find them.
  • Save.

Lastly, select your project and copy the Cloud project ID associated with it.

You’re done in Google Cloud!

2. Setup The Bulk Data Export In The GSC Property Of Your Choice

Once the Google Cloud part is completed, you will need to activate the bulk data export to your new Google Cloud project directly in the Google Search Console.

To do so, go to the Settings section of the property you want to export data from and click on Bulk data export.

Paste the Cloud project ID of the project you created before. You can also customize the name of the dataset that the GSC will create in your project (it is “searchconsole” by default).

Lastly, pick the same dataset location that use chose for your Google Cloud project.

Once you are all set, click on Continue. The GSC will let you know if this initial setup is functional or not. The dataset will also be created in your project.

The data exports will start up to 48 hours later.

They are daily and include the data for the day of the setup. While API can be set to do scheduled pulls, it often requires additional programming.

This is why the bulk data export works for many big websites.

Keep in mind that the GSC can run into data export issues after this initial setup, in which case it is supposed to retry an export the following day.

We recommend you query your data in the first days to check if it is being stored properly.

So, What Next?

You can get started querying data now! Here are some things you can analyze that cannot be analyzed easily in another way:

  • Query multiple pages at once: In BigQuery, you can run a single SQL query to get metrics for all pages (or a subset of pages) without having to click through each one individually.
  • Traffic seasonality report: Compare performance metrics by season to identify trends and optimize campaigns accordingly.
  • Bulk analysis across multiple sites: If you manage a brand with more than one website, this allows you to look at clicks across all these sites at once.
  • Click-through rate (CTR) by page and query: Instead of just looking at the average CTR, you could calculate the CTR for each individual page and search query.

In summary

In summary, the built-in bulk data export feature from Google Search Console to Google’s BigQuery offers a more robust solution for data analytics in SEO.

However, there are limitations, such as the need to develop expertise in Google Cloud and SQL, and potential costs associated with BigQuery storage and queries.

More resources: 


Featured Image: Suvit Topaiboon/Shutterstock