Google Universal Analytics 360 Sunsetting Soon: Migration Tips & Top Alternative Inside via @sejournal, @PiwikPro

This post was sponsored by Piwik PRO. The opinions expressed in this article are the sponsor’s own.

This year, Google will finally phase out Universal Analytics 360, requiring paid users to switch to Google Analytics 360.

This is not something you can skip or postpone, and the clock is ticking.

The new analytics differ significantly from the previous version, and you can’t migrate data between them, so the transition can be challenging for organizations.

Since you’ll be starting from scratch, now is a good time to explore other options and determine if there are better solutions for your needs.

The three main areas to consider when deciding if you want to stay with Google or move to another platform are: the migration process, privacy and compliance, and ease of use.

When Is Google Universal Analytics 360 Sunsetting?

July 1, 2024 is when Google will phase out Universal Analytics 360.

What Should I Do Next?

Google encourages you to migrate to Google Analytics 360 as quickly as possible.

If you don’t, you could:

  • Lose critical advertising capabilities.
  • Lose the ability to export historical data.
  • Face delays in setting up Google Analytics 360.

    How To Migrate To Your Next Analytics Platform

    Moving to a new platform is much more than just implementation; it is vital to plan your migration properly. Below are five steps to help you through the entire process.

    Step 1. Evaluate Your Stack & Resources

    Before you switch analytics tools, take the time to evaluate your entire stack, not just the tool you’re changing. Ensure that your stack is up-to-date and meets your current business needs. Migrating to a new analytics vendor almost always requires more people and more time than originally estimated. It’s a good occasion to remove redundant tools from your stack; it might also allow you to integrate with new ones that can help you run your analytics and collect data more comprehensively.

    Step 2. Tidy Your Data

    Over time, data collection may get messy, and you find yourself tracking data that isn’t relevant to your business. A migration gives you a chance to clean up your data taxonomy. Ensure that your new tool allows you to use the same categories of data as the previous one. Pay close attention to any data that needs to be collected automatically, like location data (country, region, city), and device details (device type, browser). Finally, make sure the SDKs you need are supported by your new tool.

    Step 3. Implement A New Platform

    This step involves setting up the tracking code that collects data about visitors to your website or app and making any necessary modifications. Remember to set up tags to gather more detailed data through events or connect third-party tools.

    Speed Up The Transition: If you switch to Piwik PRO, you can use a migration tool to easily transfer your settings from Universal Analytics (GA3) and Google Tag Manager.

    Step 4. Evaluate Tour New Data

    Once you’re done implementing your new platform, you should run it parallel to your existing tool for a few months before finalizing the migration. During this time, you can audit your new data and correct any errors. In this manner, you can retain your historical data while simultaneously generating new data segments on the new platform.

    Step 5. Provide Training For Your Team

    All end users need training to comprehend the platform’s operations, retrieve necessary data, and generate reports. This step is frequently missed as it falls at the end of the project.

    Upon finishing this step, you will be set to switch to your new platform fully. If you find the migration process challenging, consider getting help from outside sources. Some analytics vendors offer hands-on onboarding and user training, which accelerates product adoption.

    Is Switching To Google Analytics 360 Worth The Hassle?

    You might be thinking, “Migrating to the successor of UA 360 won’t be a walk in the park,” especially if you work for a large organization.

    In addition to subscription and data migration costs, you may also need to train your staff or increase fees for external marketing agencies that will face new challenges.

    While Analytics 360 has incredible use cases, there may be other tools that better suit your needs.

    Switching to alternative solutions may be a good option for you.

    How To Pick A Replacement For Universal Analytics 360

    To decide whether to choose a new platform or stick with Google, consider a few important factors:

    1. Because GA 360 is a different software, your marketing and analytics departments will need to allocate extra resources to learn the new platform. You will also need the support of analysts, developers, and data architects to help you reconstruct reports based on the data architecture of the chosen platform. Choosing a solution with similar features and user experience to UA 360 can be a good option, because it saves resources, making onboarding faster and easier.
    2. You will also need to redesign your entire customer journey, because the data model in GA360 has changed from sessions to events. This process can be more challenging and costly than choosing a session-based platform or one that offers you freedom of choice.
    3. Another important consideration is the level of support offered by the vendor. This can greatly affect the quality of the migration and onboarding to a new platform. Although Google Analytics is currently the most popular tool for analyzing web traffic, the level of support it provides is limited. Other companies like Piwik PRO can offer more in this area, including personalized onboarding, product implementation, training, and dedicated customer support at every step.

    Consideration 1: Think About Privacy & Compliance

    Organizations around the world are increasingly concerned with data privacy and compliance. A 2023 Thomson survey found that 80% of business professionals acknowledge the importance of compliance as a crucial advisory function for their organizations. Gartner, on the other hand, predicts that, by 2025, 60% of large enterprises will use at least one privacy-enhancing computing (PEC) technique in analytics, business intelligence, and/or cloud computing.

    This is due to a growing number of new regulations that place greater control over personal data at the forefront. The EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) are two of the most prominent examples. The landscape has been further complicated by events such as the Schrems II case, Brexit, and China’s Personal Data Protection Law. Data protection is also increasingly important in some sectors, such as healthcare, where regulations like HIPAA are mandatory.

    If your company operates globally or has ambitions to do so, the first thing to consider is who has full ownership of the data, where the servers hosting the data are located, and who owns them. Google Analytics 360 only offers cloud deployment in an unknown location, which means that data might be transferred between data centers in the Americas, Europe, and Asia. This makes it difficult to know exactly where the data is stored and ownership is unclear. For now, the issue of data transfers between the US and the EU has been resolved by the EU-US Privacy Shield framework agreement, but the future stays unclear. Last year, NOYB, led by Max Schrems, announced that it would soon appeal this decision to the Court of Justice of the European Union (CJEU).

    To meet privacy and compliance requirements in different countries and industries, choose a platform that allows you to customize your hosting plan and set specific parameters for data collection and analysis. Platforms like Piwik PRO Analytics Suite enable you to store your data on servers in Europe, the US, and Asia, based on your preferences. This translates into flexibility and security of your data.

    Consideration 2: Ease Of Use & Integration

    This may sound counterintuitive, but the new GA 360 might be too complex for many. While it offers numerous advanced functions for data analysts, it lacks features specifically designed for marketers. As a result, marketers may need help in configuring the system to efficiently use the data.

    On the other hand, in GA 360, the data model shifts from session-based to event-based. This is especially important if your teams depend on UA 360 behavioral reporting, benchmarking, and e-commerce flow reports, as these features are unavailable in the new release. You also need to revise all the reports for all the stakeholders.

    Conversely, Piwik PRO strongly emphasizes simplicity and enables marketers to quickly access the necessary data. Additionally, the data model combines both session-based and event-based structures. This approach ensures that you can start working with the data faster and deliver the reports that stakeholders are used to. Another big advantage of Piwik PRO is its model for working with raw data, which is a valuable source of knowledge about users and provides richer reporting in more contexts. Google Analytics does not provide raw data exports, so you have to use various services and tools to accomplish this. To be fair, however, exporting large raw data packets with Piwik PRO software may take longer than with Google solutions.

    The new GA 360 is most effective when used mainly with products from the Google ecosystem. When considering data activation, Google Ads is the most suitable option. When it comes to Piwik PRO, you still have this option, but integrating with other solutions is much easier. The platform offers four modules: Analytics, Tag Manager, Consent Manager and Customer Data Platform (CDP). The CDP module, available in the paid plan, lets you create detailed customer profiles and categorize your data into various audience segments. You can activate them to provide a personalized experience and run effective campaigns across multiple channels.

    The landscape of modern analytics is constantly changing. On the one hand, there are ongoing discussions about privacy and compliance regulations, while on the other, companies are trying out various methods to collect and analyze data. In the end, your choice of analytics platform will impact the performance of your marketing and sales efforts. So take the time to explore, and you may find other solutions that better suit your organization’s needs.

    Piwik PRO is a solid choice to explore for your next primary analytics solution. Book a personalized demo of the Enterprise version and see the benefits of introducing Piwik PRO Analytics Suite in your organization.


    Image Credits

    Featured Image: Image by Piwik PRO Used with permission.

    Charts: U.S. Manufacturing Trends Q1 2024

    The Institute for Supply Management is a leading not-for-profit global procurement organization. Founded in 1915, ISM’s 50,000 members from 100 countries manage about $1 trillion annually in corporate and government supply chain procurement.

    The monthly “ISM Report On Manufacturing” is among the most reliable economic indicators for supply management professionals, economists, analysts, and government and business leaders.

    The report includes the closely watched purchasing managers index — “Manufacturing PMI” — based on data compiled from purchasing and supply executives. Participants report activity in multiple categories, such as new orders, inventories, and production. Those indices are then combined to create the PMI. A PMI above 50 designates an overall expansion of the manufacturing economy, whereas below 50 is a contraction.

    In March 2024, U.S. manufacturing activity expanded for the first time in 16 months, with ISM’s Manufacturing PMI at 50.30, up from 47.80 in February and 46.30 one year ago.

    ISM’s data also shows new U.S. manufacturing orders in February 2024 were $576.76 billion, up from $568.57 billion in January and $571.26 billion year over year — an increase of 1.44% and 0.96%, respectively.

    ISM’s March 2024 U.S. Employment Index for the manufacturing sector is 47.40, indicating a slower decline from 45.90 in February and 46.90 one year ago.

    Act Now: Your Ads Measurement & Privacy Readiness Plan For 2024 & Beyond via @sejournal, @adsliaison

    The PPC ecosystem is about to undergo significant changes driven by regulation.

    With regulation updates such as regional consent requirements and Chrome’s deprecation of third-party cookies later this year (see the timeline from Chrome), as well as other shifts such as Apple’s App Tracking Transparency (ATT) policy and cross-device customer journeys, the amount of visible data available to marketers is on the decline.

    With that, Google’s ad measurement products and the ecosystem as a whole must evolve to meet this moment and be positioned for the next era.

    So much change can feel overwhelming, but with a solid plan, you’ll be ready.

    When I joined Google in early 2021, it was clear that regulatory and privacy changes and AI advancements would be key focus areas for marketers over the next several years. Fast-forward three years, and we’re now at the inflection point.

    In this article, we’ll walk through the big pieces to put in place to ensure your measurement capabilities continue now and in the years ahead.

    Preparation Is Key

    AI has been playing a critical role by enabling predictive and analytical capabilities and filling measurement gaps where data is not available.

    AI-powered conversion modeling, for example, is essential for maintaining measurement, campaign optimization, and improved bidding capabilities.

    As I wrote last year about GA4, for example, these shifts were a major driver for developing a measurement platform that can account for less observable data via third-party cookies and more data being aggregated to protect user anonymity.

    Many marketers are still deeply reliant on third-party cookies.

    As our products have evolved, there are important actions you should take now to ensure you’re taking advantage of new capabilities designed to help you maintain ad measurement in 2024 and beyond.

    Let’s dive in.

    Sitewide Tagging

    This may sound basic, but the very first step you should take is to implement sitewide tagging with either the Google Tag or Google Tag Manager.

    And if you have tagging set up, do a double-check to ensure it’s implemented correctly and collecting the data you need to measure conversions. Here’s how to get started with conversion tracking.

    To check that you are tracking conversions correctly, check the “Status” column for each of your conversion actions in the summary table (Goals > Conversions > Summary). You can then troubleshoot if you think there may be problems – Tag Assistant is also a helpful tool for this.

    Once your tagging is implemented fully, you have your measurement foundation in place and can start building on top of it. Which leads me to…

    First-party Data

    For years, discussions have been going on about the growing importance of first-party data – consented information you have collected directly from your visitors and customers – as a key part of building a durable measurement plan.

    The need to focus on building your first-party data strategy may still have felt abstract, but with the deprecation of third-party cookies and less observable data, first-party data is what will power your advertising strategy in this new landscape.

    Of course, better ads measurement is just one reason to have a first-party data strategy. When thinking about your first-party data plan, it’s important to start with a customer-centric point of view.

    What’s the value exchange you’ll be able to deliver for your customers?

    It could be early access to new products or services, special discounts, bonus content, loyalty rewards, or other offers that can help you build stronger customer relationships, improve customer lifetime value, and grow your customer base.

    We’ve discussed how enhanced conversions for leads uses first-party data.

    Additionally, you can connect CRM and customer data platform (CDP) platforms with Google Ads, Google Analytics 4, Campaign Manager 360, and Search Ads 360.

    First-party audience lists like Customer Match can help improve audience modeling, expansion, and remarketing. Working with a Customer Match partner can make this process simpler.

    Additionally, we introduced Google Ads Data Manager last year to make it much easier to connect and use your first-party data, including Customer Match lists, offline conversions and leads, store sales, and app data.

    It’s continuing to roll out and will reach general availability this quarter. You’ll be able to access it in a new “Data manager” section within “Tools” when it becomes available in your account.

    When you connect your customer and product data to Google’s advertising and measurement tools, you’ll have a more holistic view of the impact of your advertising.

    This is also where AI comes in to enable conversion modeling, predictive targeting, and analytics solutions, even when user-level data isn’t available.

    Enhanced Conversions

    Enhanced conversions is an increasingly important feature as the privacy landscape evolves.

    It can help provide a more accurate, aggregated view of how people convert after engaging with your ads, including post-view and cross-device conversions than is possible with site tagging alone.

    Enhanced conversions work by sending hashed, user-provided data from your website to Google, which is then matched to signed-in Google accounts. Sales originating from Google Search and YouTube can then be attributed to ads in a privacy-safe way.

    Supplementing your existing conversion tags with more observable data also strengthens conversion modeling and provides more comprehensive data to be able to measure conversion lift from your advertising, understand the incremental impact of your advertising, and help better inform Smart Bidding.

    There are two flavors of enhanced conversions:

    Enhanced Conversions For Web In Google Ads And GA4

    Already available in Google Ads, we recently rolled out support for enhanced conversions for web in GA4 as well.

    An advantage of implementing enhanced conversions in GA4 rather than only in Google Ads is that user-provided data can be used for additional purposes (such as demographics and interests, as well as paid and organic measurement).

    Wondering if you should set up enhanced conversions in one or both? Here’s some guidance:

    • If you are using Google Ads conversion actions, you should use Google Ads enhanced conversions.
    • If you’re using GA4 for cross-channel conversion measurement, you should use Google Analytics-enhanced conversions.
    • If you’re doing both, you can opt to set them both up on the same property. However, you need to be aware of which one you are bidding to and including in the Conversion counts to avoid double counting conversions. Be sure your Google Ads conversion tracking setup only includes the appropriate conversions in the Conversions column. In other words, be sure you’re not including the same action from both Ads and GA4.

    You’ll find details on setting up enhanced conversions in GA4 and/or Google Ads here.

    Enhanced Conversions For Leads

    If you’re tracking offline conversions, enhanced conversions for leads in Google Ads enable you to upload or import conversion data into Google Ads using first-party customer data from your website lead forms.

    If you’re using offline conversion imports to measure offline leads (i.e., Lead-gen), we recommend upgrading to Enhanced conversions for leads.

    Unlike OCI, with enhanced conversions for leads, you don’t need to modify your lead forms or CRM to receive a Google Click ID (GCLID).

    Instead, enhanced conversions for leads uses information already captured about your leads – like email addresses – to measure conversions in a way that protects user privacy.

    It’s also easy to set up with Google Tag, Google Tag Manager, or via that API if you want additional flexibility. It can then be configured right from within your Google Ads account.

    Learn more about enhanced conversions for leads here. Note there are policy requirements and restrictions for using enhanced conversions.

    Consent Mode

    The accuracy of conversion measurement can also be improved with consent mode.

    Consent choice requirements are part of regulatory changes and evolving privacy expectations (your legal and/or privacy teams can provide further guidance). Consent mode is the mechanism for passing your users’ consent choices to Google.

    Consent mode has become especially relevant for advertisers with end-users in the European Economic Area (EEA) and the UK as Google strengthens enforcement of its EU user consent policy in March.

    As part of this, consent mode (v2) now includes two new parameters – ad_user_data and ad_personalization – to send consent signals for ad personalization and remarketing purposes to Google.

    You can find more details on consent mode v2 here. The simplest way to implement consent mode is to work with a Google CMP Partner.

    If you have consent mode implemented but don’t update to v2, you will not have the option to remarket/personalize ads to these audiences in the future. To retain measurement for these audiences, you should implement consent mode by the end of 2024.

    Consent mode also enables conversion tracking when consent is provided and conversion modeling when users don’t consent to ads or analytics cookies.

    In Google Ads, when conversion modeling becomes available after you’ve met the thresholds, you’ll be able to view your conversion modeling uplift on “domain x country level” in the conversion Diagnostics tab.

    You may have seen a notification in Google Ads asking you to check your consent settings. This message will appear to all customers globally to alert you to the new Google Services selection in your account and to check your settings.

    We recommend all relevant Google services be configured to receive data labeled with consent to maintain campaign performance.

    Conversion Modeling

    Conversion modeling has long been used in Google’s measurement solutions and is increasingly important with the deprecation of individual identifiers like cookies on the web and device IDs in apps.

    Additionally, Google privacy policies prohibit the use of fingerprinting and other tactics that use heuristics to identify and track individual users.

    How it works:

    Google’s conversion modeling uses AI/machine learning trained on a set of observable data sources – including first-party data; data from platform APIs like Apple’s SKAdNetwork and Chrome’s Privacy Sandbox Attribution Reporting API; and data sets of users similar to those interacting with your ads – to help fill in the gaps when those signals are missing.

    Conversions are categorized as “observable” (conversions that can be tied directly to an ad interaction) and “unobservable” (conversions that can’t be directly linked to specific ad interactions).

    We then identify an observable group of conversions with similar behaviors and characteristics  (again, based on a diverse set of observable data sources noted above) and train the campaign model to arrive at a total number of conversions made by all users who interacted with your ad.

    To validate model accuracy, we apply the conversion models to a portion of traffic that’s held back.

    We then compare modeled and actual observed conversions from this traffic to check that there are no significant discrepancies and ensure our models can correctly quantify the number of conversions that took place on each campaign channel.

    This information is also used to tune the models. You can read more about how conversion modeling works here.

    You’ll find modeled data in your conversions and cross-device conversions reporting columns.

    How To Improve Your Conversion Modeling

    This is where everything we’ve discussed so far comes together! 

    The following steps will ensure you’re capturing as many “observable” conversions as possible. This will provide a more solid foundation for your conversion modeling.

    The first step to improving your conversion modeling, no surprise, is to be sure your conversion tracking is set up properly with Google Tag or Google Tag Manager.

    Next, implement enhanced conversions for web. For conversions affected by Apple’s ITP, enhanced conversions help advertisers recover up to 15% additional conversions compared to advertisers who haven’t implemented enhanced conversions.

    Advertisers who implement enhanced conversions also see a conversion uplift of 17% on YouTube and a 3.5% impact on Search bidding.

    Then, consider using consent mode. Again, this is particularly relevant for advertisers in the EEA, UK, and CH regions whose measurement is affected by the ePrivacy Directive.

    Additionally, for app developers, on-device conversion measurement helps increase the number of observable app install or in-app conversions from your iOS App campaigns in a privacy-centric manner.

    Data-driven attribution looks at all of your ad interaction account-wide and compares the paths of customers who convert to those of users who don’t convert to identify conversion patterns. It identifies the steps in the journey that have a higher predictability of leading to a conversion. The model then gives more credit to those ad interactions.

    Each data-driven model is specific to each advertiser. Those who switch to a data-driven attribution model from a non-data-driven one typically see a 6% average increase in conversions.

    That additional conversion data also helps inform Smart Bidding.

    GA4 properties began including paid and organic channel-modeled conversions around the end of July 2021.

    Reports such as the Event, Conversions, and Attribution reports and Explorations will include modeled data and automatically attribute conversion events across channels based on a mix of observed data where possible and modeled data where necessary.

    Marketing Mix Modeling

    With the loss of visible event-level data, many CMOs are also taking a fresh look at aggregated measurement methods such as marketing mix modeling (MMM).

    While MMMs aren’t new, they are privacy-friendly and have become increasingly accessible for companies with robust first-party data strategies.

    This month, we introduced an open-source MMM called Meridian to help advertisers get a more holistic picture across channels.

    By open-sourcing the model, advertisers can choose to use the MMM solution as it is, build on top of it, or use whichever pieces they find most useful.

    It’s launching with three primary methodologies to help marketers:

    • Get better video measurement by modeling reach and frequency in MMMs.
    • Improve lower funnel measurement by accounting for organic search volume; and
    • Calibrate MMMs for accuracy by integrating incrementality experiments across channels.

    Meridian is currently in closed beta, but all eligible non-Meridian MMM users can now review and use any of these three methodologies in their own models.

    Take Action Now

    Now is the time to ensure you have an action plan for durable, future-proof, privacy-first measurement.

    I know these may sound like a bunch of buzzwords, but the aim is to have a plan that will prepare you for third-party cookie deprecation and can evolve with future changes.

    More resources:


    Featured Image: Photon photo/Shutterstock

    21 AI Use Cases For Turning Inbound Calls Into Marketing Data [+Prompts] via @sejournal, @calltrac

    This post was sponsored by CallTrackingMetrics. The opinions expressed in this article are the sponsor’s own.

    If you’ve been enjoying having random conversations with ChatGPT, or trying your hand at tricking a car dealership chatbot into giving you a new car for $1, just wait until you start using safe AI professionally.

    Marketers are finding lots of ways to use generative AI for things like SEO research, copywriting, and summarizing survey results.

    But one of the most natural and safe fits for AI is marketing data discovery during conversational call tracking.

    Don’t believe us?

    Here are a ton of AI marketing use cases that make perfect sense for your teams to start using.

    A Quick Call Tracking Definition

    Call tracking is the act of using unique phone numbers to tie a conversation to its marketing source, and collect other caller data, such as:

    • Location of caller.
    • New or returning caller.
    • Website activity associated with the caller.

    It can help attribute sales to:

    • Best performing marketing materials.
    • Best performing local website landing pages.
    • Best performing PPC campaigns.

    Manually tracking and analyzing each conversation can take hours, and often, important nuances are missed.

    This is where AI can help speed up marketing insight discovery and automatically update contact and sales pipelines.

    All you need is a prompt.

    What Prompt Or Quick Recipe Can I Use To Get AI Insights From Call Tracking?

    Your automatically logged call transcriptions + an AI prompt = automated conversation intelligence.

    Once you have this setup configured, you can drastically speed up your first-party data collection.

    To get more specific, prompts have two main parts. The question you want answered, and how you want AI to answer it. As an example:

    The question: What prompted the Caller to reach out?

    The prompt [how should AI answer]: You are a helpful Sales agent responsible for identifying what marketing channel prompted the contact to call. If the contact did not identify what prompted their call please only respond with “None”.

    Below are some example responses on what a contact might say:

    • Podcast ad.
    • Social post.
    • Friend or family recommendation.
    • Stopped by event booth.
    • Read reviews online.

    1 – 18. How To Use AI To Update Customer Contact Fields

    Starting off boring, but powerful: Generative AI can take your customer conversations and automate data entry tasks, such as updating caller profiles to keep them relevant and qualified.

    21 AI Use Cases For Turning Inbound Calls Into Marketing Data [+Prompts]Image created by CallTrackingMetrics, March 2024

    Impressive? No.

    But the time savings add up quickly, and let your team work on the things they like (that make the company money) instead of manually filling out wrap-up panels after a call.

    What Contact Information Can AI Automatically Update?

    1. Name – You’re going to get a name from caller ID which is a great start, but is it the name your caller prefers? Is it up to date or is it still the name of a former customer who left their company to chase their dreams? With a quick AI prompt, you can make sure you’re greeting the right person when they call back.
    2. Email Address – It might be a default value for form submissions, but getting an email address from a caller can take a lot of back and forth. AI isn’t going to ask for that last part again, or require you to read it back to them to verify. It’s just going to do it.
    3. Company Name – You might be using a sales intelligence tool like ZoomInfo to pull this kind of thing from a database. Still, you might also enjoy the accuracy of extracting directly from the words of your prospect.
    4. Buyer Role – Maybe not a basic field, but one AI can fill out nonetheless (much like other custom fields below!). Give your AI a list to choose from like a researcher, influencer, or decision maker. Sure would be nice to know how much influence they actually have without having to ask directly.

    Can AI Automatically Tag Conversations In My CRM?

    Of course!

    In CRMs and sales enablement tools, tags are used to categorize and segment your conversations for further analysis or follow-up.

    Some popular tags for call tracking are marking someone a new or returning caller.

    You can set a tag manually. You can set a tag using an if/then trigger. And because of what this whole thing is about, you can update tags using AI.

    21 AI Use Cases For Turning Inbound Calls Into Marketing Data [+Prompts]Image created by CallTrackingMetrics, March 2024

    Use AI to automatically add tags to your prospect’s profile, based on their actual calls.

    1. Spam – Sure, you can mark something spam yourself, but why not let AI do it for you so you can move on to real work?
    2. Product Tags – What was the caller asking about? Add product tags to calls for further analysis, or to jump right into the sales pitch when they call back.
    3. Lifecycle Tags – Have AI examine what kinds of questions your prospect is asking and qualify them along a scale of just learning to ready to buy. Or even, mark them as an existing customer.
    4. Target Account – Did the caller mention their company size? Maybe you asked them about revenue or tech stack. If you let AI know what your ideal customer looks like, it’ll help you quickly identify them when you’re talking to one.

    Can Generative AI Score Leads In My CRM?

    Yes! However, if 100% of your calls end in sales, skip this part.

    For the rest of us, phone, text, and chat leads range from “never going to buy anything” to “ready to give you my credit card info.”

    You need a way to gauge which leads are closer to “ready.” This is where lead scoring comes in.

    21 AI Use Cases For Turning Inbound Calls Into Marketing Data [+Prompts]Image created by CallTrackingMetrics, March 2024

    While there are lots of ways to score your conversations, you can use AI to sift through the transcription and qualify a lead for you.

    For call scoring, this often looks like a score of 1 to 5.

    So, here are a few examples of how AI can automatically score your leads from transcripts and chat logs.

    1. Readiness to Buy – The most classic approach to scoring is asking, “How likely is this lead to buy?” A score of 1 is unqualified, and a score of 5 is they’re already paying us.
    2. Ideal Customer Fit – Just like adding a target account tag above, train your AI on what a good customer looks like, and it can also give you a score. How closely does this caller fit your ideal profile?
    3. Coaching – Not everything has to be about the lead. Sometimes we want to grade our own team. How well did your sales team stick to the script? Were they friendly? Let AI roll it up into a score for you.
    4. Follow-up Priority – Aggregate readiness to buy, customer fit, and other inputs to decide on how aggressively to follow up with your leads.

    Can Generative AI Capture & Update Custom Fields From Phone Calls & Chat Logs?

    Your company is likely not the same as every other company using call tracking to get customer insights.

    You’ll want some flexibility to determine what’s important to you, not what your call-tracking provider has determined to be important.

    With custom fields, you get to put your creativity and strategy together with AI’s scalability to automate pretty much anything.

    21 AI Use Cases For Turning Inbound Calls Into Marketing Data [+Prompts]Image created by CallTrackingMetrics, March 2024

    AI can accurately assess and notate:

    1. Product Familiarity – You’ve tagged a call with a product name, but how much time do you need to spend educating the prospect vs. selling them?
    2. Related Products – What else could you be selling this person?
    3. Appointments – If your team runs on appointments or demos, having an AI add a calendar date to a custom field opens up a world of automated possibilities.
    4. Next Steps – Follow up with an email, a call, or an appointment confirmation text. Have AI pull the best next step from your conversation.

    19 – 21. How To Use Generative AI To Take Action On Automatically Updated Sales Contacts

    Ok, so there are some time-savings when you use call tracking and AI to update fields.

    If that’s not quite exciting enough, let’s see what you can actually do with those automated fields.

    21 AI Use Cases For Turning Inbound Calls Into Marketing Data [+Prompts]Image created by CallTrackingMetrics, March 2024

    19. Automate Advertising Optimization

    Use conversion data to inform your decisions.

    Throw AI into the mix, and you go from A to optimized without lifting a finger.

    How?

    The tags and fields your AI just updated become qualifiers to send only the signals that matter to your business over to platforms like Google Ads where their machine learning will go wild to find more of the same. Where you might have been stuck sending a simple conversion (like any call with talk time over 90 seconds) now you can send those conversions with a three or better score for readiness to buy, and a product tag.

    20. Better Personalization In Your CRM

    To kick things off, your AI automatically scraped the conversation for an email address, so now you can add a new contact to an email-centric tool like HubSpot immediately at the end of the conversation. H

    ave you updated product tags? Use that as a great trigger to enroll them in a highly relevant email drip.

    Feed your call scores and product tags into your CRM’s lead scoring system and add complexity to a usually surface-level approach. Or do something as easy as sync their company name to their record so you can personalize outreach.

    21. Following Up & Closing Deals

    You’re not having AI fill out custom fields for fun, you’re doing it to make your job easier.

    And one of your primary jobs is following up after a conversation to get someone closer to purchasing.

    Agreed on a time for your next meeting? Send that date field to your favorite scheduling tool and get a calendar invite in their inbox. Or maybe you had a softer “call me next week” agreement? Use that to send the caller to an outbound dialer that’s set to call as soon as you log in the next week.

    How To Use AI For Analyzing Calls

    Moving beyond data entry, when you give AI a call transcription to work with, it can pull out insights to help your team get better.

    In the time it would take you to read through one eight-minute phone conversation, AI has analyzed your whole day’s worth of calls and is off taking whatever the robot equivalent of a coffee break is.

    What can AI do to upgrade your conversation intelligence? Unfortunately, after 16 use cases, we’re bumping up against our word count and we’ll have to save that for part two: Another Ton of AI Use Cases for Call Tracking.


    Image Credits

    Featured Image: Image by CallTrackingMetrics Used with permission.

    Charts: Fastest Growing Ecommerce Companies 2023

    The first challenge in ranking fast-growing ecommerce companies is the definition. Should “ecommerce” include only companies that sell their own inventory? Or does it also include platforms and tech providers that serve those sellers?

    Yahoo Finance adopted the latter last month when it ranked the fastest-growing, publicly traded ecommerce companies. The list includes retailers and platforms with at least $100 million in annual revenue in fiscal 2023.

    According to Yahoo, the Chinese firm PDD Holdings Inc. (owners of Temu, the consumer marketplace) experienced the greatest annual percentage revenue increase in 2023 at 51.91%. Turkey-based D-Market Elektronik Hizmetler ve Ticaret A.S. (an electronics marketplace) was second with a growth of 36.25%.

    In terms of market capitalization (stock price times the number of outstanding shares), Amazon holds the lead by far with $1.85 trillion in March 2024.

    According to Insider Intelligence, in 2024 global retail ecommerce sales will surpass $6 trillion, accounting for approximately 20.1% of all retail sales.

    Charts: U.S. Retail Ecommerce Sales Q4 2023

    The U.S. Department of Commerce reports quarterly total domestic retail sales and ecommerce only. Newly released figures (PDF) for Q4 2023 show ecommerce sales of $285.2 billion, a growth of 0.8% over the prior quarter.

    Per the DoC, ecommerce sales are for “goods and services where the buyer places an order (or the price and terms of the sale are negotiated) over an Internet, mobile device, extranet, electronic data interchange network, electronic mail, or other comparable online system. Payment may or may not be made online.”

    The DoC’s estimated total retail sales (online and in-store) for Q4 2023 stood at $1,831.4 billion, an increase of 0.4% from Q3 2023.

    Ecommerce accounted for 15.6% of total U.S. retail sales in Q4 2023, up slightly from 15.5% in the prior quarter.

    The DoC estimates U.S. ecommerce retail sales in Q4 2023 grew by 7.5% compared to Q4 2022, while total quarterly retail sales experienced a 2.8% annual rise in the same period.

    Charts: Global Economic Outlook Q1 2024

    Global growth is projected to stand at 3.1% in 2024 and 3.2% in 2025. That’s according to the International Monetary Fund’s January 2024 “World Economic Outlook” report, subtitled “Moderating Inflation and Steady Growth Open Path to Soft Landing.”

    The IMF updates its economic outlook twice a year. The IMF’s forecasts use a “bottom-up” approach, starting with individual countries and then aggregating into overall global projections.

    According to the IMF, growth in the United States is projected to fall from 2.5% in 2023 to 2.1% in 2024 and 1.7% in 2025. The euro region is expected to rebound from its low growth rate of 0.5% in 2023, which was influenced by exposure to the conflict in Ukraine, to 0.9% in 2024 and 1.7% in 2025.

    The IMF projects growth in advanced economies will decline slightly from 1.6% in 2023 to 1.5% in 2024 before rising to 1.8% in 2025.

    Meanwhile, in emerging markets and developing economies, growth is expected to remain at 4.1 percent in 2024 and to rise to 4.2 percent in 2025

    According to the IMF, the global consumer inflation rate, including food and energy, will fall from an estimated 6.8% in 2023 to 5.8% in 2024 and 4.4% in 2025.

    Get Started With GSC Queries In BigQuery

    BigQuery has a number of advantages not found with other tools when it comes to analyzing large volumes of Google Search Console (GSC) data.

    It lets you process billions of rows in seconds, enabling deep analysis across massive datasets.

    This is a step up from Google Search Console, which only allows you to export 1,000 rows of data and may have data discrepancies.

    You read all about why you should be using BigQuery as an SEO pro. You figured out how to plug GSC with BigQuery. Data is flowing!

    Now what?

    It’s time to start querying the data. Understanding and effectively querying the data is key to gaining actionable SEO insights.

    In this article, we’ll walk through how you can get started with your queries.

    Understanding GSC Data Structure In BigQuery

    Data is organized in tables. Each table corresponds to a specific Google Search Console report. The official documentation is very extensive and clear.

    However, if you are reading this, it’s because you want to understand the context and the key elements before diving into it.

    Taking the time to figure this out means that you will be able to create better queries more efficiently while keeping the costs down.

    GSC Tables, Schema & Fields In BigQuery

    Schema is the blueprint that maps what each field (each piece of information) represents in a table.

    You have three distinct schemas presented in the official documentation because each table doesn’t necessarily hold the same type of data. Think of tables as dedicated folders that organize specific types of information.

    Each report is stored separately for clarity. You’ve got:

    • searchdata_site_impression: Contains performance data for your property aggregated by property.
    • searchdata_url_impression: Contains performance data for your property aggregated by URL.
    • exportLog: each successful export to either table is logged here.

    A few important notes on tables:

    • You’ll find in the official documentation that things don’t run the way we expect them to: “Search Console exports bulk data once per day, though not necessarily at the same time for each table.”
    • Tables are retained forever, by default, with the GSC bulk export.
    • In the URL level table (searchdata_url_impression), you have Discover data. The field is_anonymized_discover specifies if the data row is subject to the Discover anonymization threshold.

    Fields are individual pieces of information, the specific type of data in a table. If this were an Excel file, we’d refer to fields as the columns in a spreadsheet.

    If we’re talking about Google Analytics, fields are metrics and dimensions. Here are key data fields available in BigQuery when you import GSC data:

    • Clicks – Number of clicks for a query.
    • Impressions – Number of times a URL was shown for a query.
    • CTR – Clickthrough rate (clicks/impressions).
    • Position – Average position for a query.

    Let’s take the searchdata_site_impression table schema as an example. It contains 10 fields:

    Field Explanation
    data_date The day when the data in this row was generated, in Pacific Time.
    site_url URL of the property, sc-domain:property-name or the full URL, depending on your validation.
    query The user’s search query.
    is_anonymized_query If true, the query field will return null.
    country Country from which the search query originated.
    search_type Type of search (web, image, video, news, discover, googleNews).
    device The device used by the user.
    impressions The number of times a URL was shown for a particular search query.
    clicks The number of clicks a URL received for a search query.
    sum_top_position This calculation figures out where your website typically ranks in search results. It looks at the highest position your site reaches in different searches and calculates the average.

    Putting It Together

    In BigQuery, the dataset for the Google Search Console (GSC) bulk export typically refers to the collection of tables that store the GSC data.

    The dataset is named “searchconsole” by default.

    BigQuery search console tables

    Unlike the performance tab in GSC, you have to write queries to ask BigQuery to return data. To do that, you need to click on the “Run a query in BigQuery” button.

    Run SQL query option among three other options on the welcome screenScreenshot from Google Cloud Console, January 2024

    Once you do that, you should have access to the BigQuery Studio, where you will be creating your first SQL query. However, I don’t recommend you click on that button yet.

    access screen to the BigQuery Studio where you will be creating your first SQL query. Screenshot of BigQuery Studio, January 2024

    In Explorer, when you open your project, you will see the datasets; it’s a logo with squares with dots in them. This is where you see if you have GA4 and GSC data, for instance.

    data set for search impression table

    When you click on the tables, you get access to the schema. You can see the fields to confirm this is the table you want to query.

    If you click on “QUERY” at the top of the interface, you can create your SQL query. This is better because it loads up some information you need for your query.

    It will fill out the FROM with the proper table, establish a default limit, and the date that you can change if you need to.

     If you click on “QUERY” at the top in the interface, you can create your SQL query. This is better because it loads up some information you need for your query.Screenshot from Google Cloud Console, January 2024

    Getting Started With Your First Query

    The queries we are going to discuss here are simple, efficient, and low-cost.

    Disclaimer: The previous statement depends on your specific situation.

    Sadly, you cannot stay in the sandbox if you want to learn how to use BigQuery with GSC data. You must enter your billing details. If this has you freaked out, fear not; costs should be low.

    • The first 1 TiB per month of query data is free.
    • If you have a tight budget, you can set cloud billing budget alerts — you can set a BigQuery-specific alert and get notified as soon as data usage charges occur.

    In SQL, the ‘SELECT *’ statement is a powerful command used to retrieve all columns from a specified table or retrieve specific columns as per your specification.

    This statement enables you to view the entire dataset or a subset based on your selection criteria.

    A table comprises rows, each representing a unique record, and columns, storing different attributes of the data. Using “SELECT *,” you can examine all fields in a table without specifying each column individually.

    For instance, to explore a Google Search Console table for a specific day, you might employ a query like:

    SELECT *

    FROM `yourdata.searchconsole.searchdata_site_impression`

    WHERE data_date = '2023-12-31'

    LIMIT 5;

    You always need to make sure that the FROM clause specifies your searchdata_site_impression table. That’s why it is recommended to start by clicking the table first, as it automatically fills in the FROM clause with the right table.

    Important: We limit the data we load by using the data_date field. It’s a good practice to limit costs (along with setting a limit).

    results from the first query we made shown in a table format

    Your First URL Impression Query

    If you want to see information for each URL on your site, you’d ask BigQuery to pull information from the ‘searchdata_url_impression’ table, selecting the ‘query’ and ‘clicks’ fields.

    This is what the query would look like in the console:

    SELECT

    url,

    SUM(clicks) AS clicks,

    SUM(impressions)

    FROM

    `yourtable.searchdata_url_impression`

    WHERE

    data_date = ‘2023-12-25’

    GROUP BY

    url

    ORDER BY

    clicks DESC

    LIMIT

    100

    You always need to make sure that the FROM clause specifies your searchdata_url_impression table.

    When you export GSC data into BigQuery, the export contains partition tables. The partition is the date.

    This means that the data in BigQuery is structured in a way that allows for quick retrieval and analysis based on the date.

    That’s why the date is automatically included in the query. However, you may have no data if you select the latest date, as the data may not have been exported yet.

    Breakdown Of The Query

    In this example, we select the URL, clicks, and impressions fields for the 25th of December, 2023.

    We group the results based on each URL with the sum of clicks and impressions for each of them.

    Lastly, we order the results based on the number of clicks for each URL and limit the number of rows (URLs) to 100.

    Recreating Your Favorite GSC Report

    I recommend you read the GSC bulk data export guide. You should be using the export, so I will not be providing information about table optimization. That’s a tad bit more advanced than what we are covering here.

    GSC’s performance tab shows one dimension at a time, limiting context. BigQuery allows you to combine multiple dimensions for better insights

    Using SQL queries means you get a neat table. You don’t need to understand the ins and outs of SQL to make the best use of BigQuery.

    This query is courtesy of Chris Green. You can find some of his SQL queries in Github.

    SELECT

    query,

    is_anonymized_query AS anonymized,

    SUM(impressions) AS impressions,

    SUM(clicks) AS clicks,

    SUM(clicks)/NULLIF(SUM(impressions), 0) AS CTR

    FROM

    yourtable.searchdata_site_impression

    WHERE

    data_date >= DATE_SUB(CURRENT_DATE(), INTERVAL 28 DAY)

    GROUP BY

    query,

    anonymized

    ORDER BY

    clicks DESC

    This query provides insights into the performance of user queries over the last 28 days, considering impressions, clicks, and CTR.

    It also considers whether the queries are anonymized or not, and the results are sorted based on the total number of clicks in descending order.

    This recreates the data you would normally find in the Search Console “Performance” report for the last 28 days of data, results by query, and differentiating anonymized queries.

    Feel free to copy/paste your way to glory, but always make sure you update the FROM clause with the right table name. If you are curious to learn more about how this query was built, here is the breakdown:

    • SELECT clause:
      • query: Retrieves the user queries.
      • is_anonymized_query AS anonymized: Renames the is_anonymized_query field to anonymized.
      • SUM(impressions) AS impressions: Retrieves the total impressions for each query.
      • SUM(clicks) AS clicks: Retrieves the total clicks for each query.
      • SUM(clicks)/NULLIF(SUM(impressions), 0) AS CTR: Calculates the Click-Through Rate (CTR) for each query. The use of NULLIF prevents division by zero errors.
    • FROM clause:
      • Specifies the source table as mytable.searchconsole.searchdata_site_impression.
    • WHERE clause:
      • Filters the data to include only rows where the data_date is within the last 28 days from the current date.
    • GROUP BY clause:
      • Groups the results by query and anonymized. This is necessary since aggregations (SUM) are performed, and you want the totals for each unique combination of query and anonymized.
    • ORDER BY clause:
      • Orders the results by the total number of clicks in descending order.

    Handling The Anonymized Queries

    According to Noah Learner, the Google Search Console API delivers 25 times more data than the GSC performance tab for the same search, providing a more comprehensive view.

    In BigQuery, you can also access the information regarding anonymized queries.

    It doesn’t omit the rows, which helps analysts get complete sums of impressions and clicks when you aggregate the data.

    Understanding the volume of anonymized queries in your Google Search Console (GSC) data is key for SEO pros.

    When Google anonymizes a query, it means the actual search query text is hidden in the data. This impacts your analysis:

    • Anonymized queries remove the ability to parse search query language and extract insights about searcher intent, themes, etc.
    • Without the query data, you miss opportunities to identify new keywords and optimization opportunities.
    • Not having query data restricts your capacity to connect search queries to page performance.

    The First Query Counts The Number Of Anonymized Vs. Not Anonymized Queries

    SELECT

    CASE

    WHEN query is NULL AND is_anonymized_query = TRUE THEN "no query"

    ELSE

    "query"

    END

    AS annonymized_query,

    count(is_anonymized_query) as query_count

    FROM

    `yourtable.searchdata_url_impression`

    GROUP BY annonymized_query

    Breakdown Of The Query

    In this example, we use a CASE statement in order to verify for each row if the query is anonymized or not.

    If so, we return “no query” in the query field; if not, “query.”

    We then count the number of rows each query type has in the table and group the results based on each of them. Here’s what the result looks like:

    anonymized queries shown in results

    Advanced Querying For SEO Insights

    BigQuery enables complex analysis you can’t pull off in the GSC interface. This means you can also create customized intel by surfacing patterns in user behavior.

    You can analyze search trends, seasonality over time, and keyword optimization opportunities.

    Here are some things you should be aware of to help you debug the filters you put in place:

    • The date could be an issue. It may take up to two days for you to have the data you want to query. If BigQuery says on the top right corner that your query would require 0mb to run, it means the data you want isn’t there yet or that there is no data for your query.
    • Use the preview if you want to see what a field will return in terms of value. It shows you a table with the data.
    • The country abbreviations you will get in BigQuery are in a different format (ISO-3166-1-Alpha-3 format) than you are used to. Some examples: FRA for France, UKR for Ukraine, USA for the United States, etc.
    • Want to get “pretty” queries? Click on “more” within your query tab and select “Format query.” BigQuery will handle that part for you!
    • If you want more queries right away, I suggest you sign up for the SEOlytics newsletter, as there are quite a few SQL queries you can use.

    Conclusion

    Analyzing GSC data in BigQuery unlocks transformative SEO insights, enabling you to track search performance at scale.

    By following the best practices outlined here for querying, optimizing, and troubleshooting, you can get the most out of this powerful dataset.

    Reading this isn’t going to make you an expert instantly. This is the first step in your adventure!

    If you want to know more, check out Jake Peterson’s blog post, start practicing for free with Robin Lord’s Lost at SQL game, or simply stay tuned because I have a few more articles coming!

    If you have questions or queries, do not hesitate to let us know.

    More resources:


    Featured Image: Tee11/Shutterstock

    Charts: 2024 Outlook of Global CEOs

    Forty-five percent of global CEOs believe their company will not remain viable in the next decade if it continues on its current trajectory. That’s according to the 27th annual global CEO survey issued earlier this month by PwC, which queried 4,702 CEOs in 105 countries and territories in November 2023.

    However, CEOs are now twice as likely to anticipate an improvement in the global economy this year compared to a year ago.

    CEOs expect greater impacts from technology, customer preferences, and climate change in the coming three years versus the past five.

    As of November 2023, CEOs perceived fewer imminent threats in the short term, with inflation being the top concern.

    The Federal Reserve Bank of New York’s monthly “Business Leaders Survey” asks executives about recent and expected trends in key business indicators. The January 2024 edition (PDF) queried roughly 200 service firms in the New York City region from Jan. 3 to 10.

    The survey solicits the views of executives of those firms on multiple indicators from the prior month, such as revenue, employee count, forecasts, and more. The result is a “Business Activity Index,” the sum of favorable responses less unfavorable. If 50% of respondents answered favorably and 20% unfavorably, the index would be 30.

    In January 2024, the index climbed 12 points to 24.5, suggesting that firms were more optimistic about future conditions compared to the previous month.

    Charts: U.S. Consumer Outlook Q1 2024

    U.S. consumers are carefully monitoring their spending amid ongoing global uncertainty. That’s according to “US consumer sentiment: Caution heading into 2024,” a research post last month from McKinsey & Company.

    The economic outlook of U.S. consumers remained relatively stable throughout 2023, with a slight increase in optimism.

    Consumers’ spending plans in Q1 2024 signaled a priority on essential items such as baby supplies, gasoline, and food.

    Seventy-seven percent of surveyed Americans reported engaging in some form of spending reduction in Q4 2023.

    The U.S. Bureau of Economic Analysis, a division of the Department of Commerce, publishes the monthly “Personal Consumption Expenditures Index,” a gauge of U.S. consumer spending. According to the BEA, consumer spending on goods and services increased by 0.3% in November 2023 over the prior month.