The Impact Of AI And Other Innovations On Data Storytelling via @sejournal, @InsightNarrator

This edited extract is from Data Storytelling in Marketing by Caroline Florence ©2024 and is reproduced and adapted with permission from Kogan Page Ltd.

Storytelling is an integral part of the human experience. People have been communicating observations and data to each other for millen­nia using the same principles of persuasion that are being used today.

However, the means by which we can generate data and insights and tell stories has shifted significantly and will continue to do so, as tech­nology plays an ever-greater role in our ability to collect, process, and find meaning from the wealth of information available.

So, what is the future of data storytelling?

I think we’ve all talked about data being the engine that powers business decision-making. And there’s no escaping the role that AI and data are going to play in the future.

So, I think the more data literate and aware you are, the more informed and evidence-led you can be about our decisions, regardless of what field you are in – because that is the future we’re all working towards and going to embrace, right?

It’s about relevance and being at the forefront of cutting-edge technology.

Sanica Menezes, Head of Customer Analytics, Aviva

The Near Future Scenario

Imagine simply applying a generative AI tool to your marketing data dashboards to create audience-ready copy. The tool creates a clear narrative structure, synthesized from the relevant datasets, with actionable and insightful messages relevant to the target audience.

The tool isn’t just producing vague and generic output with question­able accuracy but is sophisticated enough to help you co-author technically robust and compelling content that integrates a level of human insight.

Writing stories from vast and complex datasets will not only drive efficiency and save time, but free up the human co-author to think more creatively about how they deliver the end story to land the message, gain traction with recommendations and influence decisions and actions.

There is still a clear role for the human to play as co-author, including the quality of the prompts given, expert interpretation, nuance of language, and customization for key audiences.

But the human co-author is no longer bogged down by the complex and time-consuming process of gathering different data sources and analysing data for insights. The human co-author can focus on synthesizing findings to make sense of patterns or trends and perfect their insight, judgement, and communication.

In my conversations with expert contributors, the consensus was that AI would have a significant impact on data storytelling but would never replace the need for human intervention.

This vision for the future of storytelling is (almost) here. Tools like this already exist and are being further improved, enhanced, and rolled out to market as I write this book.

But the reality is that the skills involved in leveraging these tools are no different from the skills needed to currently build, create, and deliver great data stories. If anything, the risks involved in not having human co-authors means acquiring the skills covered in this book become even more valuable.

In the AI storytelling exercise WINconducted, the tool came up with “80 per cent of people are healthy” as its key point. Well, it’s just not an interesting fact.

Whereas the humans looking at the same data were able to see a trend of increasing stress, which is far more interesting as a story. AI could analyse the data in seconds, but my feeling is that it needs a lot of really good prompting in order for it to seriously help with the storytelling bit.

I’m much more positive about it being able to create 100 slides for me from the data and that may make it easier for me to pick out what the story is.

Richard Colwell, CEO, Red C Research & Marketing Group

We did a recent experiment with the Inspirient AI platform taking a big, big, big dataset, and in three minutes, it was able to produce 1,000 slides with decent titles and design.

Then you can ask it a question about anything, and it can produce 110 slides, 30 slides, whatever you want. So, there is no reason why people should be wasting time on the data in that way.

AI is going to make a massive difference – and then we bring in the human skill which is contextualization, storytelling, thinking about the impact and the relevance to the strategy and all that stuff the computer is never going to be able to do.

Lucy Davison, Founder And CEO, Keen As Mustard Marketing

Other Innovations Impacting On Data Storytelling

Besides AI, there are a number of other key trends that are likely to have an impact on our approach to data storytelling in the future:

Synthetic Data

Synthetic data is data that has been created artificially through computer simulation to take the place of real-world data. Whilst already used in many data models to supplement real-world data or when real-world data is not available, the incidence of synthetic data is likely to grow in the near future.

According to Gartner (2023), by 2024, 60 per cent of the data used in training AI models will be synthetically generated.

Speaking in Marketing Week (2023), Mark Ritson cites around 90 per cent accuracy for AI-derived consumer data, when triangulated with data generated from primary human sources, in academic studies to date.

This means that it has a huge potential to help create data stories to inform strategies and plans.

Virtual And Augmented Reality

Virtual and augmented reality will enable us to generate more immersive and interactive experiences as part of our data storytelling. Audiences will be able to step into the story world, interact with the data, and influence the narrative outcomes.

This technology is already being used in the world of entertainment to blur the lines between traditional linear television and interactive video games, creating a new form of content consumption.

Within data storytelling we can easily imagine a world with simulated customer conversations, whilst navigating the website or retail environment.

Instead of static visualizations and charts showing data, the audience will be able to overlay data onto their physical environment and embed data from different sources accessed at the touch of a button.

Transmedia Storytelling

Transmedia storytelling will continue to evolve, with narratives spanning multiple platforms and media. Data storytellers will be expected to create interconnected storylines across different media and channels, enabling audiences to engage with the data story in different ways.

We are already seeing these tools being used in data journalism where embedded audio and video, on-the-ground eyewitness content, live-data feeds, data visualization and photography sit alongside more traditional editorial commentary and narrative storytelling.

For a great example of this in practice, look at the Pulitzer Prize-winning “Snow fall: The avalanche at Tunnel Creek (Branch, 2012)” that changed the way The New York Times approached data storytelling.

In the marketing world, some teams are already investing in high-end knowledge share portals or embedding tools alongside their intranet and internet to bring multiple media together in one place to tell the data story.

User-Generated Content

User-generated content will also have a greater influence on data storytelling. With the rise of social media and online communities, audiences will actively participate in creating and sharing stories.

Platforms will emerge that enable collaboration between storytellers and audiences, allowing for the co-creation of narratives and fostering a sense of community around storytelling.

Tailoring narratives to the individual audience member based on their preferences, and even their emotional state, will lead to greater expectations of customization in data storytelling to enhance engagement and impact.

Moving beyond the traditional “You said, so we did” communication with customers to demonstrate how their feedback has been actioned, user-generated content will enable customers to play a more central role in sharing their experiences and expectations

These advanced tools are a complement to, and not a substitution for, the human creativity and critical thinking that great data storytelling requires. If used appropriately, they can enhance your data storytelling, but they cannot do it for you.

Whether you work with Microsoft Excel or access reports from more sophisticated business intelligence tools, such as Microsoft Power BI, Tableau, Looker Studio, or Qlik, you will still need to take those outputs and use your skills as a data storyteller to curate them in ways that are useful for your end audi­ence.

There are some great knowledge-sharing platforms out there that can integrate outputs from existing data storytelling tools and help curate content in one place. Some can be built into existing plat­forms that might be accessible within your business, like Confluence.

Some can be custom-built using external tools for a bespoke need, such as creating a micro-site for your data story using WordPress. And some can be brought in at scale to integrate with existing Microsoft or Google tools.

The list of what is available is extensive but will typically be dependent on what is available IT-wise within your own organization.

The Continuing Role Of The Human In Data Storytelling

In this evolving world, the role of the data storyteller doesn’t disap­pear but becomes ever more critical.

The human data storyteller still has many important roles to still play, and the skills necessary to influence and engage cynical, discerning, and overwhelmed audiences become even more valuable.

Now that white papers, marketing copy, internal presentations, and digital content can all be generated faster than humans could ever manage on their own, the risk of informa­tion overload becomes inevitable without a skilled storyteller to curate the content.

Today, the human data storyteller is crucial for:

  • Ensuring we are not telling “any old story” just because we can and that the story is relevant to the business context and needs.
  • Understanding the inputs being used by the tool, including limitations and potential bias, as well as ensuring data is used ethically and that it is accurate, reliable, and obtained with the appropriate permissions.
  • Framing queries appropriately in the right way to incorporate the relevant context, issues, and target audience needs to inform the knowledge base.
  • Cross-referencing and synthesizing AI-generated insights or synthetic data with human expertise and subject domain knowledge to ensure the relevance and accuracy of recommendations.
  • Leveraging the different VR, AR, and transmedia tools available to ensure the right one for the job.

To read the full book, SEJ readers have an exclusive 25% discount code and free shipping to the US and UK. Use promo code SEJ25 at koganpage.com here.

More resources: 


Featured Image: PopTika/Shutterstock

Introduction To LLMs Text Embeddings For SEO With Examples via @sejournal, @vahandev

If you are an SEO practitioner or digital marketer reading this article, you may have experimented with AI and chatbots in your everyday work.

But the question is, how can you make the most out of AI other than using a chatbot user interface?

For that, you need a profound understanding of how large language models (LLMs) work and learn the basic level of coding. And yes, coding is absolutely necessary to succeed as an SEO professional nowadays.

This is the first of a series of articles that aim to level up your skills so you can start using LLMs to scale your SEO tasks. We believe that in the future, this skill will be required for success.

We need to start from the basics. It will include essential information, so later in this series, you will be able to use LLMs to scale your SEO or marketing efforts for the most tedious tasks.

Contrary to other similar articles you’ve read, we will start here from the end. The video below illustrates what you will be able to do after reading all the articles in the series on how to use LLMs for SEO.

Our team uses this tool to make internal linking faster while maintaining human oversight.

Did you like it? This is what you will be able to build yourself very soon.

Now, let’s start with the basics and equip you with the required background knowledge in LLMs.

What Are Vectors?

In mathematics, vectors are objects described by an ordered list of numbers (components) corresponding to the coordinates in the vector space.

A simple example of a vector is a vector in two-dimensional space, which is represented by (x,y) coordinates as illustrated below.

Sample two dimensional vector with (x13y) coordinates (8,13)Sample two-dimensional vector with x=13 and y=8 coordinates notating as (13,8)

In this case, the coordinate x=13 represents the length of the vector’s projection on the X-axis, and y=8 represents the length of the vector’s projection on the Y-axis.

Vectors that are defined with coordinates have a length, which is called the magnitude of a vector or norm. For our two-dimensional simplified case, it is calculated by the formula:

L=( x1)2+(y1)2

However, mathematicians went ahead and defined vectors with an arbitrary number of abstract coordinates (X1, X2, X3 … Xn), which is called an “N-dimensional” vector.

In the case of a vector in three-dimensional space, that would be three numbers (x,y,z), which we can still interpret and understand, but anything above that is out of our imagination, and everything becomes an abstract concept.

And here is where LLM embeddings come into play.

What Is Text Embedding?

Text embeddings are a subset of LLM embeddings, which are abstract high-dimensional vectors representing text that capture semantic contexts and relationships between words.

In LLM jargon, “words” are called data tokens, with each word being a token. More abstractly, embeddings are numerical representations of those tokens, encoding relationships between any data tokens (units of data), where a data token can be an image, sound recording, text, or video frame.

In order to calculate how close words are semantically, we need to convert them into numbers. Just like you subtract numbers (e.g., 10-6=4) and you can tell that the distance between 10 and 6 is 4 points, it is possible to subtract vectors and calculate how close the two vectors are.

Thus, understanding vector distances is important in order to grasp how LLMs work.

There are different ways to measure how close vectors are:

  1. Euclidean distance.
  2. Cosine similarity or distance.
  3. Jaccard similarity.
  4. Manhattan distance.

Each has its own use cases, but we will discuss only commonly used cosine and Euclidean distances.

What Is The Cosine Similarity?

It measures the cosine of the angle between two vectors, i.e., how closely those two vectors are aligned with each other.

Euclidean distance vs. cosine similarityEuclidean distance vs. cosine similarity

It is defined as follows:

cos(α)=ABAB

Where the dot product of two vectors is divided by the product of their magnitudes, a.k.a. lengths.

Its values range from -1, which means completely opposite, to 1, which means identical. A value of ‘0’ means the vectors are perpendicular.

In terms of text embeddings, achieving the exact cosine similarity value of -1 is unlikely, but here are examples of texts with 0 or 1 cosine similarities.

Cosine Similarity = 1 (Identical)

  1. “Top 10 Hidden Gems for Solo Travelers in San Francisco”
  2. “Top 10 Hidden Gems for Solo Travelers in San Francisco”

​These texts are identical, so their embeddings would be the same, resulting in a cosine similarity of 1.

Cosine Similarity = 0 (Perpendicular, Which Means Unrelated)

  1. “Quantum mechanics”
  2. “I love rainy day”

​These texts are totally unrelated, resulting in a cosine similarity of 0 between their BERT embeddings.

However, if you run Google Vertex AI’s embedding model ‘text-embedding-preview-0409’, you will get 0.3. With OpenAi’s ‘text-embedding-3-large’ models, you will get 0.017.

(Note: We will learn in the next chapters in detail practicing with embeddings using Python and Jupyter).

We are skipping the case with cosine similarity = -1 because it is highly unlikely to happen.

If you try to get cosine similarity for text with opposite meanings like “love” vs. “hate” or “the successful project” vs. “the failing project,” you will get 0.5-0.6 cosine similarity with Google Vertex AI’s ‘text-embedding-preview-0409’ model.

It is because the words “love” and “hate” often appear in similar contexts related to emotions, and “successful” and “failing” are both related to project outcomes. The contexts in which they are used might overlap significantly in the training data.

Cosine similarity can be used for the following SEO tasks:

  1. Classification.
  2. Keyword clustering.
  3. Implementing redirects.
  4. Internal linking.
  5. Duplicate content detection.
  6. Content recommendation.
  7. Competitor analysis.

Cosine similarity focuses on the direction of the vectors (the angle between them) rather than their magnitude (length). As a result, it can capture semantic similarity and determine how closely two pieces of content align, even if one is much longer or uses more words than the other.

Deep diving and exploring each of these will be a goal of upcoming articles we will publish.

What Is The Euclidean Distance?

In case you have two vectors A(X1,Y1) and B(X2,Y2), the Euclidean distance is calculated by the following formula:

D=(x2x1)2+(y2y1)2

It is like using a ruler to measure the distance between two points (the red line in the chart above).

Euclidean distance can be used for the following SEO tasks:

  1. Evaluating keyword density in the content.
  2. Finding duplicate content with a similar structure.
  3. Analyzing anchor text distribution.
  4. Keyword clustering.

Here is an example of Euclidean distance calculation with a value of 0.08, nearly close to 0, for duplicate content where paragraphs are just swapped – meaning the distance is 0, i.e., the content we compare is the same.

Euclidean distance calculation example of duplicate contentEuclidean distance calculation example of duplicate content

Of course, you can use cosine similarity, and it will detect duplicate content with cosine similarity 0.9 out of 1 (almost identical).

Here is a key point to remember: You should not merely rely on cosine similarity but use other methods, too, as Netflix’s research paper suggests that using cosine similarity can lead to meaningless “similarities.”

We show that cosine similarity of the learned embeddings can in fact yield arbitrary results. We find that the underlying reason is not cosine similarity itself, but the fact that the learned embeddings have a degree of freedom that can render arbitrary cosine-similarities.

As an SEO professional, you don’t need to be able to fully comprehend that paper, but remember that research shows that other distance methods, such as the Euclidean, should be considered based on the project needs and outcome you get to reduce false-positive results.

What Is L2 Normalization?

L2 normalization is a mathematical transformation applied to vectors to make them unit vectors with a length of 1.

To explain in simple terms, let’s say Bob and Alice walked a long distance. Now, we want to compare their directions. Did they follow similar paths, or did they go in completely different directions?

A Cartesian plane with 'Alice' represented by a red dot in the upper right quadrant and 'Bob' represented by a green dot.“Alice” is represented by a red dot in the upper right quadrant, and “Bob” is represented by a green dot.

However, since they are far from their origin, we will have difficulty measuring the angle between their paths because they have gone too far.

On the other hand, we can’t claim that if they are far from each other, it means their paths are different.

L2 normalization is like bringing both Alice and Bob back to the same closer distance from the starting point, say one foot from the origin, to make it easier to measure the angle between their paths.

Now, we see that even though they are far apart, their path directions are quite close.

A Cartesian plane with a circle centered at the origin.A Cartesian plane with a circle centered at the origin.

This means that we’ve removed the effect of their different path lengths (a.k.a. vectors magnitude) and can focus purely on the direction of their movements.

In the context of text embeddings, this normalization helps us focus on the semantic similarity between texts (the direction of the vectors).

Most of the embedding models, such as OpeanAI’s ‘text-embedding-3-large’ or Google Vertex AI’s ‘text-embedding-preview-0409’ models, return pre-normalized embeddings, which means you don’t need to normalize.

But, for example, BERT model ‘bert-base-uncased’ embeddings are not pre-normalized.

Conclusion

This was the introductory chapter of our series of articles to familiarize you with the jargon of LLMs, which I hope made the information accessible without needing a PhD in mathematics.

If you still have trouble memorizing these, don’t worry. As we cover the next sections, we will refer to the definitions defined here, and you will be able to understand them through practice.

The next chapters will be even more interesting:

  • Introduction To OpenAI’s Text Embeddings With Examples.
  • Introduction To Google’s Vertex AI Text Embeddings With Examples.
  • Introduction To Vector Databases.
  • How To Use LLM Embeddings For Internal Linking.
  • How To Use LLM Embeddings For Implementing Redirects At Scale.
  • Putting It All Together: LLMs-Based WordPress Plugin For Internal Linking.

The goal is to level up your skills and prepare you to face challenges in SEO.

Many of you may say that there are tools you can buy that do these types of things automatically, but those tools will not be able to perform many specific tasks based on your project needs, which require a custom approach.

Using SEO tools is always great, but having skills is even better!

More resources:


Featured Image: Krot_Studio/Shutterstock

Bluehost Launches AI WordPress Website Creator via @sejournal, @martinibuster

Bluehost launched an AI Website Creator that enables users to quickly create professional websites, an evolution of the click and build website builder that makes it easy for anyone to create a WordPress website and benefit from the power and freedom of the open source community.

The importance of what this means for businesses and agencies cannot be overstated because it allows agencies to scale WordPress site creation and puts the ability to create professional WordPress sites within reach of virtually everyone.

Point And Click Website Creation

Bluehost offers an easy website building experience that provides the ease of point and click site creation with the freedom of a the WordPress open source content management system. The heart of this system is called WonderSuite.

WonderSuite is comprised of multiple components, such as a user interface that walks a user through the site creation process with a series of questions that are used as part of the site creation process. There is also a library of patterns, templates, and an easy to configure shopping cart, essentially all the building blocks for creating a site and doing business online quickly and easily.

The new AI Website Creator functionality is the newest addition to the WonderSuite site builder.

AI Website Builder

An AI website builder is the natural evolution of the point and click site creation process. Rather than moving a cursor around on a screen the new way to build a website is with an AI that acts as a designer that responds to what a user’s website needs are.

The AI asks questions and starts building the website using open source WordPress components and plugins. Fonts, professional color schemes, and plugins are all installed as needed, completely automatically. Users can also save custom generated options for future use which should be helpful for agencies that need to scale client website creation.

Ed Jay, President of Newfold Digital, the parent company of Bluehost, commented:

“Efficiency and ease are what WordPress entrepreneurs and professionals need and our team at Bluehost is dedicated to deliver these essentials to all WordPress users across the globe. With AI Website Creator, any user can rely on the Bluehost AI engine to create their personalized website in just minutes. After answering a few simple questions, our AI algorithm leverages our industry leading WordPress experience, features and technology, including all aspects of WonderSuite, to anticipate the website’s needs and ensure high quality outcomes.

The AI Website Creator presents users with multiple fully functional, tailored and customizable website options that provide a powerful but flexible path forward. It even generates images and content aligned with the user’s brief input, expediting the website off the ground and ready for launch.”

Future Of Website Creation

Bluehost’s innovative AI site creator represents the future of how businesses get online and how entrepreneurs who service clients can streamline site creation and scale their business with WordPress.

Read more about Bluehost’s new AI Website Creator:

WordPress made wonderful with AI

Featured Image by Shutterstock/Simple Line

How SEO Experts Can Utilize ChatGPT For BigQuery With Examples via @sejournal, @vahandev

AI is shaping every field by making skills (such as coding or data visualization) accessible to everyone, which weren’t available in the past.

An AI operator who can run the right prompts can perform low- and medium-level difficulty tasks, allowing more focus on strategic decision-making.

In this guide, we will walk you through step by step how to use AI chatbots with ChatGPT as an example to run complex BigQuery queries for your SEO reporting needs.

We will review two examples:

It will also give you an overall idea of how you can use chatbots to reduce the burden when running SEO reports.

Why Do You Need To Learn BigQuery?

SEO tools like Google Search Console or Google Analytics 4 have accessible user interfaces you can use to access data. But often, they limit what you can do and show incomplete data, which is usually called data sampling.

In GSC, this happens because the tool omits anonymized queries and limits table rows to up to 1,000 rows.

Screenshot from GSC Screenshot from Google Search Console, May 2024

By using BigQuery, you can solve that problem and run any complex reports you want, eliminating the data sampling issue that occurs quite often when working with large websites.

(Alternatively, you may try using Looker Studio, but the purpose of this article is to illustrate how you can operate ChatGPT for BigQuery.)

For this article, we assume you have already connected your GSC and GA4 accounts to BigQuery. If you haven’t done it yet, you may want to check our guides on how to do it:

SQL Basics

If you know Structured Query Language (SQL), you may skip this section. But for those who don’t, here is a quick reference to SQL statements:

Statement Description
SELECT Retrieves data from tables
INSERT Inserts new data into a table
UNNEST Flattens an array into a set of rows
UPDATE Updates existing data within a table
DELETE Deletes data from a table
CREATE Creates a new table or database
ALTER Modifies an existing table
DROP Deletes a table or a database.

The conditions we will be using so you can familiarize yourself:

Condition Description
WHERE Filters records for specific conditions
AND Combines two or more conditions where all conditions must be true
OR Combines two or more conditions where at least one condition must be true
NOT Negates a condition
LIKE Searches for a specified pattern in a column.
IN Checks if a value is within a set of values
BETWEEN Select values within a given range
IS NULL Checks for null values
IS NOT NULL Checks for non-null values
EXISTS Checks if a subquery returns any records

Now, let’s dive into examples of how you can use BigQuery via ChatGPT.

1. How To Analyze Traffic Decline Because Of Google Algorithm Impact 

If you have been affected by a Google algorithm update, the first thing you should do is run reports on affected pages and analyze why you have been impacted.

Remember, the worst thing you can do is start changing something on the website right away in panic mode. This may cause fluctuations in search traffic and make analyzing the impact even harder.

If you have fewer pages in the index, you may find using GSC UI data satisfactory for analyzing your data, but if you have tens of thousands of pages, it won’t let you export more than 1,000 rows (either pages or queries) of data.

Say you have a week of data since the algorithm update has finished rolling out and want to compare it with the previous week’s data. To run that report in BigQuery, you may start with this simple prompt:

Imagine you are a data analyst experienced in Google Analytics 4 (GA4), Google Search Console, SQL, and BigQuery.
Your task is to generate an SQL query to compare 'WEB' Search Console data for the periods '2024-05-08' to '2024-05-20' and '2024-04-18' to '2024-04-30'. 
Extract the total clicks, impressions, and average position for each URL for each period. 
Additionally, calculate the differences in these metrics between the periods for each URL 
(where average position should be calculated as the sum of positions divided by the sum of impressions).

Details:

BigQuery project name: use_your_bigquery_projectname
Dataset name: searchconsole
Table name: searchdata_url_impression
Please provide the SQL query that meets these requirements.

Once you get an SQL code, copy and paste it into the BigQuery SQL editor, but I bet the initial code you will get will have errors. For example, table column names may not match what is in your BigQuery dataset.

Error in BigQuery SQL when column name doesn't match the dataset column.Error in BigQuery SQL when column name doesn’t match the dataset column.

Things like this happen quite often when performing coding tasks via ChatGPT. Now, let’s dive into how you can quickly fix issues like this.

Simply click on your dataset in the left-right panel, select all columns on the right side, and click Copy as Table.

How to select all columns of table in bigquery.How to select all columns of the table in BigQuery.

Once you have it, just copy and paste it as a follow-up prompt and hit enter.

Generate SQL again by taking into account that my table schema is as follows : 
"fullname mode type description
data_date DATE 
site_url STRING 
url STRING 
query STRING 
is_anonymized_query BOOLEAN 
is_anonymized_discover BOOLEAN 
country STRING 
search_type STRING 
device STRING 
is_amp_top_stories BOOLEAN 
is_amp_blue_link BOOLEAN 
is_job_listing BOOLEAN 
is_job_details BOOLEAN 
is_tpf_qa BOOLEAN 
is_tpf_faq BOOLEAN 
is_tpf_howto BOOLEAN 
is_weblite BOOLEAN 
is_action BOOLEAN 
is_events_listing BOOLEAN 
is_events_details BOOLEAN 
is_search_appearance_android_app BOOLEAN 
is_amp_story BOOLEAN 
is_amp_image_result BOOLEAN 
is_video BOOLEAN 
is_organic_shopping BOOLEAN 
is_review_snippet BOOLEAN 
is_special_announcement BOOLEAN 
is_recipe_feature BOOLEAN 
is_recipe_rich_snippet BOOLEAN 
is_subscribed_content BOOLEAN 
is_page_experience BOOLEAN 
is_practice_problems BOOLEAN 
is_math_solvers BOOLEAN 
is_translated_result BOOLEAN 
is_edu_q_and_a BOOLEAN 
impressions INTEGER 
clicks INTEGER 
sum_position INTEGER 
is_product_snippets BOOLEAN 
is_merchant_listings BOOLEAN 
is_learning_videos BOOLEAN "

This will fix the generated SQL according to the GSC table structure as follows:

WITH period1 AS (
 SELECT 
   url,
   SUM(clicks) AS clicks,
   SUM(impressions) AS impressions,
   IFNULL(SUM(sum_position) / NULLIF(SUM(impressions), 0), 0) AS avg_position
 FROM 
   `use_your_bigquery_projectname.searchconsole.searchdata_url_impression`
 WHERE 
   data_date BETWEEN '2024-05-08' AND '2024-05-20' 
 AND search_type = 'WEB'
  GROUP BY 
  url
), 
period2 AS (
 SELECT 
  url,
  SUM(clicks) AS clicks,
  SUM(impressions) AS impressions,
  IFNULL(SUM(sum_position) / NULLIF(SUM(impressions), 0), 0) AS avg_position
 FROM 
  `use_your_bigquery_projectname.searchconsole.searchdata_url_impression`
  WHERE 
  data_date BETWEEN '2024-04-18' AND '2024-04-30'
 AND search_type = 'WEB'
 GROUP BY 
 url
),
comparison AS (
 SELECT
  COALESCE(p1.url, p2.url) AS url,
  IFNULL(p1.clicks, 0) AS clicks_period1,
  IFNULL(p2.clicks, 0) AS clicks_period2,
  IFNULL(p1.impressions, 0) AS impressions_period1,
  IFNULL(p2.impressions, 0) AS impressions_period2,
  IFNULL(p1.avg_position, 0) AS avg_position_period1,
  IFNULL(p2.avg_position, 0) AS avg_position_period2,
  IFNULL(p1.clicks, 0) - IFNULL(p2.clicks, 0) AS clicks_diff,
  IFNULL(p1.impressions, 0) - IFNULL(p2.impressions, 0) AS impressions_diff,
  IFNULL(p1.avg_position, 0) - IFNULL(p2.avg_position, 0) AS avg_position_diff
 FROM
  period1 p1
 FULL OUTER JOIN
   period2 p2
 ON
  p1.url = p2.url
)
SELECT 
 url,
 clicks_period1,
 clicks_period2,
 clicks_diff,
 impressions_period1,
 impressions_period2,
 impressions_diff,
 avg_position_period1,
 avg_position_period2,
 avg_position_diff
FROM 
 comparison
ORDER BY 
 url;

Now, run it and enjoy the data either by exporting it into the CSV or Google Sheets.

How to run SQL in BigQueryHow to run SQL in BigQuery.

In case you have millions of URLs, you may not be able to work in Google Sheets or CSV export because the data is too big. Plus, those apps have limitations on how many rows you can have in a single document. In that case, you can save results as a BigQuery table and connect to it with Looker Studio to view the data.

But please remember that BigQuery is a freemium service. It is free up to 1 TB of processed query data a month. Once you exceed that limit, your credit card will be automatically charged based on your usage.

That means if you connect your BigQuery to Looker Studio and browse your data there, it will count against your billing every time you open your Looker dashboard.

That is why, when exports have a few tens of thousands or hundreds of thousands of rows, I like using Google Sheets. I can easily connect it to Looker Studio for data visualization and blending, and this will not count against my billing.

If you have ChatGPT Plus, you can simply use this custom GPT I’ve made, which takes into account table schemas for GA4 and Search Console. In the above guide, I assumed you were using the free version, and it illustrated how you can use ChatGPT overall for running BigQuery.

In case you want to know what is in that custom GPT, here is the screenshot of the backend.

Custom GPT with bigQuery table schemasCustom GPT with BigQuery table schemas.

Nothing complicated – you just need to copy tables from BigQuery as JSON in the step explained above and upload them into the custom GPT so it can refer to the table structure. Additionally, there is a prompt that asks GPT to refer to the JSON files attached when composing queries.

This is another illustration of how you can use ChatGPT to perform tasks more effectively, eliminating repetitive tasks.

If you need to work with another dataset (different from GA4 or GSC) and you don’t know SQL, you can upload the table schema from BigQuery into ChatGPT and compose SQLs specific to that table structure. Easy, isn’t it?

As homework, I suggest you analyze which queries have been affected by AI Overviews.

There is no differentiator in the Google Search Console table to do that, but you can run a query to see which pages didn’t lose ranking but had a significant CTR drop after May 14, 2024, when Google introduced AI Overviews.

You can compare the two-week period after May 14th with the two weeks prior. There is still a possibility that the CTR drop happened because of other search features, like a competitor getting a Featured Snippet, but you should find enough valid cases where your clicks were affected by AI Overviews (formerly Search Generative Experience or “SGE”).

2. How To Combine Search Traffic Data With Engagement Metrics From GA4 

When analyzing search traffic, it is vital to understand how much users engage with content because user engagement signals are ranking factors. Please note that I don’t mean the exact metrics defined in GA4.

However, GA4’s engagement metrics – such as “average engagement time per session,” which is the average time your website was in focus in a user’s browser – may hint at whether your articles are good enough for users to read.

If it is too low, it means your blog pages may have an issue, and users don’t read them.

If you combine that metric with Search Console data, you may find that pages with low rankings also have a low average engagement time per session.

Please note that GA4 and GSC have different attribution models. GA4 uses last data-driven or last-click attribution models, which means if one visits from Google to an article page once and then comes back directly two more times, GA4 may attribute all three visits to Google, whereas GSC will report only one.

So, it is not 100% accurate and may not be suitable for corporate reporting, but having engagement metrics from GA4 alongside GSC data provides valuable information to analyze your rankings’ correlations with engagement.

Using ChatGPT with BigQuery requires a little preparation. Before we jump into the prompt, I suggest you read how GA4 tables are structured, as it is not as simple as GSC’s tables.

It has an event_params column, which has a record type and contains dimensions like page_location, ga_session_id, and engagement_time_msec.  It tracks how long a user actively engages with your website.

event_params key engagement_time_msec is not the total time on the site but the time spent on specific interactions (like clicking or scrolling), when each interaction adds a new piece of engagement time. It is like adding up all the little moments when users are actively using your website or app.

Therefore, if we sum that metric and average it across sessions for the pages, we obtain the average engagement time per session.

Now, once you understand engagement_time_msec , let’s ask ChatGPT to help us construct a query that pulls GA4 “average engagement time per session” for each URL and combines it with GSC search performance data of articles.

The prompt I would use is:

Imagine you are a data analyst experienced in Google Analytics 4 (GA4), Google Search Console, SQL, and BigQuery.
Compose a SQL query that pulls the following data from Google Search Console for each URL for the previous 7 days, excluding the current day:

1. Clicks,
2. Impressions,
3. Average position (calculated as the sum of positions divided by the sum of impressions).

From GA4 BigQuery table unnest from event_params ga_session_id, engagement_time_msec and page_location.
Select only rows which have engagement_time_msec set as not null group all sessions with the same IDs and page_location and SUM engagement_time_msec and devides to SUM of sessions number 

Join GA4 and Google Search Console data by URLs for the same period. 
Additonally Optimize the query to pull from GA4's table partitions and not query the entire table.

Details:
1. BigQuery project name: use_your_bigquery_projectname 
2. GA4 dataset name: use_your_ga4_bigquery_dataset_name
3. GA4 table name: use_your_ga4_bigquery_table_name_under_dataset
3. Google Search Console dataset name: use_your_gsc_bigquery_table_name_under_dataset
3. Google Search Console table name: use_your_gsc_bigquery_table_name_under_dataset
4. Here is BigQuery tables schemas for GA4: [copy table schema here]
5. Here is BigQuery tables schemas for Google Search Console: [copy table schema here]


Once I copied and pasted into BigQuery, it gave me results with “average engagement time per session” being all nulls. So, apparently, ChatGPT needs more context and guidance on how GA4 works.

I’ve helped to provide additional knowledge as a follow-up question from GA4’s official documentation on how it calculates engagement_time_msec. I copied and pasted the document into the follow-up prompt and asked to refer to that knowledge when composing the query, which helped. (If you get any syntax error, just copy/paste it as a follow-up question and ask to fix it.)

As a result of 15 minutes of effort, I got the right SQL:


WITH
-- Step 1: Filter GA4 data for the last 7 days and unnest event_params
ga4_data AS (
  SELECT
    event_date,
    event_timestamp,
    (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'ga_session_id') AS ga_session_id,
    (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'engagement_time_msec') AS engagement_time_msec,
    (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page_location
  FROM
    `your_bq_project_name.your_bq_ga4_dataset.events_*`
  WHERE
     _TABLE_SUFFIX BETWEEN FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 8 DAY))
    AND FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
    AND (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'engagement_time_msec') IS NOT NULL
),

-- Step 2: Aggregate GA4 data by session ID and page location
aggregated_ga4_data AS (
  SELECT
    page_location,
    SUM(engagement_time_msec/1000) / NULLIF(COUNT(DISTINCT ga_session_id), 0) AS avg_engagement_time_msec
  FROM
    ga4_data
  GROUP BY
    page_location
),
-- Step 3: Filter GSC data for the last 7 days and select urls which had clicks
gsc_data AS (
  SELECT
    url,
    SUM(clicks) AS clicks,
    SUM(impressions) AS impressions,
    SUM(sum_position) / SUM(impressions) AS avg_position
  FROM
    `your_bq_project_name.searchconsole.searchdata_url_impression`
  WHERE
    data_date BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 8 DAY) AND DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)
    and 
    clicks > 0
  GROUP BY
    url
)

-- Joining Google Search Console data with GA4 data by page_location and url 
SELECT
  gsc.url,
  gsc.clicks,
  gsc.impressions,
  gsc.avg_position,
  ga4.avg_engagement_time_msec
FROM
  gsc_data AS gsc
LEFT JOIN
  aggregated_ga4_data AS ga4
ON
  gsc.url = ga4.page_location
ORDER BY
  gsc.clicks DESC;

This pulls GSC data with engagement metrics from GA4.

Search Console combined data with GA4Search Console combined data with GA4

Please note that you might notice discrepancies between the numbers in the GA4 UI and the data queried from BigQuery tables.

This happens because GA4 focuses on “Active Users” and groups rare data points into an “(other)” category, while BigQuery shows all raw data. GA4 also uses modeled data for gaps when consent isn’t given, which BigQuery doesn’t include.

Additionally, GA4 may sample data for quicker reports, whereas BigQuery includes all data. These variations mean GA4 offers a quick overview, while BigQuery provides detailed analysis. Learn a more detailed explanation of why this happens in this article.

Perhaps you may try modifying queries to include only active users to bring results one step closer to GA4 UI.

Alternatively, you can use Looker Studio to blend data, but it has limitations with very large datasets. BigQuery offers scalability by processing terabytes of data efficiently, making it ideal for large-scale SEO reports and detailed analyses.

Its advanced SQL capabilities allow complex queries for deeper insights that Looker Studio or other dashboarding tools cannot match.

Conclusion

Using ChatGPT’s coding abilities to compose BigQuery queries for your reporting needs elevates you and opens new horizons where you can combine multiple sources of data.

This demonstrates how ChatGPT can streamline complex data analysis tasks, enabling you to focus on strategic decision-making.

At the same time, these examples taught us that humans absolutely need to operate AI chatbots because they may hallucinate or produce wrong answers.

More resources: 


Featured Image: NicoElNino/Shutterstock

Google’s AI Overviews Go Viral, Draw Mainstream Media Scrutiny via @sejournal, @MattGSouthern

Google’s rollout of AI-generated overviews in US search results is taking a disastrous turn, with mainstream media outlets like The New York Times, BBC, and CNBC reporting on numerous inaccuracies and bizarre responses.

On social media, users are sharing endless examples of the feature’s nonsensical and sometimes dangerous output.

From recommending non-toxic glue on pizza to suggesting that eating rocks provides nutritional benefits, the blunders would be amusing if they weren’t so alarming.

Mainstream Media Coverage

As reported by The New York Times, Google’s AI overviews struggle with basic facts, claiming that Barack Obama was the first Muslim president of the United States and stating that Andrew Jackson graduated from college in 2005.

These errors undermine trust in Google’s search engine, which more than two billion people rely on for authoritative information worldwide.

Manual Removal & System Refinements

As reported by The Verge, Google is now scrambling to remove the bizarre AI-generated responses and improve its systems manually.

A Google spokesperson confirmed that the company is taking “swift action” to remove problematic responses and using the examples to refine its AI overview feature.

Google’s Rush To AI Integration

The flawed rollout of AI overviews isn’t an isolated incident for Google.

As CNBC notes in its report, Google made several missteps in a rush to integrate AI into its products.

In February, Google was forced to pause its Gemini chatbot after it generated inaccurate images of historical figures and refused to depict white people in most instances.

Before that, the company’s Bard chatbot faced ridicule for sharing incorrect information about outer space, leading to a $100 billion drop in Google’s market value.

Despite these setbacks, industry experts cited by The New York Times suggest that Google has little choice but to continue advancing AI integration to remain competitive.

However, the challenges of taming large language models, which ingest false information and satirical posts, are now more apparent.

The Debate Over AI In Search

The controversy surrounding AI overviews adds fuel to the debate over the risks and limitations of AI.

While the technology holds potential, these missteps remind everyone that more testing is needed before unleashing it on the public.

The BBC notes that Google’s rivals face similar backlash over their attempts to cram more AI tools into their consumer-facing products.

The UK’s data watchdog is investigating Microsoft after it announced a feature that would take continuous screenshots of users’ online activity.

At the same time, actress Scarlett Johansson criticized OpenAI for using a voice likened to her own without permission.

What This Means For Websites & SEO Professionals

Mainstream media coverage of Google’s erroneous AI overviews brings the issue of declining search quality to public attention.

As the company works to address inaccuracies, the incident serves as a cautionary tale for the entire industry.

Important takeaway: Prioritize responsible use of AI technology to ensure the benefits outweigh its risks.

New Google Lens Ads Mimic AI Search Results via @sejournal, @martinibuster

A keynote at Google’s Marketing Live event showed a new AI-powered visual search results that feature advertisements that engage users within the context of an AI-Assisted search, blurring the line between AI-generated search results and advertisements.

Google Lens is a truly helpful app but it becomes unconventional where it blurs the line between an assistant helping users and being led to a shopping cart. This new way of engaging potential customers with AI is so far out there that the presenter doesn’t even call it advertising, he doesn’t even use the word.

Visual Search Traffic Opportunity?

Google’s Group Product Manager Sylvanus Bent, begins the presentation with an overview of the next version of Google Lens visual search that will be useful for surfacing information and for help finding where to buy them.

Sylvanus explained how it will be an opportunity for websites to receive traffic from this new way to search.

“…whether you’re snapping a photo with lens or circling to search something on your social feed, visual search unlocks new ways to explore whatever catches your eye, and we recently announced a newly redesigned results page for Visual search.

Soon, instead of just visual matches, you’ll see a wide range of results, from images to video, web links, and facts about the knowledge graph. It gets people the helpful information they need and creates new opportunities for sites to be discovered.”

It’s hard to say whether or not this will bring search traffic to websites and what the quality of that traffic will be. Will they stick around to read an article? Will they engage with a product review?

Visual Search Results

Sylvanus shares a hypothetical example of someone at an airport baggage claim who falls in like with someone else’s bag. He explains that all the person needs to do is snap a photo of the luggage bag and Google Lens will take them directly to shopping options.

Screenshot Of Shopping Ads ad the top of a Google Lens visual search

He explains:

“No words, no problem. Just open Lens, take a quick picture and immediately you’ll see options to purchase.

And for the first time, shopping ads will appear at the very top of the results on linked searches, where a business can offer what a consumer is looking for.

This will help them easily purchase something that catches their eye.”

These are image-heavy shopping ads at the top of the search results and as annoying as that may be it’s nowhere near the “next level” advertising that is coming to Google’s new version of visual search where Google presents a paid promotion within the context of an AI Assistant.

Interactive Search Shopping

Sylvanus next describes an AI-powered form advertising that happens directly within search. But he doesn’t call it advertising. He doesn’t even use the word advertising. He suggests this new form of AI search experience is more than offer, saying that, “it’s an experience.”

He’s right to not use the word advertisement because what he describes goes far beyond advertising and blurs the boundaries between search and advertising within the context of AI-powered suggestions, paid suggestions.

Sylvanus explains how this new form of shopping experience works:

“And next, imagine a world where every search ad is more than an offer. It’s an experience. It’s a new way for you to engage more directly with your customers. And we’re exploring search ads with AI powered recommendations across different verticals. So I want to show you an example that’s going live soon and you’ll see even more when we get to shopping.”

He uses the example of someone who needs to store their furniture for a few months and who turns to Google to find short term storage. What he describes is a query for local short term storage that turns into a “dynamic ad experience” that leads the searcher into throwing packing supplies into their shopping cart.

He narrated how it works:

“You search for short term storage and you see an ad for extra space storage. Now you can click into a new dynamic ad experience.

You can select and upload photos of the different rooms in your house, showing how much furniture you have, and then extra space storage with help from Google, AI generates a description of all your belongings for you to verify. You get a recommendation for the right size and type of storage unit and even how much packing supplies you need to get the job done. Then you just go to the website to complete the transaction.

And this is taking the definition of a helpful ad to the next level. It does everything but physically pick up your stuff and move it, and that is cool.”

Step 1: Search For Short Term Storage

The above screenshot shows an advertisement that when clicked takes the user to what looks like an AI-assisted search but is really an interactive advertisement.

Step 2: Upload Photos For “AI Assistance”

The above image is a screenshot of an advertisement that is presented in the context of AI-assisted search.  Masking an advertisement within a different context is the same principal behind an advertorial where an advertisement is hidden in the form of an article. The phrases “Let AI do the heavy lifting” and “AI-powered recommendations” create the context of AI-search that masks the true context of an advertisement.

Step 3: Images Chosen For Uploading

The above screenshot shows how a user uploads an image to the AI-powered advertisement within the context of an AI-powered search app.

The Word “App” Masks That This Is An Ad

Screenshot of interactive advertisement for that identifies itself as an app with the words

Above is a screenshot of how a user uploads a photo to the AI-powered interactive advertisement within the context of a visual search engine, using the word “app” to further the illusion that the user is interacting with an app and not an advertisement.

Upload Process Masks The Advertising Context

Screenshot of interactive advertisement that uses the context of an AI Assistant to mask that this is an advertisement

The phrase “Generative AI is experimental” contributes to the illusion that this is an AI-assisted search.

Step 4: Upload Confirmation

In step 4 the “app” advertisement is for confirming that the AI correctly identified the furniture that needs to be put into storage.

Step 5: AI “Recommendations”

The above screenshot shows “AI recommendations” that look like search results.

The Recommendations Are Ad Units

Those recommendations are actually ad units that when clicked takes the user to the “Extra Space Storage” shopping website.

Step 6: Searcher Visits Advertiser Website

Blurring The Boundaries

What the Google keynote speaker describes is the integration of paid product suggestions into an AI assisted search. This kind of advertising is so far out there that the Googler doesn’t even call it advertising and rightfully so because what this does is blur the line between AI assisted search and advertising. At what point does a helpful AI search become just a platform for using AI to offer paid suggestions?

Watch The Keynote At The 32 Minute Mark

Featured Image by Shutterstock/Ljupco Smokovski

Google To Prioritize AI Overviews Ads Over Organic Search via @sejournal, @martinibuster

Speakers at Google’s Marketing Live event demonstrated how they will utilize user search queries and AI Overviews content to show interactive shopping ads that will push organic search results even lower, stating that Google is “focused on opening up new opportunities for your business.

Google: We’re Not Building A Better Search Engine

The first speaker, Philipp Schindler, SVP & Chief Business Officer at Google, said out loud what Googlers normally don’t when he said that the purpose of search results is to show advertising.

He made the remark in the context of a new AI video tool that will help YouTube creators make more content.

At the 18:19 minute mark of the event, Schindler boasted:

“We’ve been collaborating with some really talented film makers, musicians and artists, and the results have been simply incredible. Soon we’ll bring video to shorts, opening up a whole new world of creative possibilities for you and your brands. Just imagine every creator with the power of AI in their pocket.

So what does all of this mean for you? More creators creating more quality content attracts more viewers, which means more reach, engagement and ROI for you. We’re not just building a better search engine or a better YouTube. We’re focused on opening up new opportunities for your business.”

Screenshot Of Google Marketing Event

The statement that Google is using AI Overviews and Search to build reach and ROI for advertisers is not the only one. The next two speakers made the same point.

Search And Shopping Ads In AI Overviews

The next speaker was Vidhya Srinivasan, VP/GM, Advertising at Google. She begins by describing how search experiences will drive traffic to websites. Then quickly switches gear to show how interactive advertising will push organic search listings literally beyond the view of users who are making the search queries.

At the 30 minute mark of the video, Srinivasan explained:

“AI overviews will appear in search results when they are particularly helpful beyond what search offers today. As we continue to test and evolve the search experience, we are going to stay super focused on sending valuable traffic to publishers and creators. But then, more avenues for user exploration leads to more choice and more choice leads to more opportunities for advertisers.

You may have noticed that we already show ads above and below AI overviews. These ads are matched to the user’s search query. We will now start testing, Search and Shopping ads ads in AI overviews for users in the US.

What is also new with this is we are going to match these ads not just to the query context, but also to the information within the AI Overviews. And, as always, ads will be clearly labeled.”

1. AI Overviews – No Organic Listings

2. Scroll Down For Shopping Ads

She next went on to describe an example of wrinkled clothes while traveling and turning to Google Search to find ways to prevent the wrinkles. She shows a search activity for travel hacks and shows how organic search results are pushed beneath the AI Overviews feature and new Search and Shopping ads that contain product images and pop out far more than any search results do.

She explained how the new AI Overviews shopping ads will be there to convert searchers:

“With the AI overview, I quickly found some common travel hacks that sounded promising. As I browsed the many options that showed up, I found a really nice fix, a wrinkle release spray that I’d never heard of before. So perfect. I want to try that.

Now, with this feature, I can just click on this ad right away, right there, and buy it.

So as you can see, we’re just making it easier and faster for consumers so that they can take action right away. So this is just one example of how we are using Gen AI. There are many more, and we’re going to start with more applications in search ads.”

3. Targeted Ads Based On AI Overviews

Google Search Is The Bait

Google search engineers are using the most advanced technology and data to create the most useful search results of any time in Google’s history, this is the best it’s ever been. But according to the people who are really in charge at Google, the purpose of Search is not “to organize the world’s information and make it universally accessible and useful” but to build more “reach, engagement and ROI” for advertisers. Sam Altman was right to call what Google is doing dystopian.

SEOs Were Social Engineered

Social engineering is the management of people’s behavior in order to get them to perform a certain way.  Google got a huge chunk of the web ecosystem bought into concepts like Core Web Vitals and also Experience, Expertise, Authoritativeness and Trustworthiness in order to satisfy users that Google apparently never intended for them.

It’s not the fault of the Googlers who put their hearts into perfecting search. They do a good job. But it’s clear that Google’s mission is no longer to make information accessible and useful. Perhaps what can only feel like a dystopian horror, Google succeeded in social-engineering the search community and publishers to focus on creating  helpful content so that those on the advertising side can use it to build more ROI for advertisers.

It’s not just SEOs and publishers that were used for the benefit of advertisers.

Watch the Google Marketing Live Keynote 2024

Featured Image by Shutterstock/pikselstock

7 Ways AI Took My Job [To The Next Level] via @sejournal, @CallRail

With AI-powered call attribution, you can gain valuable insights into which channels are driving the most conversions.

How Call Attribution Works

  • Step 1: Assign – Select unique call tracking numbers to assign to each campaign or listing.
  • Step 2: Track – Potential customers see your ad or listing and call the associated phone number.
  • Step 3: Forward –The calls ring directly into your main business phone, regardless of which number they use.
  • Step 4: Analyze – Because they used one of your tracking numbers, you instantly know which ad or campaign inspired them to call.

With AI-powered call tracking, gone are the days of wondering how your digital marketing efforts are tied to high-value inbound calls.

For agencies, this helps prove the real value of your services and extend the life of your client relationships.

2. AI Can Help You Save Time On Manually Reviewing Calls

Listening to and analyzing phone calls manually can be time-consuming and inefficient for agencies.

However, it’s an important part of understanding the customer experience and sales team performance.

With AI-powered call analysis tools, you get quality, keyword-tagged transcriptions with near-human-level accuracy.

Not only can this technology help you save over 50% of the time spent listening to phone calls, but it can also help you deliver actionable recommendations to clients and drive better results.

Conversation Intelligence, for instance, is trained on over 1.1M hours of voice data and enables real-time analysis for instantaneous results.

This advanced tool provides opportunities for you to improve your strategy through the following granular insights:

  • Spotting disparities in the industry-specific lingo your sales team uses, compared to the lingo your prospects are using to describe their business challenges and goals.
  • Identifying trends or gaps in your service offerings based on what your prospects are asking for.
  • Identifying frequently asked questions and other important topics to address through content marketing.
  • Setting goals for lead qualification — not just the quantity of leads generated for your business.

Conversational AI is perfectly suited to summarize the content of long conversations – however, the call summaries still require a human to read them and determine the main takeaways.

But if you work in a bustling small business, it’s unlikely you’d have the bandwidth for tasks such as call transcription, summaries, keyword spotting, or trend analysis.

Rather than displacing human labor, conversational AI is assisting businesses in taking on tasks that may have been overlooked and leveraging data that would otherwise remain untapped.

3. AI Can Help You Lower Cost Per Lead / Save Money On Tools & Ad Spend

Ever wonder why certain campaigns take off while others fall flat? It’s all in the data!

Even failed campaigns can offer invaluable insights into your client’s audience and messaging.

But if you can’t spot the underperformers quickly enough, you risk wasting your ad budget on ineffective tactics.

The quicker you can identify what’s working and what’s not, the quicker you can pivot and adjust your marketing strategy.

With AI-powered tools, agencies can access instant insights that enable them to reduce wasteful spending and improve overall campaign efficiency.

How To Deliver More Value With AI

  • Make a bigger impact in less time: AI-powered technology creates a force multiplier within your agency, allowing you to make more of an impact with the same level of inputs you’re already using.
  • Unlock actionable insights from call data: AI is revolutionizing the way companies leverage call data by enabling them to gain insights at scale. As a result, businesses can increase their ROI and deliver greater value to their clients by analyzing hundreds of calls efficiently.
  • Foster alignment with data-driven strategies: By analyzing customer conversations with AI, businesses can align their marketing strategy with data-driven recommendations, enhancing overall coherence. Additionally, the ability to create triggers based on specific phrases enables automated analysis and reporting, further streamlining the alignment process.
  • Drive effectiveness with rapid insights: Leveraging Conversation Intelligence enables agencies to deliver better insights faster, increase conversion rates, refine keyword strategies, and develop robust reporting capabilities.

With the right AI-powered tools, you can access the insights you need to ensure maximum ROI for your clients.

4. AI Can Help You Improve Overall Agency Efficiency

Are you spending too much valuable time on tasks that produce minimal results?

Many agencies find themselves bogged down by routine, administrative tasks that don’t contribute much to their bottom line.

But with AI automation, agencies can streamline their operations and redirect their energy towards more strategic endeavors.

From email scheduling and social media posting to data entry and report generation, AI can handle a wide array of tasks with precision and efficiency – giving you time to focus on high-impact activities that drive growth and deliver tangible results.

Ways Your Business Can Benefit From Automation

  1. Automatically transcribe your calls to boost close rates: See how your team is handling difficult objections and ensure that they’re delivering your businessʼ value proposition in an effective manner.
  2. Score calls based on quality and opportunity: Take the time-consuming work out of scoring your calls and determine which campaigns drive the best calls to your business.
  3. Classify calls by your set criteria: Qualify, score, tag, or assign a value to the leads that meet your criteria, automatically.
  4. Automatically redact sensitive information: Protect your customers by removing billing or personal information. Keep your data safe and secure through complete HIPAA compliance.
  5. Monitor your teamsʼ performance: Use Conversation Intelligence as a valuable sales training tool to ensure your team doesn’t miss any key messaging marks.
  6. Know your customersʼ needs: Identify conversation trends in your phone calls and stay privy to evolving customer needs.
  7. Improve your digital marketing strategy: Use AI-powered insights to inform your digital marketing strategy and boost your online presence.

By automating mundane tasks, agencies can optimize workflows, increase productivity, and improve efficiency across the board.

Looking for 5 – 7? Download The Full Guide

Rather than fearing AI, the future belongs to those who embrace it.

By strategically combining human creativity with artificial intelligence, you can unlock capabilities that transcend what either could achieve alone.

Want to discover even more ways to level up your agency with AI?

Get the full guide here.

Google’s AI Vision Driven By Panic, Not User Needs: Former Product Manager via @sejournal, @MattGSouthern

A 16-year Google veteran is raising concerns about the company’s current focus on AI, labeling it a “panic reaction” driven by fear of falling behind competitors.

Scott Jenson, who left Google last month, took to LinkedIn to critique the tech giant’s AI projects as “poorly motivated and driven by this mindless panic that as long as it had ‘AI’ in it, it would be great.”

Veteran’s Criticism Of Google’s AI Focus

Jenson stated that Google’s vision of creating an AI assistant for its ecosystem is “pure catnip” fueled by the fear of letting someone else get there first.

He parallels the ill-fated Google+ product, which he calls a “similar hysterical reaction” to Facebook’s rise.

Jenson wrote:

“This exact thing happened 13 years ago with Google+ (I was there for that fiasco as well). That was a similar hysterical reaction but to Facebook.”

Lack Of User-Driven Motivation

Jenson argues that Google’s strategy lacks motivation driven by genuine user needs, a sentiment echoed by a recent Gizmodo article that described this year’s Google I/O developer conference as “the most boring ever.”

The article, which Jenson linked to in his post, criticized Google for failing to clarify how Gemini’s new AI technology would integrate into its existing products and enhance the user experience.

See Jenson’s full post below:

Can You Turn Off Google’s AI Overviews?

One prime example of Google’s AI overreach is the AI overviews feature, which generates summaries to directly answer search queries by ingesting information from across the web.

This controversial move has sparked legal battles, with publishers accusing Google of violating intellectual property rights and unfairly profiting from their content without permission.

Turning Off AI Overviews

While Google doesn’t provide an official setting to turn off AI overviews, a viral article from Tom’s Hardware suggests using browser extensions.

Alternatively, you can configure Chrome to go directly to web search results, bypassing the AI-generated overviews.

Here are the steps:

  • Open Chrome settings by clicking the three dots in the top-right corner and selecting “Settings” from the menu.
  • In the Settings window, click on the “Search Engine” tab on the left side.
  • Under the “Search Engine” section, click “Manage search engines and site search.”
  • Scroll down to the “Site search” area and click “Add” to create a new entry.

In the new entry, enter the following details:

  • Name: Google (Web)
  • Shortcut: www.google.com
  • URL: {google:baseURL}/search?udm=14&q=%s
  • Click “Add
Screenshot from: chrome://settings/searchEngines, May 2024.

Lastly, click the three dots next to the new “Google (Web)” entry and select “Make default.”

Screenshot from: chrome://settings/searchEngines, May 2024.

After following these steps, Chrome will now default to showing regular web search results instead of the AI overview summaries when you perform searches from the address bar.

Tensions Over Data Usage

The controversy surrounding AI overviews creates tension between tech companies and content creators over using online data for AI training.

Publishers argue that Google’s AI summaries could siphon website traffic, threatening independent creators’ revenue streams, which rely on search referrals.

The debate reflects the need for updated frameworks to balance innovation and fair compensation for content creators, maintaining a sustainable open internet ecosystem.


FAQ

What concerns has Scott Jenson raised about Google’s AI focus?

Scott Jenson, a former Google product manager, has expressed concerns that Google’s current AI focus is more of a “panic reaction” to stay ahead of competitors rather than addressing user needs. He critiques Google’s AI initiatives as poorly motivated and driven by a fear of letting others get ahead.

How does Scott Jenson compare Google’s AI strategy to past projects?

Jenson parallels Google’s current AI focus and the company’s response to Facebook years ago with Google+. He describes both as “hysterical reactions” driven by competition, which, in the case of Google+, resulted in a product that failed to meet its objectives.

Why are content creators concerned about Google’s AI overviews?

Content creators worry that Google’s AI overviews, which generate summaries by ingesting web content, could reduce site traffic. They argue that this practice is unfair as it uses their content without permission and impacts their revenue streams that rely on search referrals.

How can users turn off Google’s AI overviews in Chrome?

Although no official setting exists to disable AI overviews, users can use a workaround by enabling a specific Chrome setting.

Here are the steps:

  • Open Chrome settings by clicking the three dots in the top-right corner and selecting “Settings” from the menu.
  • In the Settings window, click on the “Search Engine” tab on the left side.
  • Under the “Search Engine” section, click “Manage search engines and site search.”
  • Scroll down to the “Site search” area and click “Add” to create a new entry.

In the new entry, enter the following details:

    • Name: Google (Web)
    • Shortcut: www.google.com
    • URL: {google:baseURL}/search?udm=14&q=%s
    • Click “Add

This will force Chrome to skip AI-generated overviews and show the classic list of web links.


Featured Image: Sira Anamwong/Shutterstock

Using AI Ethically In SEO via @sejournal, @wburton27

AI can help brands and marketers be more efficient and productive and do things quicker, but it is not perfect and does have some drawbacks.

With the rise and adoption of AI into SEO workflows, processes, and tools, SEO pros must take an ethical approach to artificial intelligence.

What exactly does an ethical approach to AI mean?

An ethical approach involves using AI technologies transparently, fairly, and responsibly while respecting user privacy and ensuring the accuracy and integrity of information.

We are all aware that using AI is imperfect and can be full of inaccurate, biased, fluffy information, etc., which can cause many problems for agencies and marketers that rely on AI to create content.

With the March core update, sites that use AI content that was not edited, original, or helpful lost a substantial portion of organic traffic.

Here are some ways we can use AI to be more ethical.

Be Transparent And Provide Disclosure

Do not use generative AI to create content for publishing. If you use generative AI in parts of your process, you should be fully transparent to the brands you work with about how you use AI in your SEO practices.

Maintain Accuracy And Integrity

If you’re going to use AI, you should take a human-led approach to writing long-form content. Humans should always do the content creation, but AI can be helpful for brainstorming, organizing, rewording, transcription, and reworking content. In each case, outputs must be checked for originality using Copyscape or the tool of your choice.

Additionally, the information must be trustworthy and accurate. With the HCU being incorporated into the March core update, it’s more important than ever to focus on people-first content rather than content that is not helpful, useful, or satisfying the end user’s intent.

Be Original and Useful

With Google focusing on a good user and people-first content experience, we should not rely on AI content because of the inadequacy in training data, and a lack of originality. AI could be great for compiling a list of notes from people with first-hand experience and pulling them into a cohesive article, for example, but not to produce the list and facts, even with fact-checking.

Follow Compliance With Search Engine Guidelines

It’s imperative that we follow search engine guidelines and ethical standards.

AI should not be used to engage in practices like keyword stuffing, cloaking, or creating doorway pages. Instead, it should support the creation of high-quality and useful content.

Take a look at Google AI Principles – Google AI.

Promote Positive Impact

Ethically using AI in SEO also means considering the broader impact on society. This entails promoting trustworthy, useful content that contributes positively to users’ knowledge and well-being.

Develop Safely & Respect Privacy

If you build your own tools and platforms with AI, ensure you have strong security protocols and practices to prevent causing any harm.

Always assess your technologies before launching them into the production environment to ensure they are safe and secure. Ensure to continue monitoring it after it is released to the general public.

LLMs are not secure. It may be necessary to get legal advice before implementing certain types of AI, like generative AI, in processes that include user/customer information. Updating a privacy policy may not be enough.

Never put proprietary and confidential information into a generative AI chatbot like ChatGPT.  Most LLMs save all user inputs and the information could be used to generate responses to other users.

Respect Intellectual Property & Originality

One of the biggest issues with AI is intellectual property (IP). If I create some content using ChatGPT, who owns it?

We need to ensure that when AI recommends content, it is original and not taken from anywhere else. This can be problematic because some AI platforms don’t list the source of the information unless you specify chatbots to do so.

ChatGPT can tell you where the content sources are coming from if you list them in your prompt. For example, I asked ChatGPT to write me a 750-word blog post on the top things to do in NY and list the sources, and it did.

 listing top sightseeing spots in new york city with descriptions, hyperlinks, and a dialog boxScreenshot from ChatGPT, April 2024

If you’re getting some information from ChatGPT, you need to credit the source and ensure they’re not copying other people’s content. Also, setting clear rules for using AI in making content can help avoid legal problems and ensure you’re fair and honest.

I checked the content that I created in ChatGPT, and according to Copyscape, it is full of similar text.

Screenshot of a web search page displaying results for historic public parks in new york city. the results highlight multiple links and brief summaries about parks like the high line and hamilton park.

Screenshot from Copyscape, April 2024

Note: Please keep in mind that asking LLMs to cite sources doesn’t guarantee you’re citing the right content or that the content is original. The best and safest way to avoid accidental plagiarism is for humans to do the research and write the content.

Google Is Not About Content That Is Artificial And Lacking In Originality

With the rapid growth of AI-based tools entering the market and AI being incorporated into a lot of platforms and being used in daily SEO tasks,  it is extremely important for us to adhere to ethical AI principles to ensure that the use of AI in SEO supports a fair, equitable, and user-focused search ecosystem.

Google has always been about quality and original content that offers value to end users and not content that is fully artificial, offers no value, lacks trust, is thin, duplicate, lacks originality, etc.

In order to compete in today’s competitive and ever-changing SERPs, focusing on improving E-E-A-T is more important than ever before because it is a quality signal that shows Google and end users that you’re the subject matter expert and authority in your niche.

It’s highly recommended to have thought leaders and experts in your niche create your content and show their expertise on your site.

Additionally, it’s important to focus on user experience and ensure that your site loads quickly, is easy to navigate, and helps users find exactly what they came to your site for.

More resources: 


Featured Image: aniqpixel/Shutterstock