How To Use Big Query And GSC Data For Content Performance Analysis

If you’ve always been in awe of folks using the Google Search Console API to do cool things, this article is a good read for you.

You can use BigQuery with the GSC bulk data export to get some of the same benefits without requiring the help of a developer.

With BigQuery, you can efficiently analyze large volumes of data from the GSC bulk data export.

You won’t have real-time data retrieval; that’s available with the API in our scenario but you can rely on daily data imports which means that you are working with up-to-date information.

By leveraging BigQuery and the GSC bulk data export, you can access comprehensive search analytics data – that’s the part you hear everyone raving about on LinkedIn.

According to Gus Pelogia, SEO product manager at Indeed:

“It’s such a game changer and a great opportunity to learn SQL. We can finally bypass GSC and external SEO tools limitations. I was surprised to see how simple it was to retrieve data.”

A Structured Approach To Using BigQuery And Google Search Console (GSC) Data For Content Performance Analysis

The aim of this article is not to provide you with a long list of queries or a massive step-by-step blueprint of how to conduct the most intense audit of all time.

I aim to make you feel more comfortable getting into the groove of analyzing data without the limitations that come with the Google Search Console interface. To do this, you need to consider five steps:

  • Identify use cases.
  • Identify relevant metrics for each use case.
  • Query the data.
  • Create a looker studio report to help stakeholders and teams understand your analysis.
  • Automate reporting.

The issue we often face when getting started with BigQuery is that we all want to query the data right away. But that’s not enough.

The true value you can bring is by having a structured approach to your data analysis.

1. Identify Use Cases

It is often recommended that you know your data before you figure out what you want to analyze. While this is true, in this case, it will be limiting you.

We recommend you start by determining the specific purpose and goals for analyzing content performance.

Use Case #1: Identify The Queries And Pages That Bring The Most Clicks

“I believe that every high-quality SEO audit should also analyze the site’s visibility and performance in search. Once you identify these areas, you will know what to focus on in your audit recommendations.”

Said Olga Zarr in her “How to audit a site with Google Search Console” guide.

To do that, you want the queries and the pages that bring the most clicks.

Use Case #2: Calculating UQC

If you want to spot weak areas or opportunities, calculating the Unique Query Count (UQC) per page offers valuable insights.

You already know this because you use this type of analysis in SEO tools like Semrush, SE Ranking, Dragon Metrics, or Serpstat (the latter has a great guide on How to Use Google Search Console to Create Content Plans).

However, it is incredibly useful to recreate this with your own Google Search Console data. You can automate and replicate the process on a regular basis.

There are benefits to this:

  • It helps identify which pages are attracting a diverse range of search queries and which ones may be more focused on specific topics.
  • Pages with a high UQC may present opportunities for further optimization or expansion to capitalize on a wider range of search queries.
  • Analyzing the UQC per page can also reveal which position bands (e.g., positions 1-3, 4-10, etc.) display more variability in terms of the number of unique queries. This can help prioritize optimization efforts.
  • Understanding how UQC fluctuates throughout the year can inform content planning and optimization strategies to align with seasonal trends and capitalize on peak periods of search activity.
  • Comparing UQC trends across different time periods enables you to gauge the effectiveness of content optimization efforts and identify areas for further improvement.

Use case #3: Assessing The Content Risk

Jess Joyce, B2B & SaaS SEO expert has a revenue generating content optimization framework she shares with clients.

One of the critical steps is finding pages that saw a decline in clicks and impressions quarter over quarter. She relies on Search Console data to do so.

Building this query would be great but before we jump into this, we need to assess the content risk.

If you calculate the percentage of total clicks contributed by the top 1% of pages on a website based on the number of clicks each page receives, you can quickly pinpoint if you are in the danger zone – meaning if there are potential risks associated with over-reliance on a small subset of pages.

Here’s why this matters:

  • Over-reliance on a small subset of pages can be harmful as it reduces the diversification of traffic across the website, making it vulnerable to fluctuations or declines in traffic to those specific pages.
  • Assessing the danger zone: A percentage value over 40% indicates a high reliance on the top 1% of pages for organic traffic, suggesting a potential risk.
  • This query provides valuable insight into the distribution of organic traffic across a website.

2. Identify Relevant Metrics

Analyzing your content lets you discern which content is effective and which isn’t, empowering you to make data-informed decisions.

Whether it’s expanding or discontinuing certain content types, leveraging insights from your data enables you to tailor your content strategy to match your audience’s preferences.

Metrics and analysis in content marketing provide the essential data for crafting content that resonates with your audience.

Use Case #1: Identify The Queries And Pages That Bring The Most Clicks

For this use case, you need some pretty straightforward data.

Let’s list it all out here:

  • URLs and/or queries.
  • Clicks.
  • Impressions.
  • Search type: we only want web searches, not images or other types.
  • Over a specific time interval.

The next step is to determine which table you should get this information from. Remember, as we discussed previously, you have:

  • searchdata_site_impression: Contains performance data for your property aggregated by property.
  • searchdata_url_impression: Contains performance data for your property aggregated by URL.

In this case, you need the performance data aggregated by URL, so this means using the searchdata_url_impression table.

Use Case #2: Calculating UQC

For this use case, we need to list what we need as well:

  • URL: We want to calculate UQC per page.
  • Query: We want the queries associated with each URL.
  • Search Type: We only want web searches, not images or other types.
  • We still need to pick a table, in this case, you need the performance data aggregated by URL so this means using the searchdata_url_impression table.

Use Case #3: Assessing The Content Risk

To calculate the “clicks contribution of top 1% pages by clicks,” you need the following metrics:

  • URL: Used to calculate the clicks contribution.
  • Clicks: The number of clicks each URL has received.
  • Search Type: Indicates the type of search, typically ‘WEB’ for web searches.
  • We still need to pick a table, in this case, you need the performance data aggregated by URL so this means using the searchdata_url_impression table. (Narrator voice: notice a trend? We are practicing with one table which enables you to get very familiar with it.)

3. Query The Data

Use Case #1: Identify The Queries And Pages That Bring The Most Clicks

Let’s tie it all together to create a query, shall we?

You want to see pages with the most clicks and impressions. This is a simple code that you can get from Marco Giordano’s BigQuery handbook available via his newsletter.

We have slightly modified it to suit our needs and to ensure you keep costs low.

Copy this query to get the pages with the most clicks and impressions:

SELECT url, SUM(clicks) as total_clicks, SUM(impressions) as total_impressions FROM `pragm-ga4.searchconsole.searchdata_url_impression`

WHERE search_type = 'WEB' and url NOT LIKE '%#%'

AND data_date = "2024-02-13"

GROUP BY url

ORDER BY total_clicks DESC;

It relies on one of the most common SQL patterns. It enables you to group by a variable, in our case, URLs. And then, you can select aggregated metrics you want.

In our case, we specified impressions and clicks so we will be summing up clicks and impressions (two columns).

Let’s break down the query Marco shared:
SELECT statement

SELECT url, SUM(clicks) as total_clicks, SUM(impressions) as total_impressions: Specifies the columns to be retrieved in the result set.

  • url: Represents the URL of the webpage.
  • SUM(clicks) as total_clicks: Calculates the total number of clicks for each URL and assigns it an alias total_clicks.
  • SUM(impressions) as total_impressions: Calculates the total number of impressions for each URL and assigns it an alias total_impressions.

FROM clause

  • FROM table_name`pragm-ga4.searchconsole.searchdata_url_impression`: Specifies the table from which to retrieve the data.
  • table_name: Represents the name of the table containing the relevant data.
  • Important to know: replace our table name with your table name.

WHERE clause

  • WHERE search_type = ‘WEB’ and url NOT LIKE ‘%#%’: Filters the data based on specific conditions.
  • search_type = ‘WEB’: Ensures that only data related to web search results is included.
  • url NOT LIKE ‘%#%’: Excludes URLs containing “#” in their address, filtering out anchor links within pages.
  • data_date = “2024-02-13”: This condition filters the data to only include records for the date ‘2024-02-13’. It ensures that the analysis focuses solely on data collected on this specific date, allowing for a more granular examination of web activity for that day.
  • (Narrator voice: we recommend you select a date to keep costs low.)

Important to know: We recommend you select two days before today’s date to ensure that you have data available.

GROUP BY clause

  • GROUP BY url: Groups the results by the URL column.
  • This groups the data so that the SUM function calculates total clicks and impressions for each unique URL.

ORDER BY clause

  • ORDER BY total_clicks DESC: Specifies the ordering of the result set based on the total_clicks column in descending order.
  • This arranges the URLs in the result set based on the total number of clicks, with the URL having the highest number of clicks appearing first.

This query is still more advanced than most beginners would create because it not only retrieves data from the right table but also filters it based on specific conditions (removing anchor links and search types that aren’t exclusively WEB).

After that, it calculates the total number of clicks and impressions for each URL, groups the results by URL, and orders them based on the total number of clicks in descending order.

This is why you should start by your use case first, figuring out metrics second and then writing the query.

Copy this SQL to get the queries in GSC with the most clicks and impressions: 

SELECT query, SUM(clicks) as total_clicks, SUM(impressions) as total_impressions FROM `pragm-ga4.searchconsole.searchdata_url_impression`

WHERE search_type = 'WEB'

AND data_date = "2024-02-13"

GROUP BY query

ORDER BY total_clicks DESC;

This is the same query, but instead of getting the URL here, we will retrieve the query and aggregate the data based on this field. You can see that in the GROUP BY query portion.

The problem with this query is that you are likely to have a lot of “null” results. These are anonymized queries. You can remove those by using this query:

SELECT query, SUM(clicks) as total_clicks, SUM(impressions) as total_impressions FROM `pragm-ga4.searchconsole.searchdata_url_impression`

WHERE search_type = 'WEB'

AND is_anonymized_query = false

AND data_date = "2024-02-13"

GROUP BY Query

ORDER BY total_clicks DESC;

Now, let’s go one step further. I like how Iky Tai, SEO at GlobalShares went about it on LinkedIn. First, you need to define what the query does: you can see the high-performing URLs by clicks for a selected date range.

The SQL query has to retrieve the data from the specified table, filter it based on a date range, not a specific date, calculate the total number of impressions and clicks for each URL, group the results by URL, and order them based on the total number of clicks in descending order.

Now that this is done, we can build the SQL query:

SELECT

url,

SUM(impressions) AS impressions,

SUM(clicks) AS clicks

FROM

`pragm-ga4.searchconsole.searchdata_url_impression`

WHERE

data_date BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 3 DAY) AND DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)

GROUP BY

url

ORDER BY

clicks DESC;

Before you copy-paste your way to glory, take the time to understand how this is built:

SELECT statement

  • SELECT url, SUM(impressions) AS impressions, SUM(clicks) AS clicks: Specifies the columns to be retrieved in the result set.
  • url: Represents the URL of the webpage.
  • SUM(impressions) AS impressions: Calculates the total number of impressions for each URL.
  • SUM(clicks) AS clicks: Calculates the total number of clicks for each URL.

FROM clause

  • FROM searchconsole.searchdata_url_impression: Specifies the table from which to retrieve the data.
  • (Narrator voice: You will have to replace the name of your table.)
  • searchconsole.searchdata_url_impression: Represents the dataset and table containing the search data for individual URLs.

WHERE clause

  • WHERE data_date BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 3 DAY) AND DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY): Filters the data based on the date range.
  • data_date: Represents the date when the search data was recorded.
  • BETWEEN: Specifies the date range from three days ago (INTERVAL 3 DAY) to yesterday (INTERVAL 1 DAY).
  • DATE_SUB(CURRENT_DATE(), INTERVAL 3 DAY): Calculates the date three days ago from the current date.
  • DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY): Calculates yesterday’s date from the current date.

Important to know: As we said previously, you may not have data available for the previous two days. This means that you could change that interval to say five and three days instead of three and one day.

GROUP BY clause

GROUP BY url: Groups the results by the URL column.

  • This groups the data so that the SUM function calculates impressions and clicks for each unique URL.

ORDER BY clause

ORDER BY clicks DESC: Specifies the ordering of the result set based on the clicks column in descending order.

  • This arranges the URLs in the result set based on the total number of clicks, with the URL having the highest number of clicks appearing first.

Important note: when first getting started, I encourage you to use an LLM like Gemini or ChatGPT to help break down queries into chunks you can understand.

Use Case #2: Calculating UQC

Here is another useful Marco’s handbook that we have modified in order to get you seven days of data (a week’s worth):

SELECT url, COUNT(DISTINCT(query)) as unique_query_count FROM `pragm-ga4.searchconsole.searchdata_url_impression`

WHERE search_type = 'WEB' and url NOT LIKE '%#%'

AND data_date BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 10 DAY) AND DATE_SUB(CURRENT_DATE(), INTERVAL 3 DAY)

GROUP BY url

ORDER BY unique_query_count DESC;
BigQuery complex SQL query in the internfaceScreenshot from Google Cloud, February 2024

This time, we will not break down the query.

This query calculates the Unique Query Count (UQC) per page by counting the distinct queries associated with each URL, excluding URLs containing ‘#’ and filtering for web searches.

It does that for an interval of seven days while taking into account data may not be available for the two previous days.

The results are then sorted based on the count of unique queries in descending order, providing insights into which pages attract a diverse range of search queries.

Use Case #3: Assessing The Content Risk

This query calculates the percentage of total clicks accounted for by the top 1% of URLs in terms of clicks. This is a far more advanced query than the previous ones. It is taken straight from Marco’s Playbook:

WITH PageClicksRanked AS (

SELECT

url,

SUM(clicks) AS total_clicks,

PERCENT_RANK() OVER (ORDER BY SUM(clicks) DESC) AS percent_rank

FROM

`pragm-ga4.searchconsole.searchdata_url_impression`

WHERE

search_type = 'WEB'

AND url NOT LIKE '%#%'

GROUP BY

url

)

SELECT

ROUND(SUM(CASE WHEN percent_rank <= 0.01 THEN total_clicks ELSE 0 END) / SUM(total_clicks) * 100, 2) AS percentage_of_clicks

FROM

PageClicksRanked;

This SQL query is more complex because it incorporates advanced techniques like window functions, conditional aggregation, and common table expressions.

Let’s break it down:

Common Table Expression (CTE) – PageClicksRanked

  • This part of the query creates a temporary result set named PageClicksRanked.
  • It calculates the total number of clicks for each URL and assigns a percentile rank to each URL based on the total number of clicks. The percentile rank is calculated using the PERCENT_RANK() window function, which assigns a relative rank to each row within a partition of the result set.
  • Columns selected:
    • url: The URL from which the clicks originated.
    • SUM(clicks) AS total_clicks: The total number of clicks for each URL.
    • PERCENT_RANK() OVER (ORDER BY SUM(clicks) DESC) AS percent_rank: Calculates the percentile rank for each URL based on the total number of clicks, ordered in descending order.

Conditions

  • search_type = ‘WEB’: Filters the data to include only web search results.
  • AND url NOT LIKE ‘%#%’: Excludes URLs containing “#” from the result set.

Grouping

  • GROUP BY url: Groups the data by URL to calculate the total clicks for each URL.

Main Query

  • This part of the query calculates the percentage of total clicks accounted for by the top 1% of URLs in terms of clicks.
  • It sums up the total clicks for URLs whose percentile rank is less than or equal to 0.01 (top 1%) and divides it by the total sum of clicks across all URLs. Then, it multiplies the result by 100 to get the percentage.

Columns selected

  • ROUND(SUM(CASE WHEN percent_rank <= 0.01 THEN total_clicks ELSE 0 END) / SUM(total_clicks) * 100, 2) AS percentage_of_clicks: Calculates the percentage of clicks accounted for by the top 1% of URLs. The CASE statement filters out the URLs with a percentile rank less than or equal to 0.01, and then it sums up the total clicks for those URLs. Finally, it divides this sum by the total sum of clicks across all URLs and multiplies it by 100 to get the percentage. The ROUND function is used to round the result to two decimal places.

Source

  • FROM PageClicksRanked: Uses the PageClicksRanked CTE as the data source for calculations.

(Narrator voice: this is why we don’t share more complex queries immediately. Writing complex queries immediately requires knowledge, practice, and understanding of the underlying data and business requirements.)

In order to write such queries, you need:

  • A solid understanding of SQL syntax: SELECT statements, GROUP BY, aggregate functions, subqueries and window functions to start.
  • A deep understanding of the database schema which is why we took the time to go through them in another article.
  • Practice! Writing and optimizing SQL queries does the trick. So does working on datasets and solving analytical problems! Practice means taking an iterative approach to experiment, test and refine queries.
  • Having a good cookbook: Setting aside good queries you can tweak and rely on.
  • Problem-solving skills: To find the right approach, you have to be able to break down complex analytical tasks into manageable steps. That’s why we started with the five-step framework.
  • A performance mindset: You want to improve query performance, especially for complex queries operating on large datasets. If you don’t, you could end up spending a lot of money in BigQuery.

4. Create Looker Studio Dashboards

Once this is done, you can use Looker Studio to build dashboards and visualizations that showcase your content performance metrics.

You can customize these dashboards to present data in a meaningful way for different stakeholders and teams. This means you aren’t the only one accessing the information.

We will dive into this portion of the framework in another article.

However, if you want to get started with a Looker Studio dashboard using BigQuery data, Emad Sharaki shared his awesome dashboard. We recommend you give it a try.

Emad BQ dashboard for SEOsImage from Emad Sharaki, February 2024

5. Automate Reporting

Once you have done all this, you can set up scheduled queries in BigQuery to automatically fetch GSC data present in the tables at regular intervals.

This means you can automate the generation and distribution of reports within your company.

You can check out the official documentation for this portion for now. We will cover this at a later date in another dedicated article.

The one tip we will share here is that you should schedule queries after the typical export window to ensure you’re querying the most recent available data.

In order to monitor the data freshness, you should track export completion times in BigQuery’s export log.

You can use the reporting automation to enable other teams when it comes to content creation and optimization. Gianna Brachetti-Truskawa, SEO PM and strategist, supports editorial teams by integrating reports directly into the CMS.

This means editors can filter existing articles by performance and prioritize their optimization efforts accordingly. Another automation reporting element to consider is to integrate with Jira to connect your performance to a dashboard with custom rules.

This means that articles can be pulled to the top of the backlog and that seasonal topics can be added to the backlog in a timely manner to create momentum.

Going Further

Obviously, you will need more use cases and a deeper understanding of the type of content audit you want to conduct.

However, the framework we shared in this article is a great way to ensure things stay structured. If you want to take it further, Lazarina Stoy, SEO data expert, has a few tips for you:

“When doing content performance analysis, it’s important to understand that not all content is created equal. Utilize SQL Case/When statements to create subsets of the content based on page type (company page, blog post, case study, etc.), content structure patterns (concept explainer, news item, tutorial, guide, etc), title patterns, target intent, target audiences, content clusters, and any other type of classification that is unique to your content.

That way you can monitor and troubleshoot if you detect patterns that are underperforming, as well as amplify the efforts that are paying off, whenever such are detected.”

If you create queries based on these considerations, share them with us so we can add them to the cookbook of queries one can use for content performance analysis!

Conclusion

By following this structured approach, you can effectively leverage BigQuery and GSC data to analyze and optimize your content performance while automating reporting to keep stakeholders informed.

Remember, collecting everyone else’s queries will not make you an overnight BigQuery pro. Your value lies in figuring out use cases.

After that, you can figure out the metrics you need and tweak the queries others created or write your own. Once you have that in the bag, it’s time to be a professional by allowing others to use the dashboard you created to visualize your findings.

Your peace of mind will come once you automate some of these actions and develop your skills and queries even more!

More resources:


Featured Image: Suvit Topaiboon/Shutterstock

Google Had Discussed Allowing Noindex In Robots.txt via @sejournal, @martinibuster

Google’s John Mueller responded to a question on LinkedIn to discuss the use of an unsupported noindex directive on the robots.txt of his own personal website. He explained the pros and cons of search engine support for the directive and offered insights into Google’s internal discussions about supporting it.

John Mueller’s Robots.txt

Mueller’s robots.txt has been a topic of conversation for the past week because of the general weirdness of the odd and non-standard directives he used within it.

It was almost inevitable that Mueller’s robots.txt was scrutinized and went viral in the search marketing community.

Noindex Directive

Everything that’s in a robots.txt is called a directive. A directive is a request to a web crawler that it is obligated to obey (if it obeys robots.txt directives).

There are standards for how to write a robots.txt directive and anything that doesn’t conform to those those standards is likely to be ignored. A non-standard directive in Mueller’s robots.txt caught the eye of someone who decided to post a question about it to John Mueller via LinkedIn, to know if Google supported the non-standard directive.

It’s a good question because it’s easy to assume that if a Googler is using it then maybe Google supports it.

The non-standard directive was noindex. Noindex is a part of the meta robots standard but not the robots.txt standard. Mueller had not just one instance of the noindex directive, he had 5,506 noindex directives.

The SEO specialist who asked the question, Mahek Giri, wrote:

“In John Mueller’s robots.txt file,

there’s an unusual command:

“noindex:”

This command isn’t part of the standard robots.txt format,

So do you think it will have any impact on how search engine indexes his pages?

John Mueller curious to know about noindex: in robots.txt”

Why Noindex Directive In Robots.txt Is Unsupported By Google

Google’s John Mueller answered that it was unsupported.

Mueller answered:

“This is an unsupported directive, it doesn’t do anything.”

Mueller then went on to explain that Google had at one time considered supporting the noindex directive from within the robots.txt because it would provide a way for publishers to block Google from both crawling and indexing content at the same time.

Right now it’s possible to block crawling in robots.txt or to block indexing with the meta robots noindex directive. But you can’t block indexing with the meta robots directive and block crawling in the robots.txt at the same time because a block on the crawl will prevent the crawler from “seeing” the meta robots directive.

Mueller explained why Google decided to not move ahead with the idea of honoring the noindex directive within the robots.txt.

He wrote:

“There were many discussions about whether it should be supported as part of the robots.txt standard. The thought behind it was that it would be nice to block both crawling and indexing at the same time. With robots.txt, you can block crawling, or you can block indexing (with a robots meta tag, if you allow crawling). The idea was that you could have a “noindex” in robots.txt too, and block both.

Unfortunately, because many people copy & paste robots.txt files without looking at them in detail (few people look as far as you did!), it would be very, very easy for someone to remove critical parts of a website accidentally. And so, it was decided that this should not be a supported director, or a part of the robots.txt standard… probably over 10 years ago at this point.”

Why Was That Noindex In Mueller’s Robots.txt

Mueller made clear that it’s unlikely that Google would support that tag and that this was confirmed about ten years ago. The revelation about those internal discussions is interesting but it’s also deepens the sense of weirdness about Mueller’s robots.txt.

See also: 8 Common Robots.txt Issues And How To Fix Them

Featured Image by Shutterstock/Kues

OpenAI’s Sam Altman On Challenging Google With AI Search via @sejournal, @martinibuster

OpenAI’s Sam Altman answered questions about challenging Google’s search monopoly and reveals that he’d rather entirely change the paradigm of how people get information rather than copy what Google has been doing it for the past twenty+ years.  His observations were made in the context of a podcast interview by Lex Fridman.

What Altman proposed is that the best way to challenge Google is to completely replace its entire business category, including the advertising.

1. Is OpenAI Building A Challenge Google Search?

The discussion began with a question from Fridman asking if it’s true that OpenAI is going to challenge Google.

Lex Fridman asked:

“So is OpenAI going to really take on this thing that Google started 20 years ago, which is how do we get-“

Sam Altman responded that the whole idea of building a better search engine limits what the future of information retrieval can be, calling the current conception of search boring.

Altman answered:

“I find that boring. I mean, if the question is if we can build a better search engine than Google or whatever, then sure, we should go, people should use the better product, but I think that would so understate what this can be. Google shows you 10 blue links, well, 13 ads and then 10 blue links, and that’s one way to find information.

But the thing that’s exciting to me is not that we can go build a better copy of Google search, but that maybe there’s just some much better way to help people find and act on and synthesize information. Actually, I think ChatGPT is that for some use cases, and hopefully we’ll make it be like that for a lot more use cases.”

2. The World Doesn’t Need Another Google

Altman expanded on his thoughts by saying that the idea of creating another Google in order to challenge Google is not interesting. He said that the more interesting path to follow is completely change not just how people get information but to do it in a way that fits into how people are using information.

Altman continued:

“But I don’t think it’s that interesting to say, “How do we go do a better job of giving you 10 ranked webpages to look at than what Google does?”

Maybe it’s really interesting to go say, “How do we help you get the answer or the information you need? How do we help create that in some cases, synthesize that in others, or point you to it in yet others?’

But a lot of people have tried to just make a better search engine than Google and it is a hard technical problem, it is a hard branding problem, it is a hard ecosystem problem. I don’t think the world needs another copy of Google.”

3. AI Search Hasn’t Been Cracked

The part where the conversation seemed fall off the rails is when Fridman steered the discussion over to integrating a chatbot with a search engine, which itself is already done to death and boring. Bing created the chat on top of search experience over a year ago and now there are at least six AI search engines.that integrate a chatbot on top of a traditional search engine.

Fridman’s direction of the discussion threw cold water on what Altman was talking about.

Altman said that nobody has “cracked the code yet,” which implied that repeating what Bing did was not what Sam Altman had in mind. He called it an “example of a cool thing.”

Fridman and Altman continued:

“And integrating a chat client, like a ChatGPT, with a search engine-

Sam Altman
As you might guess, we are interested in how to do that well. That would be an example of a cool thing.

…The intersection of LLMs plus search, I don’t think anyone has cracked the code on yet. I would love to go do that. I think that would be cool.”

4. Advertisement Supported AI Search Is Dystopian

Altman used the word “dystopic” to characterize a world in which AI search was based on an advertising model. Dystopic means dystopian, which means a dehumanizing existence that lacks justice and is characterized by distrust.

He noted that ChatGPT as a subscription-based model can be perceived as more trustworthy as an advertising-based search engine. He raised the idea of an AI suggesting that users try a specific product and questioning whether the recommendation was influenced by advertising or what was best for the user.

That makes sense because there’s a high level of trust involved with AI that doesn’t exist with traditional search. Many consumers don’t trust Google search because, rightly or wrongly, it’s perceived as influenced by advertising and spammy SEO.

Fridman steered the conversation to advertising:

“Lex Fridman
…What about the ad side? Have you ever considered monetization of-

Sam Altman
I kind of hate ads just as an aesthetic choice. I think ads needed to happen on the internet for a bunch of reasons, to get it going, but it’s a momentary industry. The world is richer now.

I like that people pay for ChatGPT and know that the answers they’re getting are not influenced by advertisers.

I’m sure there’s an ad unit that makes sense for LLMs, and I’m sure there’s a way to participate in the transaction stream in an unbiased way that is okay to do, but it’s also easy to think about the dystopic visions of the future where you ask ChatGPT something and it says, “Oh, you should think about buying this product,” or, “You should think about going here for your vacation,” or whatever.”

5. A Search Experience Where The Consumer Is Not The Product

Altman next commented that he didn’t like how consumers are the product when they used social media or search engines. What he means is that user interactions are sold to advertisers who then turn around to target the users based on their interests.

Altman continued:

“And I don’t know, we have a very simple business model and I like it, and I know that I’m not the product. I know I’m paying and that’s how the business model works.

And when I go use Twitter or Facebook or Google or any other great product but ad-supported great product, I don’t love that, and I think it gets worse, not better, in a world with AI.”

6. Altman Is Biased Against Advertising

Sam Altman explicitly said that he was biased against search and expressed confidence that there is a path toward an AI-based information retrieval system that is profitable without having to serve advertising. His statement that he was biased against advertising was made in the context of the interviewer raising the idea of “completely” throwing out ads, which Altman refused to confirm.

“Lex Fridman
…I could imagine AI would be better at showing the best kind of version of ads, not in a dystopic future, but where the ads are for things you actually need. But then does that system always result in the ads driving the kind of stuff that’s shown?

….I think it was a really bold move of Wikipedia not to do advertisements, but then it makes it very challenging as a business model. So you’re saying the current thing with OpenAI is sustainable, from a business perspective?

Sam Altman
Well, we have to figure out how to grow, but looks like we’re going to figure that out.

If the question is do I think we can have a great business that pays for our compute needs without ads, …I think the answer is yes.

Lex Fridman
Hm. Well, that’s promising. I also just don’t want to completely throw out ads as a…

Sam Altman
I’m not saying that. I guess I’m saying I have a bias against them.”

Is OpenAI Building A Challenge To Google?

Sam Altman did not directly say that OpenAI was building a challenge to Google. He did imply that a proper challenge to Google that uses AI doesn’t yet exist, saying that nobody has “cracked the code” on that yet.

What Altman offered was a general vision of an AI search that didn’t commoditize and sell the users to advertisers and thereby be more trustworthy and useful. He said that a proper  challenge to Google would be something that was completely different than what Google has been doing.

Watch the podcast at the 01:17:27 minute mark:

Featured Image by Shutterstock/photosince

Navigating The Intersections Of Technology And Human Interaction In AI, User Experience, And SEO via @sejournal, @SEOGoddess

Gone are the days when AI was merely a distant concept.

Today, AI has already been integrated into our daily lives through various applications and services, transforming how we interact with digital platforms.

A few years ago, AI operated silently behind the scenes, facilitating data processing and optimization for large services.

However, starting in 2023, its presence has become ubiquitous, manifesting in the features and functionalities of the apps and tools we use regularly.

How AI Is Reshaping Our Digital Experience

From iOS 16 Live Text Translation to Gmail Auto-complete, AI has become integral to our digital experiences.

What’s particularly fascinating is the bold strides made in the integration of AI into the very tools we use to create and work.

Consider the emergence of AI-powered image creation tools that generate visuals from textual descriptions or transform sketches into intricate 3D models in real time.

Video creation, once labor-intensive, is now more accessible and creative, thanks to AI-driven algorithms that handle background/object detection and stylization.

My background in design has compelled me to explore how these advancements will reshape our interactions with devices and software.

We are witnessing a shift from the traditional keyboard and mouse interaction towards more natural forms of communication with AI-integrated tools.

This transition from manual inputs to intuitive conversations and descriptions empowers users to become orchestrators of technological capabilities.

Does AI Enhance Or Compromise The Authenticity Of User Interactions?

The ongoing discourse surrounding AI’s role in UX design centers on whether it enhances or compromises the authenticity of user interactions.

Advocates view AI as a transformative force, empowering designers to personalize experiences, predict user behavior, and streamline processes.

Yet, skeptics raise concerns about the potential erosion of genuine human interaction, urging caution against the over-reliance on algorithmic intervention.

The rapid pace of AI innovation necessitates a delicate balance between harnessing its potential and preserving authenticity.

While AI offers unprecedented opportunities, its integration poses ethical dilemmas and challenges UX professionals to navigate complex terrain.

The need for responsible AI adoption underscores the importance of prioritizing user-centric design principles, maintaining transparency, and fostering open dialogue.

As we embrace AI as a tool for innovation, collaboration, and empowerment, we must ensure that it complements – not replaces – human expertise.

Ultimately, the future of UX lies in striking a balanced approach: leveraging AI’s capabilities while preserving the authenticity of user interactions. We must forge a path where technology enhances – rather than compromises – the human experience.

Expanding AI Applications And Their Impact On User Experience

AI is reshaping our interactions with cameras, transcending their traditional role as image-capturing devices.

Equipped with AI algorithms, cameras extend beyond mere photography to recognize QR codes, translate texts, and facilitate visual searches. This evolution transforms the camera into a dynamic input tool, effectively bridging the physical and digital realms.

Consider the Dot-go app, which leverages smartphone cameras to initiate automation processes.

Originally designed to aid visually impaired individuals, this innovative application demonstrates the broader potential of AI-powered cameras to enhance daily experiences for everyone.

From identifying bus routes to calculating calorie intake, AI-powered camera applications offer boundless possibilities for seamless integration into everyday life.

As AI progresses, it will further enrich user experiences through natural interactions such as facial recognition and gesture control. This highlights the importance of striking a balance between automation and human touch.

A Firsthand Look At How AI Empowers Developers

GitHub Copilot exemplifies the fusion of AI and developer collaboration, transcending its role as a mere tool to become a meeting point between developers and AI. (Disclaimer: I worked for GitHub.)

GitHub’s machine learning experts dedicated themselves to enhancing Copilot’s contextual understanding, recognizing effective communication as the linchpin of seamless collaboration.

During my time as GitHub’s SEO manager, I witnessed firsthand the remarkable fusion of AI and developer collaboration during the development phase of Copilot.

More than just a tool, Copilot represents a convergence of minds between developers and their AI pair programming counterparts.

Machine learning experts from GitHub tirelessly researched, developed, and tested new capabilities to enhance Copilot’s contextual understanding.

They recognized that effective communication is essential to pair programming, emphasizing the significance of inferring context to facilitate seamless collaboration.

As a marketing team member, I played a pivotal role in ensuring complete visibility from search to all communications surrounding the project. It was a project that genuinely excited me.

The evolution of Copilot from its inception to its widespread availability represented a paradigm shift in AI-driven coding tools. Leveraging OpenAI’s Codex model, GitHub Copilot emerged as the world’s first generative AI coding tool at scale.

The journey wasn’t just about developing a tool; it aimed to empower developers to focus on meaningful work.

Through meticulous experimentation and iteration, GitHub’s team honed Copilot into a resource that accelerates tasks like code completion and fosters creativity and efficiency.

The collaborative efforts behind Copilot underscored the significance of good user experience and the seamless integration of AI technologies into the developer workflow.

As a result, Copilot continues to revolutionize the way developers work, offering tailored suggestions and insights that amplify productivity and innovation in coding endeavors.

Optimizing The User Journey With Human-AI Collaboration

AI is a pivotal catalyst for personalization, intelligent automation, and predictive analytics in UX design.

By analyzing user data and behavior, AI empowers designers to create bespoke experiences that deeply resonate with individual preferences.

This AI-driven approach fosters engagement, efficiency, and data-driven decision-making, leading to significant improvements in SEO rankings.

Websites prioritizing UX elements tend to rank higher on search engine results pages (SERPs) due to reduced bounce rates, increased dwell time, and enhanced user-interaction metrics.

Thus, integrating AI-driven UX design not only optimizes user satisfaction, but also boosts a website’s visibility and authority in the competitive digital landscape.

However, as with any technological advancement, ethical considerations loom large.

Designers must navigate data privacy, bias, and transparency issues to ensure AI algorithms uphold moral standards and minimize negative impacts on users.

Key Takeaway

AI’s transformative impact on user experience design is undeniable.

By embracing AI’s capabilities and collaborating with skilled experts, designers can harness its potential to deliver exceptional experiences that meet or exceed user expectations in the digital era.

As we continue to innovate and adapt, the synergy between human creativity and AI technology will profoundly shape the future of interaction design.

More resources: 


Featured Image: APHITHANA/Shutterstock

Why Prediction Of 25% Search Volume Drop Due to Chatbots Fails Scrutiny via @sejournal, @martinibuster

Gartner’s predictions that AI Chatbots are the future and will account for a 25% drop in search market share got a lot of attention. What didn’t get attention is the fact that the claim fails to account for seven facts that call into question the accuracy of the prediction and demonstrates that it simply does not hold up to scrutiny.

1. AI Search Engines Don’t Actually Exist

The problem with AI technology is that it’s currently impossible to use AI infrastructure to create a constantly updated search index of web content in addition to billions of pages of news and social media that is constantly generated in real-time. Attempts to create a real-time AI search index fail because the nature of the technology requires retraining the entire language model to update it with new information. That’s why language models like GPT-4 don’t have access to current information.

Current so-called AI search engines are not actually AI search engines. The way they work is that when a user asks a question, a traditional search engine finds the answers and the AI chatbot chooses the questions and summarizes them. AI search engines aren’t really AI search engines. They are traditional search engines with a chatbot interface. When you use a so-called AI search engine what’s really happening is that you’re asking a chatbot to Google this for me.

2. Generative AI Is Not Ready For Widescale Use

The recent fiasco with Gemini’s image search underscores the fact that generative AI as a technology is still in its infancy. Microsoft Copilot completely went off the rails in March 2024 by assuming a godlike persona, calling itself “SupremacyAGI,” and demanding to be worshipped under the threat of imprisoning users of the service.

This is the technology that Gartner predicts will take away 25% of market share? Really?

Generative AI is unsafe and despite attempts to add guardrails the technology still manages to jump off the cliffs with harmful responses. The technology is literally in its infancy. To assert that it will be ready for widescale use in two years is excessively optimistic about the progress of the technology

3. True AI Search Engines Are Economically Unviable

AI Search Engines are exponentially more expensive than traditional search engines. It currently costs $20/month to subscribe to a Generative AI chatbot and that comes with limits of 40 queries every 3 hours and the reason for that is because generating AI answers is vastly more expensive than generating traditional search engine responses.

The economic feasibility of AI search engines rules out the use of AI as a replacement for traditional search engines.

4. Gartner’s Prediction Of 25% Decrease Assumes Search Engines Will Remain Unchanged

Gartner predicts a 25% decrease in traditional search query volume by 2026 but that prediction assumes that traditional search engines will remain the same. The Gartner analysis fails to account for the fact that search engines evolve not just on a yearly basis but on a month to month basis.

Search engines currently integrate AI technologies that increase search relevance in ways that innovate the entire search engine paradigm with applications such as making images tappable in as a way to launch an image-based search for text answers about the subject within an image.

That’s called multi-modal search, a way to search using sound and vision in addition to traditional text-based searching.  There is absolutely no mention of multimodality in traditional search, a technology that shows how traditional search engines evolve to meet user’s needs.

So-called AI chatbot search engines are in their infancy and offer zero multimodality. How can a technology so comparatively primitive even be considered competitive to traditional search?

5. Why Claim That AI Chatbots Will Steal Market Share Is Unrealistic

The Gartner report assumes that AI chatbots and virtual agents will become more popular but that fails to consider that Gartner’s own research from June 2023 shows that users distrust AI Chatbots.

Gartner’s own report states:

“Only 8% of customers used a chatbot during their most recent customer service experience, according to a survey by Gartner, Inc. Of those, just 25% said they would use that chatbot again in the future.”

Customer’s lack of trust is especially noticeable in Your Money Or Your Life (YMYL) tasks that involve money.

Gartner reported:

“Just 17% of billing disputes are resolved by customers who used a chatbot at some stage in their journey…”

Gartner’s enthusiastic assumption that users will trust AI chatbots may be unfounded because it may not have considered that users do not trust chatbots for important YMYL search queries, according to Gartner’s own research data.

are expected to become more popular, this does not necessarily mean they will diminish the value of search marketing. Search engines may incorporate AI technologies to enhance user experiences, keeping them as a central part of digital marketing strategies.

6. Gartner Advice Is To Rethink What?

Gartner’s advice to search marketers is to incorporate more experience, expertise, authoritativeness and trustworthiness in their content, which betrays a misunderstanding what EEAT actually is. For example, trustworthiness is not something that is added to content like a feature, trustworthiness is the sum of the experience, expertise and authoritativeness that the author of the content brings to an article.

Secondly, EEAT is a concept of what Google aspires to rank in search engines but they’re not actual ranking factors, they’re just concepts.

Third, marketers are already furiously incorporating the concept of EEAT into their search marketing strategy. So the advice to incorporate EEAT as part of the future marketing strategy is itself too late and a bit bereft of unique insight.

The advice also fails to acknowledge that user interactions and user engagement not only a role in search engine success in the present but that they will likely increase in importance as search engines incorporate AI to improve their relevance and meaningfulness to users.

That means traditional that search marketing will remain effective and in demand for creating awareness and demand.

7. Why Watermarking May Not Have An Impact

Gartner suggests that watermarking and authentication will increasingly become common due to government regulation. But that prediction fails to understand the supporting role that AI can play in content creation.

For example, there are workflows where a human reviews a product, scores it, provides a sentiment score and insights about which users may enjoy the product and then submits the review data to an AI to write the article based on the human insights. Should that be watermarked?

Another way that content creators use AI is to dictate their thoughts into a recording then hand it over to the AI with the instruction to polish it up and turn into to a professional article. Should that be watermarked as AI generated?

The ability of AI to analyze vast amounts of data complements the content production workflow and can pick out key qualities of the data such key concepts and conclusions, which in turn can be used by humans to create a document that is filled with their insights, bringing to bear their human expertise on interpreting the data. Now, what if that human then uses an AI to polish up the document and make it professional. Should that be watermarked?

The Gartner’s predictions about watermarking AI content fails to take into account how AI is actually used by many publishers to create well written content with human-first insights, which absolutely complicate the use of watermarking and calls into question the adoption of it in the long term, not to mention the adoption of it by 2026.

Gartner Predictions Don’t Hold Up To Scrutiny

The Gartner predictions cite actual facts from the real-world. But it fails to consider real-world factors that make AI technology as an impotent threat to traditional search engines. For example, there is no consideration of the inability to of AI to create a fresh search index or that AI Chatbot search engines aren’t even actual AI search engines.

It is incredible that the analysis failed to cite the fact that Bing Chat experienced no significant increase in users and has failed to peel way search volume from Google. These failures cast serious doubt on the accuracy of the predictions that search volume will decrease by 25%.

Read Gartner’s press release here:

Gartner Predicts Search Engine Volume Will Drop 25% by 2026, Due to AI Chatbots and Other Virtual Agents

Featured Image by Shutterstock/Renovacio

20 Awesome Examples Of Social Media Marketing via @sejournal, @anna_bredava

Technology makes the world seem a lot smaller.

Keeping up with friends and family on the other side of the country or across the globe no longer requires an expensive telephone call or slow, one-way snail mail.

Instead, thanks to the power of social media, we can bridge distances in the blink of an eye. In just seconds, you can share updates about your life or check in with anyone with internet access.

Social media has changed how we communicate and how we consume information and entertainment.

These platforms unlock a treasure trove of opportunities for savvy marketers, transforming how brands engage with their audience and share their stories on a global stage.

Why Is Social Media Marketing Important For Brands?

Social media platforms like Instagram, TikTok, X (Twitter), and Facebook – among others – present businesses with an opportunity to engage with a massive audience.

They are not just digital spaces for socializing; they are vibrant marketplaces.

As of 2024, the global social media user base had soared to over 4.8 billion people, representing an ever-expanding audience for brands – and a whole lot of potential customers.

Social media provides the opportunity for marketers to humanize their brand through compelling storytelling that showcases their identity and values.

With social media marketing, brands can weave their narratives, engage vast audiences without hefty budgets, and raise awareness and consideration for their company with a broader audience.

It’s also a powerful tool for building authentic relationships with your target consumer.

You can conduct real-time customer service, gather feedback (both positive and negative), and build brand trust over time by interacting and engaging with your social community across specific platforms.

In addition:

  • 68% of consumers follow brands on social media to stay updated about products and services.
  • The average time spent on social media daily is 2 hours and 24 minutes.

The landscape of social media marketing is also shifting towards more engaging content formats such as short videos, live streams, and interactive stories.

From viral organic posts to paid display ads that allow you to target a highly specific demographic, social media presents an unrivaled opportunity to boost your brand visibility and find new customers.

That said, these platforms are not just about placing ads in front of consumers; they’re about creating conversations, building communities, and driving genuine brand engagement through content that resonates with audiences.

So, what separates the companies who are killing it on social media from the thousands of others who never quite seem to gain any traction?

In this piece, we’ll look at some outstanding ways brands have leveraged popular social platforms to inspire your campaigns.

How To Measure Social Media Marketing Effectiveness

Before we dive into the fun stuff, let’s take a moment to discuss how you can gauge the impact of your social media marketing efforts.

The key to assessing the effectiveness of your social media activities lies in measuring your key performance indicators (KPIs).

Some KPIs you might want to consider tracking include:

  • Reach: The number of unique users who see your content. This helps you understand the overall scale of content distribution.
  • Impressions: How many times your content was viewed (regardless of clicks or engagements). This can help you gauge how frequently people are looking at your content.
  • Engagements: Interactions with your content (e.g., how many likes, shares, comments, saves, etc., it received). This helps you understand how engaging users are finding your content.
  • Conversions: How effective your content is at driving actions (e.g., link clicks, follows, form fills, sales, sign-ups, etc.) This helps you understand whether your content is driving towards your goal-related activity.

The KPIs you choose should closely align with your strategic goals.

If you’re looking to boost awareness, reach, impressions, and engagement, offer valuable insights into how widely your message is seen and whether it’s resonating with users.

If you’re focused on lead generation or direct sales, focusing on conversion rates will provide a clearer measure of success.

Each brand is different, which means they will not only measure success differently but will also vary in which platforms are most effective for their social media marketing efforts.

With this in mind, we’ve broken down our examples and inspiration by platform. So, with no further ado, let’s jump in.


YouTube

1. Dove: Project #ShowUs

When: 2019

Campaign Outline:

To highlight that beauty comes in many forms, Dove launched Project #ShowUs, a campaign intended to challenge stereotypes of what is and isn’t considered beautiful.

In collaboration with Getty Images and Girlgaze Photographers, Project #ShowUs created the largest stock photo library in the world created by women – featuring all female-identifying and non-binary individuals.

The library featured over 5,000 photographs of women from around the globe. Dove took to social media to introduce it to the world, creating video content for YouTube and partnering with influencers to gain traction.

The Numbers:

  • The YouTube video has generated over 33.5 million views.
  • More than 100,000 women pledged to create a more inclusive vision of beauty.
  • 900+ companies in 40 countries downloaded 7,500+ images from the collection
  • The hashtag #ShowUs saw thousands of engagements across YouTube, Twitter, and Facebook.

Why Did It Work?

For generations, media and advertising have presented an image of what beauty is. However, this has left so many women feeling like they are not represented by media and advertising.

Dove spoke directly to the feelings of its target audience, engaging with them about the brand’s value and encouraging them to take pride in being themselves.

Strategic delivery helped reach women worldwide.

2. BuzzFeed x Friskies: Dear Kitten

When: 2016

Campaign Outline:

If there’s one thing the internet loves, it’s cat videos.

Buzzfeed and Friskies tapped into this sentiment with their “Dear Kitten” videos, in which an older house cat teaches a kitten how to be a cat.

The Numbers:

  • The launch video has been viewed on YouTube more than 34 million times.
  • Twelve follow-up videos have been viewed millions of times each.
  • The campaign led to viral TikTok parodies, with the hashtag #DearKitten receiving more than 3.6 million views.

Why Did It Work?

You don’t have to have genius-level insight into the human psyche to understand why this campaign was so successful.

It has cute cats and a funny script.

3. Apple: “Study With Me”

20 Awesome Examples Of Social Media MarketingScreenshot from YouTube, Study With Me feat. Storm Reid x Apple, Apple, February 2024

When: 2023

Campaign Outline:

Apple tried its hand at the popular “Study With Me” video trend in 2023 by creating a 90-minute feature starring actress and college junior Storm Reid.

In the video, Reid uses the Pomodoro Technique – which focuses on 25-minute study sessions followed by 5-minute breaks – to showcase a productive study routine.

The video serves as a virtual study companion for viewers who are looking for that type of content while also highlighting the capabilities of Apple’s MacBook Air product.

The Numbers:

  • The video has generated over 18 million views on YouTube.

Why Did It Work:

Apple did what great social media marketing often does: It tapped into a trending format to reach its ideal audience.

By tapping into a burgeoning trend among students seeking virtual companionship and motivation – and pairing that with Storm Reid, a recognizable figure who is also relatable for the target audience – the campaign struck a chord of being both authentic and helpful to viewers.

On top of that, incorporating a tried-and-true study technique gave the audience a practical takeaway to enhance their own study habits.

4. eBay: Modathon

When: 2023

Campaign Outline:

eBay wanted to shift perceptions of its brand and drive excitement with an audience of auto enthusiasts.

So, the brand created a social media campaign to tap into the subculture of offroading by leveraging the huge inventory of eBay Motors.

In a YouTube series called “Modathon,” the company partnered with YouTube creators on a mission to transform a 1979 Bronco into an offroading powerhouse using only parts and accessories from eBay Motors.

Across several longform episodes, creators customized the Bronco with parts for challenging trails.

The Numbers:

The series generated:

  • 35,000 new YouTube channel subscribers.
  • Over 8.4 million views on YouTube to date.
  • More than 100 million minutes watched.
  • 6:28 minute average episode watch time.

Why Did It Work:

The Modathon challenge succeeded by tapping into what drives the offroading community: a passion for adventure, customization, and modification.

By partnering with YouTube creators who embody the spirit and enthusiasm of its target audience, eBay positioned itself as not just a marketplace but a hub for inspiration and community.

The narrative series format catered to the audience’s preference for immersive, detailed content, which then helped foster a stronger connection with the brand.


Instagram

5. Apple: The Shot on iPhone Challenge

When: Ongoing (Launched in 2015)

Campaign Outline:

The world’s most popular smartphone manufacturer, Apple, takes great pride in the quality of images that can be captured on its devices.

To highlight the great photos that it can take, it launched a competition in 2015 that asked iPhone users to “capture the little things in a big way.”

Photographers were then invited to share their images on Instagram and other social media sites using the hashtag #ShotOniPhone.

A panel of judges then selected 10 winners from tens of thousands of entries, which were then featured on Apple’s website, the company’s Instagram, and on 10,000+ billboards in 25 countries.

It has since become an annual campaign for the brand.

The Numbers:

  • The first round of the campaign had more than 6.5 billion impressions.
  • It was mentioned by 24,000 influencers, with a 95% positive comment rating.

Why Did It Work?

User-generated content (UGC) is a low-investment way for companies to promote their brand on social media, but this isn’t the reason for this campaign’s success.

Instead, Shot on iPhone encourages people to discuss the campaign, which closely aligns with Apple’s reputation for creativity, lifestyle, and innovation.

It encouraged existing users to participate in product launches and builds a sense of excitement about being part of the iPhone community.

Additionally, it gives iPhone users a sense of being part of something cool, which everyone likes.

6. Spotify: Spotify Wrapped

Spotify WrappedScreenshot from newsroom.spotify.com, February 2024

When: Ongoing (Launched in 2019)

Campaign Outline:

In 2019, Spotify launched a campaign where users received a year-end round-up of their listening habits on the platform.

Using personalized in-app data, Spotify Wrapped gives you access to an exclusive, interactive story (or, in the past, a webpage) that shows you details like:

  • Your most listened to artists, genres, and songs.
  • Your top podcasts.
  • The total time you spent listening for the year.
  • New artists you discovered.
  • And more.

The data is presented in a visually appealing way that is formatted specifically for sharing to Instagram Stories (and elsewhere) – and Spotify encourages users to share far and wide.

Now, several years later, Spotify Wrapped has become an event that users anticipate and talk about even ahead of time.

It has evolved to serve users with new tidbits of information – such as what international city you’re aligned with based on your listening habits – and has succeeded at creating a tentpole social media marketing moment.

The Numbers:

  • In 2022, 156 million users engaged with Wrapped.
  • In 2021, that number was reportedly 120 million.
  • There were 425 million Tweets about Spotify Wrapped in the first three days after its launch in 2022.

Why Did It Work?

Spotify combines two big psychological triggers in this campaign: personalization and fear of missing out (FOMO).

The app provides a personalized story for each user. You can see how your music taste developed through the years and what songs accompanied you in your life. The visualizations and gamification make it super engaging and capture people’s attention.

By enabling and encouraging sharing on social media, Spotify amplifies the campaign’s reach. It creates a sense of community in which users want to share their results with others – and see where they differ from their friends.

People naturally wanted to show off their highlights to their friends, thus making more people eager to try this experience.

7.  Freeform: Cruel Summer Influencer Nostalgia Campaign

When: 2023

Campaign Outline:

After record-breaking viewership of season one of “Cruel Summer,” Freeform needed to reignite interest in the show’s second season.

So, the brand put together a social media campaign built around a classic tactic: nostalgia.

Collaborating with six popular Instagram meme accounts and throwback influencers like Lance Bass and Mario Lopez, the network leveraged ’90s nostalgia to create buzz around the new anthology format of Cruel Summer.

The Numbers:

  • The campaign garnered a total reach of over 22 million and 3.7 million organic impressions.
  • A top post by ThirtyAF achieved a 6.5% engagement rate.
  • The brand saw a unanimously positive sentiment from fans who expressed excitement for the new season.

Why Did It Work?

Nostalgia has proven itself to be an extremely powerful marketing tactic – and that’s especially true on social media.

Freeform’s campaign leveraged the power of nostalgia marketing – and its audience’s love for the ’90s – to drive impressive social media engagement.

Additionally, partnering with trusted social media influencers further amplified the impact of the campaign.

This innovative approach – combined with the excitement for new stories – led to a universally positive reception, proving that a well-curated throwback theme can effectively drum up anticipation and broaden viewer interest.

8. Hulu Originals: Only Murders In The Building

When: 2021

Campaign Outline:

In a strategic move to captivate audiences and announce the first season of “Only Murders in the Building,” Hulu partnered with Home Brew Agency to craft an Instagram campaign that reflected the mysterious tones of the show itself.

The strategy centered around transforming the Instagram feed into an extension of the show’s universe, complete with a detailed mosaic of the fictional Arconia building.

The campaign also highlighted the star-studded cast of Steve Martin, Martin Short, and Selena Gomez through character spotlights that introduced and teased the evolving dynamics between the main characters to interactively immerse followers in the series’ murder mystery.

Original videos and games were designed to spark curiosity and speculation among fans without revealing too much, maintaining the suspense that is the lifeblood of any whodunit.

The Numbers:

  • The @onlymurdershulu Instagram account quickly grew to 116,000 followers.
  • As a result, the show launched as Hulu’s “Most-Watched Comedy Premiere in Hulu History” and the most-watched Hulu Original comedy on premiere day.

Why Did It Work?

Hulu Originals did a number of things right here.

Firstly, it leveraged Instagram to extend the story world of the series and engage fans on a platform where they’re already active and invested.

The brand made use of the show’s considerable star power to activate a broad fan base across different demographics and generate excitement and curiosity.

By introducing an immersive social media experience that focused on mystery and teasing elements of the show piece-by-piece, Hulu Originals expanded the show’s narrative beyond the screen, heightened anticipation, and invited social media fans to join in on the fun.

This holistic approach not only solidified the show’s online presence but also played a crucial role in driving its record-breaking viewership on Hulu, demonstrating the power of social media in amplifying television narratives

9.  Bobbie: @Bobbie Instagram Handle

When: 2023

Campaign Outline:

Bobbie, a baby formula brand, is on a mission to reshape societal perceptions around infant nutrition.

In 2023, the brand set out with an objective to leverage Instagram to cultivate a supportive, diverse community for modern parents.

Central to the brand’s mission was to make the tumultuous first year of parenting less daunting by using Instagram to bond over shared experiences within the first year of parenthood and help parents feel less alone.

To do so, the brand focused on showing the real, parent-driven team behind the scenes at Bobbie, telling the powerful stories of challenges real Bobbie parents face (such as infertility and systemic injustices in maternal care). It even responded with real-time support, such as Uber-delivered formula to Instagram followers experiencing emergencies.

The Numbers:

  • Total engagements increased to 307,000 – a 338% jump from the previous year.
  • Increased total impressions to 162 million – a 334% jump from the previous year.
  • Followers grew to 113,000 – a 37% rise from the year beforehand.

Why Did It Work?

Bobbie’s strategy resonated deeply with its audience by focusing on authenticity.

By openly addressing the complexities of parenting, offering tangible support, and spotlighting real stories, Bobbie not only fostered a community but also positioned itself as a brand that truly understands and advocates for its customers’ needs.

In addition to enhancing its social metrics, the hands-on approach and commitment to addressing systemic challenges in parenthood – coupled with strategic storytelling and community engagement – also solidified the brand as a leader in championing the well-being of parents and children alike.


Facebook

10. BuzzFeed: Tasty

When: 2016

Campaign Outline:

You’ve probably seen these quick and easy recipe videos popping up all over your Facebook news feed.

BuzzFeed’s Tasty videos are essentially cooking shows for the social media generation.

These videos, typically lasting less than two minutes, deliver on-trend recipes to a highly engaged audience.

The Numbers:

  • Nearly 15 months after launching, Tasty published 2,000 recipe videos, giving the brand a steady stream of new content.
  • Videos reached around 500 million users monthly.
  • The brand has over 105 million Facebook fans.

Why Did It Work?

For starters, there’s the content.

Tasty tapped into the inherent shareability of food content and the fact that almost everyone can relate to food – it has a place in all of our lives.

But more importantly, Tasty and Proper Tasty have exploded on Facebook because the content is tailor-made for that platform.

The team at BuzzFeed clearly observed video trends on Facebook and jumped while the time was ripe.

By producing high-quality, visually appealing videos that users could easily replicate at home, Tasty not only entertained but also provided value, making it a go-to resource for culinary inspiration.

The videos are optimized for Facebook’s autoplay feature, which starts playing videos without the sound on.

You don’t need sound to see, for example, a 45-second guide to making a cheese-stuffed pizza pretzel.

11.  Planet Fitness: Home Work-Ins

When: 2020

Campaign Outline:

In 2020, with the world grappling with lockdowns and gym closures, Planet Fitness set out to leverage Facebook to revolutionize home fitness.

As many of us scaled back our physical activity in order to shelter in place, Planet Fitness launched “The Home Work-In” series.

This innovative campaign transformed Facebook Live into a virtual gym, offering free, daily live workouts to motivate people globally.

To make it happen, the company equipped trainers across the country with the necessary tech to broadcast from their homes. These sessions featured professional trainers, celebrities, and athletes, ensuring variety and broad appeal.

The Numbers:

  • Over 373 million total campaign impressions.
  • Viewed by over 208 million people across 37 countries.
  • Increase the average watch time of Planet Fitness video content by 200%.
  • Drove over 4.3 million new Facebook followers.

Why Did It Work?

Planet Fitness’s Home Work-In campaign brilliantly tapped into the needs of a global audience confined to their homes, craving movement and community.

By leveraging Facebook Live, it provided real-time, interactive fitness solutions that were accessible and free, breaking down barriers to exercise.

The strategic use of celebrities and athletes added star power, while the quick launch just days after widespread closures highlighted the brand’s agility and commitment to its members.


X (Formerly Twitter)

12. Nickelodeon: A Message From Steve – Blue’s Clues 25th Anniversary

Message from SteveScreenshot from X (Twitter), February 2024

When: 2021

Campaign Outline:

To celebrate the 25th anniversary of “Blue’s Clues,” Nickelodeon decided to use X (Twitter) to reconnect with the now-adult audience who had cherished the show as children.

The strategy was to evoke nostalgia and warmth by reminding them of the timeless bond they shared with the show, using a special message from the original host, Steve.

The centerpiece of the celebration was a “Message From Steve,” a video where Steve directly addressed the audience after decades.

The script, developed in close collaboration with Steve, touched on universal themes of adulthood, such as jobs, families, and student loans, while also acknowledging the growth and journeys of the audience since they last met Steve.

By using X (Twitter) as the distribution platform, Nickelodeon strategically featured Steve’s message in an area where it knew the conversation would flourish.

The Numbers:

  • Steve’s messaging became a viral sensation, garnering:
    • Over 40 million views.
    • Close to 800,000 retweets.
    • 2 million likes.
    • Over 222 million impressions and 18 million engagements.
  • It was the most engaging tweet of all time for any ViacomCBS account.
  • Blue’s Clues and Steve were a trending topic on social media for several days, with fans sharing their emotional reactions, memories, and more.
  • Celebrities, such as Seth Rogen and Blake Lively, and brands like XBOX and JCPenney engaged with the tweet.

Why Did It Work?

Steve’s return tapped into a deep well of nostalgia, which (as we’ve discussed above) is a powerful tool for engaging social media content.

It encouraged and allowed people to reconnect with childhood memories, and the sincerity of the message resonated with social media audiences all over the world.

By addressing the shared experiences of growing up and acknowledging the challenges of adulthood, the campaign fostered a powerful sense of community among viewers.

13. Busch: #PassMeABusch

When: 2022

Campaign Outline:

Busch Light had an ambitious goal: to dominate social media conversations on National Beer Day by making Busch Light the most talked-about beer brand.

In order to do that, it mobilized its passionate fanbase on X (Twitter) by turning April 7, 2022 into a celebration of beer, which was fueled by generous beer money giveaways.

The brand asked fans to share why they deserved to celebrate National Beer Day with Busch Light, promising $10,000 in beer money via CashApp for the most compelling reasons.

Throughout the day, it offered various giveaway amounts and “power hours” to maintain excitement and participation.

This led to fans sharing their unique, humorous, and sometimes poignant reasons for deserving a share of the beer money, generating widespread buzz and engagement.

The Numbers:

  • Bush Light became the No. 1 topic on X (Twitter) for National Beer Day.
  • The brand achieved:
    • Over 40,000 social mentions.
    • 1.7 million impressions.
    • Nearly 3,000 new followers.
  • The #PassMeABusch hashtag gained the company thousands of new followers.
  • One of the biggest growth days Anheuser-Busch ever saw on Twitter.

Why Did It Work?

The campaign’s genius lay in its simplicity and direct appeal to the audience’s love for beer and the brand.

By offering tangible rewards to fans, Busch Light created a sense of excitement that resonated across X (Twitter), and provided strong incentive for engagement.

After all, people are much more likely to engage if they believe they might get something out of it!

The mix of humor, relatability, and the thrill of potentially winning beer money incentivized people to celebrate and engage, propelling Busch Light to unprecedented social media prominence on National Beer Day.

14. Planters: The Death Of Mr. Peanut – #RIPPeanut

When: 2020

Campaign Outline:

Perhaps one of the most bizarre social media campaigns: the beloved mascot of Planters snack food company died at the beginning of January 2020.

His death was announced with a tweet and later explained in a video ad posted to YouTube. The brand explained that Mr. Peanut had sacrificed his life to save his commercial co-stars, Matt Walsh and Wesley Snipes.

Planters invited fans to mourn the loss using the #RIPPeanut hashtag (which could also win them snacks).

The brands and regular social media users alike played along with the campaign, and it even got a mention on SNL.

The campaign was inspired by the reaction to celebrity deaths on social media. It aimed to repeat the same level of engagement that Tony Stark’s death caused in “Avengers: Endgame.”

Later, Mr. Peanut was reborn as a Baby Nut and now happily tweets from the Peanut Jr. account.

The Numbers:

  • The tweet announcing the death of Mr. Peanut has gathered 42,000 retweets.
  • It generated an increase of 24,000 followers for the @MrPeanut Twitter account.

Why Did It Work?

The campaign’s success hinged on its sheer audacity and the playful engagement with a topic as somber as death, presented in a way that was both humorous and captivating.

The premise was so unexpected and so wild that it immediately piqued the interest of users across X (Twitter) and quickly became a meme.

By tapping into meme culture and encouraging the participation of other users and brands, Planters created a viral phenomenon that transcended traditional marketing campaigns.

Many comedians and funny Twitter personalities jumped into the conversation, making jokes about Mr Peanut’s death – and other brands like Snickers, Crocs, and more joined in.

Planters did an exceptional job of taking the strange humor of the platform at the time, and putting that to use in an interactive and emotional rollercoaster that demonstrated the power of creative storytelling and community engagement.


TikTok

15. P&G: #DistanceDance

When: 2020

Campaign Outline:

Created during the pandemic (seeing a trend here?), Proctor and Gamble took to TikTok with a campaign designed to encourage social distancing.

Under the hashtag #DistanceDance, the company teamed up with social media and former competitive dancer Charli D’Amelio to help slow the spread of the coronavirus.

For the first 3 million videos posted to the short-form video apps, P&G donated to Feeding America and Matthew 25 Ministries.

The Numbers:

  • The hashtag has inspired more than 2.3 million posts to date.
  • Charli D’Amelio’s video received almost 7 million likes and had more than 135,000 comments.

Why Did It Work?

Recognizing that to reach a younger audience, it needed to reach them on their platform of choice, P&G jumped fully into this TikTok campaign.

Partnering with an established influencer helped the company reach an audience it would otherwise have struggled to connect with.

The give-back component also created a feel-good reason to participate in the hashtag challenge.

16. Chipotle Mexican Grill: Chipotle x Corn Kid

When: 2023

Campaign Outline:

When an interview featuring 7-year-old Tariq (a.k.a. Corn Kid) expressing his love for corn captured TikTok’s heart and went viral, Chipotle saw an opportunity to jump into the conversation – and highlight its roasted chili-corn salsa.

Seemingly overnight, Chipotle jumped on the trend and orchestrated a collaboration with Corn Kid, creating a video of him enjoying his favorite corn salsa burrito bowl at Chipotle.

The Numbers:

  • The TikTok generated:
    • Over 59.6 million views.
    • Over 266,500 shares.
    • Over 9.3 million likes.
  • Nearly 13 million engagements across platforms.
  • Over 110 million video views across platforms.
  • Over 1.1 billion PR impressions from 768 stories.

Why Did It Work?

The partnership allowed Chipotle to enter a cultural TikTok conversation as it was unfolding in a way that felt authentic and memorable.

By being the first brand to partner with Corn Kid, Chipotle set itself apart from the competition and found a unique way to highlight its product.

The campaign’s success also stemmed from its rapid response to a fleeting cultural moment, showcasing Chipotle’s agility in content creation and ability to authentically engage with Gen Z.

The clever use of real-time culture mixed with Chipotle’s narrative around fresh ingredients resonated well with audiences, as it showed the brand really walks the walk.

17.  State Farm: Jake Gets Social

When: 2022

Campaign Outline:

In order to reach the next generation of consumers, State Farm launched a TikTok campaign around its iconic “Jake from State Farm” character.

To do so, it made Jake a content creator on TikTok, having him participate in popular challenges and trends, and partner with recognizable influencers and celebrities on the platform.

The Numbers:

  • Grew the Jake from State Farm TikTok page to 640,000 followers in 2022.
  • The profile generated 1.75 million likes and 11.7 million organic views.
  • Achieved a 14.5% average engagement rate on owned videos.

Why Did It Work?

State Farm successfully integrated Jake into the TikTok environment in an authentic way by creating engaging, community-driven content.

By focusing on creative challenges, partnerships with popular TikTok creators and celebrities, and genuine interactions with other TikTok users, State Farm went beyond the typical corporate presence on social and built real connections with people.

On a platform that values novelty and authenticity, State Farm’s adaptability and attention to trends enabled it to lay a solid foundation for future engagement with Gen Z consumers.

18.  FOX Entertainment: Special Forces: World’s Toughest Test TikTok Challenge

When: 2022

Campaign Outline:

Here’s a fun one.

In this example, FOX Entertainment introduced a brand new augmented reality (AR) obstacle course challenge on TikTok to promote the upcoming season of “Special Forces: World’s Toughest Test.”

Designed to reflect the show’s focus on overcoming physical and mental barriers, users participated in the challenge by trying to complete a 2-minute AR course using push-ups and planks.

The Numbers:

  • The AR experience collected over half a billion views.
  • It also generated 42 million likes and 2 million shares, putting FOX Entertainment in the top 1% of effect creators on TikTok.

Why Did It Work?

The AR challenge leveraged a unique capability of the TikTok platform to create something that was both interactive and immersive for users.

It also made sense for the brand to produce, as it aligned closely with the show’s themes.

Whether or not you knew about the show beforehand, you could enjoy the exciting AR challenge – and develop an awareness of the show in the meantime.

The success of this campaign underscores the power of creative content strategies that leverage emerging technologies to connect with audiences in meaningful and memorable ways.


LinkedIn

19. Harvard Business Review: Special Coverage: Coronavirus

When: 2020

Campaign Outline:

Because it’s so commonly used as a professional networking site, it’s easy to forget that LinkedIn is a social media platform just like Facebook or YouTube.

Harvard Business Review recognized it could fill a valuable role during the height of the pandemic by offering resources about the coronavirus.

Gathering many resources in one convenient place, it provided a credible source of information at a time when misinformation was running rampant.

The special coverage included information about developing work-from-home policies, responding to new variants, and helping find a new normal.

The Numbers

  • The HBR has over 14 million followers, many of whom benefited from this information.

Why Did It Work?

From fears of microchipping to governmental conspiracies, the sheer amount of outright false information about COVID-19 was staggering.

On top of this, this was uncharted territory for businesses of all types.

Leveraging the credibility of its parent institution, HBR provided quality, factual advice for dealing with a wide variety of pandemic-related issues.

20.  Verizon: #NotDone

Verizon #NotDone CampaignScreenshot from YouTube, #NotDone, Verizon, February 2024

When: 2020

Campaign Outline:

In 2020, we saw the 100th anniversary of women winning the right to vote.

To mark the occasion, Verizon launched the Future Fund, dedicating $5 million to nurture emerging female talent in technology and entertainment.

Then, the brand leveraged LinkedIn to creatively start a conversation about the historical underrepresentation of women and the roots of gender bias by creating posthumous LinkedIn profiles for pioneering women from history – such as Ada Lovelace, Dorothy Lavinia Brown, Chien-Shiung Wu, and more.

The campaign was designed to remind others that – until there are more women in tech and entertainment – we are #NotDone.

The Numbers:

  • Created posthumous LinkedIn profiles for the first time ever.
  • Engaged over 7 million users without any paid promotion.

Why Did It Work?

By leveraging LinkedIn to reintroduce historical figures to the modern job market, Verizon not only paid homage to their contributions but also starkly highlighted the brand’s messaging and values around the ongoing struggle for gender equality.

The platform was an effective choice for reaching professionals in decision-making roles within tech and entertainment, and the format Verizon chose was inherently buzzy, engaging, and never seen before.


Key Takeaway

Reflecting on the examples we’ve covered here, it’s worth noting how different they all are; they run the gamut of platforms, audiences, tactics, and messaging.

But one thing that does tie these brands together is this: They all found innovative ways to appeal to their targets and provide real value to people.

From Instagram to TikTok, these campaigns demonstrate the power of connecting with audiences in meaningful and unexpected ways.

The lesson for brands is to keep pushing the boundaries of engagement by offering value and relevance that resonates with their audience.

Embrace the challenge, and perhaps your campaign will be the next to inspire and captivate – and next year, you might even be featured on this list.

More Resources:


Featured Image: metamorworks/Shutterstock

How LinkedIn Unlocked A Genius SEO Strategy With AI via @sejournal, @martinibuster

LinkedIn’s Collaborative Articles features reached the milestone of 10 million pages of expert content in one year. The Collaborative Articles project has experienced a significant rise in weekly readership, rising by over 270% since September 2023.  How they reached these milestones and are planning to achieve even more results offer valuable lessons for creating an SEO strategy that uses AI together with human expertise.

Why Collaborative Articles Works

The intuition underlying the Collaborative Articles project is that people turn to the Internet to understand subject matter topics but what’s on the Internet is not always the best information from actual subject matter experts.

A person typically searches on Google and maybe lands on a site like Reddit and reads what’s posted but there’s no assurance that the information is by a subject matter expert or just the person with the biggest social media mouth. How does someone who is not a subject matter expert know that a post by a stranger is trustworthy and expert?

The solution to the problem was to leverage LinkedIn’s experts to create articles on topics they are expert in. The pages rank in Google and this turns into a benefit for the subject matter expert, which in turn motivates the subject matter expert to write more content.

How LinkedIn Engineered 10 Million Pages Of Expert Content

LinkedIn identifies subject matter experts and contacts them to write an essay on the topic. The essay topics are generated by an AI “conversation starter” tool developed by a LinkedIn editorial team. Those conversation topics are then matched to subject matter experts identified by LinkedIn’s Skills Graph.

The LinkedIn Skills Graph maps LinkedIn members to subject matter expertise through a framework called Structured Skills which uses machine learning models and natural language processing to identify related skills beyond what the members themselves identify.

The mapping uses skills found in members’ profiles, job descriptions, and other text data on the platform as a starting point from which they use AI, machine learning and natural language processing to expand on additional subject matter expertise the members may have.

The Skills Graph documentation explains:

“If a member knows about Artificial Neural Networks, the member knows something about Deep Learning, which means the member knows something about Machine Learning.

…our machine learning and artificial intelligence combs through massive amounts of data and suggests new skills and relations between them.

…Combined with natural language processing, we extract skills from many different types of text – with a high degree of confidence – to make sure we have high coverage and high precision when we map skills to our members…”

Experience, Expertise, Authoritativeness and Trustworthiness

The underlying strategy of LinkedIn’s Collaborative Articles project is genius because it results in millions of pages of high quality content by subject matter experts on millions of topics. That may be why LinkedIn’s pages have become more and more visible in Google search.

LinkedIn is now improving their Collaborative Articles project with features that are meant to improve the quality of the pages even more.

  • Evolved how questions are asked:
    LinkedIn is now presenting scenarios to subject matter experts that they can respond to with essays that address real-world topics and questions.
  • New unhelpful button:
    There is now a button that readers can use to offer feedback to LinkedIn that a particular essay is not helpful. It’s super interesting from an SEO viewpoint that LinkedIn is framing the thumbs down button through the paradigm of helpfulness.
  • Improved Topic Matching Algorithms
    LinkedIn has improved how they match users to topics with what they refer to as “Embedding Based Retrieval For Improved Matching” which was created to address feedback from members about the quality of the topic to member matching.

LinkedIn explains:

“Based on feedback from our members through our evaluation mechanisms, we focused our efforts on our matching capabilities between articles and member experts. One of the new methods we use is embedding-based retrieval (EBR). This method generates embeddings for both members and articles in the same semantic space and uses an approximate nearest neighbor search in that space to generate the best article matches for contributors.”

Top Takeaways For SEO

LinkedIn’s Collaborative Articles project is one of the best strategized content creation projects to come along in a long while. What makes it not just genius but revolutionary is that it uses AI and machine learning technology together with human expertise to create expert and helpful content that readers enjoy and can trust.

LinkedIn is now using user interaction signals to improve the quality of the subject matter experts that are invited to create articles as well as to identify articles that do not meet the needs of users.

The benefits of creating articles is that the high quality subject matter experts are promoted every time their article ranks in Google, which offers anyone who is promoting a service, a product or looking for clients or the next job an opportunity to demonstrate their skills, expertise and authoritativeness.

Read LinkedIn’s announcement of the one-year anniversary of the project:

Unlocking nearly 10 billion years worth of knowledge to help you tackle everyday work problems

Featured Image by Shutterstock/I AM NIKOM

Google Confirms: High-Quality Content Is Crawled More Often via @sejournal, @MattGSouthern

SEO professionals have long discussed the concept of a “crawl budget,” which refers to the limited number of pages search engines can crawl daily.

The assumption is that sites must stay within this allotted budget to get pages indexed. In a recent podcast, Google search engineers debunked some misconceptions about crawl budget and shed light on how Google prioritizes crawling.

How Googlebot Prioritizes Crawling

“I think there’s a lot of myths out there about crawling, about what it is and what it isn’t. And things like crawl budgets and phrases you hear thrown around that may be quite confusing to people,” said Dave Smart, an SEO consultant and Google Product Expert, during the podcast.

So, how does Google decide what to crawl?

“You need to do it by looking at what’s known, finding somewhere to start, a starting point. And from that, you get the links and stuff, and then you would try and determine what’s important to go and fetch now, and maybe what can wait until later and maybe what’s not important at all,” explained Smart.

Gary Illyes from Google’s search relations team agreed with this framework.

“If search demand goes down, that also correlates to the crawl limit going down. So if you want to increase how much we crawl, you somehow have to convince search that your stuff is worth fetching,” he said.

The key, then, is to produce content that Google recognizes as valuable based on user interaction.

Focus On Quality & User Experience

“Scheduling is very dynamic. As soon as we get the signals back from search indexing that the quality of the content has increased across this many URLs, we would just start turning up demand,” said Illyes.

This means there is no fixed “budget” that sites must adhere to. Improving page quality and proving usefulness to searchers can overcome any assumed limitations.

No One-Size-Fits-All Approach

“We don’t have an answer for every site,” Illyes admitted regarding crawl prioritization. “If you improved that section, then probably it’s going to help a lot.”

According to Google, the bottom line is to Focus on producing high-quality content rather than trying to reverse engineer a non-existent crawl quota. Earning links naturally and better serving users will take care of the rest.

Hear the full discussion in the podcast episode linked below:


FAQ

How does the concept of a crawl budget affect SEO strategies?

SEO professionals have discussed the concept of a crawl budget, believing that staying within a certain limit of pages crawled daily is essential. However, Google’s search engineers have clarified that there is no set crawl budget that websites must adhere to.

Instead, Google prioritizes crawling based on content quality and user interaction signals. Therefore, SEO strategies should shift focus from managing a crawl budget to optimizing for high-quality, user-centric content to increase the chances of being crawled and indexed effectively.

What factors influence Googlebot’s prioritization for crawling web pages?

A dynamic set of factors influences Googlebot’s prioritization for crawling web pages, predominantly content quality and user engagement. According to Google search engineers, the more valuable the content appears based on user interactions, the more likely the site will be crawled more frequently.

Factors such as earning organic links and improving user experience can enhance content quality signals, thus implying that enhancing overall page quality can increase a site’s crawl rate.

In what ways can marketers enhance the crawlability of their website’s content?

Marketers looking to improve their website’s crawlability should concentrate on the following:

  • Producing high-quality content that is informative, relevant, and engaging to the target audience.
  • Ensuring the website offers a superior user experience with fast loading times, mobile-friendliness, and navigational ease.
  • Gaining natural backlinks from reputable sources to increase credibility and visibility to search engines.
  • Regularly updating content to reflect the latest information, trends, and user needs.


Featured Image: BestForBest/Shutterstock

The Saga Of John Mueller’s Freaky Robots.txt via @sejournal, @martinibuster

The robots.txt file of the personal blog of Google’s John Mueller became a focus of interest when someone on Reddit claimed that Mueller’s blog had been hit by the Helpful Content system and subsequently deindexed. The truth turned out to be less dramatic than that but it was still a little weird.

SEO Subreddit Post

The saga of John Mueller’s robots.txt started when a Redditor posted that John Mueller’s website was deindexed, posting that it fell afoul of Google’s algorithm. But as ironic as that would be that was never going to be the case because all it took was a few seconds to get a load of the website’s robots.txt to see that something strange was going on.

Here’s the top part of Mueller’s robots.txt which features a commented Easter egg for those taking a peek.

The first bit that’s not seen every day is a disallow on the robots.txt. Who uses their robots.txt to tell Google to not crawl their robots.txt?

Now we know.

The Saga Of John Mueller’s Freaky Robots.txt

The next part of the robots.txt blocks all search engines from crawling the website and the robots.txt.

The Saga Of John Mueller’s Freaky Robots.txt

So that probably explains why the site is deindexed in Google. But it doesn’t explain why it’s still indexed by Bing.

I asked around and Adam Humphreys, a web developer and SEO(LinkedIn profile), suggested that it might be that Bingbot hasn’t been around Mueller’s site because it’s a largely inactive website.

Adam messaged me  his thoughts:

“User-agent: *
Disallow: /topsy/
Disallow: /crets/
Disallow: /hidden/file.html

In those examples the folders and that file in that folder wouldn’t be found.

He is saying to disallow the robots file which Bing ignores but Google listens to.

Bing would ignore improperly implemented robots because many don’t know how to do it. “

Adam also suggested that maybe Bing disregarded the robots.txt file altogether.

He explained it to me this way:

“Yes or it chooses to ignore a directive not to read an instructions file.

Improperly implemented robots directions at Bing are likely ignored. This is the most logical answer for them. It’s a directions file.”

The robots.txt was last updated sometime between July and November of 2023 so it could be that Bingbot hasn’t seen the latest robots.txt. That makes sense because Microsoft’s IndexNow web crawling system prioritizes efficient crawling.

One of directories blocked by Mueller’s robots.txt is /nofollow/ (which is a weird name for a folder).

There’s basically nothing on that page except some site navigation and the word, Redirector.

I tested to see if the robots.txt was indeed blocking that page and it was.

Google’s Rich Results tester failed to crawl the /nofollow/ webpage.

The Saga Of John Mueller’s Freaky Robots.txt

John Mueller’s Explanation

Mueller appeared to be amused that so much attention was being paid to his robots.txt and he published an explanation on LinkedIn of what was going on.

He wrote:

“But, what’s up with the file? And why is your site deindexed?

Someone suggested it might be because of the links to Google+. It’s possible. And back to the robots.txt… it’s fine – I mean, it’s how I want it, and crawlers can deal with it. Or, they should be able to, if they follow RFC9309.”

Next he said that the nofollow on the robots.txt was simply to stop it from being indexed as an HTML file.

He explained:

“”disallow: /robots.txt” – does this make robots spin in circles? Does this deindex your site? No.

My robots.txt file just has a lot of stuff in it, and it’s cleaner if it doesn’t get indexed with its content. This purely blocks the robots.txt file from being crawled for indexing purposes.

I could also use the x-robots-tag HTTP header with noindex, but this way I have it in the robots.txt file too.”

Mueller also said this about the file size:

“The size comes from tests of the various robots.txt testing tools that my team & I have worked on. The RFC says a crawler should parse at least 500 kibibytes (bonus likes to the first person who explains what kind of snack that is). You have to stop somewhere, you could make pages that are infinitely long (and I have, and many people have, some even on purpose). In practice what happens is that the system that checks the robots.txt file (the parser) will make a cut somewhere.”

He also said that he added a disallow on top of that section in the hopes that it gets picked up as a “blanket disallow” but I’m not sure what disallow he’s talking about. His robots.txt file has exactly 22,433 disallows in it.

He wrote:

“I added a “disallow: /” on top of that section, so hopefully that gets picked up as a blanket disallow. It’s possible that the parser will cut off in an awkward place, like a line that has “allow: /cheeseisbest” and it stops right at the “/”, which would put the parser at an impasse (and, trivia! the allow rule will override if you have both “allow: /” and “disallow: /”). This seems very unlikely though.”

And there it is. John Mueller’s weird robots.txt.

Robots.txt viewable here:

https://johnmu.com/robots.txt

Google Search Console Adds INP Metric In Core Web Vitals Report via @sejournal, @MattGSouthern

Google has announced that Interaction to Next Paint (INP), a new metric for measuring website interactivity, is now included as a key element in the Search Console’s Core Web Vitals report.

As of March 12, INP replaced First Input Delay (FID) as a Core Web Vital, signaling a shift in how Google evaluates user experience.

The INP metric, introduced as an experimental measure in May 2022, captures the time between a user’s interaction with a page (such as clicking a button) and when the browser can render the resulting changes on the screen.

This approach aims to provide a more comprehensive assessment of interactivity than FID, which only measured the time to first paint after the initial user interaction.

Evolving Web Metrics For Better User Experience

Google’s Web Vitals initiative, launched in 2018, provides developers with metrics to help optimize critical aspects of user experience. FID was one of the original metrics introduced as part of this effort. However, over time, Google recognized FID’s limitations in fully capturing interactivity, leading to the development of INP.

After a transition period as a ‘pending metric,’ INP replaces FID as a Core Web Vital. This change reflects Google’s ongoing commitment to refining its methods for evaluating and improving web user experience.

Adapting To The INP Transition

With the INP transition approaching, web developers are advised to assess their website’s current INP performance and take steps to optimize for the new metric.

To evaluate current INP scores, you can use tools like PageSpeed Insights and Chrome’s User Experience Report. Google recommends aiming for the “good” threshold, representing performance at the 75th percentile of page loads.

Developers should then diagnose and address issues impacting INP, such as long-running JavaScript tasks, excessive main thread activity, or overly complex DOM structures.

Implications For Web Development & Search Rankings

The adoption of INP as a Core Web Vital has implications for web development practices and SEO.

As Googe incorporates Core Web Vitals into its ranking systems, websites with strong INP scores may see positive changes in search rankings and user engagement metrics.

Web development practices may evolve to prioritize optimizing interaction readiness. This might require developers to re-evaluate application architectures, streamline code, and refine design elements to minimize interaction delays.

In Summary

By replacing the FID metric with INP, Google aims to offer a more comprehensive assessment of website interactivity.

As you navigate this transition, you can now use Search Console to monitor INP performance and take steps to address any issues that may be impacting scores.


FAQ

What is Interaction to Next Paint (INP), and why is it important?

  • Interaction to Next Paint (INP) is a performance metric in Google’s Core Web Vitals report that measures a website’s responsiveness and interactivity.
  • It provides a more complete assessment of user experience by capturing the time between a user action (e.g., clicking a button) and when the browser updates the screen to reflect that action.
  • INP is crucial because it offers a granular view of website performance, influencing user satisfaction and rankings in Google’s search results.

How can marketers and web developers optimize websites for INP?

  • To optimize for INP, evaluate current website performance using tools like PageSpeed Insights or Chrome’s User Experience Report.
  • Address issues affecting INP, such as minimizing long JavaScript tasks and reducing main thread activity.
  • Consider design modifications and code optimization that reduce interaction latency, ensuring a swift and smooth user experience throughout the site.

What does the transition from FID to INP as a Core Web Vital entail for SEO?

  • The shift from First Input Delay (FID) to Interaction to Next Paint (INP) as a Core Web Vital signifies Google’s continued refinement in measuring user experience for ranking purposes.
  • As Core Web Vitals are part of Google’s ranking factors, websites with better INP scores could see improved search rankings and user engagement.
  • This transition signals that web developers and SEO professionals should tailor their optimization strategies to prioritize INP, thus aligning with Google’s evolving standards for user experience.


Featured Image: BestForBest/Shutterstock