AI Overviews – How Will They Impact The Industry? We Ask Pedro Dias via @sejournal, @theshelleywalsh

Generative AI, SGE, and now AI Overviews have been hot topics since the launch of ChatGPT in November 2022, which gave Gen AI an accessible interface to the wide market.

Since then, the SEO industry has been trying to figure out just how much search behavior will change and how much this will impact organic search traffic.

Will we see the catastrophic drops in clicks that are being estimated?

Google’s aim is to integrate Gen AI into search to provide better answers and, in its words:

Sometimes you want a quick answer, but you don’t have time to piece together all the information you need. Search will do the work for you with AI Overviews.

However, there has been much contention and discussion about this as, in practice, the results are somewhat unpredictable – with advice, such as the health benefits of running with scissors, taking a bath with a toaster, and adding glue to pizza to make the cheese stick.

Google is still experimenting with AIO. Recently (June 19), a study from SE Ranking showed the frequency of AIO in SERPs has reduced from 64% to 8%. Meanwhile, BrightEdge reports that Google went from showing AI on 84% of queries to 15%.

Google also keeps experimenting with how AIO results appear in SERPs, and the latest iteration features citations in the top carousel.

Gen AI is disrupting the industry faster than anything else in the 25-year history of SEO. Some of the main discussion points for SEO include: How much is AI plagiarizing content, and how much do we need to pivot our approach to SEO?

I spoke to ex-Googler Pedro Dias and asked him:

In Your Opinion, What Do You Think About AI Overviews, How Will They Impact The Industry, And Where Is This Going?

This is what Pedro had to say:

“As I have mentioned before, Google wants to be your personal assistant and not your friendly librarian.

This is an important distinction, to see Google from this perspective moving forward. Instead of pointing us to the books, they will do the work for us.

If we continue to put out content that only requires a quick answer, this is where we will be disrupted. We need to focus on what people want beyond quick answers.

Google wants to be the personal assistant and caters for this by providing quick answers.

AI Overviews is no more than an evolution of instant answers.

If a site owner wants to target the quick answers, they should also be putting effort into more in-depth content that you can funnel your readers to and ideally, is closed to Google.

By doing this, you can protect the content assets you build.

We need to focus more than ever on building our own communities with users aligned to our brands. And doing more than simply providing a ‘this will do’ snippet, or an instant answer.

Right now, it’s impossible to predict how AIO will develop and what the format will be. Google keeps changing how it is presenting the SERP results and playing with the format much like live beta testing.

But, AIO will trigger different search behaviors.

Before, in SEO, we had ten blue links and no instant answers. From this, users would have to visit your website to get the answer, so a site could get considerable traffic for a basic question.

However, this type of traffic has little value, and these are not your customers – they are Google’s customers.

We need to understand how we can distinguish between instant answer traffic and users who want to consume our content. And this is the area where we should put our efforts.

Focus on building content for the people who don’t want the summary or the quick answer. Those who want to ‘read the book’ and consume the details to augment their knowledge.

In the same way that the web disrupted the music industry and the publisher industry, we are about to go through another change and we have to adapt to it. It’s a matter of time – when and not if.”

How Can You Leverage AIO And Google To Build A Content Community?

I asked Pedro:

“If we want to embrace this new approach, it will require thinking about how to gain users from a ‘take all the traffic you can get’ mentality to a selective one – leveraging Google to provide targeted traffic that you can absorb into your own community.

This will be a big change for some, so how can you leverage Google to achieve this?”

Pedro responded:

“Trying to figure out how much ‘discovery’ traffic Google will take away will be different for all verticals. For example, in the legal industry, or accountancy, the industry is based on consultants who understand and are gatekeepers to complex rules.

You can now ask AI to explain complex legislation on wider topics. But, if you have a specific scenario, you still need to visit a specialist who can deal with this for you.

AI can give you the wider information, but the expert is still needed for the detail.

As professionals in SEO, we can create content that covers broad concepts that AI can tap into. And then, for the specific scenarios and questions, this is where we can build out much more in-depth content.

This in-depth content is kept away from Google and AI and gated for your community or clients.

Every business will need to consider where to draw this line of what they give away for free and what they keep back as a premium.

AI came along to create more distance between those who know something and those who are specialists and will be sources of information.

The middle ground is about to disappear.

The professionals will remain because industries rely on the knowledge and the research these people do. And the rest will just be the rest.

Users will be divided into those who want a little information from AI and then the others who want specialist in-depth knowledge.

Being able to discern where you fit into this scenario and being able to create a strategy around this is how you can adapt.”

Fundamental Rules Never Change

I think we can expect more experimentation from Google before we begin to embrace AI in SERPs and SEO.

During a time of great flux, the best thing we can focus on is the fundamental rules that never change. And those fundamentals are all centered around how a brand builds a direct relationship with their user.

For SEO pros, it could be a challenging shift to adapt to this mindset away from chasing volume keyword traffic. Instead, looking at building user journeys and considering content touches where relevant.

The old days of gaining huge amounts of traffic for ranking from one high-volume keyword are becoming outdated. Moving forward, more effort will be needed to achieve far fewer clicks. However, those clicks should be far more relevant and beneficial.

Thank you to Pedro Dias for offering his opinion and being my guest on IMHO.

More resources:


Featured Image: BestForBest/Shutterstock

Beyond Rankings: Important Metrics To Measure For SEO Success via @sejournal, @AdamHeitzman

SEO metrics such as rankings are great, but what matters most is SEO’s impact on business growth.

This means you can only understand SEO’s value if you track the right metrics and how it can increase revenue.

Your metrics should focus on:

  • Audience quality: Are you attracting visitors who are likely to become customers?
  • Engagement and behavior: Are users finding the information they need, spending time on your site, and taking desired actions?
  • Conversions: Is your organic traffic translating into desired outcomes?
  • Brand impact: Is SEO influencing your brand’s reputation and visibility?

In this article, we’ve categorized important metrics to focus on at a high level.

User Engagement Metrics

Here are some user engagement metrics to track:

Bounce Rate

Bounce rate is the percentage of users who return to the SERP, or exit the webpage (and your site) without interacting with another page on your website. A high bounce rate can indicate that visitors are not finding what they want on your site, which causes them to exit quickly.

Why Is Bounce Rate Important In SEO?

Bounce rate helps you fix issues such as:

  • User Experience: A high bounce rate may indicate issues with your website’s content, design, or alignment with user intent. When you notice a high bounce rate, address these issues to improve user experience.
  • SEO Rankings: Search engines aim to provide users with the most relevant results. However, a high bounce rate may signal to Google that your site is not meeting user expectations and will lose visibility. This will also affect conversions as users didn’t even engage with your page.

How To Analyze Bounce Rate

Check your Google Analytics 4 for the percentage of single-page visits and divide that by the total number of visits. The result is a percentage of your bounce rate.

For example, if your website received 500 visitors and 100 interacted with more than one page, then 400 visitors bounced. Therefore, your bounce rate would be 80% (400 single-page visits / 500 total visits times 100).

Read more: 20 Proven Ways To Reduce Your Bounce Rate.

Dwell Time (Engaged Session Duration)

Engaged session duration measures the amount of time a user actively spends on your website during an engaged session. This metric indicates how long users interact with your content before leaving the site or becoming inactive.

For example, if a user searches for “best running shoes,” clicks on your link, spends three minutes reading your content, and continues to interact with other parts of your site, the engaged session duration is three minutes.

Why Is Engaged Session Duration Important In SEO?

  • It indicates content engagement and relevance: Longer engaged session durations show that users find your content valuable and are willing to spend time interacting with it.
  • It impacts rankings: High-engaged session durations signal to search engines that your page provides content that satisfies user intent, which can improve your rankings.
  • It helps gauge content effectiveness: If users spend more time on your site, it suggests your content is meeting their expectations and providing the information they need.

To Analyze Dwell Time

  • Open GA4 and click on Reports in the left-hand menu.
  • Choose the Traffic acquisition: Session default channel group.
  • Click on the pencil icon at the top right corner and select Metrics.
  • In the bottom search box that says Add metric, type “average engagement time” and hit Apply.
Traffic acquisition: Session default channel groupScreenshot from GA4, June 2024

Engaged Sessions Per User

Engaged sessions per user is a metric that measures how frequently users interact meaningfully with your website.

In Google Analytics, an engaged session is defined by user activity that includes spending a certain amount of time on the site, viewing multiple pages, or completing specific actions like form submissions or purchases.

For example, if a user lands on your homepage, spends more than a minute exploring your content, clicks on a product page, and completes a form, this counts as an engaged session.

Why Is Engaged Sessions Per User Important To SEO?

  • It reflects user engagement and satisfaction: High engaged sessions per user indicate that visitors find your content valuable and are willing to interact with it in a meaningful way.
  • It impacts SEO positively: Search engines use engagement metrics as signals of content quality and relevance. High engagement suggests that your site meets user needs, which can boost your rankings.

How To Calculate Engaged Sessions Per User

Google Analytics provides this metric directly, but to calculate it manually, divide the total number of engaged sessions by the number of unique users.

For example, if your website had 50,000 engaged sessions and 20,000 unique users in a month, engaged sessions per user equals 50,000 divided by 20,000 (2.5).

This means, on average, each user had 2.5 engaged sessions during that month.

Read more: Essential GA4 Reports You Need To Measure Your SEO Campaigns.

Conversion Metrics

Here are helpful conversion metrics to track:

Organic Conversion Rate

The organic conversion rate is the percentage of visitors who find your website through organic search results and complete a desired action. This could be

  • Making a purchase (usually on ecommerce sites).
  • Submitting a lead form (for businesses focused on lead gen).
  • Newsletter subscription (to build an email list)

Or any other goal that moves them further along the customer journey.

This metric shows how SEO drives valuable clicks that contribute to your business objectives.

How To Calculate the Organic Conversion Rate

  • Determine what constitutes a conversion for your business (e.g., form completion, sales, subscription).
  • Track the number of users who complete the desired action and the total number of organic visitors over a specific period.
  • Divide the number of conversions by the total number of organic visitors, then multiply by 100 to get a percentage.

For context, the organic conversion rate equals the number of conversions divided by the number of organic visitors multiplied by 100.

This means if 500 out of 10,000 organic visitors complete the desired action, the conversion rate would be 5%.

Read more: What Is Conversion Rate & How Do You Calculate It?

Goal Completions

A goal completion is recorded whenever a user completes a specific action you’ve defined as valuable. The actions could be the same metrics bulleted out in the previous point.

Goal completions matter because they tell if your SEO is driving the right traffic and if visitors are taking the actions you want them to take.

How To Track Goal Completions

  • Choose an analytics platform such as GA4, Adobe Analytics, Matomo, etc.
  • Define your goals and be specific (e.g., “purchase confirmation page viewed”).
  • Set up goal tracking.

For this article, we’ll use GA4, and tracking looks like this:

  • Go to the Admin section.
  • In the Property column, click on Events.
  • Click the “Create Event” button to set up a new event.
  • Name your event (e.g., “form_submission” or “purchase_completed”).
  • Define the conditions for your event. For example, if tracking a form submission, set parameters like event name equals “form_submit” or similar.
  • Click Create to save your new event.
  • Mark that event as a Key Event (conversion).
  • Then, monitor and analyze the reports to track goal completions.
Key events: GA4Screenshot from GA4, June 2024

Ecommerce Transactions

In ecommerce, a conversion is completing a desired action that generates revenue.

The most apparent conversion is a purchase, but other valuable actions include adding items to a cart, creating an account, or subscribing to emails.

What Does Tracking Ecommerce Transactions Look Like?

  • A user searches for [best running shoes] on Google.
  • They click on your blog post, “Top 10 Running Shoes for 2024,” which ranks high in organic search results.
  • They read your review and click on the buy button link to a product page on your website.
  • They add the shoes to their cart and complete the purchase.

If you enable enhanced ecommerce in GA4, it’ll track the entire customer journey (from product view to purchase).

UTM parameters will identify the blog post as the conversion source, your attribution model will assign credit to the post, and your CRM can link the purchase to the user’s profile for further analysis.

Follow this process to track ecommerce sales on Google Analytics 4.

Traffic Metrics

Here are some traffic metrics to prioritize:

Organic Traffic Volume

Organic traffic volume is the number of visitors arriving at your website through unpaid search results – organic clicks from search engine result pages (SERPs).

High organic traffic indicates that search engines consider your website relevant and authoritative for your target keywords.

This way, as long as you write quality content, your website will convert users without relying on paid advertising.

How To Measure Organic Traffic

Log into GA4 and go to Acquisition Reports. Navigate to Reports > Acquisition > Traffic Acquisition.

This report provides a detailed breakdown of your traffic sources, including organic search.

Organic Traffic Value

Organic traffic value goes beyond numbers to assess the actual worth of visitors your SEO efforts attract. It quantifies the potential revenue or business impact of your organic traffic.

Organic traffic value is ROI-focused; it answers the question, “What is the monetary value of the organic traffic we’re getting?”

The answer then informs decisions on how to allocate marketing resources.

How To Calculate Organic Traffic Value

You can either use the cost-per-click (CPC), conversion-based value, or the customer lifetime value (LTV) metrics:

  • The CPC method estimates the value of organic traffic by calculating how much you would have spent on paid advertising (PPC) to get the same number of clicks. It uses the average CPC for your target keywords.

If your website receives 1,000 organic clicks per month for a keyword with an average CPC of $2, the estimated organic traffic value would be $2,000.

  • The conversion-based value metric calculates the revenue generated from organic traffic by tracking conversions and assigning a value to each conversion. For example, if your website receives 1,000 organic visitors and 50 convert into customers with an average order value of $100, the organic traffic value would be $5,000.
  • Another method is the customer lifetime value (LTV). This method takes a long-term view by considering the total value a customer brings over their entire relationship with your business. It factors in repeat purchases, customer retention, and average order value.

For example, if your average customer from organic search makes three purchases per year with an average order value of $100 and remains a customer for two years, their LTV would be $600.

Technical SEO Metrics

Technical SEO metrics provide insights into your website’s infrastructure to ensure search engines can access, crawl, and index your content. Here are some metrics to focus on:

Crawl Errors

Crawl errors occur when search engine bots (like Googlebot) encounter issues while crawling pages on your website.

These errors can prevent search engines from understanding your content, potentially leading to lower rankings and visibility in SERPs.

Types of Crawl Errors

  • 404 (Not Found): The requested page doesn’t exist. This could be due to a broken link, a deleted page, or a typo in the URL.
  • 5xx (Server Errors): The server encountered an error while processing the request. This could be due to a temporary outage, a misconfiguration, or a server overload.
  • Robots.txt Errors: The robots.txt file blocks search engine bots from accessing certain pages or sections of your website.

How To Identify Crawl Errors

Head to Google Search Console (GSC). Go to Index > Coverage to see a list of crawl errors and warnings. Click on each error for more details, including the affected URLs and the error type. Then, prioritize the most critical errors, such as 404 errors on essential pages.

You can also check your server logs for crawl errors that might not appear in GSC.

To fix 404 errors, try these processes:

  • Restore the page if it was accidentally deleted.
  • Create a 301 redirect to the new URL if the page has been moved permanently.
  • Create a helpful custom 404 page that guides users back to relevant content.
  • Afterward, validate your fixes using the URL Inspection tool in GSC to test if the fixed page can be crawled and indexed correctly.

Indexation Status

Indexation status refers to whether or not a specific webpage has been added to a search engine’s index.

When a page is indexed, it appears in search results when users search for relevant queries. In contrast, if a page is not indexed, it’s invisible to search engines and won’t be found by users.

How To Ensure Proper Indexing of Pages

  • Create high-quality, unique content and use relevant keywords to signal to search engines what your page is about.
  • Submit a sitemap to help search engines discover and crawl your pages.
  • Optimize internal linking to help search engine bots navigate your site and discover all your pages.
  • Check Robots.txt to ensure your txt file is not blocking search engines from crawling and indexing critical pages.
  • Monitor indexation status by checking the Index > Coverage report in GSC to see which pages have been indexed and if there are any indexing errors.
Page trackingScreenshot from GA4, June 2024

Site Speed

Site speed is the time a website’s content takes to load and become fully interactive for users. Think of it as the digital stopwatch that measures the responsiveness and efficiency of your website.

Why Is Site Speed Important for SEO?

  • User experience (UX): Studies have shown that users expect websites to load within a few seconds. Fast website speed keeps users engaged, encourages them to explore more pages, consume more content, and ultimately convert into customers or leads. It also enhances the mobile experience.
  • Search engine rankings: Search engines prioritize faster websites because they provide a better user experience, which can help your faster website outrank slower competitors.

Read More:

Content Performance Metrics

This explores how effective your content is via:

Content Engagement

Content engagement measures users’ level of interaction and involvement with your web pages.

It goes beyond passive consumption and delves into how visitors actively engage with your content to indicate genuine interest and value.

How To Measure Content Engagement

  • In GA4, track metrics like average engagement time, sessions, and engagement rate to gauge how long users actively interact with your content. You can also implement event tracking to measure specific interactions (video views, downloads, form submissions, or clicks on internal links).
  • Use heatmaps and session recording tools like Hotjar or Crazy Egg to visualize how users interact with your pages. This will reveal where they click, scroll, and spend the most time.

Content Shares And Backlinks

Content shares, or social signals, are the number of times your content is shared across social media platforms.

Social shares indicate that your content is valuable and worthy of being shared and can amplify reach, build brand awareness, and attract backlinks.

Backlinks, on the other hand, are links from external websites that point to your web pages. Quality backlinks from other authoritative sites act as “votes of confidence” and signal to search engines that your content is trustworthy and authoritative.

High-quality backlinks can boost rankings, drive referral traffic from other websites, and increase your domain authority.

To track social shares, use the built-in analytics tools provided by social media platforms to track the number of shares, likes, comments, and overall engagement for your content. You can also use third-party tools like Hootsuite or Buffer.

To track backlinks, use tools like Ahrefs, Semrush, or Moz to see your total backlinks, referring domains, and link quality.

Read more:

Local SEO Metrics

Local SEO ensures your business appears when users search for products or services in your geographic area. Let’s start with getting insights from Google Business.

Google Business Profile Insights

Google Business Profile (GBP) is a free tool for businesses to manage their online presence across Google, including Search and Maps.

GBP Insights provides valuable data on how customers find and interact with your business listing.

How To Track GBP Performance

Log in to your GBP account and click the Insights tab. Look for the section titled How customers search for your business.

You’ll see a breakdown of:

  • Direct searches (branded searches),
  • Discovery searches (non-branded searches— when customers search for a general category, product, or service that you offer) and
  • Maps searches: When customers find your business through Google Maps.
Total pageviewsImage from Google Support, June 2024

In the same Insights tab, look for the section called Where customers view your business on Google. It will show whether customers find your listing more often in Search results or Maps.

where customers view business on GoogleImage from Google Support, June 2024

Also, check for customer actions in the Insights tab. Here, you can track website visits, calls directly from your listing, and direct requests to your location. This data reveals how customers engage with your business after finding your listing.

Other data to track include photo views and search queries.

Local Search Rankings

Local search rankings refer to your business’s position in the SERPs for queries with local intent.

These searches include location-specific keywords like “coffee shops near me” or “best dentist in Albany.”

Local search results often include a map pack (a group of three to four businesses displayed on a map) and organic listings.

How to Track Local SEO Success

  • Tracking local keyword rankings through tools like Semrush, Ahrefs, or Moz Local. Monitor your rankings for critical local keywords, as well as your map pack rankings and organic rankings.
  • Monitor GMB Insights to know how customers find your business, their actions, and which search queries they use.
  • Analyze local traffic and conversions on GA4 to segment your traffic by location and track conversions (phone calls, direction requests, website visits, purchases) that originated from local searches.
  • Track online reviews and ratings.

Read more on how to rank for Local Pack SEO.

Customer Reviews And Ratings

Customer reviews and ratings provide valuable customer feedback about their experiences with your business, products, or services.

These reviews are often publicly accessible on Google, Yelp, Facebook, and industry-specific review sites.

Why Are Reviews Important For Local SEO?

  • It’s a ranking factor, as businesses with positive reviews are more likely to appear higher in local search results, including the map pack and organic listings. For instance, Google ranks your business higher if you have many reviews, a high frequency of new reviews, multiple review sources, and an overall star rating.
  • Star ratings (or positive reviews) displayed alongside your business listing in search results can increase CTR.
  • Positive reviews enhance customer trust and conversion, as customers now rely on online reviews when making purchasing decisions.

Competitor Analysis

Competitive Benchmarking

Competitive benchmarking in SEO involves identifying, analyzing, and comparing your website’s performance to that of your top competitors in the search engine results pages (SERPs).

This helps you uncover your strengths and weaknesses, discover opportunities, and make data-driven decisions.

Some competitor performance metrics to analyze include:

  • Their keywords, search volume, and keyword gaps.
  • Their high-performing content format.
  • Backlink analysis.
  • Technical SEO audit (site speed, mobile friendliness, crawlability, and indexability.

Read more: SEO Competitive Analysis: The Definitive Guide.

Conclusion

Rankings are great, but conversions pay the bills.

Conversions are important they determine the efficacy of all your marketing efforts.

Tracking these metrics (and how they contribute to sales) will help you intensify marketing efforts on the strategies that work and allocate budgets effectively.

More resources:


Featured Image: Deemerwha studio/Shutterstock

In article screenshots taken by author

Google’s AI Overviews Coincide With Drop In Mobile Searches via @sejournal, @MattGSouthern

A new study by search industry expert Rand Fishkin has revealed that Google’s rollout of AI overviews in May led to a noticeable decrease in search volume, particularly on mobile devices.

The study, which analyzed millions of Google searches in the United States and European Union, sheds light on the unexpected consequences of AI integration.

AI Overviews Rollout & Reversal

In May 2024, Google rolled out AI overviews in the United States, which generate summaries for many search queries.

However, the feature was met with mixed reactions and was quickly dialed back by the end of the month.

In a blog post published on May 30, Google admitted to inaccurate or unhelpful AI overviews, particularly for unusual queries.

Google says it implemented over a dozen technical improvements to its systems in response.

A subsequent study by SE Ranking found the frequency of these summaries decreased, with only 8% of searches now triggering an AI Overview. However, when shown, these overviews are now longer and more detailed, averaging 25% more content.

SE Ranking also noted that after expansion, AI overviews typically link to fewer sources, usually around four.

Decline In Mobile Searches

Fishkin’s analysis reveals that the introduction of AI Overviews coincided with a marked decline in mobile searches in May.

While desktop searches saw a slight increase, the drop in mobile searches was significant, considering that mobile accounts for nearly two-thirds of all Google queries.

This finding suggests that users may have been less inclined to search on their mobile devices when confronted with AI-generated summaries.

Fishkin commented:

“The most visible changes in May were shared by both the EU and US, notably… Mobile searches fell a considerable amount (if anything spooked Google into rolling back this feature, I’d put my money on this being it).”

He adds:

“If I were running Google, that dip in mobile searches (remember, mobile accounts for almost 2/3rds of all Google queries) would scare the stock-price-worshiping-crap outta me.”

Impact On Overall Search Behavior

Despite the dip in mobile searches, the study found that search behavior remained relatively stable during the AI overviews rollout.

The number of clicks per search on mobile devices increased slightly, while desktop clicks per search remained flat.

This indicates that while some users may have been deterred from initiating searches, those who did engage with the AI Overviews still clicked on results at a similar or slightly higher rate than the previous months.

Implications For Google & the Search Industry

The study highlights the challenges Google faces in integrating AI-generated content into its search results.

Additionally, the research found other concerning trends in Google search behavior:

  • Low Click-through Rates: Only 360 out of every 1,000 Google searches in the US result in clicks to non-Google websites. The EU fares slightly better with 374 clicks per 1,000 searches.
  • Zero-click Searches Dominate: Nearly 60% of searches in both regions end without any clicks, classified as “zero-click searches.”
  • Google’s Self-referral Traffic: About 30% of clicks from US searches go to Google-owned properties, with a somewhat lower percentage in the EU.

Why SEJ Cares

This study underscores the need for adaptable SEO strategies.

As an industry, we may need to shift focus towards optimizing for zero-click searches and diversifying traffic sources beyond Google.

The findings also raise questions about the future of AI in search.

While major tech companies continue to invest in AI technologies, this study suggests that implementation may not always yield the expected results.


Featured Image: Marco Lazzarini/Shutterstock

Robots.txt Turns 30: Google Highlights Hidden Strengths via @sejournal, @MattGSouthern

In a recent LinkedIn post, Gary Illyes, Analyst at Google, highlights lesser-known aspects of the robots.txt file as it marks its 30th year.

The robots.txt file, a web crawling and indexing component, has been a mainstay of SEO practices since its inception.

Here’s one of the reasons why it remains useful.

Robust Error Handling

Illyes emphasized the file’s resilience to errors.

“robots.txt is virtually error free,” Illyes stated.

In his post, he explained that robots.txt parsers are designed to ignore most mistakes without compromising functionality.

This means the file will continue operating even if you accidentally include unrelated content or misspell directives.

He elaborated that parsers typically recognize and process key directives such as user-agent, allow, and disallow while overlooking unrecognized content.

Unexpected Feature: Line Commands

Illyes pointed out the presence of line comments in robots.txt files, a feature he found puzzling given the file’s error-tolerant nature.

He invited the SEO community to speculate on the reasons behind this inclusion.

Responses To Illyes’ Post

The SEO community’s response to Illyes’ post provides additional context on the practical implications of robots.txt’s error tolerance and the use of line comments.

Andrew C., Founder of Optimisey, highlighted the utility of line comments for internal communication, stating:

“When working on websites you can see a line comment as a note from the Dev about what they want that ‘disallow’ line in the file to do.”

Screenshot from LinkedIn, July 2024.

Nima Jafari, an SEO Consultant, emphasized the value of comments in large-scale implementations.

He noted that for extensive robots.txt files, comments can “help developers and the SEO team by providing clues about other lines.”

Screenshot from LinkedIn, July 2024.

Providing historical context, Lyndon NA, a digital marketer, compared robots.txt to HTML specifications and browsers.

He suggested that the file’s error tolerance was likely an intentional design choice, stating:

“Robots.txt parsers were made lax so that content might still be accessed (imagine if G had to ditch a site, because someone borked 1 bit of robots.txt?).”

Screenshot from LinkedIn, July 2024.

Why SEJ Cares

Understanding the nuances of the robots.txt file can help you optimize sites better.

While the file’s error-tolerant nature is generally beneficial, it could potentially lead to overlooked issues if not managed carefully.

What To Do With This Information

  1. Review your robots.txt file: Ensure it contains only necessary directives and is free from potential errors or misconfigurations.
  2. Be cautious with spelling: While parsers may ignore misspellings, this could result in unintended crawling behaviors.
  3. Leverage line comments: Comments can be used to document your robots.txt file for future reference.

Featured Image: sutadism/Shutterstock

AIO Pullback: Google Shows 2/3 Fewer AIOs And More Citations via @sejournal, @Kevin_Indig

It has become quiet around AI Overviews. One month after my initial traffic impact analysis, I updated my data for AIOs. The results are important for anyone who aims for organic traffic from Google as we’re seeing a shift in AIO structures.

Shortly after Google just launched AI Overviews on May 14, I looked at 1,675 queries and found:

  • -8.9% fewer organic clicks when a domain is cited in AIOs than regular results.
  • A strong relationship between a domain’s organic ranks and AIO citations.
  • Variations of referral traffic depending on user intent.

Since then:

  • Featured snippets and AIOs confuse users with slightly different answers.
  • Google has significantly pulled back AIOs across all industries.
  • AIOs cite more sources.

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

AIOs Dropped By Two-Thirds

A few days after Google launched AIOs in the US, users found misleading and borderline harmful answers.

In a post titled “About last week,” VP of Search Liz Reid addressed the issue, but also called out that many queries were phrased in a way that would likely return questionable answers.

The debate about LLM answers and questionable queries is not new. Yes, you might get a funny answer when you ask an LLM a funny question. Leading queries were used in the NY Times vs. OpenAI lawsuit and backlash against Perplexity and are no different than leading questions that suggest the answer.

After the PR backlash, Google dropped AIOs across almost every industry by an average of two-thirds.

  • May 30: 0.6% on desktop, 0.9% on mobile.
  • June 28: 0.2% on desktop, 0.3% on mobile.

Industries with the largest drops (data from Semrush Sensor):

  • Health: -3.7% desktop, 1.3% mobile.
  • Science: -1% desktop, -2.6% mobile.
  • People & Society: -2% desktop, -3.9% mobile.
Bar chart showing percentage changes in web traffic across various categoriesImage Credit: Kevin Indig

It seems that YMYL industries, such as health, science, animals, and law, were most affected. Some industries gained a small amount of AIOs, but not more than a negligible 0.2%.

  • Example: SEOmonitor clearly shows the pullback in visibility metrics for the jobs site monster.com.
A line graph with two lines indicating visibility percentages over time, showing a red arrow highlighting a peak around mid-June. The time period from May 1 to June 22 reflects the impact of Google AIOs on these trends.Image Credit: Kevin Indig

For the 1,675 queries I analyzed, the number of AIOs dropped from 42% to 23% of queries (almost half). Interestingly, the domain was cited more often (31% vs. 25%, more shortly) and ranked more often in the top 10 spots (45% vs. 41%).

Bar chart showing changes in AIOs in health before and after Google PR backlash. Metrics compared: AIOs shown, Citations, and Domain ranks in top 10, with percentages for 6/29 and 5/23. This illustrates the AIO Pullback during this period.Image Credit: Kevin Indig

Queries that stopped showing AIOs had, on average, less search volume. However, I couldn’t detect a clear pattern across word count, user intent, or SERP features for queries that gained vs. lost AIOs. The effect applies broadly, meaning Google reduced AIOs for all types of queries.

A bar chart titled Image Credit: Kevin Indig

AIOs Lean Heavily On No. 1 Web Result For Text Snippets

The before and after comparison allows us to learn more about the structure and behavior of AIOs.

For example, [hair growth products] and [best hair growth products] deliver almost identical AIOs (see screenshots below). The text is the same, but the product list and cited sources are slightly different. Google treats product searches as equal to “best” searches (makes sense).

SERPs for hair growth products (Image Credit: Kevin Indig)
SERPs for best hair growth products (AIO text is identical to screenshot above) Image Credit: Kevin Indig

The biggest difference is that the query for [hair growth products] shows no citation carousel on the side when you click the “show more” button (another example below).

On mobile, the carousel lives at the bottom of the AIO, which is not great for click-throughs. These subtle design differences likely make a big difference when it comes to clicks from AIOs since more prominently featured citations increase the likelihood of clicks.

Citations only expand when users click “show more” (Image Credit: Kevin Indig)

For transactional queries like [hair growth products], Google ranks products in the AIO in no apparent order.

I cross-referenced reviews, average ratings, price, organic product carousel and references in top-ranking articles – none indicate a relationship with the ranking in the AIO. It seems Google leans on its Shopping Graph to sort product lists.

To structure the AIO text, Google seems to pick more elements from the organic No. 1 result than others. For example, time.com ranks No. 1 for [best hair growth products]. Even though the citation in the AIO highlights a section about ingredients (purple in the screenshot below), the whole text closely mirrors the structure of the TIME article before it lists products.

The AIO mirrors the text on the No. 1 web result (time.com) (Image Credit: Kevin Indig)

AIOs use fragments of top web results because LLMs commonly use Retrieval Augmented Generation (RAG) to generate answers.

I wrote in How SEO might thrive under Bard and Prometheus:

Sridhar says that Neeva uses a technique called Retrieval Augmented Generation (RAG), a hybrid of classic information retrieval and machine learning. With RAG, you can train LLMs (Large Language Models) through documents and “remove” inaccurate results by setting constraints. In plain terms, you can show AI what you want with the ranking score for web pages. That seems to be the same or similar technique Bing uses to make sure Prometheus results are as accurate and relevant as possible.

The best example of Google mirroring the AIO after the No. 1 web result (in some cases) is the answer for [rosemary oil for hair growth]. The AIO pulls its text from MedicalNewsToday (No. 1) and restructures the answer.

Highlighted text comparing studies from 2015 and 2022 on the effectiveness of rosemary oil for hair growth, noting its potential treatment for androgenic alopecia similar to Minoxidil after 6 weeks of use, cited fewer AIOs overall.Text in the AI Overview vs. a snippet from MedicalNewsToday (Image Credit: Kevin Indig)

AIOs And Featured Snippets Still Co-Exist

For more informational queries with a featured snippet, like [dht], [panic attack vs. anxiety attack], or [does creatine cause hair loss], Google closely mirrors the answer in the featured snippets and elaborates further.

High overlap between AIOs and featured snippets (Image Credit: Kevin Indig)

In some cases, the elaboration might confuse users. When searching for [which vitamin deficiency causes hair loss], users see a long list in the AIO and a single answer in the featured snippet. While not contradicting each other, the AIO answer makes the featured snippet seem less trustworthy.

A Google search result page detailing information on vitamins and hair loss, with citations emphasizing that Vitamin D deficiency can lead to dry, brittle hair prone to breaking easily and increased shedding and thinning.Image Credit: Kevin Indig

In my opinion, Google would be best off not showing a featured snippet when an AIO is present. However, that would be bad news for sites ranking in featured snippets.

AIOs Contain More Citations

One way Google seems to have increased the accuracy of AIOs after the PR backlash is by adding more citations. The average number of citations increased from 15 to 32 in the sample of 1,675 keywords I analyzed. I haven’t yet been able to confirm that more citations are used to compile the answer, but more outgoing links to webpages are a good signal for the open web because they increase the chance of getting click-throughs from AIOs.

Both Reddit and Wikipedia were cited more often after the PR Backlash. I counted citations from those two domains because marketers pay a lot of attention to influencing the public discourse on Reddit, while Wikipedia has a reputation for having more gatekeepers.

Bar chart comparing citations from Reddit and Wikipedia in Google AI Overviews on two dates. Image Credit: Kevin Indig

Keep in mind that, with 0.8% and 1%, the number of citations is relatively low. It seems AIO heavily diversifies the number of citations. Only 23 keywords in the 1,675 keyword sample returned more than 10% of citations from Reddit after the PR backlash (28 for Wikipedia).

Accountability

We can conclude that:

  1. Google shows 50-66% fewer AIOs, which reduces the risk of losing organic traffic – for now.
  2. There seem to be more opportunities to be cited in AIOs, but strong performance in classic web search still largely determines citations and referral clicks from AIOs.
  3. Featured snippets get fewer clicks when AIOs are present since they elaborate much more on the answer.

Google becomes more accountable as it touches the border to publishing with AI Overviews. Verticals like health, science, and law continuously morph as new evidence comes out. It will be curious to understand whether AIOs are able to factor new evidence and opinions in and at what speed.

It’s not clear how, exactly, AI Overviews evaluate the strength of evidence, or whether it takes into account contradictory research findings, like those on whether coffee is good for you. “Science isn’t a bunch of static facts,” Dr. Yasmin said. She and other experts also questioned whether the tool would draw on older scientific findings that have since been disproved or don’t capture the latest understanding of an issue.

If AIOs adapt to new information, websites need to monitor AIOs and adapt content at an equal speed. The adaptation challenge alone will provide room for competitive advantages.


AI Overviews: About last week

Google Is Using A.I. to Answer Your Health Questions. Should You Trust It?


Featured Image: Paulo Bobita/Search Engine Journal

Google Discusses Core Topicality Systems via @sejournal, @martinibuster

Google’s latest Search Off the Record shared a wealth of insights on how Google Search actually works. Google’s John Mueller and Lizzi Sassman spoke with Elizabeth Tucker, Director, Product Management at Google, who shared insights into the many systems that work together to rank web pages, including a mention of a topicality system.

Google And Topicality

The word “topicality” means how something is relevant in the present moment. But when used in search the word “topicality” is about matching the topic of a search query with the content on a web page. Machine learning models play a strong role in helping Google understand what users mean.

An example that Elizabeth Tucker mentions is BERT (Bidirectional Encoder Representations from Transformers) which is a language model that helps Google understand a word within the context of the words that come before and after it (it’s more, that’s a thumbnail explanation).

Elizabeth explains the importance of matching topically relevant content to a search query within the context of user satisfaction.

Googler Lizzi Sassman asked about user satisfaction and Tucker mentioned that there are many dimensions to search, with many systems, using as an example the importance of the concept of topical relevance.

Lizzi asked (at about the 4:20 minute mark):

“In terms of the satisfaction bit that you mentioned, are there more granular ways that we’re looking at? What does it mean to be satisfied when you come away from a search?”

Elizabeth answered:

“Absolutely, Lizzi. Inside Search Quality, we think about so many important dimensions of search. We have so many systems. Obviously we want to show content that’s topically relevant to your search. In the early days of Google Search, that was sometimes a challenge.

Our systems have gotten much better, but that is still sometimes, for especially really difficult searches, we can struggle with. People search in so many ways: Everything from, of course, typing in keywords, to speaking to Google and using normal everyday language. I’ve seen amazing searches. “Hey Google, who is that person who, years ago, did this thing, and I don’t remember what it was called.” You know, these long queries that are very vague. And it’s amazing now that we have systems that can even answer some of those.”

Takeaway:

An important takeaway from that exchange is that there are many systems working together, with topicality being just one of them. Many in the search marketing community tend to focus on the importance of one thing like Authority or Helpfulness but in reality there are many “dimensions” to search and it’s counterproductive to reduce the factors that go into search to one, two or three concepts.

Biases In Search

Google’s John Mueller asked Elizabeth about biases in search and if that’s something that Google thinks about and she answered that there are many kinds of biases that Google watches out for and tries to catch. Tucker explains the different kinds of search results that may be topically relevant (such as evergreen and fresh) and then explains how it’s a balance that Google focuses on getting correctly.

John asked (at the 05:24 minute mark):

“When you look at the data, I assume biases come up. Is that a topic that we think about as well?”

Elizabeth answered:

“Absolutely. There are all sorts of biases that we worry about when you’re looking for information. Are we disproportionately showing certain types of sites, are we showing more, I don’t know, encyclopedias and evergreen results or are we showing more fresh results with up-to-date information, are we showing results from large institutional sites, are we showing results from small blogs, are we showing results from social media platforms where we have everyday voices?

We want to make sure we have an appropriate mix that we can surface the best of the web in any shape or size, modest goals.”

Core Topicality Systems (And Many Others)

Elizabeth next reiterated that she works with many kinds of systems in search. This is something to keep in mind because the search community only knows about a few systems when in fact there are many, many more systems.

That means it’s important to not focus on just one, two or three systems when trying to debug a ranking problem but instead to keep an open mind that it might be something else entirely, not just helpfulness or EEAT or some other reasons.

John Mueller asked whether Google Search responds by demoting a site when users complain about certain search results.

She speaks about multiple things, including that most of the systems she works on don’t have anything to do with demoting sites. I want to underline how she mentions that she works with many systems and many signals (not just the handful of signals that the search marketing community tends to focus on).

One of those systems she mentions is the core topicality systems. What does that mean? She explains that it’s about matching the topic of the search query. She says “core topicality systems” so I that probably means multiple systems and algorithms.

John asked (at the 11:20 minute mark):

“When people speak up loudly, is the initial step to do some kind of a demotion where you say “Well, this was clearly a bad site that we showed, therefore we should show less of it”? Or how do you balance the positive side of things that maybe we should show more of versus the content we should show less of?”

Elizabeth answered:

“Yeah, that’s a great question. So I work on many different systems. It’s a fun part of my job in Search Quality. We have many signals, many systems, that all need to work together to produce a great search result page.

Some of the systems are by their nature demotative, and webspam would be a great example of this. If we have a problem with, say, malicious download sites, that’s something we would probably want to fix by trying to find out which sites are behaving badly and try to make sure users don’t encounter those sites.

Most of the systems I work with, though, actually are trying to find the good. An example of this: I’ve worked with some of our core topicality systems, so systems that try to match the topic of the query.

This is not so hard if you have a keyword query, but language is difficult overall. We’ve had wonderful breakthroughs in natural language understanding in recent years with ML
models, and so we want to leverage a lot of this technology to really make sure we understand people’s searches so that we can find content that matches that. This is a surprisingly hard problem.

And one of the interesting things we found in working on, what we might call, topicality, kind of a nerdy word, is that the better we’re able to do this, the more interesting and difficult searches people will do.”

How Google Is Focused On Topics In Search

Elizabeth returns to discussing Topicality, this time referring to it as the “topicality space” and how much effort Google has expended on getting this right. Of particular importance she highlights how Google  used to be very focused on keywords, with the clear implication that they’re not as focused on it any more, explaining the importance of topicality.

She discusses it at the 13:16 minute mark:

“So Google used to be very keyword focused. If you just put together some words with prepositions, we were likely to go wrong. Prepositions are very difficult or used to be for our systems. I mean, looking back at this, this is laughable, right?

But, in the old days, people would type in one, two, three keywords. When I started at Google, if a search had more than four words, we considered it long. I mean, nowadays I routinely see long searches that can be 10-20 words or more. When we have those longer searches, understanding what words are important becomes challenging.

For example, this was now years and years ago, maybe close to ten years ago, but we used to be challenged by searches that were questions. A classic example is “how tall is Barack Obama?” Because we wanted pages that would provide the answer, not just match the words how tall, right?

And, in fact, when our featured snippets first came about, it was motivated by this kind of problem. How can we match the answer, not just keyword match on the words in the question? Over the years, we’ve done a lot of work in, what we might call, the topicality space. This is a space that we continue to work in even now.”

The Importance Of Topics And Topicality

There are a lot to understand in Tucker’s answer, including that it may be helpful that, when thinking about Google’s search ranking algorithms, to also consider the core topicality systems which help Google understand search query topics and match those to web page content because it underlines the importance of thinking in terms of topics instead of focusing hard on ranking for keywords.

A common mistake I see is in what people who are struggling with ranking is they are strongly focused on keywords.  I’ve been encouraging an alternate approach for the past many years that stresses the importance of thinking in terms of Topics. That’s a multidimensional way to think of SEO. Optimizing for keywords is one dimensional. Optimizing for a topic is multidimensional and aligns with how Google Search is ranking web pages in that topicality is an important part of ranking.

Listen to the Search Off The Record podcast starting at about the 4:20 minute mark and then fast forward to the 11:20 minute mark:

Featured Image by Shutterstock/dekazigzag

Top 10 Digital Marketing Trends For 2024 via @sejournal, @gregjarboe

It’s been a year of considerable disruptions in digital marketing so far.

Right now, the industry is dealing with the integration of generative AI and the impact this is going to have on user behaviour and how people search. Alongside the relentless updates that Google keeps throwing at us.

SEO is changing and the industry is trying to adapt whilst accepting the uncertainty.

But, it’s not all catastrophic, there is a lot of opportunity ahead for those that can evolve to embrace the new.

To help marketers and brands thrive amidst uncertainty, I’ve outlined trends to focus on, guided by strategic insights and Yogi Berra’s timeless wisdom,

“Predictions are hard, especially about the future.” – Yogi Berra

Digital marketers can no doubt relate to Yogi’s sentiment, acknowledging the challenge of what lies ahead.

These, then, are the top 10 digital marketing trends for 2024:

1. Strategy: “If You Don’t Know Where You Are Going, You Might Wind Up Someplace Else.”

Why is “strategy” this year’s top trend instead of the latest technology?

Well, as Yogi once observed, “If you don’t know where you are going, you might wind up someplace else.”

According to Spencer Stuart’s 2024 CMO Tenure Study, the average tenure of chief marketing officers (CMOs) at Fortune 500 companies in 2023 was 4.2 years.

The study also found the average tenure of CMOs at B2B companies was 4.5 years. It was 4.0 years for CMOs at B2C companies. And it was just 3.1 years at the consumer-heavy top 100 advertisers.

So, developing a digital marketing strategy that will generate above-average results within a couple of years is the key to keeping your job as the CMO of a big brand.

And if you don’t master the art and science of creating a digital marketing strategy that generates business outcomes, then you won’t land one of the CMO jobs that turn over each year.

In other words, learning to use the latest technology is necessary, but it won’t get digital marketing leaders and executives where they want to go.

2. Generative AI: “Predictions Are Hard, Especially About The Future.”

Yogi also said, “Predictions are hard, especially about the future.” So, it’s tempting to ask generative AI tools to predict their own future.

OpenAI’s ChatGPT 3.5 says:

“AI and machine learning have been transforming digital marketing for years, but 2024 is poised to see these technologies become even more integral. From chatbots and predictive analytics to personalized content and ad targeting, AI will enhance customer experiences and drive efficiency. Brands leveraging AI can expect to see improved ROI and deeper customer insights.”

Google’s Gemini (formerly Bard) says:

“It ain’t science fiction anymore. Artificial intelligence (AI) is taking center stage, from crafting personalized chatbots that convert like crazy to optimizing campaigns with laser precision. Don’t fear the robot takeover, embrace it!”

And Anthropic’s Claude 3 says:

“Artificial intelligence writing assistants like Claude have been making waves, and in 2024 we’ll see these tools become ubiquitous in content marketing. They’ll help scale content creation while maintaining quality.”

But AI can’t see the big picture for your organization. It can’t empathize with people. And it can’t be creative like you. So, AI needs you in the driver’s seat to make it work effectively.

3. SEO: “It Ain’t Over Till It’s Over.”

Some pundits think SEO is dead. But as Yogi declared, “It ain’t over till it’s over.”

That’s because SEO pros have the remarkable ability to adapt to constant change or new information. Often, this means adjusting to the latest Google algorithm updates. But this also includes rethinking strategies based on the recent Google API “leak.”

Now, Rand Fishkin and Mike King were the first to report on the leaked documents. Although Google has officially acknowledged that these internal documents are authentic, it has also cautioned against jumping to conclusions based on the leaked files alone.

What should savvy SEO pros do?

Well, I’ve known Fishkin for more than 20 years. And he has the experience, expertise, authoritativeness, and trustworthiness (E-E-A-T) you’ve heard about.

So, I’m going to follow Fishkin’s recommendations, including:

  • Hire writers with established reputational authority that Google already associates with quality content.
  • Supplement link-building with public relations to increase branded search demand. (I’ll say more on this below.)
  • “Think about SEO as being more geographically specific than you think it is even for web search results.”
  • Move beyond parsing Google’s public statements and embrace experimentation and testing to uncover what produces results.

4. Link Building: “Always Go To Other People’s Funerals; Otherwise, They Won’t Go To Yours.”

I spotted this trend a long time ago, and I spoke about it at SES London 2009 in a session titled, “Beyond Linkbait: Getting Authoritative Mentions Online.”

Back then, I said link bait tactics can be effective “if you focus on the underlying quality as well as ingenuity needed to get other websites to link to you.”

I also provided a couple of case studies that showed British SEO professionals how to “approach journalists, bloggers, and other authoritative sources to enhance your company’s online reputation, whether or not you get links.”

But getting authoritative mentions without links didn’t translate. People on the other side of the pond thought I was saying something unintentionally funny like, “Always go to other people’s funerals; otherwise, they won’t go to yours.”

Hopefully, Fishkin’s recommendation will enable a lot more SEO pros to finally understand the underlying wisdom of supplementing link building with public relations.

As he clearly explained at MozCon, “If you get a whole bunch of links in one day and nothing else, guess what? You manipulated the link graph. If you’re really a big brand, people should be talking about you.”

5. Paid Media: “It’s Déjà Vu All Over Again.”

Everyone knows that Google, Meta, and other paid media are adding AI to their advertising platforms faster than the speed of sound. So, this might be mistaken as background noise.

But I’ve spotted the signal in the noise. Today’s frenzy to provide AI solutions is remarkably like the frenzy to provide programmatic solutions a decade ago. As Yogi said, “It’s déjà vu all over again.”

This means that digital marketers – and their agencies – can quickly refresh their “programmatic” workflow and turn it into “AI” best practices.

For example, Google touted a five-step programmatic workflow five years ago.

It consisted of:

  • Organize audience insights.
  • Design compelling creative.
  • Execute with integrated technology.
  • Reach audiences across screens.
  • Measure the impact.

Why is today’s process of buying and selling digital media in an automated fashion so similar? Because AI is just fulfilling the early promise of programmatic to engage with consumers in the moments that matter most.

But there’s one significant difference between then and now.

As you’ll read below, it’s the improved ability to integrate your advertising platforms with your analytics platform to measure the impact of campaigns on brand awareness and lead generation.

6. Analytics: “You Can Observe A Lot By Watching.”

Performance marketers integrated their advertising platforms with their analytics platform more than a decade ago to measure the impact of their campaigns on “conversions.”

But brand marketers rarely focused on their analytics data because “brand awareness” was something they measured when consumers initially saw their display ads or watched their video ads.

A funny thing happened after Google Analytics 4 rolled out last summer. A “Business objectives” collection replaced the “Life cycle” collection of reports and one business objective you can now track is “Raise brand awareness.”

For example, brand marketers can now use traffic acquisition, demographic details, user acquisition, as well as which pages and screens users visit to measure brand awareness in places that are less vulnerable to ad fraud.

Another business objective you can now track is “Generate leads.”

So, digital marketers can measure any user action that’s valuable to their organization, including:

  • Scrolling to 90% or more of their blog post.
  • Downloading a whitepaper.
  • Subscribing to their newsletter.
  • Playing at least 50% of a product video.
  • Completing a tutorial.
  • Submitting a registration form.

And as Yogi noted, “You can observe a lot by watching.”

7. Content Marketing: “When You Come To A Fork In The Road, Take It.”

In the summer of 2020, the Content Marketing Institute and MarketingProfs fielded their annual survey and found that “Content marketers are resilient. Most have met the challenges of the pandemic head-on.”

In response to the pandemic, B2B and B2C marketers:

  • Increased time spent talking with customers.
  • Revisited their customer/buyer personas.
  • Re-examined the customer journey.
  • Changed their targeting/messaging strategy.
  • Changed their distribution strategy.
  • Adjusted their editorial calendar.
  • Put more resources toward social media/online communities.
  • Changed their website.
  • Changed their products/services.
  • Adjusted their key performance indicators (KPIs).
  • Changed their content marketing metrics (e.g., set up new analytics/dashboards).

In other words, many content marketers totally overhauled their process for creating a content marketing plan from stem to stern.

For some, 2020 was the year of quickly adapting their content marketing strategy. For others, it was the year to finally develop one.

According to BrightEdge, content marketers are now “preparing for a Searchquake,” a tectonic shift in the content marketing landscape triggered by Google’s Search Generative Experiences (SGE).

But content marketers now know exactly what to do. As Yogi directed, “When you come to a fork in the road, take it.”

8. Video Creation: “If You Can’t Imitate Him, Don’t Copy Him.”

I teach an online class at the New Media Academy in Dubai on “Influencer Marketing and AI.” This may seem like an odd combination of topics, but they’re related to another class I teach on “Engaging Audiences through Content.”

I tell my students that creating great content is hard. That’s why marketers start using influencers or AI to create video content that their audience will find valuable and engaging. Then, they learn that there’s more to learn.

For example, AI can create realistic and imaginative scenes from text instructions. But AI can’t be creative like humans. So, the heart of every great video is still innovative, surprising, human-led creativity.

I show them “OpenAI Sora’s first short film – ‘Air Head,’ created by shy kids,” a Toronto-based production company.

Then, I ask them to apply what they have learned by using Synthesia, Runway, or invideo AI to generate a short video for their capstone project.

Invariably, they report that AI video generators can create realistic and imaginative scenes from text instructions but aren’t creative like shy kids.

Or, as Yogi put it, “If you can’t imitate him, don’t copy him.”

9. Influencer Marketing: “Nobody Goes There Anymore. It’s Too Crowded.”

The Influencer Marketing Hub says, “Most marketers believe that finding and selecting the best, most relevant influencers to be the most difficult part of influencer marketing.”

That’s ironic because HypeAuditor offers an influencer discovery platform that enables marketers to search through a database of 137.5 million influencers on Instagram, YouTube, TikTok, X (formerly Twitter), and Twitch.

It also enables marketers to apply filters to discover the perfect partners for their brand.

This apparent contradiction reminds me of Yogi’s comment, “Nobody goes there anymore. It’s too crowded.”

But it also indicates that most marketers are looking at influencer identification through the wrong end of the telescope. What should they do instead?

Well, I show the students in my “Influencer Marketing and AI” class how to use SparkToro to get a free report on the audience that searches for “Dubai.”

Infographic showcasing digital marketing trends for 2024 with monthly searches and demographics for Dubai. Image from SparkToro, June 2024

SparkToro estimates that 446,000 to 654,000 people search for “Dubai” monthly. And it uncovers the websites they visit, the keywords they search for, and their gender demographics.

Screenshot of a list showing accounts related to Dubai, their affinity scoresImage from SparkToro, June 2024

SparkToro also identifies the sources of influence for this audience, including high-affinity accounts and hidden gems, so marketers can invest in the right ones.

10. Social Media: “The Future Ain’t What It Used To Be.”

I’m a big believer in “the rule of three.”

So, I wasn’t startled when I received an email from Jennifer Radke inviting me to attend “an exciting webinar focused on a high-level look into using ChatGPT for social media!”

But I was shocked when Katie Delahaye Paine shared a link to new research by Asana’s Work Innovation Lab and Meltwater, which found that “only 28% of marketing professionals have received training on how to use AI tools effectively.”

I was also horrified when I read a column by Mark Ritson in MarketingWeek that argued, “AI’s strength is automating high-volume, short-term marketing activity, which means social media could become a cesspool of synthetic content.”

Hey, I was having lunch with Chris Shipley in 2004 when she coined the term “social media.” So, I remember when social media still had a promising future.

But, as Yogi once declared, “The future ain’t what it used to be.”

So, social media marketing has three options:

  • They can get upskilled to use AI tools more effectively.
  • They can get reskilled to identify the right influencers.
  • They can update their resumes and look for new jobs.

Picking Digital Marketing Trends Is Like Playing Moneyball

Some skeptics may question this counter-intuitive lineup of the top 10 digital marketing trends for 2024. Some of my selections seem to throw out conventional wisdom.

I recently watched the movie Moneyball (2011) for a second time. I was reminded that the Oakland Athletics baseball team’s general manager, Billy Beane (Brad Pitt), and assistant general manager, Peter Brand (Jonah Hill), used sabermetrics to analyze players.

This produced an epiphany: Picking digital marketing trends is like playing Moneyball. If you want to win against competitors with bigger budgets, then you need to find strategic insights, critical data, tactical advice, and digital marketing trends that conventional wisdom has overlooked.

And where did I come up with the whimsical idea of matching each trend with one of Yogi’s memorable quotes? Was it inspiration or hallucination?

I recently watched the documentary It Ain’t Over (2022) for the first time. It’s about New York Yankee Hall of Fame catcher Yogi Berra. And it supported Yogi’s claim, “I really didn’t say everything I said.”

But sportswriters kept attributing these Yogi-isms to the catcher because these “distilled bits of wisdom … like good country songs … get to the truth in a hurry,” as Allan Barra, the author of a book on Yogi, has explained.

And that strategic insight produced this year’s update – by a human – as opposed to last year’s top 10 digital marketing trends by ChatGPT.

More resources:


Featured Image: SuPatMaN/Shutterstock

Google’s E-E-A-T & The Myth Of The Perfect Ranking Signal via @sejournal, @MattGSouthern

Few concepts have generated as much buzz and speculation in SEO as E-E-A-T.

Short for Experience, Expertise, Authoritativeness, and Trustworthiness, this framework has been a cornerstone of Google’s Search Quality Evaluator Guidelines for years.

But despite its prominence, more clarity about how E-E-A-T relates to Google‘s ranking algorithms is still needed.

In a recent episode of Google’s Search Off The Record podcast, Search Director & Product Manager Elizabeth Tucker addressed this complex topic.

Her comments offer insights into how Google evaluates and ranks content.

No Perfect Match

One key takeaway from Tucker’s discussion of E-E-A-T is that no single ranking signal perfectly aligns with all four elements.

Tucker explained

“There is no E-E-A-T ranking signal. But this really is for people to remember it’s a shorthand, something that should always be a consideration, although, you know, different types of results arguably need different levels of E-E-A-T.”

This means that while Google’s algorithms do consider factors like expertise, authoritativeness, and trustworthiness when ranking content, there isn’t a one-to-one correspondence between E-E-A-T and any specific signal.

The PageRank Connection

However, Tucker did offer an example of how one classic Google ranking signal – PageRank – aligns with at least one aspect of E-E-A-T.

Tucker said:

“PageRank, one of our classic Google ranking signals, probably is sort of along the lines of authoritativeness. I don’t know that it really matches up necessarily with some of those other letters in there.”

For those unfamiliar, PageRank is an algorithm that measures the importance and authority of a webpage based on the quantity and quality of links pointing to it.

In other words, a page with many high-quality inbound links is seen as more authoritative than one with fewer or lower-quality links.

Tucker’s comments suggest that while PageRank may be a good proxy for authoritativeness, it doesn’t necessarily capture the other elements of E-E-A-T, like expertise or trustworthiness.

Why SEJ Cares

While it’s clear that E-E-A-T matters, Tucker’s comments underscore that it’s not a silver bullet to ranking well.

Instead of chasing after a mythical “E-E-A-T score,” websites should create content that demonstrates their expertise and builds user trust.

This means investing in factors like:

  • Accurate, up-to-date information
  • Clear sourcing and attribution
  • Author expertise and credentials
  • User-friendly design and navigation
  • Secure, accessible web infrastructure

By prioritizing these elements, websites can send strong signals to users and search engines about the quality and reliability of their content.

The E-E-A-T Evolution

It’s worth noting that E-E-A-T isn’t a static concept.

Tucker explained in the podcast that Google’s understanding of search quality has evolved over the years, and the Search Quality Evaluator Guidelines have grown and changed along with it.

Today, E-E-A-T is just one of the factors that Google considers when evaluating and ranking content.

However, the underlying principles – expertise, authoritativeness, and trustworthiness – will likely remain key pillars of search quality for the foreseeable future.

Listen to the full podcast episode below:


Featured Image: salarko/Shutterstock

Google Warns Of Soft 404 Errors And Their Impact On SEO via @sejournal, @MattGSouthern

In a recent LinkedIn post, Google Analyst Gary Illyes raised awareness about two issues plaguing web crawlers: soft 404 and other “crypto” errors.

These seemingly innocuous mistakes can negatively affect SEO efforts.

Understanding Soft 404s

Soft 404 errors occur when a web server returns a standard “200 OK” HTTP status code for pages that don’t exist or contain error messages. This misleads web crawlers, causing them to waste resources on non-existent or unhelpful content.

Illyes likened the experience to visiting a coffee shop where every item is unavailable despite being listed on the menu. While this scenario might be frustrating for human customers, it poses a more serious problem for web crawlers.

As Illyes explains:

“Crawlers use the status codes to interpret whether a fetch was successful, even if the contents of the page is basically just an error message. They might happily go back to the same page again and again wasting your resources, and if there are many such pages, exponentially more resources.”

The Hidden Costs Of Soft Errors

The consequences of soft 404 errors extend beyond the inefficient use of crawler resources.

According to Illyes, these pages are unlikely to appear in search results because they are filtered out during indexing.

To combat this issue, Illyes advises serving the appropriate HTTP status code when the server or client encounters an error.

This allows crawlers to understand the situation and allocate their resources more effectively.

Illyes also cautioned against rate-limiting crawlers with messages like “TOO MANY REQUESTS SLOW DOWN,” as crawlers cannot interpret such text-based instructions.

Why SEJ Cares

Soft 404 errors can impact a website’s crawlability and indexing.

By addressing these issues, crawlers can focus on fetching and indexing pages with valuable content, potentially improving the site’s visibility in search results.

Eliminating soft 404 errors can also lead to more efficient use of server resources, as crawlers won’t waste bandwidth repeatedly visiting error pages.

How This Can Help You

To identify and resolve soft 404 errors on your website, consider the following steps:

  1. Regularly monitor your website’s crawl reports and logs to identify pages returning HTTP 200 status codes despite containing error messages.
  2. Implement proper error handling on your server to ensure that error pages are served with the appropriate HTTP status codes (e.g., 404 for not found, 410 for permanently removed).
  3. Use tools like Google Search Console to monitor your site’s coverage and identify any pages flagged as soft 404 errors.

Proactively addressing soft 404 errors can improve your website’s crawlability, indexing, and SEO.


Featured Image: Julia Tim/Shutterstock

Google’s Search Dilemma: The Battle With ‘Not’ & Prepositions via @sejournal, @MattGSouthern

While Google has made strides in understanding user intent, Director & Product Manager Elizabeth Tucker says specific queries remain challenging.

In a recent episode of Google’s Search Off The Record podcast, Tucker discussed some lingering pain points in the company’s efforts to match users with the information they seek.

Among the top offenders were searches containing the word “not” and queries involving prepositions, Tucker reveals:

“Prepositions, in general, are another hard one. And one of the really big, exciting breakthroughs was the BERT paper and transformer-based machine learning models when we started to be able to get some of these complicated linguistic issues right in searches.”

BERT, or Bidirectional Encoder Representations from Transformers, is a neural network-based technique for natural language processing that Google began leveraging in search in 2019.

The technology is designed to understand the nuances and context of words in searches rather than treating queries as a bag of individual terms.

‘Not’ There Yet

Despite the promise of BERT and similar advancements, Tucker acknowledged that Google’s ability to parse complex queries is still a work in progress.

Searches with the word “not” remain a thorn in the search engine’s side, Tucker explains:

“It’s really hard to know when ‘not’ means that you don’t want the word there or when it has a different kind of semantic meaning.”

For example, Google’s algorithms could interpret a search like “shoes not made in China” in multiple ways.

Does the user want shoes made in countries other than China, or are they looking for information on why some shoe brands have moved their manufacturing out of China?

This ambiguity poses a challenge for websites trying to rank for such queries. If Google can’t match the searcher’s intent with the content on a page, it may struggle to surface the most relevant results.

The Preposition Problem

Another area where Google’s algorithms can stumble is prepositions, which show the relationship between words in a sentence.

Queries like “restaurants with outdoor seating” or “hotels near the beach” rely on prepositions to convey key information about the user’s needs.

For SEO professionals, this means that optimizing for queries with prepositions may require some extra finesse.

It’s not enough to include the right keywords on a page; the content needs to be structured to communicate the relationships between those keywords.

The Long Tail Challenge

The difficulties Google faces with complex queries are particularly relevant to long-tail searches—those highly specific, often multi-word phrases that make up a significant portion of all search traffic.

Long-tail keywords are often seen as a golden opportunity for SEO, as they tend to have lower competition and can signal a high level of user intent.

However, if Google can’t understand these complex queries, it may be harder for websites to rank for them, even with well-optimized content.

The Road Ahead

Tucker noted that Google is actively improving its handling of these linguistically challenging queries, but a complete solution may still be a way off.

Tucker said:

“I would not say this is a solved problem. We’re still working on it.”

In the meantime, users may need to rephrase their searches or try different query formulations to find the information they’re looking for – a frustrating reality in an age when many have come to expect Google to understand their needs intuitively.

Why SEJ Cares

While BERT and similar advancements have helped Google understand user intent, the search giant’s struggles with “not” queries and prepositions remind us that there’s still plenty of room for improvement.

As Google continues to invest in natural language processing and other AI-driven technologies, it remains to be seen how long these stumbling blocks will hold back the search experience.

What It Means For SEO

So, what can SEO professionals and website owners do in light of this information? Here are a few things to keep in mind:

  1. Focus on clarity and specificity in your content. The more you can communicate the relationships between key concepts and phrases, the easier it will be for Google to understand and rank your pages.
  2. Use structured data and other technical SEO best practices to help search engines parse your content more effectively.
  3. Monitor your search traffic and rankings for complex queries, and be prepared to adjust your strategy if you see drops or inconsistencies.
  4. Monitor Google’s efforts to improve its natural language understanding and be ready to adapt as new algorithms and technologies emerge.

Listen to the full podcast episode below: