260k Search Results Analyzed: Here’s How Google Evaluates Your Content [Data Study] via @sejournal, @ericvanbuskirk

The most recent Helpful Content Update (HCU) concluded with the Google March core update, which finished rolling out on April 19, 2024. The updates integrated the helpful content system into the core algorithm.

To investigate changes in Google’s ranking of webpages, data scientists at WLDM and ClickStream partnered with Surfer SEO, which pulled data based on our keyword lists.

Implications Of The March Update And Google’s Goals

Google is prioritizing content that offers exceptional value to humans, not machines.

Logically, the update should prioritize topic authority: Creators should demonstrate thorough experience, expertise, authoritativeness, and trustworthiness (E-E-A-T) on a given website page to assist users.

Your Money or Your Life (YMYL) pages should also be prioritized by HCU. When our health or money is at risk, we rely on accurate information.

Google’s Search Laison, Danny Sullivan, confirmed that HCU works on a page level, not just sitewide.

Google says:

“This [HCU] update involves refining some of our core ranking systems to help us better understand if webpages are unhelpful, have a poor user experience, or feel like they were created for search engines instead of people. This could include sites created primarily to match very specific search queries.

We believe these updates will reduce the amount of low-quality content on Search and send more traffic to helpful and high-quality sites.”

Google also released the March 2024 spam update, finalized on March 20.

SEO Industry Impact

The update significantly affected many websites, causing search rankings to fluctuate and even reverse course during the update. Some SEO professionals have called it a “seismic shift” in the SEO industry.

Frustratingly, over the past few weeks, Google undermined the guidelines and algorithms central HCU system by releasing AI search results that include dangerous and incorrect health-related information.

There remains SERP volatility to date. It appears adjustments to the March update are still occurring now.

Background

Methodology

In December 2023, we analyzed the top 30 results on Google SERPs for 12,300 keywords. In April 2024, we expanded our study by examining 428,436 keywords and analyzing search results for 8,460. The study covered 253,800 final SERP results in 2024.

Our 2023 keyword set was more limited, providing a baseline for an expanded study. This allowed us to understand Google’s ranking signal changes after March and some of the “rank tremors” that occurred in early April.

We appended “how to use” to the front of keywords to create information-intent keywords for both data sets. JungleScout provided access to a database of ecommerce keywords grouped and siloed using NLP. Our study focused on specific product niches.

Correlation And Measurements

We used the Spearman correlation to measure the strength and direction of associations between ranked variables.

In SEO ranking studies, a .05 correlation is considered significant. With hundreds of ranking signals, each one impacts the ranking only slightly.

Our Focus Is On-Page Ranking Factors 

Our study primarily analyzes on-page ranking signals. By chance, our 2024 study was scheduled for April, coinciding with the end of Google’s most significant ranking changes in over eight years. Data studies require extensive planning, including setting aside people and computing resources.

Our key metric for the study was comprehensive content coverage, which means thorough or holistic writing about the primary topic or keyword on a page. Each keyword was matched to text on the pages of the 30 top URLs in the SERP. We had highly precise measurements for scoring natural language processing-related topics used on pages.

Another key study goal was understanding webpages covering health-sensitive topics versus those in non-health pages. Would pages not falling into the now-infamous YMYL category be less sensitive to some ranking factors?

Since Google is looking for excellent user experience, data was pulled on each webpage’s speed and Core Web Vitals in real-time to see if Google considers it a key component of the user experience.

Content Score As A Predictor

It’s not surprising that Surfer SEO’s proprietary “Content Score” was the best predictor of high ranking compared to any single on-page factor we examined in our study. This is true for 2023, where the correlation was .18, and 2024, which is .21.

The score is an amalgamation of many ranking factors. Clearly, the scoring system shows helpful content that’s meaningful for users. The small correlation change from the two periods shows the March update did not change many key on-page signals.

The Content Score consists of many factors, including:

  1. Usage of relevant words and phrases.
  2. Title and your H1.
  3. Headers and paragraph structure.
  4. Content length.
  5. Image occurrences.
  6. Hidden content (i.e., alt text of the images).
  7. Main and partial keywords – not only how often but where exactly those are used.

… and many more good SEO practices.

More About Correlations And Measurements In The Study

Niches were chosen because we wanted domains with multiple URLs to appear in our study. It was important to get many niche and “specialty” oriented sites, as is the case for most non-mega sites.

Most data studies overlook how a group of URLs from one domain tells a story: The keywords they use are so randomized that the mega websites have the vast majority of URLs in results.

The narrow topics also meant fewer keywords with extreme ranking competition. Many ranking studies use a preponderance of keywords with over 40,000 monthly searches, but most SEO professionals don’t work for websites that can rank in the top 10 for those. This study is biased toward less competitive keywords, and we didn’t look at Google keyword search volume – just the volume on Amazon.

Our keywords had more than 10 monthly searches on Amazon per month (via JungleScout). However, when appending “how to use” to the front of the keyword, the search volume in Google would be less than 10 a month in many cases.

The “dangerous, prohibited, banned” group was excluded from most comparisons of health vs. non-health. Many of these were very esoteric topics or Amazon needed six to 10 words to describe them.

Most SEO professionals don’t work for the top 50 largest websites. Instead, we want results that help the vast majority of SEO pros.

Here’s How We Generated Different Keyword Types

For example, we appended “buy” to the product keyword “adobe professional” in one instance and “how to use” in another.

Product Category Search Intent Appended Keyword
adobe professional software informational how to use how to use adobe professional

We examined data using the Spearman rank-order correlation formula. Spearman calculates the correlation between two variables, and the correlation is measured from -1 to 1. A correlation coefficient of 1 or -1 would mean that there is a strong monotonic relationship between the two variables.

The Spearman correlation is used instead of Pearson because of the nature of Google search results; they are ranked by importance in decreasing order.

Spearman’s correlation compares the ranks of two datasets, which fits our goal better than Pearson’s. We used .05 as our level of correlation confidence.

When we show a correlation of .08, it suggests a ranking signal that is twice as powerful as another ranking signal measure of .04. Greater than .05 is a positive correlation; less than .05 is no correlation. Correlations range from .05 to -.05. A negative correlation shows that it is causing the direct variable number to go down.

Many of the domains in the study are from outlier or niche topics or are small because little time and money is spent on them. That is, first and foremost, why they don’t rank well.

That is also why we must look for “controls” that might show that two domains have the same amount of time, web development/design superiority, and money invested in them, but they are, for example, health vs. non-health topics.

Correlation is not causation. We did want to understand how we could “control” some large factors to better pinpoint the effect of results. This was done with graph visualizations.

Google uses potentially thousands of factors, so isolating independent variables is very difficult. Correlations have been used in science for centuries, where variables can’t be totally controlled. They are accepted science, and to say otherwise is a fool’s errand.

Keyword Categories And Classifications

Our keywords were search terms related to products.

Using narrow niches lets us cluster topics that are very much not YMYL vs. those that are.

Image from author, June 2024

For example, CBD and vape keywords are banned from Google Ads, so they are very good for our health-related keyword set. The FDA and others consider muscle building and weight loss two of the riskiest (read: dangerous) health-related categories on Amazon.

We chose the other non-health categories because they were near-poster children of innocuous niches.

The “dangerous, prohibited, banned” keywords come from products that are manually removed from Amazon’s Seller Central page list.

Each category fits into one of three classifications (The X axis here is a number of keywords).

Image from author, June 2024

Detailed Findings And Actionable Insights

Importance Of Topic Authority And Semantic SEO

The largest on-page ranking factor is the use of topics related to the searched keyword phrase (our measure of topic authority and semantic SEO).

We found a correlation of -.11 in December 2023, which increased to -.13 in April 2024 for “missing common keywords and phrases.” These numbers are calculated by examining the relationship between the metric and a site’s Google ranking.

A higher negative correlation, like -.13, signifies that omitting these keywords significantly decreases the site’s ranking.

2024 YMYL vs. Safe Content – Not (Image from author, June 2024)

Surfer SEO’s algorithm typically reveals 10-100 words and phrases that should be included to cover the topic comprehensively.

That factor is so strong that it is more important than the domain monthly traffic volume for the domain a webpage is on (for example, articles on Amazon.com rank higher than those published on small websites).

A domain’s traffic is a measure of authority (and, perhaps, trust to some extent). Domain rating or Domain authority, metrics calculated by Ahrefs and Moz, are other ways to measure a website’s ability to rank highly in the SERP. However, they rely much more on links, an off-page ranking factor.

This is a novel finding. We’ve never seen any large Google ranking study demonstrate such high importance of topical authority. Concurrently, none used such highly precise on-page data examining text with thousands of search result pages.

If you’re not paying attention to natural language processing, a.k.a topic modeling known as semantic SEO, you’re almost nine years late. That’s when the Hummingbird algorithm launched. Six years later, the sub-algorithm of Hummingbird appeared: BERT.

The BERT algorithm is a neural machine translation system developed by Google that performs word-level training and uses a bidirectional LSTM with attention to learning representations of words. It’s particularly important in helping Google understand the meaning of users’ queries.

Health-Related Vs. Non-Health Pages

We found that Google’s algorithms increase their sensitivity to on-page factors when returning results about health-sensitive topics. To rank highly in Google, YMYL pages need more comprehensive topic coverage. Since the March update, this has become more important than in December.

Image from author, June 2024

Generally, YMYL search results prioritize content from government sites, established financial companies, research hospitals, and very large news organizations. Sites like Forbes, NIH, and official government pages often rank highly in these areas to ensure users receive reliable and accurate information.

More About The Massive March Update And YMYL

Websites in YMYL started getting slews of attention and traction in the SEO community in 2018 when Google rolled out the “Medic Update.” Health and finance categories have seen a rollercoaster ride in the SERPs over the years since then.

One way of understanding the changes is that Google tries to be more cautious in ranking pages related to personal health and finances. This might be especially true when topics lack broad consensus, are controversial, or have an outsized impact on personal health and finance choices.

Most SEO pros agree that there is no YMYL ranking factor per se. Instead, websites in these sectors have E-E-A-T signals that are examined with far higher demands and expectations.

When we look at on-page ranking signals, many other factors interfere with what we are trying to measure. For example, in link studies, SEO pros would love to isolate how different types of anchor texts perform. Unless you own over 500 websites, you don’t have enough control over what affects minor differences among anchor text variables.

Nevertheless, we find differences in correlations between health vs. non-health ranking signals in both of our studies.

The “banned, hazardous, prohibited” pages were even more sensitive to one page’s optimization than the non-health-related group.

Since the Content Score we used amalgamates many factors, it is especially good at showing the differences. Isolating for a small factor like “body missing/having common words” (topic coverage) is too weak a signal in itself to show a pronounced difference between two types of content pages.

The number of domain-ranked keywords and the website’s (domain’s) estimated monthly traffic affect how a page ranks – a lot.

These measure domain authority. Google doesn’t use its own results (organic search traffic) as a ranking factor, but it’s one of the most useful stats for understanding how successful a site is with organic search.

Most SEO pros evaluate via scores like DA (Moz) or DR (Ahrefs), which are much more heavy on link profiles and less on actual traffic driven via organic search.

Ranked keywords and estimated traffic are critical ways to find E-E-A-T for a domain. They show the website’s success but not the page’s. Looking at these external ranking factors on a page level would give more insights, but it is important to remember that this study focuses on on-page factors.

Ranked keywords had a strong relationship, with correlations of .11 for 2023 and .09 for 2024. For traffic estimations, we saw .12 (2023) and .11 (2024).

Having a page on a larger website predicts higher rankings. One of the first things SEO pros learn is to avoid going after parent topics and competitive keywords where authority sites dominate the SERPs.

Five years ago, when most SEO practitioners weren’t paying attention to topic coverage, the best way to create keyword maps or plans was using the “if they can rank, we can rank” technique.

This strategy is still important when used alongside topic modeling, as it relies heavily on being certain that competitor sites analyzed have similar authority and trust.

Website Speed And High-Ranking Pages

Google created a lot of hoopla when it announced:

Page experience signals [will] be included in Google Search ranking. These signals measure how users perceive the experience of interacting with a webpage and contribute to our ongoing work to ensure people get the most helpful and enjoyable experiences from the web…the page experience signals in ranking will roll out in May 2021.”

We looked at four site speed factors. These are:

  • HTML size (in bytes).
  • Page speed time to first byte.
  • Load time in milliseconds.
  • Page size in kilobytes

In our 2023 study, we did not find a correlation with the page speed measurements. That was surprising. Many website owners placed too much emphasis on them last year. The highest correlation was just .03 for both time to first byte and HTML file size.

However, we saw a significant jump since the March update. This matches squarely with Google’s statement that user experience is its priority for Helpful Content. Time to first byte is the most important factor, as it was five years ago. HTML file size was the second speed factor that mattered most.

Bar chart showing correlations between various speed factors and health-related content. Most factors have near-zero correlations except page speed sizes, which have negative correlations. Data source: Surfer SEO Study May 2024. This information is crucial for optimizing search results on Google.April 2024 Speed correlations (Image from author, June 2024)

In 2016, I oversaw the first study to show Google measures page speed factors other than time to first byte. Since then, others have also found even bigger effects on higher ranking by having fast sites in other areas like “Time to First Paint” or “Time to First Interactive.” However, that was before 2023.

Informational Vs. Buy Intent Content

Different search intents require different approaches.

Content must be better optimized for informational searches compared to buyer intent searches.

We created two groups for user intent query types. This is another test we’ve not seen done with a big data set.

Bar charts compare Image from author, June 2024

For buyer intent, “for sale” was appended to the end of search terms and “buy” to the front of other terms. This was implemented randomly on half of all keywords in the study. The other half had “how to use” appended to the beginning.

Since there are so many impacts on rank, these differences – if there even are any – get a bit lost. We did see a small difference where informational pages, which tend to have more comprehensive topic coverage, are slightly more sensitive when they are missing related keywords.

Our hypothesis was ecommerce pages are not expected to be as holistic in word coverage. They have authority from user reviews and unique images not found elsewhere. An informational page has less to prove its authoritativeness and trustworthiness, as the writing is more critical.

Prior to the March update, we saw a more pronounced difference.

Image from author, June 2024

Google knows users don’t want to see too much text on an ecommerce page. If they are ready to buy, they’ve typically done some due diligence on what to buy and have completed most of their customer journey.

Ecommerce sites use more complex frameworks, and Google can tell much about buyer user experience with technical SEO page factors that are less important on informational pages.

In addition, for sites with more than a handful of products, category pages tend to have the more thorough content that users and Google look for before diving deeper.

Challenges And Considerations

Google is under intense scrutiny because of its AI search results that give incorrect, dangerous answers to health questions. Google lowered the number of YMYL responses that trigger AI results, but it has left a double standard in place: websites appearing in Search must have content from personal experience, expertise, etc. Yet Google’s AI overviews come from scraping content to generate answers via large language models known to make mistakes (hallucinations).

There was outrage over answers to uncommon searches that produced ridiculous results for health-related questions (for example, suggesting users use glue with their pizza). In our view, the bigger issue is that AI results don’t use the same tough standards the search giant expects of website owners.

For example, a search for “stem cells cerebral palsy” in late May produced an AI overview that sources an “obscure clinic as its supposed expert

Screenshot from search for [stem cells cerebral palsy], June 2024

Potential For Over-Optimization

An interesting consideration posed by HCU is whether having too many of the same entities and topics as the existing top results for the same topic is considered “creating for search engines.”

There’s no way to answer that with a correlation study, but Google likely looks for subtle clues of overoptimization. Its use of machine learning suggests it examines pages for such clues, including related topics.

Keyword “stuffing” stopped being a valid SEO tactic. Perhaps “topic stuffing” might someday become a no-no. We didn’t measure that, but if having fewer related words and phrases hurts ranking, it seems this is not an issue now.

Recommendations Based On Findings

Enhance Topic Coverage And Comprehensive Content

To achieve high rankings, ensure your content is thorough and covers topics extensively. This is often referred to as “semantic SEO.”

By focusing on related topics, you can create content that addresses the primary subject and covers related subtopics, making it more valuable to readers and search engines alike.

Actionable Tips:

  • Research Related Topics: Use tools like SurferSEO.com, Frase.io, AnswerThePublic.com, Ahrefs.com, or Google’s Keyword Planner to identify related topics that complement your main content. Look for questions people are asking about your main topic and address those within your content.
  • Create Detailed Content Outlines: Develop comprehensive outlines for your articles, including primary and secondary topics. This ensures your content covers the subject matter in depth and addresses related subtopics.
  • Use Topic Clusters: Consider organizing your content into clusters, where a central “pillar” page covers the main topic broadly and links to “cluster” pages that dive deeper into related subtopics. This helps search engines understand the breadth and depth of your content.
  • Incorporate User Intent: Understand the different intents behind search queries related to your topic (informational, navigational, transactional) and create content that satisfies these intents. This could include how-to guides, detailed explanations, product reviews, and more.
  • Update Regularly: Keep your content fresh by regularly updating it with new information, trends, and insights. This shows search engines that your content is current and relevant.

Meet Higher Standards Of E-E-A-T For Health-Related Content

If your website covers health or finance-related topics, it’s crucial to meet the high standards of expertise, authoritativeness, trustworthiness, and experience (E-E-A-T). This ensures your content is reliable and credible, which is essential for user trust and search engine rankings.

Actionable Tips:

  • Collaborate with qualified healthcare professionals to create and review your content.
  • Include clear author bios that highlight their credentials and expertise in the field.
  • Cite reputable sources and provide references to studies or official guidelines.
  • Regularly review and update your health content to ensure it remains accurate and current.
  • Build links and ensure you’re getting brand mentions off-site. Our study didn’t focus on this, but it’s critical.

Improve Website Speed And User Experience

Website speed and user experience are increasingly important for SEO. To enhance load times and overall user satisfaction, focus on improving the “time to first byte” (TTFB) and minimizing the HTML file size of your pages.

Actionable Tips:

  • Optimize your server response time to improve TTFB. This might involve upgrading your hosting plan or optimizing your server settings.
  • Minimize page size by compressing images, reducing unnecessary code, and leveraging browser caching.
  • Use tools like Google PageSpeed Insights to identify and fix performance issues.
  • Ensure your website is mobile-friendly, as most traffic comes from mobile devices.

Future Research

We tried to compare the top 15% of large websites to the lower 85% to see if they benefited more from the March update. There was no meaningful change.

However, slews of small publishers spoke up about the update’s outsized impact on them. We wish we had more time to examine this area. It’s important to understand how Google dramatically changed the landscape of Search.

Further studies are needed to understand the impact of semantic SEO and user intent on rankings. Google is looking at this as a site-wide signal, so the SEO community can learn a lot from a study that looks at entity and topic coverage site-wide.

Other site-wide studies with big data sets are also absent in SEO studies. Can we measure site architecture across 1,000 websites to find other best practices for Google rewards?

Additional Notes And Footnotes

Editor’s Note: Search Engine Journal, ClickStream, and WLDM are not affiliated with Surfer SEO and did not receive compensation from it for this study.

All Metrics Measured And Analyzed In Our Study

Metric Description
For Domain Estimated Traffic Surfer SEO’s estimation based on search volumes, ranked keywords, and positions.
For Domain Referring Domains Number of unique domains linking to a domain, a bit outdated.
URL Domain Partial Keywords Number of partial keywords in the domain name.
Title Exact Keywords Number of exact keywords in the title.
Body Words Word count.
Body Partial Keywords Number of partial keywords in the body (exact keywords variations, a word matches if it starts with the same three letters).
Links Unique Internal How many links are on the page pointing to the same domain (internal outgoing links).
Links Unique External How many links are on the page pointing to other domains (external outgoing links).
Page Speed HTML Size (B) HTML size in bytes.
Page Speed Load Time (ms) Load time in milliseconds.
Page Speed Total Page Size (KB) Page size in kilobytes.
Structured Data Total Structured Data Types How many schema markup types are embedded on the page, e.g., local business, organization = 2.
Images Number of Elements Number of images.
Images Number of Elements Outside Links Toggle Off Number of images, including clickable images like banners or ads.
Body Number of Words in Hidden Elements Number of words hidden (e.g., display none).
Above the Fold Words Number of words visible within the first 700 pixels.
Above the Fold Exact Keywords Number of exact keywords visible within the first 700 pixels.
Above the Fold Partial Keywords Number of partial keywords visible within the first 700 pixels.
Body Exact Keywords Number of exact keywords used in the body.
Meta Description Exact Keywords Number of exact keywords used in the meta description.
URL Path Exact Keywords Number of exact keywords within the URL.
URL Domain Exact Keywords Number of exact keywords within the domain name.
URL Path Partial Keywords Number of partial keywords within the URL.

More resources: 


Featured Image: 7rainbow/Shutterstock

Study: Google Favors Ecommerce Sites & User-Generated Content via @sejournal, @MattGSouthern

A recent study by the digital marketing agency Amsive documented a notable change in Google’s search results rankings over the last year.

The study found that Google is surfacing more ecommerce websites and sites featuring user-generated content while reducing the visibility of product review and affiliate marketing sites.

Here’s a look at the findings and the implications for online businesses if the shifts continue.

Ecommerce Dominance In Search Results

The study found a marked increase in ecommerce sites appearing in top search positions for many commercial queries.

Keywords that previously returned results from product reviews and affiliate sites now predominantly feature online retailers.

For example:

  • Bird feeders“: Ecommerce stores now hold all 10 top positions, replacing several product review sites from the previous year.
  • Laptops“: The top 10 results now consist exclusively of ecommerce websites, with some appearing multiple times.
  • Towel warmer“: Ecommerce giants like Amazon and Walmart have multiple listings, completely replacing affiliate websites in the top results.

Rise Of User-Generated Content

Alongside ecommerce sites, user-generated content (UGC) platforms have seen a significant boost in search visibility.

Reddit, Quora, and YouTube now frequently appear in top positions for various queries where they were previously absent or ranked lower.

This trend is particularly noticeable for longer queries like “toys for 2-year-old boys,” where UGC sites are more visible.

Impact On Product Review & Affiliate Sites

The shift in search rankings introduces challenges for product review and affiliate websites, as they’re now less visible for many commercial queries.

While Google hasn’t explicitly stated that product review content is considered “unhelpful,” the data suggests that recent updates have disproportionately affected these pages.

Implications For Digital Marketing Strategies

Due to these changes, product review and affiliate sites may need to reconsider their strategies to maintain visibility and traffic.

Lily Ray and Silvia Gituto, the study’s authors, suggest diversifying traffic sources through:

  • Increased focus on digital media and PR.
  • Enhanced social media engagement.
  • Creation of video content for platforms like YouTube Shorts and TikTok.
  • Development of podcast content.
  • Active participation in relevant online forums.

What This Means For Websites

For ecommerce sites, this is an opportunity to gain more visibility and traffic.

They could take advantage of this shift by getting more customer reviews and user-generated content on their sites.

Product review and affiliate sites may need to change strategies.

Promoting themselves on social media, making videos, starting podcasts, and engaging in online forums could help compensate for lost Google search traffic.

Adapting to these changes, especially around user-generated content, will likely be needed for continued success.


Featured Image: hanss/Shutterstock

Google Dials Back AI Overviews In Search Results, Study Finds via @sejournal, @MattGSouthern

According to new research, Google’s AI-generated overviews have undergone significant adjustments since the initial rollout.

The study from SE Ranking analyzed 100,000 keywords and found Google has greatly reduced the frequency of AI overviews.

However, when they appear, they’re more detailed than they were previously.

The study digs into which topics and industries are more likely to get an AI overview. It also looks at how the AI snippets interact with other search features like featured snippets and ads.

Here’s an overview of the findings and what they mean for your SEO efforts.

Declining Frequency Of AI Overviews

In contrast to pre-rollout figures, 8% of the examined searches now trigger an AI Overview.

This represents a 52% drop compared to January levels.

Yevheniia Khromova, the study’s author, believes this means Google is taking a more measured approach, stating:

“The sharp decrease in AI Overview presence likely reflects Google’s efforts to boost the accuracy and trustworthiness of AI-generated answers.”

Longer AI Overviews

Although the frequency of AI overviews has decreased, the ones that do appear provide more detailed information.

The average length of the text has grown by nearly 25% to around 4,342 characters.

In another notable change, AI overviews now link to fewer sources on average – usually just four links after expanding the snippet.

However, 84% still include at least one domain from that query’s top 10 organic search results.

Niche Dynamics & Ranking Factors

The chances of getting an AI overview vary across different industries.

Searches related to relationships, food and beverages, and technology were most likely to trigger AI overviews.

Sensitive areas like healthcare, legal, and news had a low rate of showing AI summaries, less than 1%.

Longer search queries with ten words were more likely to generate an AI overview, with a 19% rate indicating that AI summaries are more useful for complex information needs.

Search terms with lower search volumes and lower cost-per-click were more likely to display AI summaries.

Other Characteristics Of AI Overviews

The research reveals that 45% of AI overviews appear alongside featured snippets, often sourced from the exact domains.

Around 87% of AI overviews now coexist with ads, compared to 73% previously, a statistic that could increase competition for advertising space.

What Does This Mean?

SE Ranking’s research on AI overviews has several implications:

  1. Reduced Risk Of Traffic Losses: Fewer searches trigger AI Overviews that directly answer queries, making organic listings less likely to be demoted or receive less traffic.
  2. Most Impacted Niches: AI overviews appear more in relationships, food, and technology niches. Publishers in these sectors should pay closer attention to Google’s AI overview strategy.
  3. Long-form & In-Depth Content Essential: As AI snippets become longer, companies may need to create more comprehensive content beyond what the overviews cover.

Looking Ahead

While the number of AI overviews has decreased recently, we can’t assume this trend will continue.

AI overviews will undoubtedly continue to transform over time.

It’s crucial to monitor developments closely, try different methods of dealing with them, and adjust game plans as needed.


Featured Image: DIA TV/Shutterstock

Google Launches June 2024 Spam Update via @sejournal, @MattGSouthern

Google has announced the rollout of the June 2024 spam update, which aims to further improve search results by targeting websites that violate Google’s spam policies.

According to a statement, the update, which began on June 20, is expected to take up to one week to roll out fully.

Background On Google’s Spam Updates & Policies

Google regularly updates its systems to reduce low-quality and spammy content from its search results.

Spam updates target websites that break Google’s rules, such as:

  • Automatically generating content solely to improve search rankings.
  • Buying or selling links to manipulate rankings.
  • Having thin, duplicated, or poor-quality content.
  • Tricking users with hidden redirects or other deceptive techniques.

Google’s last spam update was released in March.

Despite the March update impacting many spammy websites, some AI-generated content still managed to rank well in search results.

Analysis by Search Engine Journal’s Roger Montti notes that some AI spam sites ranked for over 217,000 queries, with more than 14,900 ranking in the top 10 search results.

The sites employed tactics such as rapid content churn, AI-generated images, and templated article structures, exploiting a loophole that allowed new content to receive an initial ranking boost.

Potential Impact On Search Results

The June spam update will likely refine Google’s spam detection capabilities further.

However, past experiences have shown that closing loopholes can inadvertently impact legitimate websites.

As with any significant update, the June spam update may result in fluctuations in search rankings for some websites.

Websites that engage in practices that violate Google’s spam policies or rely heavily on AI-generated content may see a decline in their search visibility.

On the other hand, some websites may benefit from the update, as they will face less competition from spammy websites in search results.

Looking Ahead

Google says the June 2024 spam update may take up to one week to roll out fully.

Once the rollout is complete, Google will post an update on its Search Status Dashboard, and you can assess the update’s impact on your search rankings.


Featured Image: Danishch/Shutterstock

Reddit Traffic Up 39%: Is Google Prioritizing Opinions Over Expertise? via @sejournal, @MattGSouthern

Reddit’s website traffic has grown 39% compared to the previous year, according to data from Similarweb.

This growth seems fueled by Reddit’s increased visibility in Google search results.

Why is Reddit growing so fast, and what does this mean for businesses and SEO professionals?

Here’s our take on it.

Why Is Reddit Growing?

Several factors, including Google prioritizing “helpful content” from discussion forums in a recent algorithm update, have likely contributed to Reddit’s improved search rankings and visibility.

A report from Business Insider indicates that more people are now finding Reddit through Google searches than by directly visiting the reddit.com website.

Mordy Oberstein, Wix’s Head of SEO, shared recent data showing a consistent increase in the share of Reddit sources appearing in Google’s Discussion and Forums SERP feature.

Lily Ray, Senior Director of SEO and Head of Organic Research at Amsive Digital, tweeted about Reddit’s increased visibility in Google search results.

She noted that Reddit appeared in “Discussions and Forums” for various medical queries in recent weeks but not anymore today.

Ray also observed that the number of Discussion and Forum features with multiple Reddit URLs has decreased slightly over the past months.

Google’s $60 Million Deal with Reddit

Google recently signed a $60 million deal to license Reddit data for AI products.

The timing of the deal and Reddit’s search growth raise questions.

Google has denied a direct connection between the deal and Reddit’s search visibility, but the coincidence is notable.

Implications For Marketers & SEO Professionals

Reddit’s newfound dominance in Google search results presents business challenges and opportunities.

Challenges

Roger Montti, a staff writer for Search Engine Journal, raises concerns about the expertise and trustworthiness of Reddit content:

In the article, “Let’s Be Real: Reddit In Google Search Lacks Credibility,” Montti states:

“Opinions shared on Reddit by people who lack expertise and are sharing opinions in anonymity qualify as dubious. Yet Google is not only favoring Reddit in the search results, it is also paying millions of dollars for access to content that is lacking in expertise, experience, authoritativeness and trustworthiness.”

This is challenging because it means your expert-written content could get outranked by the opinions of anonymous Reddit users.

Opportunities

Search Engine Journal founder Brent Csutoras offers a more optimistic view, believing marketers should lean into Reddit’s newfound prominence.

In the article, “Why Every Marketer Should Be On Reddit,” Csutoras states:

“If your brand has something meaningful to say and is interested in truly connecting with your audience, then yes, you should be on Reddit.”

However, Reddit’s community-driven nature requires a delicate approach, Csutoras adds:

“Reddit communities can be highly negative toward self-serving promotion. But if you put in the effort and solve people’s needs and problems, Reddit has the potential to be a high-performance channel.”

Why SEJ Cares

SEO professionals and marketers should be mindful that expert-written resources could be outranked by Reddit threads that reflect personal opinions rather than authoritative information.

However, by providing genuine value and respecting Reddit’s community guidelines, businesses may be able to leverage the platform’s prominence for increased visibility and audience engagement.


Featured Image: rafapress/Shutterstock

Is Google Crawling Your Site A Lot? That Could Be A Bad Sign via @sejournal, @MattGSouthern

According to a recent LinkedIn post by Gary Illyes, Analyst at Google, you should be cautious if Google starts aggressively crawling your website.

While an uptick in crawling can be a good sign, Illyes says it may indicate underlying issues.

Illyes cautions:

“Don’t get happy prematurely when search engines unexpectedly start to crawl like crazy from your site.”

He says there are two common problems to watch out for: infinite spaces and website hacks.

Infinite Spaces Could Cause Crawling Spike

An issue Illyes highlighted is sites with “infinite spaces”—areas like calendar modules or endlessly filterable product listings that can generate unlimited potential URLs.

If a site is crawled a lot already, crawlers may get extra excited about infinite spaces.

Illyes explains:

“If your site generally has pages that search users find helpful, crawlers will get excited about these infinite spaces for a time.”

He recommends using the robots.txt file to block crawlers from accessing infinite spaces.

Hacked Sites Can Trigger Crawling Frenzy

Another troubling cause of a crawling spike is a security breach where hackers inject spam onto a reputable site.

Crawlers may initially interpret this as new content to index before realizing it’s malicious.

Illyes states:

“If a no-good-doer somehow managed to get access…they might flood your site with, well, crap… crawlers will get excited about these new pages for a time and happily crawl them.”

Remain Skeptical Of Crawling Spikes

Rather than assuming a crawling spike is positive, Illyes suggests treating it as a potential issue until the root cause is identified.

He states:

“Treat unexpected sharp increases in crawling as a symptom…until you can prove otherwise. Or, you know, maybe I’m just a hardline pessimist.”

Fixing Hacked Sites: Help From Google

For hacked sites, Illyes pointed to a page that includes a video with further assistance:

Here are the key points.

Tips From Google’s Video

Google’s video outlines the steps in the recovery process.

1. Identify The Vulnerability

The first crucial step is finding how the hacker gained access. Tools like Google’s Webmaster Tools can assist in detecting issues.

2. Fix The Vulnerability

Once the security hole is identified, it must be closed to prevent any future unauthorized access. This could involve updating software, changing passwords, etc.

3. Clean The Hacked Content

Check the entire site’s content and code to remove any spam, malware, defaced pages, or other injections by the hacker. Security plugins like Wordfence can assist in this process.

4. Harden Security

Beyond fixing the specific vulnerability, take additional measures to harden the site’s security. This could include enabling firewalls, limiting user permissions, and more frequent software updates.

5. Request A Review

Once the vulnerability is patched and any hacked content is removed, you can then request Google to review the site and remove any security warnings or blacklists once it’s verified as clean.

The video notes that the review process is faster for malware issues (days) than spam issues (weeks) since Google has to inspect spam cleanup efforts further.

Additional Tips From Google’s John Mueller

Google’s John Mueller has previously offered specific advice on recovering from the SEO impact of hacked pages:

  1. Use the URL removal tool to deindex the hacked pages quickly.
  2. Focus on improving the overall site quality beyond removing hacked content.
  3. Lingering impacts may persist for months until the site recovers Google’s trust.

Why SEJ Cares

Website security is crucial for all businesses, as hacked content can impact trust and search engine rankings.

Google’s Gary Illyes pointed out that sudden spikes in crawling activity could indicate security breaches or technical issues that need immediate attention.


Featured Image: Stacey Newman/Shutterstock

Google’s Unconventional Advice On Fixing Broken Backlinks via @sejournal, @martinibuster

Google’s Gary Illyes recently answered the question of whether one should spend time fixing backlinks with wrong URLs that are pointing to a website, known as broken backlinks. The answer is interesting because it suggests a way of considering this issue in a completely unorthodox manner.

Google: Should Broken Backlinks Be Fixed?

During a recent Google SEO Office Hours podcast, a question was asked about fixing broken backlinks:

“Should I fix all broken backlinks to my site to improve overall SEO?”

Google’s Gary Ilyes answered:

“You should fix the broken backlinks that you think would be helpful for your users. You can’t possibly fix all the links, especially once your site grew to the size of a mammoth. Or brontosaurus.”

Unconventional Advice

Assessing broken backlinks for those that are the the most helpful for “users” is an unconventional way to decide whether to fix them or not. The conventional SEO practice is to fix a broken backlink to assure that a site is receiving the maximum available link equity. So his advice runs counter to standard SEO practice but it shouldn’t be dismissed out of hand because there may be something useful there.

Keep an open mind, be open to different ways of considering solutions. Something I like about his approach is that it’s a shortcut for determining whether or not a backlink is useful. For example, if the link is to a product that is no longer sold or supported in any way, a 404 response is the best thing to show to search crawlers and to users. So there is some validity to his way of looking at it.

Why Broken Backlinks Should Be Fixed

It’s not really a big deal to fix these kinds of backlinks, it’s one of the easier SEO chores to be done and it’s a quick win.

While any benefit is hard to measure, it’s nonetheless worth doing it for site visitors who might follow the wrong URL to the webpage that they’re looking for.

Check Backlinks After A Link Building Campaign

Checking backlinks is also important to do after a backlink campaign, even months after asking for a link, because site owners will sometimes add their links weeks or months later but it could be that they added the wrong URL. It happens, I know from experience.

Broken Backlinks That Do & Don’t Matter

The kinds of broken backlinks that usually (but not always) matter are the ones that show up as 404 errors on your server logs or in the Google Search Console.

There are two kinds of broken backlinks that matter:

  1. A backlink that’s broken because the linked page no longer exists or the URL changed.
  2. The URL of the backlink is misspelled.

Then there are backlinks that matter less and the reasons for that are:

  • Because the broken backlink is from a low quality website that doesn’t send any traffic
  • The link is to an outdated webpage that doesn’t matter and should return a 404 response
  • It’s just a random link created by an AI chatbot, spambot, or a spam web page.

How To Identify Broken Backlinks

Identifying any kind of broken backlink is (arguably) best done done by reviewing 404 errors generated from visits to pages that no longer exist or to URLs that are misspelled. If the link matters then there’s going to be web traffic from a broken backlink to a 404 page.

You might not be able to see where that link is coming from, although it may be possible to search for the broken URL and possibly find it.

The server log may show the IP address and user agent of the site visitor that created the broken link and from there a site owner can make the judgment call of whether it’s a spam or hacker bot, a search engine bot or an actual user. The Redirection WordPress plugin and the Wordfence plugin can be helpful for site owners that don’t have access to server logs.

A site owner may find that using a SaaS backlink tool might be useful for finding broken links but many sites, particularly sites that have been around awhile, have a lot of backlinks and using a tool might not be the right solution because it’s a lot of work for finding a link that doesn’t even send traffic. If the broken link sends traffic then you’ll know it because it’ll show up as a 404 error response.

Fixing Broken Backlinks

Fixing links that no longer exist can be done by recreating the resource or by redirecting requests for the missing web page to a web page that is substantially similar.

Fixing a link to a misspelled URL is easily done by redirecting the misspelled URL to the correct URL.

Another way to fix it is to contact the site that’s linking to the wrong URL but there are three things to consider before doing that.

1. The site owner may decide that they don’t want to link to the site and remove the link altogether.

2. The site owner may decide to add a no-follow link attribute to the corrected URL.

3. There are other sites that may have copied the web page and/or the link and are thus also linking to the wrong URL.

Simply adding a redirect from the misspelled URL to the correct URL fixes the problem without any risk that the backlink is going to be removed or nofollowed.

Fixing Broken Backlinks

Identifying broken backlinks is something that many site owners might stumble on when investigating 404 errors. Some call it link reclamation but any discussion of “link reclamation” is basically about fixing broken backlinks, it’s just another name for it.

Regardless, fixing these kinds of inbound links are one of the few SEO quick wins that could actually benefit a site owner and it could be a part of a site audit especially when it’s limited to finding opportunities in 404 error responses because these are links that are either getting crawled or are being used by potential site visitors.

Listen to the podcast at the 5:32 minute mark for the answer on fixing broken backlinks:

Featured Image by Shutterstock/Roman Samborskyi

How To Think About SEO, Content & PR Measurement (Indicated In The Google Leak) via @sejournal, @_kevinrowe

Google’s recent leak highlighted engagement as part of the ranking system, alluding to the importance of influencing audience behavior to drive SEO-specific metrics, like ranking or organic visibility.

That said, I’ve been using simple variations of these measures for a while to evaluate the impact of integrated PR and SEO campaigns. I don’t think the idea of measuring search behavior is new, but the Google leaks shed some light on its importance.

For sustainable growth in organic visibility and rankings, SEO strategies need to pivot to include measures that reflect how strongly owned assets, marketing assets, and messaging influence an audience’s search behavior.

Google’s broader objective to rank content that is genuinely helpful to specific audience segments is an important context for considering this shift.

So, SEO pros should evaluate website performance based on engagement-driven metrics like asset NPS, idea adoption rate, and time to activation, which will be important for directly and indirectly maximizing organic search visibility.

Why Measure Influence

The recent Google leaks highlight the growing importance of audience engagement measures in ranking pages.

This highlights the importance of integrating SEO, content creation, and PR, where influencing audience behavior becomes a key measure.

I see it like this:

  • Google emphasizes engagement: The Google leaks suggest that Google places a lot of weight on user engagement measures such as click data, repeat visitors, site traffic, or related click data. Despite being incomplete and likely outdated information, it is one of many examples of Google using user engagement in some way.
  • AI integration into the algorithm: With AI being integrated more heavily into Google’s ranking systems, AI could interpret and use this user engagement data to influence ranking.
  • Brand search: Site traffic from brand search is an indicator of audience engagement and can influence organic visibility.

But to drive audience engagement, we have to think beyond simple SEO activities like link building, creating keyword-focused content, or technical SEO.

The future of search marketing is designing scenarios that influence an audience’s search behavior.

Ideal Search Behavior Scenario

The audience’s journey is more complex today than ever because they use many different sources to learn about their problems, the solutions, and the opportunities they create. However, this scenario simplifies how to think about your search strategy.

Scenario: You create an asset, you get PR coverage, and the audience searches the asset in Google (maybe they don’t find it based on keywords, then search your brand name). Then, they keep returning to your site for new assets or resources to solve their problems or create an opportunity (the original one as a resource or for your offering).

Simple Search Behavior Scenario Statement:

I need to create a content asset about [a problem or opportunity], to get coverage about [an asset of the asset] that the audience will prompt because [audience interest], which will drive my audience to search for [category or terms you own], and they will immediately or return to the site to take action because [solve a problem or create an opportunity].

You’ll have to modify this based on your specific website event goals, but the statement’s essence will guide you in the right direction.

This direction will allow you to focus on the much more significant but more difficult-to-impact measures below.

I have a foundation in product management and marketing, so I adopted these measures from product marketing concepts since they directly relate to audience actions.

Measure 1: Asset NPS

How likely is your audience to promote your content assets or ideas?

NPS score is used to gauge an audience’s loyalty and satisfaction using a survey question: “How likely are you to recommend our content to a friend or colleague?”

Respondents can provide a rating from 0-10.

  • Promoters (9-10): Loyal and enthusiastic audience who keep talking about and referring your content or ideas to others.
  • Passives (7-8): Satisfied with content but not an overly enthusiastic audience who will listen to a competitor’s point of view.
  • Detractors (0-6): Unhappy audience that speaks negatively about your content.

High NPS indicates strong audience engagement, boosting engagement, and can indirectly influence organic visibility.

Typically, you’d have to survey an audience to gather the data. Use Google Forms, Survey Monkey, or any survey tool with a rating scale to collect questions.

Pro tip: Survey the audience on your site, the following you have on social media, or the email list you’re building as a result of the audience submitting contact info on the site or even through a newsletter.

Measure 2: Idea Adoption Rate

Does your audience adopt your ideas?

The adoption rate of an idea refers to the percentage of the audience segment that starts using the idea after you launch the asset.

This is a key measure to understand if your audience is accepting a particular idea, providing insights into engagement and market fit. This could directly influence engagement signals that can influence ranking.

Here’s How To Calculate

Metrics

  • Audience segment size: How many people are in your audience segment?
  • Audience usage size: Number of people who use the ideas in your content.
Formula: Adoption rate = (audience usage size/audience segment size) X 100%

You can collect this data in a lot of different ways, but shares alone are not a great metric since I don’t believe they reflect actual influence.

Find discussions or actions taken as a result of your ideas or content.

  • Is your audience discussing your ideas on LinkedIn, Twitter (X), or relevant social?
  • Are newsletters talking about your ideas or the essence of your ideas?
  • Are your process steps being discussed?
  • Do people share videos using your product or ideas?

Pro tip: I see some creators concerned about people “stealing” their original idea. I don’t think this is a bad thing. This is a signal of adoption due to the idea of solving a significant problem or opportunity.

Measure 3: Time To Activation

How long does it take your audience to take action on your site?

Time to activation measures how long it takes for your audience to take action by searching a topic or taking action on your site after engaging with your messaging.

These can include brand searches, search keywords you own, document downloads, contacting for a quote, or requesting a demo.

This measure can show how well your content is being adopted or if the messaging aligns with your audience’s journey. Shorter activation times suggest strong alignment with audience needs and higher content efficacy.

How To Measure

  • Identify an activation point (e.g., events you want the audience to trigger) or goals on the site.
  • Estimate how many people read or engaged with your content.
  • Measure how many people took action around specific events on the site.

Pro tip: Some marketers will say you shouldn’t measure your program because attribution modeling doesn’t work or SEO takes time. However, time to activation highlights the importance of evaluating the actions on the site that the campaign should drive. Design campaigns for time to activation of less than 3 months for each event, 6 months for large goals, and 12 months for larger business impacts like creating a new market category.

As you activate your audience, brand search will likely have an impact, as your audience will likely search Google for more information on your topic.

Measure 4: Brand Search Volume

Does the audience search for your brand in search engines?

Brand measures refer to the number of times users search for a specific term you branded or own in search engines.

You can measure this in Google Search Console, searching for your brand name or a term you own.

Pro tip: Brand keywords are reported in Google Analytics under the general search engine (e.g. Google) with non-brand keywords. Look for short-term spikes or sustainable trends in Google Search Console, segmenting it in any way possible (e.g. page, query, date, brand modified term) to find the impact. Design your strategy with the idea of being able to measure brand search impact.

Impact On Your Strategy

Integrating SEO and PR strategies to influence audience behavior and engagement is important for maximizing organic visibility and search rankings.

Google’s recent leaks emphasize the importance of audience engagement, highlighting the need to integrate content creation, SEO, and PR to drive meaningful interactions.

Measures such as asset NPS, idea adoption rate, and time to activation provide valuable insights into audience loyalty, idea adoption, and action times.

These seem to be important for driving engagement and influencing search engine rankings but critical for audience engagement.

These engagement-driven measures will help ensure you don’t have to keep chasing Google’s evolving algorithms and that content genuinely resonates with your audience segment.

Start designing integrated PR and SEO strategies.

More resources:


Featured Image: Yurii_Yarema/Shutterstock

Google Answers Question About Toxic Link Sabotage via @sejournal, @martinibuster

Google’s Gary Illyes answered a question about how to notify Google that someone is poisoning their backlink profile with “toxic links” which is a problem that many people have been talking about for at least fifteen years.

Question About Alerting Google To Toxic Links

Gary narrated the question:

“Someone’s asking, how to alert Google of sabotage via toxic links?”

And this is Gary’s answer:

I know what I would do: I’d ignore those links.

Generally Google is really, REALLY good at ignoring links that are irrelevant to the site they’re pointing at. If you feel like it, you can always disavow those “toxic” links, or file a spam report.

Disavow Links If You Feel Like It

Gary linked to Google’s explainer about disavowing links where it’s explained that the disavow tool is for a site owner to tell Google about links that they are responsible for in some way, like paid links or some other link scheme.

This is what it advises:

“If you have a manual action against your site for unnatural links to your site, or if you think you’re about to get such a manual action (because of paid links or other link schemes that violate our quality guidelines), you should try to remove the links from the other site to your site. If you can’t remove those links yourself, or get them removed, then you should disavow the URLs of the questionable pages or domains that link to your website.”

Google suggests that a link disavow is only necessary when two conditions are met:

  1. “You have a considerable number of spammy, artificial, or low-quality links pointing to your site,
    AND
  2. The links have caused a manual action, or likely will cause a manual action, on your site.”

Both of the above conditions must be met in order to file a valid link disavow tool.

Origin Of The Phrase Toxic Links

As Google became better at penalizing sites for low quality links and paid links, some in the highly competitive gambling industry started creating low quality links to sabotage their competitors. The practice was called negative SEO.

The phrase toxic link is something that was never heard of until after the Penguin link updates in 2012 which required penalized sites to remove all the paid and low quality links they created and then disavow the rest. An industry grew around disavowing links and it was that industry that invented the phrase Toxic Links for use in their marketing.

Confirmation That Google Is Able To Ignore Links

I have shared this anecdote before and I’ll share it here again. Someone I knew contacted me and said that their site lost rankings from negative SEO links. I took a look and their site had a ton of really nasty looking links. So out of curiosity (and because I knew that the site was this person’s main income), I emailed someone at Google Mountain View headquarters about it. That person checked it and replied that the site didn’t lose rankings because of the links. They lost rankings because of a Panda update related content issue.

That was around 2012 and it showed me how good Google was at ignoring links. Now, if Google was that good at ignoring really bad links back then, they’re probably better at it now, twelve years later now that they have the spam brain AI.

Listen to the question and answer at the 8:22 minute mark:

Featured Image by Shutterstock/New Africa

Google On Traffic Diversity As A Ranking Factor via @sejournal, @martinibuster

Google’s SearchLiaison tweeted encouragement to diversify traffic sources, being clear about the reason he was recommending it. Days later, someone followed up to ask if traffic diversity is a ranking factor, prompting SearchLiaison to reiterate that it is not.

What Was Said

The question of whether diversity of traffic was a ranking factor was elicited from a previous tweet in a discussion about whether a site owner should be focusing on off-site promotion.

Here’s the question from the original discussion that was tweeted:

“Can you please tell me if I’m doing right by focusing on my site and content – writing new articles to be found through search – or if I should be focusing on some off-site effort related to building a readership? It’s frustrating to see traffic go down the more effort I put in.”

SearchLiaison split the question into component parts and answered each one. When it came to the part about off-site promotion, SearchLiaison (who is Danny Sullivan), shared from his decades of experience as a journalist and publisher covering technology and search marketing.

I’m going to break down his answer so that it’s clearer what he meant

This is the part from the tweet that talks about off-site activities:

“As to the off-site effort question, I think from what I know from before I worked at Google Search, as well as my time being part of the search ranking team, is that one of the ways to be successful with Google Search is to think beyond it.”

What he is saying here is simple, don’t limit your thinking about what to do with your site to thinking about how to make it appeal to Google.

He next explains that sites that rank tend to be sites that are created to appeal to people.

SearchLiaison continued:

“Great sites with content that people like receive traffic in many ways. People go to them directly. They come via email referrals. They arrive via links from other sites. They get social media mentions.”

What he’s saying there is that you’ll know that you’re appealing to people if people are discussing your site in social media, if people are referring the site in social media and if other sites are citing it with links.

Other ways to know that a site is doing well is when when people engage in the comments section, send emails asking follow up questions, and send emails of thanks and share anecdotes of their success or satisfaction with a product or advice.

Consider this, fast fashion site Shein at one point didn’t rank for their chosen keyword phrases, I know because I checked out of curiosity. But they were at the time virally popular and making huge amounts of sales by gamifying site interaction and engagement, propelling them to become a global brand. A similar strategy propelled Zappos when they pioneered no-questions asked returns and cheerful customer service.

SearchLiaison continued:

“It just means you’re likely building a normal site in the sense that it’s not just intended for Google but instead for people. And that’s what our ranking systems are trying to reward, good content made for people.”

SearchLiaison explicitly said that building sites with diversified content is not a ranking factor.

He added this caveat to his tweet:

“This doesn’t mean you should get a bunch of social mentions, or a bunch of email mentions because these will somehow magically rank you better in Google (they don’t, from how I know things).”

Despite The Caveat…

A journalist tweeted this:

“Earlier this week, @searchliaison told people to diversify their traffic. Naturally, people started questioning whether that meant diversity of traffic was a ranking factor.

So, I asked @iPullRank what he thought.”

SearchLiaison of course answered that he explicitly said it’s not a ranking factor and linked to his original tweet that I quoted above.

He tweeted:

“I mean that’s not exactly what I myself said, but rather repeat all that I’ll just add the link to what I did say:”

The journalist responded:

“I would say this is calling for publishers to diversify their traffic since you’re saying the great sites do it. It’s the right advice to give.”

And SearchLiaison answered:

“It’s the part of “does it matter for rankings” that I was making clear wasn’t what I myself said. Yes, I think that’s a generally good thing, but it’s not the only thing or the magic thing.”

Not Everything Is About Ranking Factors

There is a longstanding practice by some SEOs to parse everything that Google publishes for clues to how Google’s algorithm works. This happened with the Search Quality Raters guidelines. Google is unintentionally complicit because it’s their policy to (in general) not confirm whether or not something is a ranking factor.

This habit of searching for “ranking factors” leads to misinformation. It takes more acuity to read research papers and patents to gain a general understanding of how information retrieval works but it’s more work to try to understand something than skimming a PDF for ranking papers.

The worst approach to understanding search is to invent hypotheses about how Google works and then pore through a document to confirm those guesses (and falling into the confirmation bias trap).

In the end, it may be more helpful to back off of exclusively optimizing for Google and focus at least equally as much in optimizing for people (which includes optimizing for traffic). I know it works because I’ve been doing it for years.

Featured Image by Shutterstock/Asier Romero