Google Ads Now Being Mixed In With Organic Results via @sejournal, @brodieseo

Google has an incentive to encourage users to click its sponsored ads – but this should not be to the detriment of user experience.

This aspect of Search seems to have gone awry in recent years, with Google engaging in activities that negatively impacted users.

Historically, search engine users are accustomed to ads either being placed at the top or the bottom of a SERP, with the page itself either being purely organic results or having the organic results placed in between the ads. Search features are often mixed in, too.

This has now changed.

A change was recently added to Google’s documentation, stating that:

“Top ads may show below the top organic results on certain queries.”

Detailing how placement for top ads is dynamic and may change.

In this article, we explore this change and its impact on users and organic search results.

Timeline Of Changes

Leading up to the change, Google had been testing mixing sponsored ads within organic listings in various capacities over a 10-month period.

Here is a timeline of the changes leading up to the official launch.

June 17th, 2023: Initial Testing

This was the first time the test appeared in Google’s search results, only showing on mobile devices at the time. Within this initial testing period, it was showing for very few users with more discrete inclusion only on mobile, easily being mistaken for an organic listing for users.

October 23rd, 2023: Heavier Testing

Within this testing period, it was the first time that the broader SEO community started to notice the ad labels appearing within organic listings, being visible across both mobile and desktop.

This testing period was more prolonged in the lead-up to launch.

March 28th, 2024: Launch

On this date, Google’s Ads Liaison announced that the change would be a permanent one, with a new definition being added to the “top ads” documentation. From this date, users were then to expect an official change where ads would be mixed in with organic results beyond limited testing.

Different Types Of Placements

Now that Google has been mixing sponsored ads within organic results for almost two months, we’re able to gain a better understanding of the extent of the change and how the sponsored ads are appearing.

Based on my research, there are two common situations where Google is presenting ads within organic listings.

Mixed With Organic Results

The standard approach involves a simple ad placement within the top organic results.

Based on my experience, it is common for there to be one or two ads that are placed together in this situation. It is rare for there to be a maximum of four ads in a row.

An example of this can be found below:

Screenshot from search for seo expert melbourneScreenshot from search for [seo expert melbourne], Google, May 2024

In this example, the sponsored ad technically appears in position #2 on the page. Normally, the ad would have appeared above my page, but in this instance, it is below.

For the Semrush page, the visibility on the SERP would be unchanged if they were above, but for my page it is at an advantage in terms of ranking visibility.

Directly Below Featured Snippets

What seems to be the most common way ads are mixed in with organic listings is by placing them directly below a featured snippet.

In cases like this, it is common for there to be a full lot of four ads that appear below the featured snippet. In this example, there are two ads that are appearing.

2 sponsored ads appearing in the SERPsScreenshot from search for [backruptcy], Google, May 2024

In the past, and still having the ability to show right now, ads would always be placed directly above the featured snippet.

This could have been perceived as a poor user experience, considering featured snippets tend to show when an answer to a query can be explained with a short description from the page.

What Are Google’s Intentions?

Each of the situations explained in the previous section could be interpreted differently.

The first situation (mixed within organic results) is pretty clear about Google’s intentions: to encourage more clicks on ads and desensitize users to ads appearing at the top, with users mistaking ads for organic listings.

In contrast, the second situation with featured snippets could be perceived differently. While ads continue to appear in the viewport on desktop, the answer to the user’s query is prominently displayed at the top of search results without ads getting in the way.

I can’t see this being a bad thing for users or SEO, as Google is making the organic listing more visible across these instances.

In general, I’m aware of Google’s need to prioritize ad revenue with changes to ad placement. While there are certainly arguments to be made from both angles with this change, my perception is that the outcome is fairly neutral from both sides.

Ads mixed in with organic results are still exceptionally rare, but featured snippet placements are a more common use case, and there are some clear upsides to this.

How To Analyze With Semrush

analyze ads top featureScreenshot from Semrush, May 2024

While Semrush does have an Advertising Research tool that shows you the position of your ads across various queries, I found that the data wasn’t being collected in a way that allows you to compare ad position relative to organic listings.

As an alternative, I found the best approach for analysis to be through using “Ads top” as a SERP feature filter through Organic Research to locate instances where ads were being mixed with organic listings.

Here’s where this filter is located:

This filtering doesn’t allow you to filter by URLs for a specific domain, with it instead showing instances where “top ads” are a SERP feature across the Semrush index.

Using this method, I’m able to review historical top ad inclusions since the launch in March and conclude that ads being mixed in with organic results is still exceptionally rare.

Final Thoughts

Overall, based on how Google currently operates, I’m not particularly concerned about this ad placement change from Google.

While the change is an official one based on the update to Google’s documentation, it still operates more like a test, where ads are continuing to appear in normal positions in the vast majority of instances.

Based on my research, I believe the change should be perceived as neutral for Google users and SEO. If you see ads being mixed with organic listings in the wild, keep your wits about you.

I’ll be keeping an eye on this change to make sure Google’s ad placements don’t get too carried away.

Google’s ad testing has more recently reverted back to using the “ad” labeling instead of “sponsored” on mobile, which was the previous treatment up until recent years.

We can certainly expect these types of tests to continue into the future, with there never being a boring day within our industry.

More resources: 


Featured Image: BestForBest/Shutterstock 

Why Using A Log Analyzer Is A Must For Big Websites

This post was sponsored by JetOctopus. The opinions expressed in this article are the sponsor’s own.

If you manage a large website with over 10,000 pages, you can likely appreciate the unique SEO challenges that come with such scale.

Sure, the traditional tools and tactics — keyword optimization, link building, etc. — are important to establish a strong foundation and maintain basic SEO hygiene.

However, they may not fully address the technical complexities of Site Visibility for Searchbots and the dynamic needs of a large enterprise website.

This is where log analyzers become crucial. An SEO log analyzer monitors and analyzes server access logs to give you real insights into how search engines interact with your website. It allows you to take strategic action that satisfies both search crawlers and users, leading to stronger returns on your efforts.

In this post, you’ll learn what a log analyzer is and how it can enable your enterprise SEO strategy to achieve sustained success. But first, let’s take a quick look at what makes SEO tricky for big websites with thousands of pages.

The Unique SEO Challenges For Large Websites

Managing SEO for a website with over 10,000 pages isn’t just a step up in scale; it’s a whole different ball game.

Relying on traditional SEO tactics limits your site’s potential for organic growth. You can have the best titles and content on your pages, but if Googlebot can’t crawl them effectively, those pages will be ignored and may not get ranked ever.

Image created by JetOctopus, May 2024

For big websites, the sheer volume of content and pages makes it difficult to ensure every (important) page is optimized for visibility to Googlebot. Then, the added complexity of an elaborate site architecture often leads to significant crawl budget issues. This means Googlebot is missing crucial pages during its crawls.

Image created by JetOctopus, May 2024

Furthermore, big websites are more vulnerable to technical glitches — such as unexpected tweaks in the code from the dev team — that can impact SEO. This often exacerbates other issues like slow page speeds due to heavy content, broken links in bulk, or redundant pages that compete for the same keywords (keyword cannibalization).

All in all, these issues that come with size necessitate a more robust approach to SEO. One that can adapt to the dynamic nature of big websites and ensure that every optimization effort is more meaningful toward the ultimate goal of improving visibility and driving traffic.

This strategic shift is where the power of an SEO log analyzer becomes evident, providing granular insights that help prioritize high-impact actions. The primary action being to better understand Googlebot like it’s your website’s main user — until your important pages are accessed by Googlebot, they won’t rank and drive traffic.

What Is An SEO Log Analyzer?

An SEO log analyzer is essentially a tool that processes and analyzes the data generated by web servers every time a page is requested. It tracks how search engine crawlers interact with a website, providing crucial insights into what happens behind the scenes. A log analyzer can identify which pages are crawled, how often, and whether any crawl issues occur, such as Googlebot being unable to access important pages.

By analyzing these server logs, log analyzers help SEO teams understand how a website is actually seen by search engines. This enables them to make precise adjustments to enhance site performance, boost crawl efficiency, and ultimately improve SERP visibility.

Put simply, a deep dive into the logs data helps discover opportunities and pinpoint issues that might otherwise go unnoticed in large websites.

But why exactly should you focus your efforts on treating Googlebot as your most important visitor?

Why is crawl budget a big deal?

Let’s look into this.

Optimizing Crawl Budget For Maximum SEO Impact

Crawl budget refers to the number of pages a search engine bot — like Googlebot — will crawl on your site within a given timeframe. Once a site’s budget is used up, the bot will stop crawling and move on to other websites.

Crawl budgets vary for every website and your site’s budget is determined by Google itself, by considering a range of factors such as the site’s size, performance, frequency of updates, and links. When you focus on optimizing these factors strategically, you can increase your crawl budget and speed up ranking for new website pages and content.

As you’d expect, making the most of this budget ensures that your most important pages are frequently visited and indexed by Googlebot. This typically translates into better rankings (provided your content and user experience are solid).

And here’s where a log analyzer tool makes itself particularly useful by providing detailed insights into how crawlers interact with your site. As mentioned earlier, it allows you to see which pages are being crawled and how often, helping identify and resolve inefficiencies such as low-value or irrelevant pages that are wasting valuable crawl resources.

An advanced log analyzer like JetOctopus offers a complete view of all the stages from crawling and indexation to getting organic clicks. Its SEO Funnel covers all the main stages, from your website being visited by Googlebot to being ranked in the top 10 and bringing in organic traffic.

Image created by JetOctopus, May 2024

As you can see above, the tabular view shows how many pages are open to indexation versus those closed from indexation. Understanding this ratio is crucial because if commercially important pages are closed from indexation, they will not appear in subsequent funnel stages.

The next stage examines the number of pages crawled by Googlebot, with “green pages” representing those crawled and within the structure, and “gray pages” indicating potential crawl budget waste because they are visited by Googlebot but not within the structure, possibly orphan pages or accidentally excluded from the structure. Hence, it’s vital to analyze this part of your crawl budget for optimization.

The later stages include analyzing what percentage of pages are ranked in Google SERPs, how many of these rankings are in the top 10 or top three, and, finally, the number of pages receiving organic clicks.

Overall, the SEO funnel gives you concrete numbers, with links to lists of URLs for further analysis, such as indexable vs. non-indexable pages and how crawl budget waste is occurring. It is an excellent starting point for crawl budget analysis, allowing a way to visualize the big picture and get insights for an impactful optimization plan that drives tangible SEO growth.

Put simply, by prioritizing high-value pages — ensuring they are free from errors and easily accessible to search bots — you can greatly improve your site’s visibility and ranking.

Using an SEO log analyzer, you can understand exactly what should be optimized on pages that are being ignored by crawlers, work on them, and thus attract Googlebot visits. A log analyzer benefits in optimizing other crucial aspects of your website:

Image created by JetOctopus, May 2024
  • Detailed Analysis of Bot Behavior: Log analyzers allow you to dissect how search bots interact with your site by examining factors like the depth of their crawl, the number of internal links on a page, and the word count per page. This detailed analysis provides you with the exact to-do items for optimizing your site’s SEO performance.
  • Improves Internal Linking and Technical Performance: Log analyzers provide detailed insights into the structure and health of your site. They help identify underperforming pages and optimize the internal links placement, ensuring a smoother user and crawler navigation. They also facilitate the fine-tuning of content to better meet SEO standards, while highlighting technical issues that may affect site speed and accessibility.
  • Aids in Troubleshooting JavaScript and Indexation Challenges: Big websites, especially eCommerce, often rely heavily on JavaScript for dynamic content. In the case of JS websites, the crawling process is lengthy. A log analyzer can track how well search engine bots are able to render and index JavaScript-dependent content, underlining potential pitfalls in real-time. It also identifies pages that are not being indexed as intended, allowing for timely corrections to ensure all relevant content can rank.
  • Helps Optimize Distance from Index (DFI): The concept of Distance from Index (DFI) refers to the number of clicks required to reach any given page from the home page. A lower DFI is generally better for SEO as it means important content is easier to find, both by users and search engine crawlers. Log analyzers help map out the navigational structure of your site, suggesting changes that can reduce DFI and improve the overall accessibility of key content and product pages.

Besides, historical log data offered by a log analyzer can be invaluable. It helps make your SEO performance not only understandable but also predictable. Analyzing past interactions allows you to spot trends, anticipate future hiccups, and plan more effective SEO strategies.

With JetOctopus, you benefit from no volume limits on logs, enabling comprehensive analysis without the fear of missing out on crucial data. This approach is fundamental in continually refining your strategy and securing your site’s top spot in the fast-evolving landscape of search.

Real-World Wins Using Log Analyzer

Big websites in various industries have leveraged log analyzers to attain and maintain top spots on Google for profitable keywords, which has significantly contributed to their business growth.

For example, Skroutz, Greece’s biggest marketplace website with over 1 million sessions daily, set up a real-time crawl and log analyzer tool that helped them know things like:

  • Does Googlebot crawl pages that have more than two filters activated?
  • How extensively does Googlebot crawl a particularly popular category?
  • What are the main URL parameters that Googlebot crawls?
  • Does Googlebot visit pages with filters like “Size,” which are typically marked as nofollow?

This ability to see real-time visualization tables and historical log data spanning over ten months for monitoring Googlebot crawls effectively enabled Skroutz to find crawling loopholes and decrease index size, thus optimizing its crawl budget.

Eventually, they also saw a reduced time for new URLs to be indexed and ranked — instead of taking 2-3 months to index and rank new URLs, the indexing and ranking phase took only a few days.

This strategic approach to technical SEO using log files has helped Skroutz cement its position as one of the top 1000 websites globally according to SimilarWeb, and the fourth most visited website in Greece (after Google, Facebook, and Youtube) with over 70% share of its traffic from organic search.

Image created by JetOctopus, May 2024

Another case in point is DOM.RIA, Ukraine’s popular real estate and rental listing website, which doubled the Googlebot visits by optimizing their website’s crawl efficiency. As their site structure is huge and elaborate, they needed to optimize the crawl efficiency for Googlebot to ensure the freshness and relevance of content appearing in Google.

Initially, they implemented a new sitemap to improve the indexing of deeper directories. Despite these efforts, Googlebot visits remained low.

By using the JetOctopus to analyze their log files, DOM.RIA identified and addressed issues with their internal linking and DFI. They then created mini-sitemaps for poorly scanned directories (such as for the city, including URLs for streets, districts, metro, etc.) while assigning meta tags with links to pages that Googlebot often visits. This strategic change resulted in a more than twofold increase in Googlebot activity on these crucial pages within two weeks.

Image created by JetOctopus, May 2024

Getting Started With An SEO Log Analyzer

Now that you know what a log analyzer is and what it can do for big websites, let’s take a quick look at the steps involved in logs analysis.

Here is an overview of using an SEO log analyzer like JetOctopus for your website:

  • Integrate Your Logs: Begin by integrating your server logs with a log analysis tool. This step is crucial for capturing all data related to site visits, which includes every request made to the server.
  • Identify Key Issues: Use the log analyzer to uncover significant issues such as server errors (5xx), slow load times, and other anomalies that could be affecting user experience and site performance. This step involves filtering and sorting through large volumes of data to focus on high-impact problems.
  • Fix the Issues: Once problems are identified, prioritize and address these issues to improve site reliability and performance. This might involve fixing broken links, optimizing slow-loading pages, and correcting server errors.
  • Combine with Crawl Analysis: Merge log analysis data with crawl data. This integration allows for a deeper dive into crawl budget analysis and optimization. Analyze how search engines crawl your site and adjust your SEO strategy to ensure that your most valuable pages receive adequate attention from search bots.

And that’s how you can ensure that search engines are efficiently indexing your most important content.

Conclusion

As you can see, the strategic use of log analyzers is more than just a technical necessity for large-scale websites. Optimizing your site’s crawl efficiency with a log analyzer can immensely impact your SERP visibility.

For CMOs managing large-scale websites, embracing a log analyzer and crawler toolkit like JetOctopus is like getting an extra tech SEO analyst that bridges the gap between SEO data integration and organic traffic growth.


Image Credits

Featured Image: Image by JetOctopus Used with permission.

Google Data Leak Clarification via @sejournal, @martinibuster

Over the United States holidays some posts were shared about an alleged leak of Google ranking-related data. The first posts about the leaks focused on “confirming” beliefs that were long-held by Rand Fishkin but not much attention was focused on the context of the information and what it really means.

Context Matters: Document AI Warehouse

The leaked document shares relation to a public Google Cloud platform called Document AI Warehouse which is used for analyzing, organizing, searching, and storing data. This public documentation is titled Document AI Warehouse overview. A post on Facebook shares that the “leaked” data is the “internal version” of the publicly visible Document AI Warehouse documentation. That’s the context of this data.

Screenshot: Document AI Warehouse

Screenshot

@DavidGQuaid tweeted:

“I think its clear its an external facing API for building a document warehouse as the name suggests”

That seems to throw cold water on the idea that the “leaked” data represents internal Google Search information.

As far we know at this time, the “leaked data” shares a similarity to what’s in the public Document AI Warehouse page.

Leak Of Internal Search Data?

The original post on SparkToro does not say that the data originates from Google Search. It says that the person who sent the data to Rand Fishkin is the one who made that claim.

One of the things I admire about Rand Fishkin is that he is meticulously precise in his writing, especially when it comes to caveats. Rand precisely notes that it’s the person who provided the data who makes the claim that the data originates from Google Search. There is no proof, only a claim.

He writes:

“I received an email from a person claiming to have access to a massive leak of API documentation from inside Google’s Search division.”

Fishkin himself does not affirm that the data was confirmed by ex-Googlers to have originated from Google Search. He writes that the person who emailed the data made that claim.

“The email further claimed that these leaked documents were confirmed as authentic by ex-Google employees, and that those ex-employees and others had shared additional, private information about Google’s search operations.”

Fishkin writes about a subsequent video meeting where the the leaker revealed that his contact with ex-Googlers was in the context of meeting them at a search industry event. Again, we’ll have to take the leakers word for it about the ex-Googlers and that what they said was after carefully reviewing the data and not an informal comment.

Fishkin writes that he contacted three ex-Googlers about it. What’s notable is that those ex-Googlers did not explicitly confirm that the data is internal to Google Search. They only confirmed that the data looks like it resembles internal Google information, not that it originated from Google Search.

Fishkin writes what the ex-Googlers told him:

  • “I didn’t have access to this code when I worked there. But this certainly looks legit.”
  • “It has all the hallmarks of an internal Google API.”
  • “It’s a Java-based API. And someone spent a lot of time adhering to Google’s own internal standards for documentation and naming.”
  • “I’d need more time to be sure, but this matches internal documentation I’m familiar with.”
  • “Nothing I saw in a brief review suggests this is anything but legit.”

Saying something originates from Google Search and saying that it originates from Google are two different things.

Keep An Open Mind

It’s important to keep an open mind about the data because there is a lot about it that is unconfirmed. For example, it is not known if this is an internal Search Team document. Because of that it is probably not a good idea to take anything from this data as actionable SEO advice.

Also, it’s not advisable to analyze the data to specifically confirm long-held beliefs. That’s how one becomes ensnared in Confirmation Bias.

A definition of Confirmation Bias:

“Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one’s prior beliefs or values.”

Confirmation Bias will lead to a person deny things that are empirically true. For example, there is the decades-old idea that Google automatically keeps a new site from ranking, a theory called the Sandbox. People every day report that their new sites and new pages nearly immediately rank in the top ten of Google search.

But if you are a hardened believer in the Sandbox then actual observable experience like that will be waved away, no matter how many people observe the opposite experience.

Brenda Malone, Freelance Senior SEO Technical Strategist and Web Developer (LinkedIn profile), messaged me about claims about the Sandbox:

“I personally know, from actual experience, that the Sandbox theory is wrong. I just indexed in two days a personal blog with two posts. There is no way a little two post site should have been indexed according to the the Sandbox theory.”

The takeaway here is that if the documentation turns out to originate from Google Search, the incorrect way to analyze the data is to go hunting for confirmation of long-held beliefs.

What Is The Google Data Leak About?

There are five things to consider about the leaked data:

  1. The context of the leaked information is unknown. Is it Google Search related? Is it for other purposes?
  2. The purpose of the data. Was the information used for actual search results? Or was it used for data management or manipulation internally?
  3. Ex-Googlers did not confirm that the data is specific to Google Search. They only confirmed that it appears to come from Google.
  4. Keep an open mind. If you go hunting for vindication of long-held beliefs, guess what? You will find them, everywhere. This is called confirmation bias.
  5. Evidence suggests that data is related to an external-facing API for building a document warehouse.

What Others Say About “Leaked” Documents

Ryan Jones, someone who not only has deep SEO experience but has a formidable understanding of computer science shared some reasonable observations about the so-called data leak.

Ryan tweeted:

“We don’t know if this is for production or for testing. My guess is it’s mostly for testing potential changes.

We don’t know what’s used for web or for other verticals. Some things might only be used for a Google home or news etc.

We don’t know what’s an input to a ML algo and what’s used to train against. My guess is clicks aren’t a direct input but used to train a model how to predict clickability. (Outside of trending boosts)

I’m also guessing that some of these fields only apply to training data sets and not all sites.

Am I saying Google didn’t lie? Not at all. But let’s examine this leak objectionably and not with any preconceived bias.”

@DavidGQuaid tweeted:

“We also don’t know if this is for Google search or Google cloud document retrieval

APIs seem pick & choose – that’s not how I expect the algorithm to be run – what if an engineer wants to skip all those quality checks – this looks like I want to build a content warehouse app for my enterprise knowledge base”

Is The “Leaked” Data Related To Google Search?

At this point in time there is no hard evidence that this “leaked” data is actually from Google Search. There is an overwhelming amount of ambiguity about what the purpose of the data is. Notable is that there are hints that this data is just “an external facing API for building a document warehouse as the name suggests” and not related in any way to how websites are ranked in Google Search.

The conclusion that this data did not originate from Google Search is not definitive at this time but it’s the direction that the wind of evidence appears to be blowing.

Featured Image by Shutterstock/Jaaak

Google Search Leak: Conflicting Signals, Unanswered Questions via @sejournal, @MattGSouthern

An apparent leak of Google Search API documentation has sparked intense debate within the SEO community, with some claiming it proves Google’s dishonesty and others urging caution in interpreting the information.

As the industry grapples with the allegations, a balanced examination of Google’s statements and the perspectives of SEO experts is crucial to understanding the whole picture.

Leaked Documents Vs. Google’s Public Statements

Over the years, Google has consistently maintained that specific ranking signals, such as click data and user engagement metrics, aren’t used directly in its search algorithms.

In public statements and interviews, Google representatives have emphasized the importance of relevance, quality, and user experience while denying the use of specific metrics like click-through rates or bounce rates as ranking-related factors.

However, the leaked API documentation appears to contradict these statements.

It contains references to features like “goodClicks,” “badClicks,” “lastLongestClicks,” impressions, and unicorn clicks, tied to systems called Navboost and Glue, which Google VP Pandu Nayak confirmed in DOJ testimony are parts of Google’s ranking systems.

The documentation also alleges that Google calculates several metrics using Chrome browser data on individual pages and entire domains, suggesting the full clickstream of Chrome users is being leveraged to influence search rankings.

This contradicts past Google statements that Chrome data isn’t used for organic searches.

The Leak’s Origins & Authenticity

Erfan Azimi, CEO of digital marketing agency EA Eagle Digital, alleges he obtained the documents and shared them with Rand Fishkin and Mike King.

Azimi claims to have spoken with ex-Google Search employees who confirmed the authenticity of the information but declined to go on record due to the situation’s sensitivity.

While the leak’s origins remain somewhat ambiguous, several ex-Googlers who reviewed the documents have stated they appear legitimate.

Fishkin states:

“A critical next step in the process was verifying the authenticity of the API Content Warehouse documents. So, I reached out to some ex-Googler friends, shared the leaked docs, and asked for their thoughts.”

Three ex-Googlers responded, with one stating, “It has all the hallmarks of an internal Google API.”

However, without direct confirmation from Google, the authenticity of the leaked information is still debatable. Google has not yet publicly commented on the leak.

It’s important to note that, according to Fishkin’s article, none of the ex-Googlers confirmed that the leaked data was from Google Search. Only that it appears to have originated from within Google.

Industry Perspectives & Analysis

Many in the SEO community have long suspected that Google’s public statements don’t tell the whole story. The leaked API documentation has only fueled these suspicions.

Fishkin and King argue that if the information is accurate, it could have significant implications for SEO strategies and website search optimization.

Key takeaways from their analysis include:

  • Navboost and the use of clicks, CTR, long vs. Short clicks, and user data from Chrome appear to be among Google’s most powerful ranking signals.
  • Google employs safelists for sensitive topics like COVID-19, elections, and travel to control what sites appear.
  • Google uses Quality Rater feedback and ratings in its ranking systems, not just as a training set.
  • Click data influences how Google weights links for ranking purposes.
  • Classic ranking factors like PageRank and anchor text are losing influence compared to more user-centric signals.
  • Building a brand and generating search demand is more critical than ever for SEO success.

However, just because something is mentioned in API documentation doesn’t mean it’s being used to rank search results.

Other industry experts urge caution when interpreting the leaked documents.

They point out that Google may use the information for testing purposes or apply it only to specific search verticals rather than use it as active ranking signals.

There are also open questions about how much weight these signals carry compared to other ranking factors. The leak doesn’t provide the full context or algorithm details.

Unanswered Questions & Future Implications

As the SEO community continues to analyze the leaked documents, many questions still need to be answered.

Without official confirmation from Google, the authenticity and context of the information are still a matter of debate.

Key open questions include:

  • How much of this documented data is actively used to rank search results?
  • What is the relative weighting and importance of these signals compared to other ranking factors?
  • How have Google’s systems and use of this data evolved?
  • Will Google change its public messaging and be more transparent about using behavioral data?

As the debate surrounding the leak continues, it’s wise to approach the information with a balanced, objective mindset.

Unquestioningly accepting the leak as gospel truth or completely dismissing it are both shortsighted reactions. The reality likely lies somewhere in between.

Potential Implications For SEO Strategies and Website Optimization

It would be highly inadvisable to act on information shared from this supposed ‘leak’ without confirming whether it’s an actual Google search document.

Further, even if the content originates from search, the information is a year old and could have changed. Any insights derived from the leaked documentation should not be considered actionable now.

With that in mind, while the full implications remain unknown, here’s what we can glean from the leaked information.

1. Emphasis On User Engagement Metrics

If click data and user engagement metrics are direct ranking factors, as the leaked documents suggest, it could place greater emphasis on optimizing for these metrics.

This means crafting compelling titles and meta descriptions to increase click-through rates, ensuring fast page loads and intuitive navigation to reduce bounces, and strategically linking to keep users engaged on your site.

Driving traffic through other channels like social media and email can also help generate positive engagement signals.

However, it’s important to note that optimizing for user engagement shouldn’t come at the expense of creating reader-focused content. Gaming engagement metrics are unlikely to be a sustainable, long-term strategy.

Google has consistently emphasized the importance of quality and relevance in its public statements, and based on the leaked information, this will likely remain a key focus. Engagement optimization should support and enhance quality content, not replace it.

2. Potential Changes To Link-Building Strategies

The leaked documents contain information about how Google treats different types of links and their impact on search rankings.

This includes details about the use of anchor text, the classification of links into different quality tiers based on traffic to the linking page, and the potential for links to be ignored or demoted based on various spam factors.

If this information is accurate, it could influence how SEO professionals approach link building and the types of links they prioritize.

Links that drive real click-throughs may carry more weight than links on rarely visited pages.

The fundamentals of good link building still apply—create link-worthy content, build genuine relationships, and seek natural, editorially placed links that drive qualified referral traffic.

The leaked information doesn’t change this core approach but offers some additional nuance to be aware of.

3. Increased Focus On Brand Building and Driving Search Demand

The leaked documents suggest that Google uses brand-related signals and offline popularity as ranking factors. This could include metrics like brand mentions, searches for the brand name, and overall brand authority.

As a result, SEO strategies may emphasize building brand awareness and authority through both online and offline channels.

Tactics could include:

  • Securing brand mentions and links from authoritative media sources.
  • Investing in traditional PR, advertising, and sponsorships to increase brand awareness.
  • Encouraging branded searches through other marketing channels.
  • Optimizing for higher search volumes for your brand vs. unbranded keywords.
  • Building engaged social media communities around your brand.
  • Establishing thought leadership through original research, data, and industry contributions.

The idea is to make your brand synonymous with your niche and build an audience that seeks you out directly. The more people search for and engage with your brand, the stronger those brand signals may become in Google’s systems.

4. Adaptation To Vertical-Specific Ranking Factors

Some leaked information suggests that Google may use different ranking factors or algorithms for specific search verticals, such as news, local search, travel, or e-commerce.

If this is the case, SEO strategies may need to adapt to each vertical’s unique ranking signals and user intents.

For example, local search optimization may focus more heavily on factors like Google My Business listings, local reviews, and location-specific content.

Travel SEO could emphasize collecting reviews, optimizing images, and directly providing booking/pricing information on your site.

News SEO requires focusing on timely, newsworthy content and optimized article structure.

While the core principles of search optimization still apply, understanding your particular vertical’s nuances, based on the leaked information and real-world testing, can give you a competitive advantage.

The leaks suggest a vertical-specific approach to SEO could give you an advantage.

Conclusion

The Google API documentation leak has created a vigorous discussion about Google’s ranking systems.

As the SEO community continues to analyze and debate the leaked information, it’s important to remember a few key things:

  1. The information isn’t fully verified and lacks context. Drawing definitive conclusions at this stage is premature.
  2. Google’s ranking algorithms are complex and constantly evolving. Even if entirely accurate, this leak only represents a snapshot in time.
  3. The fundamentals of good SEO – creating high-quality, relevant, user-centric content and promoting it effectively – still apply regardless of the specific ranking factors at play.
  4. Real-world testing and results should always precede theorizing based on incomplete information.

What To Do Next

As an SEO professional, the best course of action is to stay informed about the leak.

Because details about the document remain unknown, it’s not a good idea to consider any takeaways actionable.

Most importantly, remember that chasing algorithms is a losing battle.

The only winning strategy in SEO is to make your website the best result for your message and audience. That’s Google’s endgame, and that’s where your focus should be, regardless of what any particular leaked document suggests.

The Traffic Impact Of AI Overviews via @sejournal, @Kevin_Indig

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

Two weeks after Google rolled out AI Overviews (AIOs), we can analyze early data to gauge the impact on organic traffic.

I wanted to understand the impact and implications of AIOs, especially after many reports of misinformation and sometimes harmful recommendations from Google’s AI answers.

I found that AIOs can hurt organic traffic, especially when queries indicate that users want quick answers. However, there might be a chance that some AIOs deliver more traffic to cited sites.

When AIOs Show Up

I used ZipTie to crawl the search results for 1,675 queries in the health vertical on May 22 to understand how, when, and maybe why Google shows AIOs.

I mixed ZipTie’s data with Search Console and Ahrefs to understand the implications for organic traffic, the influence of backlinks, and estimated domain traffic.

42% Of Queries Show AIOs

AIOs showed up 42% of the time (704/1,675 queries), which is much more than the 16% found in ecommerce by ZipTie or the 15% found by SEO Clarity.

The higher rate makes sense since Google announced to show AIOs for complex queries, which are more likely to occur in health than ecommerce.

I found a weak relationship with the number of words, indicating that longer queries are more likely to trigger AIOs. Yet, I’m surprised to find so many AIOs in a sensitive space like health, where wrong information is so much more risky.

Bar graph titled Image Credit: Kevin Indig

There seems to be no relationship between keyword difficulty or search volume, even though the domain I have GSC access to is more likely to be cited for low-difficulty queries.

AIOs are more likely to show up alongside People Also Ask (PAA), Featured Snippets (FS), and Discussions & Forums modules (D&F), which makes sense since those SERP features indicate informational searches.

Knowledge Panels and Top Ads showed no correlation, but also showed up less often than other SERP features.

Bar graph displaying the correlation between AI-driven AIOs and SERP features.Image Credit: Kevin Indig

Though correlations are weak, the data indicates that AIOs are more likely to appear for queries related to questions and comparisons.

Bar chart showing correlation between AIOs and query syntax for terms.Image Credit: Kevin Indig

Who Shows Up In AIOs

The 704 AIOs in the dataset cited 4,493 sites, which is an average of 6.3 links per AIO.

I found a very strong correlation (0.917) between sites that show up in the top 10 organic results and sites cited in AIOs for the 1,675 keywords.

Surprisingly, Reddit (No. 92 on the list) and Quora (No. 17) barely contributed to citations.

A table showing various domains, the number of times each domain is cited, and their rankings in the top 10 organic search results.Image Credit: Kevin Indig

I found no strong relationships between domains that get a lot of AIO citations and their organic traffic, backlinks (referring domains), or ranking keywords.

Sites that rank well for keywords are more likely to be cited in AIOs, but it’s still unclear when Google decides to cite a site that doesn’t rank well for a keyword.

Traffic Impact Of AIOs

The most important question is how AIOs impact organic traffic of cited and uncited URLs.

To get to the ground of things, I compared organic clicks from Search Console for a domain across 1,675 non-branded keywords (US) in the week of May 7 with the week of May 14.

After excluding low-traffic keywords, 560 AIOs showed up for 1,344 keywords, of which the target domain was cited 171 times with 461 different URLs.

To make sure rank changes don’t influence the results, I excluded keywords with a higher rank change than 0.9 and lower than -0.9, after which 52/521 URLs remained.

I found a strong negative correlation of -0.61 between cited URLs and traffic change, indicating that AIO citations send fewer clicks to cited URLs. In this case, the domain received -8.9% fewer clicks when cited in AIOs.

Scatter plot showing the correlation between AI citations and traffic impact. Image Credit: Kevin Indig

However, results can vary by user intent. Most URLs that lost clicks, most likely due to AIOs, targeted questions like “how to get viagra without prescription.”

AIOs seem comparable to Featured Snippets, which can send more or fewer clicks based on whether the keyword is shallow or complex.

Also, note that a big chunk of the traffic losses was caused by a few URLs. Correlating the data without outlier URLs actually resulted in a slightly positive correlation of 0.1, indicating that there might be a chance that some AIOs send more traffic to URLs based on whether users want more information.

When AIOs show up, and a site isn’t cited, I found an average loss of -2.8% organic traffic, indicating that users might still click organic results. But to really make sure, we have to measure the impact on organic clicks for the same keywords. Take this result with a grain of salt.

Caveats

  • AIOs can change, and it’s not clear how often. There does not seem to be a normal change rate over time, but it seems Google has pulled back a lot of AIOs. I wasn’t able to recreate most AIOs five days after the initial crawl. Google might have pulled back due to the numerous reports of misleading and sometimes harmful answers.
  • 1,675 queries are a good start, but we need 100 times as many to make more robust statements.
  • We also need a lot more traffic data than n=1.

Canary In The Coal Mine

In aggregate, strong SEO performance seems the best way to appear in AI Overviews, even though Google will also cite URLs that don’t perform well. We still don’t know enough about the content features that make it more likely to be cited in AIOs.

Broadly speaking, AI overviews have a bigger impact than “just” Google Search. They’re the first AI feature that significantly changes the cash cow of one of the most valuable companies in the world.

AI has been undergoing a massive hype cycle since the launch of ChatGPT3 in November 2022. Lately, we’ve been asking ourselves more and more how big the actual incremental value of AI is.

AIOs are the canary in the coal mine. They could prove the value of AI in products or even pop the AI bubble we’re in.


X (Twitter) of Bartosz Góralewicz

AI Overviews: Measuring the Impact on SEO


Featured Image: Paulo Bobita/Search Engine Journal

New WordPress Plugin Solves Site Navigation Problem via @sejournal, @martinibuster

Joost de Valk, the creator of Yoast SEO plugin, has created a new (and free) plugin for solving a site architecture problem that can silently diminish a website’s ability to rank.

Site Architecture

Site architecture is an important SEO factor because a well-organized website with clear navigation helps users quickly get to the content and products they’re looking for. Along the way it also helps Google find the most important pages and rank them.

The normal and common sense way to organize a website is by topic categories. While some newbie-SEOs believe that organizing a site by topic is an SEO strategy, it’s really just plain old common sense. Organizing a site by topic categories organizes a site in a way that makes it easy to drill-down and find specific things.

Tags: Contextual Site Navigation

Another way to organize a website is through contextual navigation. Contextual navigation is a way to offer a site visitor links to more webpages that are relevant to the webpage and to their interests in the moment. The way to provide a contextual link is through the concept of Tags. Tags are strongly relevant links to content that site visitors may find interesting.

For example, if someone is on a webpage about a new song by a pop star they may in that moment may be interested in reading more articles about that singer. A publisher can create a tag which links to a page that collects every article about that specific pop singer. Ordinarily it doesn’t make sense to create an entire category for hundreds of musical artists because that would defeat the purpose of a hierarchical site navigation (which is to make it easy to find content).

Tags solve the problem of making it easy to navigate to more content that one site visitor is specifically interested in at that moment. It’s contextually relevant navigation.

Too Many Good Things Isn’t Always Good

Creating a long-range plan for organizing a website can be undone by time as a website grows and trends wane. An artist that was trending several years ago may have dropped out of favor (as they often do) and people lose interest. But those tags remain, linking to content that isn’t important anymore, defeating the purpose of internal site navigation, which is to link to the most important content.

Joost de Valk researched a (very small) sample of WordPress sites and discovered that about two thirds of the websites contained overlapping tags, multiple tags linking to the same content while also generating thin content pages, which are webpages with little value.

A blog post sharing his findings noted:

“Tags are not used correctly in WordPress. Approximately two-thirds of WordPress websites using tags are using (way) too many tags. This has significant consequences for a site’s chances in the search engines – especially if the site is large. WordPress websites use too many tags, often forget to display them on their site, and the tag pages do not contain any unique content.”

The sample size was small and a reasonable argument can be made that his findings aren’t representative of most WordPress sites. But the fact remains that websites can be burdened by overlapping and outdated tags.

Here are the three main tag navigation problems that Joost identified:

1. Too Many Tags
He found that some publishers add a tag to an article with the expectation that they will add more articles to that tags when those articles are written which in many cases doesn’t happen, resulting in tags that link to just a few articles, sometimes only to one article.

2. Some Themes Are Missing The Tag Functionality
The next issue happens when websites upgrade to a new theme (or a new version of a theme) that doesn’t have the tag functionality. This creates orphaned tag pages, pages that site visitors can’t reach because the links to those tag pages are missing. But because those pages still exist the search engines will find them through the autogenerated XML sitemaps.

3. Tag Pages Can Become Thin Content
The third issue is that many publishers don’t take the time to add meaningful content to tag pages, they’re just pages of links with article excerpts that are also reproduced on category pages.

Use Fewer Tags

This is where Joost de Valk’s new WordPress plugin comes in handy. What it does is to automatically remove tags that aren’t linking to enough pages, which helps to normalize internal linking. This new plugin is called, The Fewer Tags WordPress Plugin. There’s a free version and a paid Pro version.

The free version of the plugin works automatically to remove all tag pages that contain less than ten posts, which can be adjusted to remove pages with five posts or less.

Added functionality of the Pro version allows greater control over tag management so that a publisher can merge tag pages, automatically create redirects or send a 404 Page Not Found server response.

These are the list of benefits for the Pro version:

  • “Merge & delete unneeded tag pages quickly & easily.
  • Creates redirects for removed tag pages on the fly, in your SEO plugin of choice.
  • Includes an online course in which Joost explains what you should do!
  • Fix a site’s tag issues long-term!
  • Uninstall the plugin when you’re done!”

Where To Download Fewer Tags Plugin

The free version of the plugin can be downloaded here:

Fewer Tags Free By Joost de Valk

Read more about the Pro version here.

Featured Image by Shutterstock/Simple Line

Google Is Now Indexing EPUB Files via @sejournal, @martinibuster

Google announced that it is now indexing .epub documents, a format commonly used to print books for e-readers. Google is already showing EPUB books in the search index.

EPUB File Format

EPUB is an XML-based eBook publishing format based on a standard developed by the International Digital Publishing Forum, which in 2016 was subsequently merged with the World Wide Wide Web Consortium (W3C). The goal of the merger was to bring together electronic book publishing with the Internet so that they would mutually enrich each other.

Google Indexing EBUB Content

The intent of merging e-publishing with the Internet aligns with Google’s decision to index (and at some point presumably rank) EPUB content. The only surprise should be that it took eight years to do so. The changelog notes that EPUB file format was added to Google’s documentation of indexable file types and offers no other details.

Google’s official changelog offers a matter of fact notation:

“Adding epub to indexable file types

What: Added EPUB to the list of indexable file types.

Why: Google Search now supports epub.”

Does Google Rank EPUB Content?

I did a site:search for EPUB content, noted the title of a scientific research about eating contaminated fish in Lake Ontario (“Consumption of Contaminated Lake Fish and Reproduction”) that was hosted on the journals.lww.com domain.

I next searched for that document in the regular search using the exact match keyword phrase and a variation of the keyword phrase (“Consumption of Contaminated Fish in Lake Ontario”) and Google didn’t surface the EPUB document but it did surface the webpage that contained the download to the EPUB document.

Screenshot Of EPUB Download Page

Google’s official indexable file type documentation only notes that the listed filetypes are indexable. At this time it’s fair to say that Google isn’t ranking EPUB documents but Google will surface them with a filetype:epub search.

Read Google’s official documentation:

File types indexable by Google

Featured Image by Shutterstock/Simple Line

Google’s AI Overviews Shake Up Ecommerce Search Visibility via @sejournal, @MattGSouthern

An analysis of 25,000 ecommerce queries by Bartosz Góralewicz, founder of Onely, reveals the impact of Google’s AI overviews on search visibility for online retailers.

The study found that 16% of eCommerce queries now return an AI overview in search results, accounting for 13% of total search volume in this sector.

Notably, 80% of the sources listed in these AI overviews do not rank organically for the original query.

“Ranking #1-3 gives you only an 8% chance of being a source in AI overviews,” Góralewicz stated.

Shift Toward “Accelerated” Product Experiences

International SEO consultant Aleyda Solis analyzed the disconnect between traditional organic ranking and inclusion in AI overviews.

According to Solis, for product-related queries, Google is prioritizing an “accelerated” approach over summarizing currently ranking pages.

She commented Góralewicz’ findings, stating:

“… rather than providing high level summaries of what’s already ranked organically below, what Google does with e-commerce is “accelerate” the experience by already showcasing what the user would get next.”

Solis explains that for queries where Google previously ranked category pages, reviews, and buying guides, it’s now bypassing this level of results with AI overviews.

Assessing AI Overview Traffic Impact

To help retailers evaluate their exposure, Solis has shared a spreadsheet that analyzes the potential traffic impact of AI overviews.

As Góralewicz notes, this could be an initial rollout, speculating that “Google will expand AI overviews for high-cost queries when enabling ads” based on data showing they are currently excluded for high cost-per-click keywords.

An in-depth report across ecommerce and publishing is expected soon from Góralewicz and Onely, with additional insights into this search trend.

Why SEJ Cares

AI overviews represent a shift in how search visibility is achieved for ecommerce websites.

With most overviews currently pulling product data from non-ranking sources, the traditional connection between organic rankings and search traffic is being disrupted.

Retailers may need to adapt their SEO strategies for this new search environment.

How This Can Benefit You

While unsettling for established brands, AI overviews create new opportunities for retailers to gain visibility without competing for the most commercially valuable keywords.

Ecommerce sites can potentially circumvent traditional ranking barriers by optimizing product data and detail pages for Google’s “accelerated” product displays.

The detailed assessment framework provided by Solis enables merchants to audit their exposure and prioritize optimization needs accordingly.


FAQ

What are the key findings from the analysis of AI overviews & ecommerce queries?

Góralewicz’s analysis of 25,000 ecommerce queries found:

  • 16% of ecommerce queries now return an AI overview in the search results.
  • 80% of the sources listed in these AI overviews do not rank organically for the original query.
  • Ranking positions #1-3 only provides an 8% chance of being a source in AI overviews.

These insights reveal significant shifts in how ecommerce sites need to approach search visibility.

Why are AI overviews pulling product data from non-ranking sources, and what does this mean for retailers?

Google’s AI overviews prioritize “accelerated” experiences over summarizing currently ranked pages for product-related queries.

This shift focuses on showcasing directly what users seek instead of traditional organic results.

For retailers, this means:

  • A need to optimize product pages beyond traditional SEO practices, catering to the data requirements of AI overviews.
  • Opportunities to gain visibility without necessarily holding top organic rankings.
  • Potential to bypass traditional ranking barriers by focusing on enhanced product data integration.

Retailers must adapt quickly to remain competitive in this evolving search environment.

What practical steps can retailers take to evaluate and improve their search visibility in light of AI overview disruptions?

Retailers can take several practical steps to evaluate and improve their search visibility:

  • Utilize the spreadsheet provided by Aleyda Solis to assess the potential traffic impact of AI overviews.
  • Optimize product and detail pages to align with the data and presentation style preferred by AI overviews.
  • Continuously monitor changes and updates to AI overviews, adapting strategies based on new data and trends.

These steps can help retailers navigate the impact of AI overviews and maintain or improve their search visibility.


Featured Image: Marco Lazzarini/Shutterstock

Google Rolls Out Search Profile Feature For Reviewers via @sejournal, @MattGSouthern

Google is rolling out a new social profile feature that will allow you to view, manage, and share written reviews across Google’s various platforms.

This feature was announced via an email to people who have contributed reviews to Google Search.

In the email notification, Google states that the primary purpose of this feature is to “make your reviews more helpful to others.”

Please note that this is separate from users’ Google Maps review history, which is already public.

Centralized Review Management

Google’s search reviews profiles, accessible at profile.google.com, is a centralized hub for you to see all the reviews you’ve previously contributed, including reviews for TV shows, movies, and other content.

This new feature provides a more seamless experience for viewing, updating, and deleting past reviews.

Private Initially, Public Soon

Currently, these profiles are visible only to the individual users themselves.

On June 24th, other Google users can view your profile and written reviews by tapping our name or picture on any published reviews.

Privacy Considerations

By allowing users to access and explore each other’s review histories, Google is making the review ecosystem within its platforms more transparent.

While the profile will make your written reviews publicly accessible, Google has assured that personal details from individual Google Accounts, such as birthdays, won’t be displayed.

If you prefer not to have a public profile, you’ll have the option to delete it.

Why SEJ Cares

This centralized profile could be a helpful way to evaluate the credibility and consistency of reviewers, potentially influencing purchasing decisions.

Conversely, creators may need to adapt their review management strategies to account for the potential impact of individual reviewers.

As the June 24th rollout date approaches, expect to see this new feature integrated into the search experience.

How This Can Benefit You

If you actively contribute reviews on Google’s platforms, this increased visibility may enhance your influence and result in greater recognition within your area of expertise.

For creators, the ability to investigate reviewer profiles could help identify and address potentially misleading or fraudulent reviews, fostering a more trustworthy review ecosystem.

On the other hand, it may necessitate a more proactive approach to monitoring and responding to critical reviews, as they will now be more easily accessible to potential customers.


FAQ

What is the search reviews profile feature introduced by Google?

Google introduced a new type of social profile that allows users to view, manage, and share their written reviews across various platforms.

This feature aims to make users’ reviews more helpful by centralizing them in one hub. It makes it easier for users to update, delete, or view their past reviews. Initially private, these profiles will soon be visible to other users starting June 24th.

How will individual reviewer profiles impact online marketers?

This feature adds a layer of transparency to the review ecosystem. Online marketers might use these profiles to assess the credibility and consistency of reviewers, which can inform their strategies for managing customer feedback.

For reviewers, increased visibility can enhance their reputations, potentially influencing purchasing decisions and improving their authority in specific niches.

What are the key benefits of the new Google profiles for active review contributors?

Active review contributors stand to benefit from increased visibility and recognition. Their reviews will be easily accessible, enhancing their influence as trusted reviewers.

This can be particularly advantageous for users whose reviews focus on specific domains, as it may lead to greater acknowledgment and trust from the community.


Featured Image: BestForBest/Shutterstock

Google Confirms: No Algorithmic Actions For Site Reputation Abuse Yet via @sejournal, @MattGSouthern

Google’s Search Liaison, Danny Sullivan, has confirmed that the search engine hasn’t launched algorithmic actions targeting site reputation abuse.

This clarification addresses speculation within the SEO community that recent traffic drops are related to Google’s previously announced policy update.

Sullivan Says No Update Rolled Out

Lily Ray, an SEO professional, shared a screenshot on Twitter showing a significant drop in traffic for the website Groupon starting on May 6.

Ray suggested this was evidence that Google had begun rolling out algorithmic penalties for sites violating the company’s site reputation abuse policy.

However, Sullivan quickly stepped in, stating:

“We have not gone live with algorithmic actions on site reputation abuse. I well imagine when we do, we’ll be very clear about that. Publishers seeing changes and thinking it’s this — it’s not — results change all the time for all types of reasons.”

Sullivan added that when the actions are rolled out, they will only impact specific content, not entire websites.

This is an important distinction, as it suggests that even if a site has some pages manually penalized, the rest of the domain can rank normally.

Background On Google’s Site Reputation Abuse Policy

Earlier this year, Google announced a new policy to combat what it calls “site reputation abuse.”

This refers to situations where third-party content is published on authoritative domains with little oversight or involvement from the host site.

Examples include sponsored posts, advertorials, and partner content that is loosely related to or unrelated to a site’s primary purpose.

Under the new policy, Google is taking manual action against offending pages and plans to incorporate algorithmic detection.

What This Means For Publishers & SEOs

While Google hasn’t launched any algorithmic updates related to site reputation abuse, the manual actions have publishers on high alert.

Those who rely heavily on sponsored content or partner posts to drive traffic should audit their sites and remove any potential policy violations.

Sullivan’s confirmation that algorithmic changes haven’t occurred may provide temporary relief.

Additionally, his statements also serve as a reminder that significant ranking fluctuations can happen at any time due to various factors, not just specific policy rollouts.


FAQ

Will Google’s future algorithmic actions impact entire websites or specific content?

When Google eventually rolls out algorithmic actions for site reputation abuse, these actions will target specific content rather than the entire website.

This means that if certain pages are found to be in violation, only those pages will be affected, allowing other parts of the site to continue ranking normally.

What should publishers and SEOs do in light of Google’s site reputation abuse policy?

Publishers and SEO professionals should audit their sites to identify and remove any content that may violate Google’s site reputation abuse policy.

This includes sponsored posts and partner content that doesn’t align with the site’s primary purpose. Taking these steps can mitigate the risk of manual penalties from Google.

What is the context of the recent traffic drops seen in the SEO community?

Google claims the recent drops for coupon sites aren’t linked to any algorithmic actions for site reputation abuse. Traffic fluctuations can occur for various reasons and aren’t always linked to a specific algorithm update.


Featured Image: sockagphoto/Shutterstock