Google vs Microsoft Bing: A Detailed Comparison Of Two Search Engines via @sejournal, @wburton27

Between Google and Bing, which search engine should you focus on? Should you focus on both or prioritize one over the other?

Google is still the world’s most popular search engine and dominant APP store player, but things are changing quickly in an AI-driven world.

With the rise of artificial intelligence and both Bing and Google incorporating AI – i.e., Microsoft Copilot powered by OpenAI’s GPT-4, Bing Chat, and Google Gemini – into their algorithms and in the search engine results pages (SERPs), things are changing fast.

Let’s explore.

Google Vs. Microsoft Bing Market Share

One of the first distinctions between Microsoft Bing and Google is market share. According to Statcounter, in the US:

  • Google fell to 86.58%, down from 86.94% in March and 88.88% YoY.
  • Microsoft Bing grew to 8.24%, up from 8.04% in March and up from 6.43% YoY.
  • Yahoo grew to 2.59%, up from 2.48% in March and up from 2.33% YoY.

That’s pretty huge to see Bing growing and Google reducing.

Globally

Google had a 91.05% search market share in June 2024, according to Statcounter’s revised data, which is down from 91.38% in March and 92.82% YoY. Google’s highest search market share during the past 12 months, globally, was 93.11% last May.

While that may make it tempting to focus on Google alone, Microsoft Bing provides good conversions and has a user base that shouldn’t be ignored. Bing’s usage has grown because of the AI-powered feature Bing Chat, which has attracted new users.

Bing is also used by digital assistants such as Alexa and Cortana.

Bing has around 100 million daily active users which is a number that you can’t ignore. It’s particularly important to optimize for Bing if you’re targeting an American audience. In fact, 28.3% of online queries in the U.S. are powered by Microsoft properties when you factor in Yahoo and voice searches.

Some have wondered over the years whether Bing is an acronym for “Because It’s Not Google.” I’m not sure how true that is, but the name does come from a campaign in the early 1990s for its predecessor, Live Search.

Another fun tidbit is that Ahrefs recently did a study on the Top 100 Bing searches globally, and the #1 query searched was [Google].

Comparing Google Vs. Microsoft Bing’s Functionality

From a search functionality perspective, the two search engines are similar, but Google offers more core features:

Feature Google  Microsoft Bing
Text Search Yes Yes
Video Search Yes Yes
Image Search Yes Yes
Maps Yes Yes
News Yes Yes
Shopping Yes Yes
Books Yes No
Flights Yes No
Finance Yes No
Scholarly Literature Yes No

Comparing AI Functionality

Feature Google Bing
AI Accuracy Prone to errors More accurate since it is based on OpenAI GPT-4
Integration Google Workspace Microsoft 365 apps (Word, PowerPoint, Excel, etc.)
Image Generation Handle complex image prompts better than Gemini Allows users to use existing images as prompts for modifications, a feature not in Copilot
Knowledge Base Accesses the up-to-date info and has access to the web Copilot may lag due to potentially outdated databases
Summarizes Provides concise summaries for content within Google’s ecosystem i.e., YouTube videos or emails Good at summarizing meetings and writing emails. Etc.
Context Window Has significantly larger context window of 2 million tokens (or up to 10 million for researchers), allowing it to process much more information at once Microsoft Copilot (using GPT-4) has a context window of up to 100,000 tokens.
AI in Results Yes (AI Overviews) Yes
Focus Research Business and customer service applications
Pricing Similar Similar

How Google & Microsoft Bing Differ In Size Of Index And Crawling

Google says:

“The Google Search index contains hundreds of billions of webpages and is well over 100,000,000 gigabytes in size.”

Even so, not even Google can crawl the entire web. That is just not going to happen.

This is why using structured data is so important, especially now with AI overviews. It provides a data feed about your content so Google can understand it better, which can help you qualify for rich results and get more clicks and impressions.

Microsoft Bing hasn’t released similar figures. However, this search engine index size estimate website puts the Microsoft Bing index at somewhere between 8 to 14 billion web pages.

The two engines have shared a little about their approaches to web indexing.

Microsoft Bing says:

“Bingbot uses an algorithm to determine which sites to crawl, how often, and how many pages to fetch from each site. The goal is to minimize bingbot crawl footprint on your web sites while ensuring that the freshest content is available.”

Around the same time the above statement was made, John Mueller from Google said:

“I think the hard part here is that we don’t crawl URLs with the same frequency all the time. So, some URLs we will crawl daily. Some URLs maybe weekly.

Other URLs every couple of months, maybe even every once half year or so. So, this is something that we try to find the right balance for so that we don’t overload your server.”

Google has a mobile-first index, while Microsoft Bing takes a different stance and does not have plans to apply a mobile-first indexing policy.

Instead, Microsoft Bing maintains a single index that is optimized for both desktop and mobile, so it is important to make sure your site experience is optimized, loads quickly, and gives users what they need.

Google has evolved into more than just a search engine with products like Gmail, Maps, Chrome OS, Android OS, YouTube, and more.

Microsoft Bing also offers email via Outlook, as well as other services like Office Online or OneDrive.

Unlike Google, however, it does not have its own operating system. Instead, it uses Windows Phone 8 or iOS on Apple devices.

Now, let’s take a look at where Bing is on par with Google – or superior.

Differences In User Interface & Tools

Google has a clean, simple interface that many people find easy to use, but for some queries, AI overviews are shown.

Google search for bitcoinScreenshot from search for [bitcoin], Google, July 2024

So does Microsoft Bing, though Bing is a little bit more visual.

Microsoft Bing search for bitcoinScreenshot from search for [bitcoin], Microsoft Bing, July 2024

Both search engines display useful information about related searches, images, companies, and news and do a great job of informing users of everything they need to know about a given topic.

SEO professionals love our tools and data.

Thankfully, both Google and Microsoft Bing have decent keyword research tools that offer insights into performance:

Keywords research toolsScreenshot from author, July 2024

One area where I think Google falls behind is the data it provides in Google Search Console. If you want to learn how to use it, check out How to Use Google Search Console for SEO: A Complete Guide.

One of the cool feature sets in Microsoft Bing is the ability to import data from Google Search Console:

Another Microsoft Bing feature that I think beats Google is the fact that it provides SEO Reports.

Bing Webmaster ToolsScreenshot from Bing Webmaster Tools, July 2024

According to Bing, these reports contain common page-level recommendations based on SEO best practices to improve your rankings.

The reports are automatically generated biweekly and provide tips as to what to work on or investigate.

See A Complete Guide to Bing Webmaster Tools to learn more.

Microsoft Bing May Excel In Image Search Over Google

When it comes to image search, Microsoft Bing may have a leg up on Google by providing higher-quality images.

Microsoft Bing search for donutsScreenshot from search for [donuts], Microsoft Bing, July 2024

I like the filtering features in its image search, too, because you can turn titles off and search by image size, color, or type.

Test out Bing Visual Image Search, which allows you to do more with images. Check out its library of specialized skills to help you shop, identify landmarks and animals, or just have fun.

Then, see How Bing’s Image & Video Algorithm Works to learn more.

Google search for donutsScreenshot from search for [donuts], Google, July 2024

Google has more images available for viewing than Microsoft Bing. Make the most of it with the tips in A Guide to Google’s Advanced Image Search.

However, Microsoft Bing provides more detailed information about the image users are searching for.

How Microsoft Bing & Google Handle Video Search

Microsoft Bing provides a much more visual video search results page, including a grid view of large thumbnails.

Google’s video results are more standard, featuring a vertical list of small thumbnails.

As you can see from the screenshot of a Bitcoin search below, they include different filters like length, price, etc., which is a great user experience.

I did not get this experience with Google video search.

This is one area where Microsoft Bing outperforms Google.

Microsoft Bing video searchScreenshot from search for [bitcoin], Microsoft Bing, July 2024
Google video search for bitcoinScreenshot from search for [bitcoin], Google, July 2024

Map Listings On Both Search Engines Matter For Local SEO

Both engines have similar functionality for maps, including map listings and local listings in the search engine results pages (SERPs).

Make sure you claim all your listings in both Microsoft Bing and Google and optimize your profile with business information, photos, proper categories, social information, etc.

Accurate name, address, and phone number (NAP) information are key. Google focuses on a user’s immediate vicinity by default, providing highly localized search results, while Bing offers a broader view of the wider area in local searches, which can be beneficial for some businesses.

See A Complete Guide to Google Maps Marketing.

Optimizing For Google Search Vs. Microsoft Bing

Google is primarily concerned with E-E-A-T: Experience, Expertise, Authority, and Trust. Providing users with high-quality, useful, and helpful content that is factual, original, and offers users value, as well as a site that provides good user experience, will help you rank.

Backlinks are also still important.

Microsoft Bing has always been focused on on-page optimization. It emphasizes exact-match keywords in domain names and URLs, gives weight to social signals and official business listings, and favors older and established domains.

Unlike Google, Microsoft Bing states in its webmaster guidelines that it incorporates social signals into its algorithm. That means you should also focus on Twitter and Facebook – including building good quality content on your site and social platforms – if you want to rank highly in Microsoft Bing.

Content is extremely important for both search engines. Always focus on high-quality content that satisfies the user’s intent and informational needs. By creating useful and relevant content, users will naturally love it and link to it.

Both speed, mobile-friendliness, and proper tech infrastructure matter for both engines.

Make sure you check out these resources for optimizing for various search engines:

Google Is Pushing Organic Results Further And Further Down The Page

As time goes on, Google continues to push organic results down the page, resulting in more revenue from paid search ads and fewer clicks from organic search. That is why a blended strategy is important to win in today’s SERPs.

Here is a comparison between a search in Google and a search in Bing. As you can see, Bing does not have as many ads as Google, and organic listings are more prominent on the page than Google.

Bing search for project management softwareScreenshot from search for [project management software], Microsoft Bing, July 2024
Google search for project management softwareScreenshot from search for [project management software], Google, July 2024

Google Search Vs. Microsoft Bing: The Verdict

Both Microsoft Bing and Google satisfy the informational needs of millions of people every day.

While Google remains the dominant player in the battle between Bing and Google, they both offer opportunities for your brand to reach new users and put you in front of millions of qualified customers who are looking for information, products, and services.

Bing offers unique advantages and opportunities, particularly in visual search, social signals, and certain niche markets.

Bing holds a smaller market share but has a growing user base.

Since optimizing for both Bing and Google is similar, with some key differences, I recommend optimizing for both. This can enhance overall visibility and reach, especially in a world where Google is pushing organic listings further and further down the page.

More resources: 


Featured Image: Overearth/Shutterstock

Monopoly: A Ruling Against Google Could Benefit The Open Web via @sejournal, @Kevin_Indig
Monopoly Image Credit: Lyna ™

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

4 years after the DOJ lawsuit against Google started, Judge Amit Mehta declared Google guilty of monopolizing online search and advertising markets. The most successful startup in history is officially an illegal monopoly.

Search engine market shareGoogle’s search engine market share (Image Credit: Kevin Indig).

The ruling itself is big, but the fat question in the room is what consequences follow and whether there is an impact on SEO.

I can’t look into the future, but I can run through scenarios. There is a good chance it will affect SEO and the open web.

Before we dive in, remember:

  1. I’m not a lawyer or legal expert.
  2. I solely rely on documents and insights from the court case for my opinion.
  3. When I refer to “the document”, I mean Judge Mehta’s opinion memorandum.1

Scenarios

Scenario planning is the art and science of envisioning multiple futures.

Step one is framing the key question: What might the remedies (consequences) of the lawsuit against Google be, and what potential consequences could result for SEO?

Step two is identifying the driving forces affecting the key question:

  • Legal:
    • Judge Mehta concludes that Google is an illegal search monopoly, not an advertising monopoly. This is important.
    • The defining precedent lawsuit against Microsoft in the 90s didn’t lead to a break-up of the company but the opening of APIs, sharing of key information and a change in business practices.
  • Economic:
    • Google faces competition in advertising from Amazon, TikTok and Meta.
    • Google has superior market share in search, browsers, mobile OS and other markets.
    • Exclusivity and revenue share agreements between Google, Apple, Samsung, Mozilla and other partners delivered massive traffic to Google and profits to partners.
  • Technological:
    • Apple agreed not to innovate in search, spotlight and device search in return for revenue share.
    • Large Language Models are in the process of changing how search works and the dynamics between searchers, search engines and content providers.
  • Social: Younger generations use TikTok to search and social networks to get news and other information.
  • Political:
    • The sentiment of “big tech” has turned largely negative.
    • After almost two decades of no anti-competitive action against tech companies, the Google lawsuit could start a wave of tech regulation.

Step three is defining scenarios based on the key question and driving forces. I see 3 possible scenarios:

Scenario 1: Google must end its exclusivity deals immediately. Apple needs to let users choose a default search engine when setting up their devices. Google could get hefty fines for every year they keep the contract with Apple going.

Scenario 2: Google gets broken up. Alphabet must spin off assets that prevent it from gaining and holding more power in search and keep other players from entering the market.

  • YouTube is the 2nd largest search engine (Google is the largest text search engine, according to the judge). Running both at the same time creates too much power for one company to own.
  • Chrome and Android – maybe Gmail – need to be divested because they habituate users to choose Google and provide critical data about user behavior. A good example for the “damage” or habituation is Neeva, which failed because it couldn’t convince users to change their habit of using Google, according to founder Sridhar Ramaswamy.
  • Alphabet can keep Maps because there is competition from Apple.

Scenario 3: Google must share data like click behavior with the open market so everyone can train search engines on it.

Scenarios two and three are messy and could potentially harm consumers (privacy). Scenario 1 is the most likely to happen. To me, the argument “If Google is the best search engine, why does it need to pay to be the default on devices?” checks out.

Polygamy

Let’s look at the consequences for Google, Apple, and the web under the lens of scenario 1: Apple needs to end its monogamous relationship with Google and let users choose which search engine they want as default when setting up their phones.

1/ Consequence For Google

Apple’s impact on Google Search is massive. The court documents reveal that 28% of Google searches (US) come from Safari and makeup 56% of search volume. Consider that Apple sees 10 billion searches per week across all of its devices, with 8 billion happening on Safari and 2 billion from Siri and Spotlight.

Google receives only 7.6% of all queries on Apple devices through user-downloaded Chrome” and “10% of its searches on Apple devices through the Google Search App (GSA).” Google would take a big hit without the exclusive agreement with Apple.

best search engine vs Google alternativeGoogle searches for “best search engine” vs. “google alternative” (Image Credit: Kevin Indig)

If Apple lets users choose a search engine, 30% of searches from iOS and 70% from MacOS could go to non-Google search engines: “In 2020, Google estimated that if it lost the Safari default placement, it would claw back more search volume on desktop than on mobile.” Apparently, users are less inclined to change their default search engine on mobile devices.

Google would take a big hit but survive because its brand is so strong that even worse search results wouldn’t scare users away. From the document:

In 2020, Google conducted a quality degradation study, which showed that it would not lose search revenue if were to significantly reduce the quality of its search product. Just as the power to raise price “when it is desired to do so” is proof of monopoly power, so too is the ability to degrade product quality without concern of losing consumers […]. The fact that Google makes product changes without concern that its users might go elsewhere is something only a firm with monopoly power could do.

Most of you had some feelings about this test when I brought it up on Twitter.

2/ Consequence For Apple

Apple wouldn’t be able to make another exclusive deal. I doubt that the court would forbid only Google to make distribution agreements.

Even if Apple could partner with someone else, they don’t want to: Eddy Cue, Apple’s senior vice president of Services, said publicly in court, “There’s no price that Microsoft could ever offer“ to replace Google. “They offered to give us Bing for free. They could give us the whole company.” Woof.

But Apple’s bottom line would certainly take a hit. In the short term, Apple would miss about $20 billion from Google, which makes up 11.5% of its $173 billion profits (trailing the last 12 months in Q1 ‘24). In the long term, the losses would amount to $12 billion over 5 years:

Internal Apple assessment from 2018, which concluded that, even assuming that Apple would retain 80% of queries should it launch a GSE, it would lose over $12 billion in revenue during the first five years following a potential separation from Google.

Mind you, not only Apple’s bottom line would take a hit, but also Google’s other distribution partners. Mozilla, for example, gets over 80% of its revenue from Google.2 Without the revenue share, it’s likely the company wouldn’t survive. Bing should buy Mozilla to keep the company alive and slightly balance Google’s power with Chrome.

3/ Consequence For The web

The web could be the big winner from a separation of Google’s distribution agreements. More traffic to other search engines could result in a broader distribution of web traffic. Here is my thought process:

  1. Search is a zero-sum game that follows Zipf’s law in click distribution: the first result gets a lot more clicks than the second, which gets more than the third and so on.
  2. In theory, you can get near-infinite reach on social networks because they customize the feed for audiences. On Google, the feed is not customized, meaning there are only so many results for a keyword.
  3. If more users would use other search engines on Apple devices, those non-Google search engines get more traffic, which they could pass on to the web.
  4. Assuming not every search engine would rank the same site at the top (otherwise, what’s the point?), the available amount of traffic for websites would expand because there are now more search results across several search engines that websites could get traffic from.

The big question is, “How many users would choose search engines that are not google if given a choice?” Google estimated in 2020 that it would lose $28.2 – $32.7 billion in net revenue (~$30 billion to keep the math simple) and over double that in gross revenue from losing 30% of iOS searches and 70% of MacOS.

Net revenue is the amount of money from selling goods or services minus discounts, returns, or deductions. Since we don’t have that number, we have to use total revenues as a ceiling because we know that net revenue has to be lower than revenue.

In 2020, Google’s total revenue was $182.5 billion, meaning~$30 billion would be 16.5% of total revenue. The actual number is likely higher.

Other search engines would likely catch some of Google’s lost revenue. A study by DuckDuckGo from 2019 3 found that mobile market share of non-Google search engines would increase by 300%-800% if users could choose a default.

The next logical question is “Who would get the search traffic Google loses?” Bing and DuckDuckGo are the obvious ones, but what about Perplexity and OpenAI? As I wrote in Search GPT:

OpenAI might bet on regulators breaking up Google’s exclusive search engine deal with Apple and hope to become part of a search engine choice set on Apple devices.

At the time of writing, I thought the likelihood of OpenAI intentionally launching Search GPT to catch some of the Apple traffic is small. I don’t think that anymore.

If Open AI got just 10% of the $30b in revenue Google would lose, it could make up over half of the $5b in annual expenses it runs on now. And all that without having to build much more functionality. Good timing.

According to Judge Mehta, Chat GPT is not considered a search engine: “AI cannot replace the fundamental building blocks of search, including web crawling, indexing, and ranking.”

I don’t agree, for what it’s worth. Most LLMs ground answers in search results. From What Google I/O 2023 reveals about the future of SEO:

Most search engines use a tech called Retrieval Augmented Generation, which cross-references AI answers from LLMs (large language models) with classic search results to decrease hallucination.

2nd-Order Effects

I want to take my scenarios one step further to uncover 2nd-order effects:

First, Would only Apple be forced to let users choose a default search engine when setting up their device or could Android as well? Mobile operating systems could be seen as a market bottleneck to search traffic.

A blanket ruling for all mobile OSs could mean that Google has to let users choose and potentially lose some of the advantages of owning Android.

Second, if Google were forced to cut all distribution agreements, it would have ~$25b to spend. What would they do with the money? Would it simply compensate for the ~$30 billion it would lose by taking a massive hit in Apple search traffic?

Third, if Apple wasn’t contractually obligated to not innovate in Search across Spotlight, Safari, and Siri, would it build its own search engine?

It might be better off building what comes after search and/or charge to use LLMs. The court documents reveal that Apple estimated a cost of at least $6 billion per year to build a general search engine.

State Of SEO Report: Top Insights For 2025 Success via @sejournal, @Juxtacognition

What opportunities are other SEO professionals taking advantage of? Did other SEO professionals struggle with the same things you did this year?

Our fourth annual State of SEO Report is packed with valuable insights, including the most pressing challenges, emerging trends, and actionable strategies SEO practitioners like you have faced over the last year and what they see on the horizon.

Find out how top search teams are tackling challenges. Download the full report today.

Top Challenges In SEO: From Content To Algorithm Changes

In 2023, 13.8% of SEO pros said content creation was the top challenge for SEO professionals. However, in 2024, 22.2% (up from 8.6% in 2023) of all SEO practitioners surveyed revealed that algorithm changes have become the primary concern.

In fact, 30.2% of those we asked pointed to core and general algorithm updates as the main source of traffic instability over the last 12 months. This finding is in stark contrast to 2023,  where 55.9% of SEO pros felt algorithm updates helped their efforts at least a little.

Why?

Simply put, creating the most helpful and expert content no longer guarantees a top spot in the SERPs.

To complicate matters, Google’s algorithms are constantly evolving, making it crucial to adapt and stay updated.

Budget Constraints: A Major Barrier To Success

Our survey revealed that budget limitations (cited by 19.4%) are the number one barrier to SEO success and the primary reason clients leave (by 41.0% of SEO professionals surveyed.)

With everyone feeling the financial squeeze, how can you gain an edge?

  • Forget gaming the SERPs. Focus on creating content that genuinely serves your ideal customer.
  • Collaborate with your marketing team to distribute this content on platforms where your audience is most active. Remember, Google’s rules may change, but the need for high-quality, valuable content that genuinely serves a need remains constant.
  • Prove your return on investment (ROI). Track customer journeys and identify where you are gaining conversions. If you’re not seeing success, make a plan and create a proposal to improve your strategies.

Learn how to overcome budget barriers with even more insights in the full report.

Key Insights From The State Of SEO Survey

SEO Industry Changes:

  • AI is predicted to drive the most significant changes in the SEO industry according to 29.0% of those we surveyed.
  • 16.6% believe Google updates will continue to be a major factor.

Performance Disruptions:

  • 36.3% of State of SEO respondents believe generative AI in search platforms and AI-generated content are major disruptors going forward into the future.

Essential SEO Metrics: Adapting To Fluctuations

As you explore the data in the report, you’ll find that 20.0% of State of SEO 2025 respondents indicated that keyword rankings and organic pageviews (11.7%) are the top tracked SEO metrics.

However, when these metrics fluctuate due to uncontrollable factors, it’s essential to build business value into your tracking.

Focus on the quality of your traffic and prioritize efforts that bring in high-quality users.

Skills In Demand: Navigating A Changing SEO Landscape

The most challenging skills to find in SEO professionals are technical SEO (18.9%) and data analysis (14.8%).

Meanwhile, 18.2% of respondents indicated that the most desired skills in candidates are soft skills and 15.7% said the ability to build and execute SEO strategies.

Want to grow as an SEO professional?

Develop rare and desirable skills.

SEO is increasingly integrated with other marketing disciplines, so cultivating exemplary collaborative skills and learning the languages of other fields will make you highly valuable.

Other Important Findings

  • 69.8% of SEO professionals found SERP competition increased over the last 12 months.
  • Only 13.2% of respondents felt zero click searches will cause significant shifts in the SEO industry.
  • 50.0% of SEO professionals reported client turnover remained steady throughout 2024.

The State of SEO 2025 Report is your go-to resource for understanding and mastering the current SEO landscape.

Download your copy today to gain a deeper understanding of the challenges, opportunities, and insights that will shape SEO in the coming year.

Stay informed, stay ahead, and make 2025 your best year in SEO yet!

Google’s AI Overviews Ditch Reddit, Embrace YouTube [Study] via @sejournal, @MattGSouthern

A new study by SEO software company SE Ranking has analyzed the sources and links used in Google’s AI-generated search overviews.

The research, which examined over 100,000 keywords across 20 niches, offers insights into how these AI-powered snippets are constructed and what types of sources they prioritize.

Key Findings

Length & Sources

The study found that 7.47% of searches triggered AI overviews, a slight decrease from previous research.

The average length of these overviews has decreased by approximately 40%, now averaging 2,633 characters.

According to the data, the most frequently linked websites in AI overviews were:

  1. YouTube.com (1,346 links)
  2. LinkedIn.com (1,091 links)
  3. Healthline.com (1,091 links)

Government & Education

The research indicates that government and educational institutions are prominently featured in AI-generated answers.

Approximately 19.71% of AI overviews included links to .gov websites, while 26.61% referenced .edu domains.

Media Representation

Major media outlets appeared frequently in the AI overviews.

Forbes led with 804 links from 723 AI-generated answers, followed by Business Insider with 148 links from 139 overviews.

HTTPS Dominance

The study reported that 99.75% of links in AI overviews use the HTTPS protocol, with only 0.25% using HTTP.

Niche-Specific Trends

The research revealed variations in AI overviews across niches:

  • The Relationships niche dominated, with 40.64% of keywords in this category triggering AI overviews.
  • Food and Beverage maintained its second-place position, with 23.58% of keywords triggering overviews.
  • Notably, the Fashion and Beauty, Pets, and Ecommerce and Retail niches saw significant declines in AI overview appearances compared to previous studies.

Link Patterns

The study found that AI overviews often incorporate links from top-ranking organic search results:

  • 93.67% of AI overviews linked to at least one domain from the top 10 organic search results.
  • 56.50% of all detected links in AI overviews matched search results from the top 1-100, with most (73.01%) linking to the top 1-10 search results.

International Content

The research noted trends regarding international content:

  • 9.85% of keywords triggering AI overviews included links to .in (Indian) domains.
  • This was prevalent in certain niches, with Sports and Exercise leading at 36.83% of keywords in that category linking to .in sites.

Reddit & Quora Absent

Despite these platforms ‘ popularity as information sources, the study found no instances of Reddit or Quora being linked in the analyzed AI overviews. This marks a change from previous studies where these sites were more frequently referenced.

Methodology

The research was conducted using Google Chrome on an Ubuntu PC, with sessions based in New York and all personalization features disabled.

The data was collected on July 11, 2024, providing a snapshot of AI overview behavior.

SE Ranking has indicated that they plan to continue this research, acknowledging the need for ongoing analysis to understand evolving trends.

What Does This Mean?

These findings have several implications for SEO professionals and publishers:

  1. Google’s AI favors trusted sources. Keep building your site’s credibility.
  2. AI overviews are getting shorter. Focus on clear, concise content.
  3. HTTPS is a must. Secure your site if you haven’t already.
  4. Diversify your sources. Mix in .edu and .gov backlinks where relevant.
  5. AI behavior varies across industries. Adapt your strategy accordingly.
  6. Think globally. You might be competing with international sites more than before.

Remember, this is just a snapshot. Google’s AI overviews are changing fast. Monitor these trends and be ready to pivot your SEO strategy as needed.

The full report on SE Ranking’s website provides a detailed breakdown of the findings, including niche-specific data.


Featured Image: DIA TV / Shutterstock.com

Maximize Your Organic Traffic for Enterprise Ecommerce Sites via @sejournal, @hethr_campbell

In the enterprise ecommerce space, staying ahead of the competition on Google can be challenging. With so much at stake, it’s key to ensure that your site is performing at its best and capturing as much market share as possible. But how can you make sure your ecommerce platform is fully optimized to reach its potential in organic search?

On August 21st, we invite you to join us for an in-depth webinar where we’ll explore the strategies that can help you make the most of your existing site. Whether you’re looking to resolve technical challenges or implement scalable solutions that are proven to drive results, this session will provide the practical insights you need.

Why Attend This Webinar?

Wayland Myers, with his 18 years of experience working with major brands like Expedia and Staples, will lead the discussion. Save your spot to learn the common issues that often prevent large ecommerce sites from reaching their full potential in organic search and he’ll explain that, if left unaddressed, can significantly limit your site’s ability to attract and convert visitors.

Wayland will dive into actionable solutions that can help overcome these challenges. You’ll learn about proven strategies that can be applied at scale, ensuring that your site is not only optimized for performance but also prepared to handle the complexities of enterprise-level ecommerce. 

What Will You Learn?

From technical fixes to advanced tactics like AI-enhanced programmatic content creation and internal linking, this session will cover the approaches that have been proven to work in real-world scenarios.

This webinar will also highlight the importance of careful implementation. Making changes to an enterprise ecommerce site requires a thoughtful approach to avoid potential pitfalls. Wayland will share his insights on what to watch out for during the process, ensuring that your efforts lead to positive outcomes without unintended consequences.

Key Takeaways:

  • Identifying and resolving issues that hinder your site’s organic growth.
  • Implementing solutions that enhance search performance at scale.
  • Learning from successful strategies used by industry leaders.

Live Q&A: Get Your Questions Answered

After the presentation, there will be a LIVE Q&A session where you can bring your specific questions. Whether you’re dealing with technical challenges or looking to fine-tune your current strategy, this is your chance to get expert advice tailored to your needs.

If you’re focused on improving your ecommerce site’s performance and capturing a larger share of the market on Google, this webinar is an opportunity you won’t want to miss.

Can’t make it to the live session? No worries. By registering, you’ll receive a recording of the webinar to watch at your convenience.

Take this chance to learn from an industry expert and ensure your ecommerce site is fully optimized for success.

13 Steps To Boost Your Site’s Crawlability And Indexability via @sejournal, @MattGSouthern

One of the most important elements of search engine optimization, often overlooked, is how easily search engines can discover and understand your website.

This process, known as crawling and indexing, is fundamental to your site’s visibility in search results. Without being crawled your pages cannot be indexed, and if they are not indexed they won’t rank or display in SERPs.

In this article, we’ll explore 13 practical steps to improve your website’s crawlability and indexability. By implementing these strategies, you can help search engines like Google better navigate and catalog your site, potentially boosting your search rankings and online visibility.

Whether you’re new to SEO or looking to refine your existing strategy, these tips will help ensure that your website is as search-engine-friendly as possible.

Let’s dive in and discover how to make your site more accessible to search engine bots.

1. Improve Page Loading Speed

Page loading speed is crucial to user experience and search engine crawlability. To improve your page speed, consider the following:

  • Upgrade your hosting plan or server to ensure optimal performance.
  • Minify CSS, JavaScript, and HTML files to reduce their size and improve loading times.
  • Optimize images by compressing them and using appropriate formats (e.g., JPEG for photographs, PNG for transparent graphics).
  • Leverage browser caching to store frequently accessed resources locally on users’ devices.
  • Reduce the number of redirects and eliminate any unnecessary ones.
  • Remove any unnecessary third-party scripts or plugins.

2. Measure & Optimize Core Web Vitals

In addition to general page speed optimizations, focus on improving your Core Web Vitals scores. Core Web Vitals are specific factors that Google considers essential in a webpage’s user experience.

These include:

To identify issues related to Core Web Vitals, use tools like Google Search Console’s Core Web Vitals report, Google PageSpeed Insights, or Lighthouse. These tools provide detailed insights into your page’s performance and offer suggestions for improvement.

Some ways to optimize for Core Web Vitals include:

  • Minimize main thread work by reducing JavaScript execution time.
  • Avoid significant layout shifts by using set size attribute dimensions for media elements and preloading fonts.
  • Improve server response times by optimizing your server, routing users to nearby CDN locations, or caching content.

By focusing on both general page speed optimizations and Core Web Vitals improvements, you can create a faster, more user-friendly experience that search engine crawlers can easily navigate and index.

3. Optimize Crawl Budget

Crawl budget refers to the number of pages Google will crawl on your site within a given timeframe. This budget is determined by factors such as your site’s size, health, and popularity.

If your site has many pages, it’s necessary to ensure that Google crawls and indexes the most important ones. Here are some ways to optimize for crawl budget:

  • Using a clear hierarchy, ensure your site’s structure is clean and easy to navigate.
  • Identify and eliminate any duplicate content, as this can waste crawl budget on redundant pages.
  • Use the robots.txt file to block Google from crawling unimportant pages, such as staging environments or admin pages.
  • Implement canonicalization to consolidate signals from multiple versions of a page (e.g., with and without query parameters) into a single canonical URL.
  • Monitor your site’s crawl stats in Google Search Console to identify any unusual spikes or drops in crawl activity, which may indicate issues with your site’s health or structure.
  • Regularly update and resubmit your XML sitemap to ensure Google has an up-to-date list of your site’s pages.

4. Strengthen Internal Link Structure

A good site structure and internal linking are foundational elements of a successful SEO strategy. A disorganized website is difficult for search engines to crawl, which makes internal linking one of the most important things a website can do.

But don’t just take our word for it. Here’s what Google’s search advocate, John Mueller, had to say about it:

“Internal linking is super critical for SEO. I think it’s one of the biggest things that you can do on a website to kind of guide Google and guide visitors to the pages that you think are important.”

If your internal linking is poor, you also risk orphaned pages or pages that don’t link to any other part of your website. Because nothing is directed to these pages, search engines can only find them through your sitemap.

To eliminate this problem and others caused by poor structure, create a logical internal structure for your site.

Your homepage should link to subpages supported by pages further down the pyramid. These subpages should then have contextual links that feel natural.

Another thing to keep an eye on is broken links, including those with typos in the URL. This, of course, leads to a broken link, which will lead to the dreaded 404 error. In other words, page not found.

The problem is that broken links are not helping but harming your crawlability.

Double-check your URLs, particularly if you’ve recently undergone a site migration, bulk delete, or structure change. And make sure you’re not linking to old or deleted URLs.

Other best practices for internal linking include using anchor text instead of linked images, and adding a “reasonable number” of links on a page (there are different ratios of what is reasonable for different niches, but adding too many links can be seen as a negative signal).

Oh yeah, and ensure you’re using follow links for internal links.

5. Submit Your Sitemap To Google

Given enough time, and assuming you haven’t told it not to, Google will crawl your site. And that’s great, but it’s not helping your search ranking while you wait.

If you recently made changes to your content and want Google to know about them immediately, you should submit a sitemap to Google Search Console.

A sitemap is another file that lives in your root directory. It serves as a roadmap for search engines with direct links to every page on your site.

This benefits indexability because it allows Google to learn about multiple pages simultaneously. A crawler may have to follow five internal links to discover a deep page, but by submitting an XML sitemap, it can find all of your pages with a single visit to your sitemap file.

Submitting your sitemap to Google is particularly useful if you have a deep website, frequently add new pages or content, or your site does not have good internal linking.

6. Update Robots.txt Files

You’ll want to have a robots.txt file for your website. It’s a plain text file in your website’s root directory that tells search engines how you would like them to crawl your site. Its primary use is to manage bot traffic and keep your site from being overloaded with requests.

Where this comes in handy in terms of crawlability is limiting which pages Google crawls and indexes. For example, you probably don’t want pages like directories, shopping carts, and tags in Google’s directory.

Of course, this helpful text file can also negatively impact your crawlability. It’s well worth looking at your robots.txt file (or having an expert do it if you’re not confident in your abilities) to see if you’re inadvertently blocking crawler access to your pages.

Some common mistakes in robots.text files include:

  • Robots.txt is not in the root directory.
  • Poor use of wildcards.
  • Noindex in robots.txt.
  • Blocked scripts, stylesheets, and images.
  • No sitemap URL.

For an in-depth examination of each of these issues – and tips for resolving them, read this article.

7. Check Your Canonicalization

What a canonical tag does is indicate to Google which page is the main page to give authority to when you have two or more pages that are similar, or even duplicate. Although, this is only a directive and not always applied.

Canonicals can be a helpful way to tell Google to index the pages you want while skipping duplicates and outdated versions.

But this opens the door for rogue canonical tags. These refer to older versions of a page that no longer exist, leading to search engines indexing the wrong pages and leaving your preferred pages invisible.

To eliminate this problem, use a URL inspection tool to scan for rogue tags and remove them.

If your website is geared towards international traffic, i.e., if you direct users in different countries to different canonical pages, you need to have canonical tags for each language. This ensures your pages are indexed in each language your site uses.

8. Perform A Site Audit

Now that you’ve performed all these other steps, there’s still one final thing you need to do to ensure your site is optimized for crawling and indexing: a site audit.

That starts with checking the percentage of pages Google has indexed for your site.

Check Your Indexability Rate

Your indexability rate is the number of pages in Google’s index divided by the number of pages on your website.

You can find out how many pages are in the Google index from the Google Search Console Index by going to the “Pages” tab and checking the number of pages on the website from the CMS admin panel.

There’s a good chance your site will have some pages you don’t want indexed, so this number likely won’t be 100%. However, if the indexability rate is below 90%, you have issues that need investigation.

You can get your no-indexed URLs from Search Console and run an audit for them. This could help you understand what is causing the issue.

Another helpful site auditing tool included in Google Search Console is the URL Inspection Tool. This allows you to see what Google spiders see, which you can then compare to actual webpages to understand what Google is unable to render.

Audit (And request Indexing) Newly Published Pages

Any time you publish new pages to your website or update your most important pages, you should ensure they’re being indexed. Go into Google Search Console and use the inspection tool to make sure they’re all showing up. If not, request indexing on the page and see if this takes effect – usually within a few hours to a day.

If you’re still having issues, an audit can also give you insight into which other parts of your SEO strategy are falling short, so it’s a double win. Scale your audit process with tools like:

9. Check For Duplicate Content

Duplicate content is another reason bots can get hung up while crawling your site. Basically, your coding structure has confused it, and it doesn’t know which version to index. This could be caused by things like session IDs, redundant content elements, and pagination issues.

Sometimes, this will trigger an alert in Google Search Console, telling you Google is encountering more URLs than it thinks it should. If you haven’t received one, check your crawl results for duplicate or missing tags or URLs with extra characters that could be creating extra work for bots.

Correct these issues by fixing tags, removing pages, or adjusting Google’s access.

10. Eliminate Redirect Chains And Internal Redirects

As websites evolve, redirects are a natural byproduct, directing visitors from one page to a newer or more relevant one. But while they’re common on most sites, if you’re mishandling them, you could inadvertently sabotage your indexing.

You can make several mistakes when creating redirects, but one of the most common is redirect chains. These occur when there’s more than one redirect between the link clicked on and the destination. Google doesn’t consider this a positive signal.

In more extreme cases, you may initiate a redirect loop, in which a page redirects to another page, directs to another page, and so on, until it eventually links back to the first page. In other words, you’ve created a never-ending loop that goes nowhere.

Check your site’s redirects using Screaming Frog, Redirect-Checker.org, or a similar tool.

11. Fix Broken Links

Similarly, broken links can wreak havoc on your site’s crawlability. You should regularly check your site to ensure you don’t have broken links, as this will hurt your SEO results and frustrate human users.

There are a number of ways you can find broken links on your site, including manually evaluating every link on your site (header, footer, navigation, in-text, etc.), or you can use Google Search Console, Analytics, or Screaming Frog to find 404 errors.

Once you’ve found broken links, you have three options for fixing them: redirecting them (see the section above for caveats), updating them, or removing them.

12. IndexNow

IndexNow is a protocol that allows websites to proactively inform search engines about content changes, ensuring faster indexing of new, updated, or removed content. By strategically using IndexNow, you can boost your site’s crawlability and indexability.

However, using IndexNow judiciously and only for meaningful content updates that substantially enhance your website’s value is crucial. Examples of significant changes include:

  • For ecommerce sites: Product availability changes, new product launches, and pricing updates.
  • For news websites: Publishing new articles, issuing corrections, and removing outdated content.
  • For dynamic websites, this includes updating financial data at critical intervals, changing sports scores and statistics, and modifying auction statuses.
  • Avoid overusing IndexNow by submitting duplicate URLs too frequently within a short timeframe, as this can negatively impact trust and rankings.
  • Ensure that your content is fully live on your website before notifying IndexNow.

If possible, integrate IndexNow with your content management system (CMS) for seamless updates. If you’re manually handling IndexNow notifications, follow best practices and notify search engines of both new/updated content and removed content.

By incorporating IndexNow into your content update strategy, you can ensure that search engines have the most current version of your site’s content, improving crawlability, indexability, and, ultimately, your search visibility.

13. Implement Structured Data To Enhance Content Understanding

Structured data is a standardized format for providing information about a page and classifying its content.

By adding structured data to your website, you can help search engines better understand and contextualize your content, improving your chances of appearing in rich results and enhancing your visibility in search.

There are several types of structured data, including:

  • Schema.org: A collaborative effort by Google, Bing, Yandex, and Yahoo! to create a unified vocabulary for structured data markup.
  • JSON-LD: A JavaScript-based format for encoding structured data that can be embedded in a web page’s or .
  • Microdata: An HTML specification used to nest structured data within HTML content.

To implement structured data on your site, follow these steps:

  • Identify the type of content on your page (e.g., article, product, event) and select the appropriate schema.
  • Mark up your content using the schema’s vocabulary, ensuring that you include all required properties and follow the recommended format.
  • Test your structured data using tools like Google’s Rich Results Test or Schema.org’s Validator to ensure it’s correctly implemented and free of errors.
  • Monitor your structured data performance using Google Search Console’s Rich Results report. This report shows which rich results your site is eligible for and any issues with your implementation.

Some common types of content that can benefit from structured data include:

  • Articles and blog posts.
  • Products and reviews.
  • Events and ticketing information.
  • Recipes and cooking instructions.
  • Person and organization profiles.

By implementing structured data, you can provide search engines with more context about your content, making it easier for them to understand and index your pages accurately.

This can improve search results visibility, mainly through rich results like featured snippets, carousels, and knowledge panels.

Wrapping Up

By following these 13 steps, you can make it easier for search engines to discover, understand, and index your content.

Remember, this process isn’t a one-time task. Regularly check your site’s performance, fix any issues that arise, and stay up-to-date with search engine guidelines.

With consistent effort, you’ll create a more search-engine-friendly website with a better chance of ranking well in search results.

Don’t be discouraged if you find areas that need improvement. Every step to enhance your site’s crawlability and indexability is a step towards better search performance.

Start with the basics, like improving page speed and optimizing your site structure, and gradually work your way through more advanced techniques.

By making your website more accessible to search engines, you’re not just improving your chances of ranking higher – you’re also creating a better experience for your human visitors.

So roll up your sleeves, implement these tips, and watch as your website becomes more visible and valuable in the digital landscape.

More Resources:


Featured Image: BestForBest/Shutterstock

Google’s “Branded Search” Patent For Ranking Search Results via @sejournal, @martinibuster

Back in 2012 Google applied for a patent called “Ranking Search Results” that shows how Google can use branded search queries as a ranking factor. The patent is about using branded search queries and navigational queries as ranking factors, plus a count of independent links. Although this patent is from 2012, it’s possible that it may still play a role in ranking.

The patent was misunderstood by the search marketing community in 2012 and the knowledge contained in it was lost.

What Is The Ranking Search Results Patent About? TL/DR

The patent is explicitly about an invention for ranking search results, that’s why the patent is called “Ranking Search Results.” The patent describes an algorithm that uses to ranking factors to re-rank web pages:

Sorting Factor 1: By number of independent inbound links
This is a count of links that are independent from the site being ranked.

Sorting Factor 2: By number of branded search queries & navigational search queries.
The branded and navigational search queries are called “reference queries” and also are referred to as implied links.

The counts of both factors are used to modify the rankings of the web pages.

Why The Patent Was Misunderstood TL/DR

First, I want to say that in 2012, I didn’t understand how to read patents. I was more interested in research papers and left the patent reading to others. When I say that everyone in the search marketing community misunderstood the patent, I include myself in that group.

The “Ranking Search Results” patent was published in 2012, one year after the release of a content quality update called the Panda Update. The Panda update was named after one of the engineers who worked on it, Navneet Panda. Navneet Panda came up with questions that third party quality raters used to rate web pages. Those ratings were used as a test to see if changes to the algorithm were successful at removing “content farm” content.

Navneet Panda is also a co-author of the “Ranking search results” patent. SEOs saw his name on the patent and immediately assumed that this was the Panda patent.

The reason why that assumption is wrong is because the Panda update is an algorithm that uses a “classifier” to classify web pages by content quality. The “Ranking Search Results” patent is about ranking search results, period. The Ranking Search Results patent is not about content quality nor does it feature a content quality classifier.

Nothing in the “Ranking Search Results” patent relates in any way with the Panda update.

Why This Patent Is Not The Panda Update

In 2009 Google released the Caffeine Update which enabled Google to quickly index fresh content but inadvertently created a loophole that allowed content farms to rank millions of web pages on rarely searched topics.

In an interview with Wired, former Google search engineer Matt Cutts described the content farms like this:

“It was like, “What’s the bare minimum that I can do that’s not spam?” It sort of fell between our respective groups. And then we decided, okay, we’ve got to come together and figure out how to address this.”

Google subsequently responded with the Panda Update, named after a search engineer who worked on the algorithm which was specifically designed to filter out content farm content. Google used third party site quality raters to rate websites and the feedback was used to create a new definition of content quality that was used against content farm content.

Matt Cutts described the process:

“There was an engineer who came up with a rigorous set of questions, everything from. “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?” Questions along those lines.

…we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons…”

In simple terms, a classifier is an algorithm within a system that categorizes data. In the context of the Panda Update, the classifier categorizes web pages by content quality.

What’s apparent when reading the “Ranking search results” patent is that it’s clearly not about content quality, it’s about ranking search results.

Meaning Of Express Links And Implied Links

The “Ranking Search Results” patent uses two kinds of links to modify ranked search results:

  1. Implied links
  2. Express links

Implied links:
The patent uses branded search queries and navigational queries to calculate a ranking score as if the branded/navigational queries are links, calling them implied links. The implied links are used to create a factor for modifying web pages that are relevant (responsive) to search queries.

Express links:
The patent also uses independent inbound links to the web page as a part of another calculation to come up with a factor for modifying web pages that are responsive to a search query.

Both of those kinds of links (implied and independent express link) are used as factors to modify the rankings of a group of web pages.

Understanding what the patent is about is straightforward because the beginning of the patent explains it in relatively easy to understand English.

This section of the patent uses the following jargon:

  • A resource is a web page or website.
  • A target (target resource) is what is being linked to or referred to.
  • A “source resource” is a resource that makes a citation to the “target resource.”
  • The word “group” means the group of web pages that are relevant to a search query and are being ranked.

The patent talks about “express links” which are just regular links. It also describes “implied links” which are references within search queries, references to a web page (which is called a “target resource”).

I’m going to add bullet points to the original sentences so that they are easier to understand.

Okay, so this is the first important part:

“Links for the group can include express links, implied links, or both.

An express link, e.g., a hyperlink, is a link that is included in a source resource that a user can follow to navigate to a target resource.

An implied link is a reference to a target resource, e.g., a citation to the target resource, which is included in a source resource but is not an express link to the target resource. Thus, a resource in the group can be the target of an implied link without a user being able to navigate to the resource by following the implied link.”

The second important part uses the same jargon to define what implied links are:

  • A resource is a web page or website.
  • The site being linked to or referred to is called a “target resource.”
  • A “group of resources” means a group of web pages.

This is how the patent explains implied links:

“A query can be classified as referring to a particular resource if the query includes a term that is recognized by the system as referring to the particular resource.

For example, a term that refers to a resource may be all of or a portion of a resource identifier, e.g., the URL, for the resource.

For example, the term “example.com” may be a term that is recognized as referring to the home page of that domain, e.g., the resource whose URL is “http://www.example.com”.

Thus, search queries including the term “example.com” can be classified as referring to that home page.

As another example, if the system has data indicating that the terms “example sf” and “esf” are commonly used by users to refer to the resource whose URL is “http://www.sf.example.com,” queries that contain the terms “example sf” or “esf”, e.g., the queries “example sf news” and “esf restaurant reviews,” can be counted as reference queries for the group that includes the resource whose URL is “http://www.sf.example.com.” “

The above explanation defines “reference queries” as the terms that people use to refer to a specific website. So, for example (my example), if people search using “Walmart” with the keyword Air Conditioner within their search query then the query  “Walmart” + Air Conditioner is counted as a “reference query” to Walmart.com, it’s counted as a citation and an implied link.

The Patent Is Not About “Brand Mentions” On Web Pages

Some SEOs believe that a mention of a brand on a web page is counted by Google as if it’s a link. They have misinterpreted this patent to support the belief that an “implied link” is a brand mention on a web page.

As you can see, the patent does not describe the use of “brand mentions” on web pages. It’s crystal clear that the meaning of “implied links” within the context of this patent is about references to brands within search queries, not on a web page.

It also discusses doing the same thing with navigational queries:

“In addition or in the alternative, a query can be categorized as referring to a particular resource when the query has been determined to be a navigational query to the particular resource. From the user point of view, a navigational query is a query that is submitted in order to get to a single, particular web site or web page of a particular entity. The system can determine whether a query is navigational to a resource by accessing data that identifies queries that are classified as navigational to each of a number of resources.”

The takeaway then is that the parent describes the use of “reference queries” (branded/navigational search queries) as a factor similar to links and that’s why they’re called implied links.

Modification Factor

The algorithm generates a “modification factor” which re-ranks (modifies) the a group of web pages that are relevant to a search query based on the “reference queries” (which are branded search queries) and also using a count of independent inbound links.

This is how the modification (or ranking) is done:

  1. A count of inbound links using only “independent” links (links that are not controlled by the site being linked to).
  2. A count is made of the reference queries (branded search queries) (which are given a ranking power like a link).

Reminder: “resources” is a reference to web pages and websites.

Here is how the patent explains the part about the ranking:

“The system generates a modification factor for the group of resources from the count of independent links and the count of reference queries… For example, the modification factor can be a ratio of the number of independent links for the group to the number of reference queries for the group.”

What the patent is doing is it is filtering links in order to use links that are not associated with the website and it is also counting how many branded search queries are made for a webpage or website and using that as a ranking factor (modification factor).

In retrospect it was a mistake for some in the SEO industry to use this patent as “proof” for their idea about brand mentions on websites being a ranking factor.

It’s clear that “implied links” are not about brand mentions in web pages as a ranking factor but rather it’s about brand mentions (and URLs & domains) in search queries that can be used as ranking factors.

Why This Patent Is Important

This patent describes a way to use branded search queries as a signal of popularity and relevance for ranking web pages. It’s a good signal because it’s the users themselves saying that a specific website is relevant for specific search queries. It’s a signal that’s hard to manipulate which may make it a clean non-spam signal.

We don’t know if Google uses what’s described in the patent. But it’s easy to understand why it could still be a relevant signal today.

Read The Patent Within The Entire Context

Patents use specific language and it’s easy to misinterpret the words or overlook the meaning of it by focusing on specific sentences. The biggest mistake I see SEOs do is to remove one or two sentences from their context and then use that to say that Google is doing something or other. This is how SEO misinformation begins.

Read my article about How To Read Google Patents to understand how to read them and avoid misinterpreting them. Even if you don’t read patents, knowing the information is helpful because it’ll make it easier to spot misinformation about patents, which there is a lot of right now.

I limited this article to communicating what the “Ranking Search Results” patent is and what the most important points are. There many granular details about different implementations that I don’t cover because they’re not necessary to understanding the overall patent itself.

If you want the granular details, I strongly encourage first reading my article about how to read patents before reading the patent.

Read the patent here:

Ranking search results

AI Agnostic Optimization: Content For Topical Authority And Citations

The search and AI ecosystem is full of promise, options, and new ways for literally every type of marketer to evolve and grow.

Yes, there is lots of complexity, but there is also commonality: the need for marketers to focus on topical approaches to content creation, build their brand authority for AI citations, and become more predictive in their approach to how consumers interact online.

The introduction of Google AI Overviews and new AI-first platforms like Perplexity AI are making how consumers find answers to their needs a lot more complex.

The advancement of LLMs such as Claude and Google Gemini are also revolutionizing content outputs in visual and video formats. Just recently, Bing introduced GSE and OpenAI SearchGPT.

One thing they all have in common is that they are all fighting for the best authoritative sources for information and citations.

Wikipedia CitationScreenshot from author, July 2024

Today, I will mainly use Google AI Overviews in Search as an example, as they currently offer the most rich insights and best practices that are applicable to future and upcoming engines.

AI And Search Citations, Authority, And Your Brand

Being the cited source is quickly becoming the new form of ranking.

As AI looks to cite trustworthy and relevant content, brands need to be the source. While every engine has a different approach, the reality is that success relies on sources and quality.

They look to answer questions in many ways, and citations are common across the board. They look at authoritative sources to see whether that source answers that question, and then they seem to know whether it’s quotable.

  • Google wants quotable content that is above the fold, not buried. It also likes the question to be directly answered.
  • Perplexity, which has steadily increased traffic referrals (31% in June), focuses on academic and research citations but has had issues with attribution and sources.
  • Bing GSE is engineering its search results to satisfy users in a way that encourages discovery on the websites that originate the content.
  • ChatGPT/search does not need direct answers; it will digest them and express them in its own language. At first glance, it mainly cites and links to sources developed with input from major publishers like the Atlantic and NewsCorp.

So, as marketers, it is the simplest way to start focusing on the commonality and best practices that prepare you for what is ahead. Then, you can pivot and adapt as we learn more about how citations are shown and treated as each AI engine evolves.

For example, Google AI Overviews is beginning to cite more authoritative review publications to help users shop. The removal of user-generated content (UGC) and reviews from Reddit and Quora dropped to near zero in AI Overviews.

  • Reddit citations: 85.71% decrease.
  • Quora citations: 99.69% decrease.

User-generated reviews may not be designed for a broader audience and lack the objectivity that a publication would. BrightEdge Generative Parser™ has recently found:

  • 49% increase in presence from PC Mag.
  • 39% in Forbes increase in presence from Forbes.
Google AIO Screenshot from BrightEdge, July 2024

Sites like Forbes are becoming key players in AI overviews. As well as thought leadership and instructive information, their comparative product reviews define where a product shines and where it falls short against competitors.

Here are three things that marketers can master now to stay ahead in AI and search.

1. Ensure AI Engines Find You: Become The Cited Source

Start by identifying core – and broader, see later in the article –  topics relevant to your audience and aligning with your business objectives. These topics should serve as the foundation for a thematic content strategy.

Schema+: Diversify And Mark-Up Your Content As Much As Possible

The importance of diverse content formats cannot be overstated. To adapt to answer engine models, content must be comprehensive and encompass multiple modalities, including text, video, infographics, and interactive elements.

This approach ensures that content caters to diverse user preferences and provides information in formats that are most accessible and engaging.

Core technical SEO approaches like Schema Markup are essential for content marketers aiming to enhance their visibility and relevance in search results, as they help search engines better understand the content.

This improves the likelihood of content being featured as a direct answer and enhances its overall discoverability.

  • Provide AI engines with hints on who you are.
  • Ensure your teams look at things like Schema so AI entities can see your content.
  • Little formats like these can tell the AI models how to use your content.
  • It ensures that you are more frequently cited as the source in topics where you already have the right to win.

Develop content clusters around these core topics, covering different aspects, subtopics, and related themes. Each piece of content within a cluster should complement and support others, creating a cohesive narrative for multiple users.

Discovery, Engagement, ResultsImage from BrightEdge, July 2024

2. Anticipate Customers’ Next Questions: Focus On The Follow-Up

Build Thematic Content & Focus On Content Clustering

AI-powered search engines like AI Overviews (as explained in The Ultimate Guide to AI Overviews, free, ungated, and updated monthly by my company, BrightEdge) are redefining the criteria for visibility by prioritizing thematically connected content.

This applies even where the content doesn’t rank highly in traditional search results, making intelligent content clustering and thematic coherence essential.

Adopting a strategic approach to thematic content and content clustering means that instead of creating isolated pieces of content, you focus on developing interconnected content clusters that comprehensively explore various aspects of a topic.

  • AI search aims to do more than display a list of products for the keywords.
  • They want to anticipate the following questions that the demographic will likely have: how, what, where, and more.
  • AI models will cite trusted sources to generate these answers before the user even thinks about asking them.
  • Marketers need to create content for all these types of follow-ups in different formats.

Ensure that content within the same cluster is interlinked using relevant anchor text. This helps search engines understand the thematic relationship between different pieces of content and strengthens your website’s authority on the topic.

Understanding what triggers things in AI Overviews will become essential.

For example, in June, there was a 20% increase in “What is” queries showing an AI Overview. For brand-specific queries, there was a 20% decrease.

This could show that Google uses AI for more complex, knowledge-intensive topics while playing it safe with brand queries.

However, expect this to change monthly, as SEJ states and shares more below:

3. Prove Your Expertise: Become The Authority In Your Field Domain

Baking User and Topical Intent Into Every Piece of Content

Traditional SEO focuses on keyword rankings and visibility, but AI-driven search engines prioritize delivering precise, relevant answers based on user queries. This shift means simply ranking highly is no longer enough; you must ensure your content aligns closely with users’ needs and topics of interest.

AI-powered search engines like ChatGPT, Google’s SGE, Perplexity, and now SearchGPT are designed to comprehend the context and nuance behind a user’s query. They aim to provide direct answers and anticipate follow-up questions, creating a more dynamic and personalized search experience.

*A Note of Serving Multiple Intents*

AI-powered search results are evolving to coexist with traditional search. Google is experimenting with blending conventional and AI-enhanced search results. For example, searching for [outdoor lighting solutions].

The traditional search component assumes the user intends to purchase such products and ranks relevant ecommerce sites accordingly. This serves users who know exactly what they’re looking for and need quick access to buy options.

Multiple Intent TypesImage from BrightEdge, July 2024

In contrast, the AI-generated overview caters to users seeking a broader understanding of outdoor lighting. It might provide a conversational explanation covering various aspects, such as:

  • Key considerations when choosing outdoor lights.
  • Various types of outdoor lighting and their characteristics.
  • Available power options for outdoor illumination.
  • Understanding brightness levels and their significance.
  • Best practices for installation and placement.
  • Tips for maintaining outdoor lighting systems.

Anticipating and addressing related queries helps build the site’s credibility and improves the chances of being featured in AI-generated answers.

Since AI-first engines, LLMs, and traditional search engines are designed to recognize and prioritize unique, high-quality content over generic or duplicated material, this increases the chances that your content will surface in response to user queries.

  • Prove your expertise and make it easy for AI models to trust what you say.
  • AI engines need to see that your content is approved (validated) by other experts, as well as user-generated content and reviews.
  • Ensure your content reaches expert influencers and connects to related sources and websites.
  • Gain as much 3rd party validation that your content is trustworthy.
  • Ensure your content workflows consider traditional ranking factors and AI citations, as they rely on some standard but separate signals.

Video And YouTube

We are now seeing (pros and cons) YouTube videos cited in AI Overviews in ways that benefit marketers at the top of the funnel.

If YouTube were not part of Google, it would be the sixth biggest digital platform in the USA. It commands a lot of reach!

Cited Sources for AIO Image from BrightEdge, July 2024

As you can see above, this offers new advantages to marketers targeting early-stage prospects. Visual content can effectively showcase specific offerings and provide tangible reviews, potentially swaying purchasing choices such as buying a washing machine.

They are being shown to help simplify complex topics for users. For example, abstract technological concepts like “blockchain fundamentals” often become clearer through visual demonstrations, accelerating audience understanding.

Ensure that when you identify high-potential topical themes, you pair them with AI’s video citation preferences. Video is on an explosive growth trajectory, so start to build and get creative as part of your more comprehensive marketing strategy and for maximum AI Overview visibility.

This helps offer multiple reference possibilities. A single piece of video content could be cited numerous times, expanding your topical reach, which I mentioned earlier.

Key Takeaways

In an era where AI-driven search and AI-first answer engines or assistants reshape how markers operate, marketers, SEO pros, content creators, and brand marketers must adapt their strategies to optimize for AI answers and multiple types of search engines.

Below are a few end notes and outliers for your consideration also:

  • The core basics of SEO and classic search still matter.
  • AI Overviews are reduced in size to give more concise answers.
  • AI answers more complex questions, but more common questions and queries are also answered in better-served universal or classic formats – balance will be essential.
  • Monitor with cadence new engines; many are so new it will take an informed data-led opinion to form.
  • Going forward, different types of consumers will use engines for various use cases, and each engine will cite some common sources and other specific ones like news academics and publishers. Let’s see how it develops; it is something I am looking into myself now.
  • Always remember that everything varies depending on your vertical and type of business. Experimentation is still very heavy everywhere, including at Google!
  • With new entrants emerging, the news and live experiments every day expect change.
  • What happens in one month can differ from another while engines find equilibrium.

Essential best practices such as focusing on user intent, leveraging structured data markup, and embracing multimedia content aren’t going anywhere. Classic search is here to stay; many skills are transferable to AI.

The future lies in a balance of classic online marketing, adapting to AI, and uncovering new AI engines’ nuances as they grow and establish more of a foothold. It is an exciting time, and I think exercising a little patience will help us all prevail.

As for SearchGPT, I believe its evolution does not diminish SEO; on the contrary, it makes it even more critical!

For now, monitor and use time-based data as your compass, and don’t react to opinions without some substance behind them.

More resources: 


Featured Image from author

Best SEO for Dropshipping

Dropshipping is the entry point for many new ecommerce retail ventures, but selling essentially the same product as hundreds of other online stores makes search engine optimization challenging.

With a payment card, a logo, and a few clicks, entrepreneurs can quickly launch an online store by combining an ecommerce platform such as Shopify, BigCommerce, or WooCommerce with dropshipping apps such as Dsers, Spocket, or SaleHoo.

Advertising

The products for a dropshipping-enabled ecommerce shop could come from multiple sources, including AliExpress, and work on thin margins, effectively practicing retail arbitrage.

The relatively small margins can complicate advertising since just about any dip in ad performance can eliminate profits or worse.

To be sure, there are ways to market an ecommerce dropshipping business successfully, and while ads are the best option for most new stores, traffic from organic search listings and from visitors directly is essential for long-term success. This fact brings us back to dropshipping’s inherent problem: competition.

Almost without exception, whatever product a store chooses to source from a dropshipping service will be available on dozens, if not hundreds, of similar online shops, all vying for prominent search engine rankings.

Hundreds of online stores offer this Star Trek t-shirt from AliExpress.

SEO

SEO is typically iterative — no single procedure guarantees a top ranking on Google.

One SEO strategy for dropshipping is to build a content site that sells products — a content-then-commerce approach. Optimize for articles, videos, podcasts, and related, and then promote the dropshipped products within that editorial content.

Here are five content-then-commerce SEO tactics.

Identify content keyphrases

Classic keyword research is the best place to start for dropshipping SEO. But focus here on content phrases, not transactional or product.

In my research, every dropshipping shop selling the “Live Long and Prosper” licensed adult t-shirt from AliExpress is looking for a long-tail keyphrase. Avoid the crowd and seek keywords for the content.

Content marketing

A content-then-commerce strategy requires creating and distributing articles to target search engines and engage readers.

The articles should be clear and engaging, with proper HTML headers and tags. Repurposed articles make quality social media posts, generating what SEO practitioners call “social signals.” The number of followers, likes, and reposts a shop has on X or Instagram could inform search engines about the business and impact rankings.

Finally, some content marketing efforts, such as customer surveys, could be newsworthy.

Classic link building

Acquiring backlinks for a dropshipping store is perhaps the most difficult and valuable SEO tactic. It is vital for building credibility with search engines, but it demands hard work.

Compelling, original content will likely attract links organically. Otherwise, link building could include writing guest posts or contacting other sites to request links.

Media relations

The aim of media relations is to get links from large news sites.

SEO practitioners were excited to learn that Google’s index of links to global websites resides on three tiers of (massive) computer servers: random access memory, solid-state drives, and hard disk drives. These storage types differ in cost and speed.

The assumption is that Google considers links on the fastest tier (random access memory, or RAM) more valuable. Popular news sites typically reside on that tier and are therefore the best backlinks.

Structured data markup

Structured data markup from the Schema.org vocabulary, JSON-LD, or similar serves at least two purposes.

First, this uniform, structured info tells search engines what a page is about. Structured data could distinguish a site selling “Live Long & Prosper” t-shirts from one offering health tips for prospering over a long life.

Most SEO pros believe that structured markup increases the likelihood that a page will obtain a rich snippet or an AI-generated citation.

The Basics

This list of SEO tactics for dropshipping could have been much longer. Instead,  I’ve focused on content and assumed that ecommerce platforms would provide technical components such as HTML tags, site speed, mobility usability, and more.