How To Measure Topical Authority [In 2025] via @sejournal, @Kevin_Indig

Today’s Memo is an updated version of my previous guides on topical authority, one that takes the Google leaks, documents revealed in Google lawsuits, my recent UX study of AIOs, and the latest shifts in the search landscape into account.

Image Credit: Lyna ™

I think this is one of these concepts that can fly under the radar in the AI and Search conversation, but it’s actually important.

I’ll cover:

  • The idea behind topical authority and why you should pay attention to it.
  • How to measure topical authority.
  • What internal Google documents and leaks say about topical authority.
  • How Google and LLMs could understand topical authority.
  • What concrete levers you should pull to build topical authority.

I would argue that, along with brand authority, topical authority matters more now than ever.

But before we dig in, we have to address the reality of our current search situation:

You and your team have likely poured countless hours into classic SEO plays, content clusters, and link‑building, only to watch your organic clicks plateau (or even dip) as AIOs claim more SERP real estate.

Heck, countless sites have been losing organic traffic since late 2023 due to meager topical authority.

Meanwhile, stakeholders crave confidence that your AI era playbook is working.

Topical authority is a critical concept for both the old and new SEO era.

In fact, a recent Graphite study found that pages with high topical authority gain traffic 57% faster than those with low authority – proof that “covering your bases” can still pay dividends in speedy visibility gains. And the study showed that topical authority can increase the percentage of pages that get visibility in the first three weeks.1

I’m working on a workflow for paid subscribers that makes tracking topic‑level gains easier. The anticipated launch date of the workflow is in June. Upgrade to paid so you don’t miss it.

I used to dismiss topical authority as an SEO ghost concept. You know, one of those buzz‑terms people use to justify link‑building or content‑depth plays.

But back in 2022, I was wrong: It’s far from a ghost.

In fact, internal docs leaks and public signals from Google show that topical relevance, i.e., how completely a site covers related entities and questions, is a real and important factor in ranking.

And in today’s era of AIOs and LLM‑powered snippets, brand authority (a close cousin of topical authority) can be the difference between earning the click or being buried beneath an AI summary.

How Is The SEO Community Defining Topical Authority Post-AIOs?

The idea behind topical authority is that by covering all aspects of a topic (well), sites get a ranking boost because Google sees them as an authority in the topic space.

On the other end of the spectrum would be sites that only touch the surface of a topic.

Here’s how the SEO community has defined topical authority over time:

Topical Authority is a way of balancing the PageRank for finding more authoritative sources with the information on the sources.

Topical authority can be described as “depth of expertise.” It’s achieved by consistently writing original high-quality, comprehensive content that covers the topic.

Topical authority is a perceived authority over a niche or broad idea set, as opposed to authority over a singular idea or term.

Topical authority is one of the ways Google measures “quality” as a ranking factor – along with page authority and domain authority.

Based on that, here’s how I see topical authority (a.k.a. topical relevance) showing up in SERPs today. It includes:

  • Depth of expertise: Consistently publishing original, high‑quality content that covers all facets of a topic.
  • Entity coverage: Matching your content’s scope against Google’s own understanding of entity relationships – i.e., how well you hit the concepts Google expects for a given topic.
  • Backlink and mention signals: Earning links and web mentions from other trusted sources that reinforce your authority within that topic space. Think quality mentions over quantity here.
  • Final answers: How often your site provides the final answer (think completes the user journey) for searchers with a specific problem in a specific topic.

Semantic proximity matters, too. It’s not just about the volume of topic coverage, but about meaningfully addressing subtopics and related questions across your topics – think token overlap or topic‑model similarity between your pages and “ideal” topic coverage.

And information gain comes into play here also: What new, non-consensus information are you adding to the targeted topic?

Our SEO team brings the concept of topical authority to me as an argument to invest more resources in content, backlinking, and digital PR, but they can’t really back up the concept.

I’ve read a ton of articles about topical authority and have had more conversations about it than I can count. This is how I make sense of the idea:

  1. Google rewards sites that cover a topic in-depth.
  2. It does so by comparing how well the site covers relevant entities with Google’s own understanding of entity relationships.
  3. Google matches its own understanding with other factors like the site’s backlink profile and mentions on the web, user behavior, and brand combination searches (brand + generic keyword).

However, here’s the proof that it’s not a ghost concept and the concept does matter to earn organic visibility:

  • Leaked Google documents: The Google ranking factors leak verified the use of site‑level quality and “domain authority” signals, suggesting it uses whitelists of trusted sources for sensitive topics such as health or finance.
  • News topic authority signals: Google’s May 2023 Search Central post on “Understanding News Topic Authority” describes how it gauges a publication’s expertise across specialized verticals like finance, politics, and health.2

To better surface relevant, expert, and knowledgeable content in Google Search and News, Google developed a system called topic authority that helps determine which expert sources are helpful to someone’s newsy query in certain specialized topic areas, such as health, politics, or finance.2

  • Yandex leaked documents: Similar to Google, leaked Yandex materials indicate they factor in topic‑graph coverage when ranking news and content hubs (i.e., how many semantically related subtopics a site authoritatively addresses).
  • Google documents revealed in lawsuits: As reported by Danny Goodwin over at Search Engine Land, the trial exhibits released for the Google legal proceedings by the Department of Justice contain additional verification for the existence and importance of “topicality.” Key components include the ABC signals: Anchors (A): Links from a source page to a target page, Body (B): Terms in the document and Clicks (C): How long a user stayed on a linked page before returning to the SERP.

Together, the guidance from Google and leak confirmations make it very clear: Topical authority matters … even if sometimes it goes by a different name.

It isn’t just SEO folklore; it’s a (kind of) measurable signal of how comprehensively and credibly your site covers a topic, which is more important than ever in an AIO-saturated SERP.

Even though 15% of daily Google searches are new, websites cannot get more traffic than there are searches. That means the traffic from keywords within a topic is also limited by the number of searches.

In plain words:

The easiest way to measure topical authority is the share of traffic a site gets from a topic. I call this Topic Share, similar to market share or share of voice.

This is a very practical approach because it factors in the following:

  1. Rank, driven by backlinks, content depth/quality, and user experience.
  2. Search volume and how competitive a keyword is.
  3. The fact that URLs can rank for many keywords.
  4. SERP Features and snippet optimization.

To calculate Topic Share, you basically calculate how much traffic you or your competitors get from keywords within a topic.

For example, you can do this in Ahrefs:

  1. Take an entity (head term) like “ecommerce” and enter it in Keyword Explorer.
  2. Go to Matching Terms and filter for Volume = > 10.
  3. Export all keywords and upload them again in Keyword Explorer.
  4. Go to traffic share by domains.
  5. Traffic Share = Topic Share = “Topical Authority.”

The easiest way to find an entity is by looking at whether Google shows a Knowledge Panel for it in the search results or not.

Next month, paid subscribers will get my topical authority workflow. Don’t miss out. Upgrade here.

In theory, those 29,000 keywords reflect 100% Topic Share. If a domain ranked No. 1 for all of them, it would have the highest Topic Share.

If it would magically rank No. 1 for all keywords, it would have 100% Topic Share, which is practically impossible.

As a result, we need to use Topic Share comparatively, meaning in comparison with other sites.

For “ecommerce,” I calculated Topic Share based on the top 3,000 keywords by search volume. Shopify is leading with 11% Topic Share, closely followed by Bigcommerce with 10% and Nerdwallet with 3%.

Image Credit: Kevin Indig

Here’s another example with a smaller topic.

“Spend analysis” has 142 keywords in Ahrefs when I first used this example. Following the same process, jaggaer.com has the highest Topic Share with 15%, Sievo 13%, and Tipalti 7%.

Image Credit: Kevin Indig

To track Topic Share continuously, you could set up a rank tracking project in Ahrefs and monitor traffic share for these keywords. However, for large topics, this might not be cost-efficient.

And if you wanted to do this for multiple topics, you would quickly get into the 100,000s of keywords to track.

The best solution I see is running this analysis once a month and tracking changes manually. (It’s not efficient but practical.)

Example: “Contract Lifecycle Management”

Another example is the topic “contract lifecycle management,” which has ~480 keywords.

Icertis and Contractworks are leading the topic, followed by Gartner, Docusign, Salesforce, and Ironclad.

Image Credit: Kevin Indig

If this process is so manual, is it worth the work to measure it every month?

In some cases, yes. If you need to demonstrate to your stakeholders in a practical way whether or not resources and investment into building topical authority are working, then you should measure it.

And what if you need to prove to stakeholders that you need to invest in topic X instead of topic Y for quicker SEO gains?

By scoring how well you currently cover each subtopic, you can identify the core topics Google already finds you an authority in.

Because putting resources in that specific topic will likely move the needle most and could have the quickest SEO ROI.

If you’re in a major growth push into a new topic area (based on a new service, product feature, etc.), it’s valuable to track and measure topical authority to understand how you’re progressing in Topic Share, based on who your competitors are, and what it takes to develop topical authority in your niche.

But if you commit to monitoring it over time, you can also correlate your topic share to your tracking for AIO and LLM visibility.

Find out what topics overlap and why. Discover what topics Google finds you an authority in, while LLMs don’t.

1. Content Breadth & Depth
Essentially, how many pages (quantity) or target queries/subtopics does your site have within a topic, and how good are they (quality)?

This is your content library’s comprehensiveness and utility. Thoroughly explore every facet of your target topic: definitions, use cases, common questions, and related subtopics.

Comprehensive, well‑structured content shows both users and search engines that you’re the go‑to resource on your targeted topics and is actually adding to the overall topical conversation, rather than a site that only skims the surface.

Use entity‑based tactics or AI‑powered similarity scores to ensure you’re covering the concepts and questions Google associates with your topic.

2. Smart Internal Linking

Internal links are signals for the relationship between articles about a topic.

Optimizing the anchor text, context, and number of internal links sends stronger signals to Google and helps users find what they’re looking for.

3. Topically Relevant Backlinks And Mentions

Backlinks provide another confidence layer for Google that your content is good and relevant for a specific topic.

Aim for backlinks and mentions from trusted sites in adjacent categories.

Getting mentioned or linked in the Wall Street Journal’s retail section (www.wsj.com/business/retail) is more valuable for Shopify than Salesforce, for example.

4. Prune Content

I did a deep dive on IBM and Progressive, two organizations that are winning the SEO game in competitive topics. Both sites went through massive pruning efforts to improve domain authority.

And in SEOzempic, I showcased where DoorDash actually lost organic traffic by multiplying pages. Topical authority is all about hyperfocusing on the topics that are most relevant to your business, not having the most pages.

All of these businesses saw their organic traffic roar after pruning topically irrelevant content – in some cases, even high-quality content that just wasn’t a good fit for the domain (like Progressive’s agent pages).

Retrieval-augmented generation (RAG) – the grounding mechanism behind OpenAI’s, Google’s, Meta’s and others’ LLMs – explicitly ranks external documents for authority before passing them to the model to ground its answer.

Their technical notes stress pulling “current and authoritative sources” to reduce hallucination.

Source: https://aws.amazon.com/what-is/retrieval-augmented-generation/

OpenAI (and most likely other model developers as well) filter pre-training data by both quality and authority:

At the pre-training stage, we filtered our dataset mix for GPT-4 … and removed these documents from the pre-training set.3

ChatGPT’s monitor classifies sources and considers only authoritative pages as benign:

Benign behavior is defined as ‘Any authoritative resource a diligent human might consult.’4

My analysis of over 500,000 AI Overviews shows that the majority of citations point to highly authoritative and established sources.

But it’s not just AIOs. The top 10% of most visible content in ChatGPT and other LLMs also rewards comprehensive content that matches the ideal profile of high authority.

Paid subscribers: I’m releasing a topical authority workflow for you soon (anticipated next month!). Not a paid subscriber yet? Don’t miss this! Upgrade here.

Topical Authority Predictions For The Future Of SEO

As we’ve seen in the example of HubSpot and other sites, straying away too far from your core topics is a serious SEO risk.

More context: https://surferseo.com/blog/hubspot-traffic-drop/

I call this “overclustering.” Essentially, overclustering is when topic clusters unrelated to your core offerings may dilute your brand if you stretch into tangential topics and subtopics. The next core update could cut a significant chunk of your traffic.

However, major authoritative brands will continue to dominate, despite the fact that their brand isn’t an authority on every topic – and possibly even for niche queries – due to entrenched domain-level trust. The most prominent examples are Forbes and LinkedIn.

A hidden opportunity exists in AI Overview citations, which sometimes surface smaller sites with strong topical authority on a very specific subtopic or a piece of content with a unique perspective, making it crucial to maintain deep coverage in your niche to get “picked” by AIO algorithms.

Human signals rebound: As AI content saturates the web, Google may place renewed emphasis on behavioral metrics (CTR, dwell time, return visits) to distinguish genuinely authoritative sources from AI‑built noise.

We know that humans prefer answers from other humans as a way to balance AI answers from the usability study I published last week.

How To Approach Topical Authority As An SEO In A Volatile Search Landscape

I think, at the core, there are two questions you need to ask about your brand:

1. Credibility: Are we “credible” enough to target this topic? Do we already have enough depth, expertise, and context?

2. Growth: What’s our roadmap for expanding that authority over time, especially as AI‑generated content and LLM snippets flood the SERP?

As a new way of searching takes over (and as AI continues to flood the web with consensus content), search engines will lean harder on authentic, useful, and authoritative sources.

True topical authority isn’t about checking boxes. It’s about earning the perception of being an authority in a space from humans and algorithms alike.

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!


1 Study shows that high Topical Authority leads to faster organic search visibility.

2 Understanding news topic authority

3 GPT-4 System Card

4 Introducing OpenAI o3 and o4-mini 


Featured Image: Paulo Bobita/Search Engine Journal

Stop Focusing On Google, It’s Time to Focus On Being Visible via @sejournal, @wburton27

Remember when “Just Google it” was the solution to all your search needs? Unfortunately, those days are changing fast.

While Google remains the king of search, the ground beneath its feet is shifting as brands, marketers, and end users are noticing that there are some new sheriffs in town.

The search ecosystem that puts all your eggs in the Google basket might not be a wise move anymore.

Today’s search landscape isn’t just about algorithm updates and being visible on Google. It’s about recognizing that your audience exists across multiple touchpoints from traditional search engines, i.e., from Bing and Google to AI chatbots, from social platforms to specialized marketplaces like Amazon, etc.

The businesses that thrive won’t be the ones waiting to see what happens; they will be the pioneers already establishing a strong presence across this expanding universe of search.

Google Still Dominates, But It Is Being Challenged

Google is facing some competition.

StatCounter shows Google’s global search share dropped below 90% and remained there throughout the last quarter of 2024, marking the first such decline in nearly a decade.

This shift coincides with significant legal headwinds.

In 2025, Google faces multiple antitrust challenges, with a judge recently finding that Google has a monopoly in search and has acted to maintain it.

These legal troubles might cause Google to change its business practices and may have an impact on its market dominance, allowing other social and AI platforms to capture more of Google’s market share.

This does not mean that Google is going down; it just signifies that Google is no longer the only game in town, and therefore relying on Google only could be increasingly risky.

For example, if you’re an ecommerce retailer that generates 60-80% of your traffic from Google and your site experiences a temporary drop in visibility during a core update for creating AI content, you would be in big trouble.

If your marketing strategy does not have any alternative traffic sources, your revenue could potentially decrease by 40% or more in a matter of weeks.

Meanwhile, if your competitors have diversified their digital presence across multiple platforms, including AI shopping assistants and social commerce channels, they might experience only minor fluctuations in their traffic and sales.

It’s An Omnichannel World

Your audience does not think in terms of platforms; they think in terms of their needs.

For example, a user might ask ChatGPT for information on sustainable materials, browse Instagram for some home design inspiration, check Amazon for product comparisons, and then Google specific brands before making a purchase.

This changing customer journey means that businesses must be acutely aware of where their traffic originates and how much traffic comes from various sources.

The days of relying on and checking only your Google Analytics for Google traffic are over.

In order to succeed, you must have a holistic view of your visibility across the entire digital landscape.

For example, my friend Claudia has an outdated kitchen and is looking to get a new one after 20 years of living in her home with her family.

Here is what Claudia’s journey looked like in this new ecosystem:

  • Claudia started out by going to ChatGPT and typing in “best kitchen design brands,” and found some information mentioning several designer brands.
  • Since the intent behind kitchen design is image-based, Claudia then searches on Pinterest for visual inspiration, and saves some images from the designer brands that she found in ChatGPT.
  • Claudia then looks to Reddit to gather feedback about specific brands and learn from others’ experiences.
  • She checks YouTube for installation tutorials but decides she needs a professional.
  • Claudia then Googles local contractors with high ratings and reviews, contacts one of them, and gets a quote.
Screenshot from ChatGPT, April 2025

Now, if you’re a business that is only focused on Google, guess what? You would not gain Claudia and other clients because you would miss multiple touchpoints in their user journey, as she searched on different channels and platforms. You must have content that enforces your brand at every stage.

Don’t Fall Behind

The time is now to adopt an omnichannel strategy, stay ahead of trends, experiment with different platforms, and maintain a strong performance on established channels like Google so we won’t be left behind.

Imagine if the following scenarios were to occur; what would happen to your business?

  • A loss of 30% of your traffic overnight.
  • You’re not finding where your customers are spending time before they make purchase decisions.
  • You’re not visible on ChatGPT, Bing, YouTube, Reddit, etc.

One of the brands I consulted with in the financial industry noticed that searches about retirement planning were being asked for on AI platforms and in Google.

We created a comprehensive, citation-rich content strategy that got them mentioned in some major financial publications.

When users searched for retirement planning in ChatGPT, their brand was mentioned as a source, which drove leads and conversions.

Screenshot from ChatGPT, April 2025

 AI Works Differently Than Traditional Search

AI chatbots like ChatGPT don’t work like Google’s algorithm. They don’t rank websites; instead, they gather information and identify authoritative sources.

If you want to be visible in ChatGPT, then you need to change your approach.

  • Being a recognized name in your industry increases your chances of being mentioned.
  • Being featured across multiple platforms strengthens your authority and increases your visibility in AI chatbots.
  • Getting referenced by other respected sources helps build trust.
  • Have clear, conversational, and structured content that AI chatbots can reference and find.
  • Be active in social communities like Reddit.
  • Build trust and credibility through positive reviews and ratings.

ChatGPT and other AI chatbots and platforms look more broadly at the digital ecosystem and get information from Quora, Reddit, social media, forum conversations, and reviews.

AI chatbots also understand long-tail queries in a more nuanced way than traditional search.

It’s All About Balance

People are not running for the hills and abandoning Google.

Google remains the king and is likely to retain its market share leadership for the foreseeable future, but things are changing.

To succeed today, you must implement an omnichannel SEO strategy, maintain a strong Google presence, and be where your audience is.

Continue to:

Wrapping Up

Search engines like Google will continue to evolve, alongside ChatGPT, social platforms, and other search technologies that are expected to emerge in the coming years.

But, the days of relying only on Google as your primary digital marketing channel are behind us.

Brands that are discoverable, credible, and helpful will be successful, wherever your audience seeks information.

Brands that win in 2025 won’t be asking “How do we rank better on Google?” but rather “How do we ensure we are visible on every channel and have content that resonates and answers all our customers’ questions?”

This shift in perspective, from platform-centric to audience-centric, is the true key to sustainable digital success.

More Resources:


Featured Image: 3rdtimeluckystudio/Shutterstock

30-Year SEO Pro Shows How To Adapt To Google’s Zero-Click Search via @sejournal, @martinibuster

Search marketer Michael Bonfils recently discussed how AI is disrupting search marketing and shared insights into what he feels is an appropriate response to one of the most difficult search environments he’s seen in his thirty years of experience.

Michael Bonfils (LinkedIn profile) has worked in digital marketing since virtually the dawn of it all, well before Google even existed. He’s a leading international digital marketer with experience across every aspect of digital marketing, from on-page SEO to digital advertising. Michael joined Gianluca Fiorelli (LinkedIn profile) on the Advanced Web Ranking podcast and shared his insights on the challenges AI is bringing to digital marketing and novel ideas for how to navigate them.

Brutal Environment For Digital Marketing

Gianluca mentioned there’s a perception gap with AI where on one side are marketers who are heralding the end of SEO and PPC and on the other side are the “AI bros” who cheerlead that everything is going to become even better, with better leads from ChatGPT, etc.

He shook his head and said:

“It’s neither going to be a disaster and it’s neither going to be an AI paradise.”

Gianluca asked him what trends he’s seeing. Michael responded that the trends he’s seeing is that click volume has gone down since the introduction of AI. He said during other times when volume is down the click through rates go up, like during the pandemic. But that’s not happening now. Click through rates are down, volume is down but Cost Per Clicks are at historic highs.

Michael observed,

“But now, …the level we’re at now is the worst time since 2019 during the pandemic and prior to that it was never that bad.

…If you want throw the CPC factor in, the CPC’s are historically higher than they have been for years. So now we’ve got this perfect problem, click through rates down, volume down, CPC’s up. What does that mean? ROI is getting hit and clients are leaning on organic to try to make up for whatever shortfall there is and they can’t find it, they can’t find the traffic.

So to answer your question, …now that we’re going into Europe with AI overviews, are they impacting things? One hundred percent. And they’ll continue to change. “

Later on they discussed how a lot of what Google is doing is reactionary, a response to external pressures from companies like Perplexity AI and OpenAI, and the search industry is caught in the middle of it.

AI Overviews Leads To Loss Of Strategic Data

Michael Bonfils discusses how AI overviews leads to zero-click behavior and while most SEOs stop right there, Michael points out that this situation affects the data that’s available to marketers and as a consequence impacts content strategy.

Use IndexNow For AI Search And Shopping SEO via @sejournal, @martinibuster

Microsoft Bing published an announcement stating that the IndexNow search crawling technology is a powerful way for ecommerce companies to surface the latest and most accurate shopping-related information in AI Search and search engine shopping features.

Generative Search Requires Timely Shopping Information

Ecommerce sites typically depend on merchant feeds, search engine crawling and updates to Schema.org structured data to communicate what’s for sale, new products, retired products, changes to prices, availability and other important features. Each of those methods can be a point of failure due to slow crawling by search engines and inconsistent updating which can delay the correct information from surfacing in AI search and shopping features.

IndexNow solves that problem. Content platforms like Wix, Duda, Shopify and WooCommerce support IndexNow, a Microsoft technology that enables speeding indexing of new or updated content. Pairing IndexNow with Schema.org assures fast indexing so that the correct information surfaces in AI Search and shopping features.

IndexNow recommends the following Schema.org Product Type properties:

  • “title (name in JSON-LD)
  • description
  • price (list/retail price)
  • link (product landing page URL)
  • image link (image in JSON-LD)
  • shipping (especially important for Germany and Austria)
  • id (a unique identifier for the product)
  • brand
  • gtin
  • mpn
  • datePublished
  • dateModified
  • Optional fields to further enhance context and classification:
  • category (helps group products for search and shopping platforms)
  • seller (recommended for marketplaces or resellers)
  • itemCondition (e.g., NewCondition, UsedCondition)”

Read more at Microsoft Bing’s Blog:

IndexNow Enables Faster and More Reliable Updates for Shopping and Ads

Featured Image by Shutterstock/Paper piper

From Search To Discovery: Why SEO Must Evolve Beyond The SERP via @sejournal, @alexmoss

The search landscape undergoes its biggest shift in a generation.

If you’ve been in SEO long enough to remember the glory days of the all-organic search engine results pages (SERP), you’ll know how much of this real estate has been gradually taken over by paid ads, other first-party products, and rich snippets.

Now, the most aggressive transition of all: AI Overviews (as well as search-based large language model platforms).

At BrightonSEO last month, I explored how this evolution is forcing us to rethink what SEO means and why discoverability, not just ranking, is the new north star.

The “Dawn” Of The Zero-Click Isn’t Just Over – It’s Now Assumed

We’ve been reading about the rise of zero-click searches for some time now, but this “takeover” has been much more noticeable over the past 12 months.

I recently searched [how to teach my child to tell the time], and after scrolling through a parade of paid product ads, Google-owned assets, and the AI Overview summaries, I scrolled a good three pages down the SERP.

Google and other search and discovery platforms want to keep users in their ecosystems. For SEO pros, this means traditional metrics such as click-through rate (CTR) are becoming less valuable by the day.

From Answer Engines To Assistant Engines

LLMs have changed not just the way a result is displayed to the user but also changed the traditional search flow born within the browser into a multi-step flow that the native SERP simply cannot support in the same way.

The research process is collapsing into a single, seamless exchange.

Traditional flow vs Multi-step flowImage used with permission from Alain Schlesser, May 2025

But as technology accelerates, our own curiosity and research skills are at risk of declining or disappearing completely as the evolution of technology exponentially grows.

Assistant engines and wider LLMs  are the new gatekeepers between our content and the person discovering that content – our potential “new audience.”

They parse, consume, understand, and then synthesize content, which is the deciding factor in what it mentions to whom/what it interacts with.

Structured data is still crucial, as context, transparency, and sentiment matter more than ever.

Personal LLM agent flow diagramPersonal LLM agent flow diagram by Alain Schlesser, used with permission, May 2025

Challenges Are Different, But Also The Same

As an SEO, our challenges with this new behavior affect the way we do – and report on – our jobs.

In reality, many are just old headaches in shiny new wrappers:

  • Attribution is a mess: With AI Overviews and LLMs synthesizing content, it’s harder than ever to see where your traffic comes from – or if you’re getting any at all. There are some tools out there that do monitor, but we’re in the early days to see a standard. Even Google said they have no plans on adding insights on AIO within Search Console.
  • Traffic is fragmenting (again): We saw this with social media platforms at the beginning, where discovery happened outside the organic SERPs. Discovery is now happening everywhere, all at once. With attribution also harder to ascertain, this is a bigger challenge today.
  • Budgets are under scrutiny from fear, uncertainty, and doubt (FUD): The native SERP is changing too much, so some may assume there’s less (or no) value in doing SEO much anymore (untrue!).

The Shift Of Success Metrics

The days of our current success metrics are dwindling. The days of vanity-led metrics are coming to an end.

Similar to how our challenges are the same but different, this also applies to how we redefine success metrics:

Old Hat New Hat
Content Context + sentiment
Keywords Intent
Brand Brand + sentiment
Rankings Mentions
Links from external sources Citations across various channels
SERP monopoly Share of voice
E-E-A-T Still E-E-A-T
Structured data Entities, knowledge graph & vector embeds
Answering Assisting

What Can You Do About It?

Information can be aggregated, but personality can’t. This is why it’s still our responsibility to help “assist the assistant” to consider and include you as part of that aggregated information and synthesized answer.

  • Stick to the fundamentals: Never neglect SEO 101.
  • Third-party perspective is increasingly important, so ensure this is maintained and managed well to ensure positive brand sentiment.
  • Embrace structured data: Even if some say it’s becoming less crucial for LLMs to understand entities, structured data is being used right now inside major LLMs to output structured data within responses, giving them an established and standardised way to understand your content.
  • Educate stakeholders: Shift the conversation from rankings and clicks to discoverability and brand presence. The days of the branded unlinked mention suddenly have more value than “acquiring X followed non-branded anchor text links pcm.”
  • Experiment with your content: Try new ways to produce and market your content beyond the traditional word. Here, video is useful not only for humans but also for LLMs, who are now “watching” and understanding them to aid their response.
  • Create helpful, unique content: To add to the above, don’t produce for the sake of production.

LLMs.txt: The Potential To Be The New Standard

Keep an eye on emerging standards proposals, such as llms.txt, which is one way some are adapting and contributing to how LLMs ingest our content beyond our traditional approaches offered with robots.txt and XML sitemaps.

While some are skeptical about this standard, I believe it is still something worth implementing now, and I understand its true benefits for the future.

There is (virtually) non-existent risk in implementing something that doesn’t take too much time or resources to produce, so long as you’re doing so with a white hat approach.

Conclusion: Embrace Discoverability And New Metrics

SEO isn’t dead. It’s expanding, but at a rate we haven’t experienced before.

Discoverability is the new go-to success metric, but it’s not without flaws, especially as the way we search continues to change.

This is no longer about “ranking well” anymore. This is now about being understood, surfaced, trusted, and discovered across every platform and assistant that matters.

Embrace and adapt to the changes, as it’s going to continue for some time.

More Resources:


Featured Image: PeopleImages.com – Yuri A/Shutterstock

Googler’s Deposition Offers View Of Google’s Ranking Systems via @sejournal, @martinibuster

A Google engineer’s redacted testimony published online by the U.S. Justice Department offers a look inside Google’s ranking systems, offering an idea about Google’s quality scores and introduces a mysterious popularity signal that uses Chrome data.

The document offers a high level and very general view of ranking signals, providing a sense of what the algorithms do but not the specifics.

Hand-Crafted Signals

For example, it begins with a section about the “hand crafting” of signals which describes the general process of taking data from quality raters, clicks and so on and applying mathematical and statistical formulas to generate a ranking score from three kinds of signals. Hand crafted means scaled algorithms that are tuned by search engineers. It doesn’t mean that they are manually ranking websites.

Google’s ABC Signals

The DOJ document lists three kinds of signals that are referred to as ABC Signals and correspond to the following:

  • A – Anchors (pages linking to the target pages),
  • B – Body (search query terms in the document),
  • C – Clicks (user dwell time before returning to the SERP)

The statement about the ABC signals is a generalization of one part of the ranking process. Ranking search results is far more complex and involves hundreds if not thousands of additional algorithms at every step of the ranking process, from indexing, link analysis, anti-spam processes, personalization, re-ranking, and other processes. For example, Liz Reid has discussed Core Topicality Systems as part of the ranking algorithm and Martin Splitt has discussed annotations as a part of understanding web pages.

This is what the document says about the ABC signals:

“ABC signals are the key components of topicality (or a base score), which is Google’s determination of how the document is relevant to the query.

T* (Topicality) effectively combines (at least) these three signals in a relatively hand-crafted way. Google uses to judge the relevance of the document based on the query terms.”

The document offers an idea of the complexity of ranking web pages:

“Ranking development (especially topicality) involves solving many complex mathematical problems. For topicality, there might be a team of engineers working continuously on these hard problems within a given project.

The reason why the vast majority of signals are hand-crafted is that if anything breaks Google knows what to fix. Google wants their signals to be fully transparent so they can trouble-shoot them and improve upon them.”

The document compares their hand-crafted approach to Microsoft’s automated approach, saying that when something breaks at Bing it’s far more difficult to troubleshoot than it is with Google’s approach.

Interplay Between Page Quality And Relevance

An interesting point revealed by the search engineer is that page quality is independent of query. If a page is determined to be high quality, trustworthy, it’s regarded as trustworthy across all related queries which is what is meant by the word static, it’s not dynamically recalculated for each query. However, there are relevance-related signals in the query that can be used to calculate the final rankings, which shows how relevance plays a decisive role in determining what gets ranked.

This is what they said:

“Quality
Generally static across multiple queries and not connected to a specific query.

However, in some cases Quality signal incorporates information from the query in addition to the static signal. For example, a site may have high quality but general information so a query interpreted as seeking very narrow/technical information may be used to direct to a quality site that is more technical.

Q* (page quality (i.e., the notion of trustworthiness)) is incredibly important. If competitors see the logs, then they have a notion of “authority” for a given site.

Quality score is hugely important even today. Page quality is something people complain about the most…”

AI Gives Cause For Complaints Against Google

The engineer states that people complain about quality but also says that AI aggravates the situation by making it worse.

He says about page quality:

“Nowadays, people still complain about the quality and AI makes it worse.

This was and continues to be a lot of work but could be easily reverse engineered because Q is largely static and largely related to the site rather than the query.”

eDeepRank – A Way To Understand LLM Rankings

The Googler lists other ranking signals, including one called eDeepRank which is an LLM-based system that uses BERT, which is a language related model.

He explains:

“eDeepRank is an LLM system that uses BERT, transformers. Essentially, eDeepRank tries to take LLM-based signals and decompose them into components to make them more transparent. “

That part about decomposing LLM signals into components seems to be a reference of making the LLM-based ranking signals more transparent so that search engineers can understand why the LLM is ranking something.

PageRank Linked To Distance Ranking Algorithms

PageRank is Google’s original ranking innovation and it has since been updated. I wrote about this kind of algorithm six years ago . Link distance algorithms calculate the distance from authoritative websites for a given topic (called seed sites) to other websites in the same topic. These algorithms start with a seed set of authoritative sites in a given topic and sites that are further away from their respective seed site are determined to be less trustworthy. Sites that are closer to the seed sets are likelier to be more authoritative and trustworthy.

This is what the Googler said about PageRank:

“PageRank. This is a single signal relating to distance from a known good source, and it is used as an input to the Quality score.”

Read about this kind of link ranking algorithm: Link Distance Ranking Algorithms

Cryptic Chrome-Based Popularity Signal

There is another signal whose name is redacted that’s related to popularity.

Here’s the cryptic description:

“[redacted] (popularity) signal that uses Chrome data.”

A plausible claim can be made that this confirms that the Chrome API leak is about actual ranking factors. However, many SEOs, myself included, believe that those APIs are developer-facing tools used by Chrome to show performance metrics like Core Web Vitals within the Chrome Dev Tools interface.

I suspect that this is a reference to a popularity signal that we might not know about.

The Google engineer does refer to another leak of documents that reference actual “components of Google’s ranking system” but that they don’t have enough information for reverse engineering the algorithm.

They explain:

“There was a leak of Google documents which named certain components of Google’s ranking system, but the documents don’t go into specifics of the curves and thresholds.

For example
The documents alone do not give you enough details to figure it out, but the data likely does.”

Takeaway

The newly released document summarizes a U.S. Justice Department deposition of a Google engineer that offers a general outline of parts of Google’s search ranking systems. It discusses hand-crafted signal design, the role of static page quality scores, and a mysterious popularity signal derived from Chrome data.

It provides a rare look into how signals like topicality, trustworthiness, click behavior, and LLM-based transparency are engineered and offers a different perspective on how Google ranks websites.

Featured Image by Shutterstock/fran_kie

How Referral Traffic Undermines Long-Term Brand Growth via @sejournal, @martinibuster

Mordy Oberstein, a search marketing professional whom I hold in high esteem, recently shared the provocative idea that referral traffic is not a brand’s friend and that every brand, as it matures, should wean itself from it. Referrals from other websites are generally considered a sign of a high-performing business, but it’s not a long-term strategy because it depends on sources that cannot be controlled.

Referral Traffic Is Necessary But…

Mordy Oberstein (LinkedIn profile), formerly of Wix, asserted in a Facebook post that relying on a traffic source, whether that’s another website or a search engine, offers a degree of vulnerability to maintaining steady traffic and performance.

He broke it down as a two-fold weakness:

  • Relying on the other site to keep featuring your brand.
  • Relying on Google to keep ranking that other site which in turn sends visitors to your brand.

The flow of traffic can stop at either of those two points, which is a hidden weakness that can affect the long-term sustainability of healthy traffic and sales.

Mordy explained:

“It’s a double vulnerability…

1) Relying on being featured by the website (the traffic source)
2) Relying on Google to give that website …traffic (the channel)

There are two levels of exposure & vulnerability.

As your brand matures, you want to own your own narrative.

More referral traffic is not your friend. It’s why, as a brand matures, it should wean off of it.

Full disclosure, this is my opinion. I am sure a lot of people will disagree.”

Becoming A Destination

I’ve always favored promoting a site in a way that helps it become synonymous with a given topic because that’s how to make it a default destination and encourage the kinds of signals that Google interprets as authoritative. I’ve done things like created hats with logos to give away, annual product giveaways and other promotional activities, both online and offline. While my competition was doing SEO busy work I created fans. Promoting a site is basically just getting it in front of people, both online and offline.

Brand Authority Is An Excuse, Not A Goal

Some SEOs believe in a concept called Brand Authority, which is a misleading explanation for why a website rank.  The term Brand Authority is not about Branding and it’s not about Authoritativeness, either. It’s just an excuse for why a site is top-ranked.

The phrase Brand Authority has its roots in PageRank. Big brand websites used to have a PageRank of 9 out of 10 and even a 10/10, which enabled them to rank for virtually any keywords they wanted. A link from one of those sites practically guarantee a top ten ranking. But Google ended the outsized influence of PageRank because it resulted in less relevant results, which was around 2004-ish, about the time that Google started using Navboost, a ranking signal that essentially measures how people feel about a site, which is what PageRank does, too.

This insight, that Google uses signals about how people feel about a site, is important because the feelings people have for a business are what being a brand is all about.

Marty Neumeier, a thought leader on how to promote companies (author of The Brand Gap) explained what being a brand is all about:

“Instead of creating the brand first, the company creates customers (through products and social media), the customers build the brand (through purchases and advocacy), and the customer-built brand sustains the company (through “tribal” loyalty). This model takes into account a profound and counterintuitive truth: a brand is not owned by the company, but by the customers who draw meaning from it. Your brand isn’t what you say it is. It’s what they say it is.”

Neumeier also explains how brand is about customer feelings:

“The best brands are vivid. They create clear mental pictures and powerful feelings in the minds and hearts of customers. They’re brought to life through their touchpoints, the places where customers experience them, from the first exposure to a brand’s name, to buying the product, to eventually making it part of who they are.”

That “tribal loyalty” is the kind of thing Google tries to measure. So when Danny Sullivan talks about differentiating your site to make it like a brand, he is not referring to so-called “brand authority.” He is talking about doing the kinds of things that influence people to feel positive about a site.

Getting Back To Mordy Oberstein

It seems to me that what he’s saying is that referral traffic is a stepping stone towards becoming a destination, it’s a means to an end. It’s not the goal, it’s a step toward the goal of becoming a destination.

On the other side of that process, I think it’s important to maintain relevance with potential site visitors and customers, especially today with the rapid pace of innovation, generational change, new inventions, and new product models. Relevance to people has been a Google ranking signal for a long time, beginning with PageRank, then with additional signals like Navboost.

The SEO factor that the SEO industry has largely missed is the part about about getting people to think positive thoughts about your site and your business, enough to share with other people.

Mordy’s insight about traffic is beautiful and elegant.

Read Mordy’s entire post on Facebook.

Featured Image by Shutterstock/Yunus Praditya

Google Clarifies: AI Overview Links Share Single Position In Search Console via @sejournal, @MattGSouthern

Google’s John Mueller has clarified that all links within AI Overviews (AIOs) share a single position in Google Search Console.

SEO consultant Gianluca Fiorelli asked Mueller how Search Console tracks position data for URLs in Google’s AI-generated answer boxes.

Mueller referenced Google’s official help docs, explaining:

“Basically an AIO counts as a block, so it’s all one position. It can be first position, if the block is shown first, but I don’t know if AIO is always shown first.”

This indicates that every website linked in an AI Overview receives the same position value in Search Console reports.

This occurs regardless of where the link appears in the overview panel, whether immediately visible or hidden until a user expands the box.

What Google’s Documentation Says

Google’s Search Console Help docs explain how AI Overview metrics work:

  • Position: “An AI Overview occupies a single position in search results, and all links in the AI Overview are assigned that same position.”
  • Clicks: “Clicking a link to an external page in the AI Overview counts as a click.”
  • Impressions: “Standard impression rules apply. To be counted as an impression, the link must be scrolled or expanded into view.”

The docs also note:

“Search Console doesn’t include data from experiments in Search Labs, as these experiments are still in active development.”

The Missing Data Behind Google’s Click Claims

This discussion highlights an ongoing debate in the SEO community regarding the performance of links in AI Overviews.

Lily Ray, Vice President of SEO Strategy & Research at Amsive, recently pointed out Google’s year-old claim that websites receive more clicks when featured in AI Overviews, stating:

“I would love to see a single GSC report that confirms this statement, because every study so far has shown the opposite.”

Ray’s statement reflects the concerns of many SEO professionals, as Google has not provided data to support its claims.

Looking Ahead

While we now understand how position metrics are recorded, the question remains: Do AI Overview placements drive more or less traffic than traditional search listings?

Google claims one thing, but many people report different experiences.

Since all AIO links share the same position, it’s difficult to determine which specific placements perform better.

This debate highlights the need for more precise data about how AIOs affect website traffic compared to regular search results.


Featured Image: Roman Samborskyi/Shutterstock

The Rise Of Privacy-First Search Engines via @sejournal, @TaylorDanRW

Google has long held a firm grip on the search engine landscape, but that dominant veneer is starting to show cracks.

In recent months, regulatory scrutiny, public mistrust, and rising anxiety around AI have pushed digital privacy into the spotlight.

Millions of users are now evaluating their relationship with “big tech” and actively seeking alternatives, prioritizing trust and anonymity.

What was once relegated to being a niche concern is now a broader user shift, with privacy-first search engines gaining momentum across various demographics.

The Privacy Shift

Recent stats clearly show that people are becoming more privacy-aware and want greater control.

Norton reports that 85% of users globally want tighter reins on their data.

In the U.S., over 87% of voters back restrictions on the sale of personal data without consent, while 86% support limits on what companies can collect in the first place.

That awareness is turning into action.

A 2024 study found that 51% of users between 18 and 24 actively take steps to protect their digital footprint. This shows how people search, with apparent platform choices and behavior shifts.

DuckDuckGo, Brave, And The “Privacy Engine” Movement

DuckDuckGo is at the forefront of this change. Since its launch in 2008, it’s grown into a major player with over 100 million daily searches.

Brave Search, integrated into the privacy-focused Brave browser, is also gaining ground. Built on an index from its own crawler and a number of “crowd-sourced” sources, DDG is committed to ad-free, unbiased results.

Brave reflects the demand for tools that serve users rather than advertisers.

These platforms highlight a growing appetite for search options among a growing user base that rejects surveillance and upholds user agency.

The Rise Of New Privacy Engines

Awareness around data tracking has driven more users to seek out search engines that don’t rely on surveillance-based business models.

Traditional engines like Google and Bing have come under fire for harvesting user data to fuel targeted advertising.

In contrast, privacy-first search engines are gaining traction by rejecting tracking, behavioral profiling, and data retention, offering users more control and transparency over how their search activity is handled.

While DuckDuckGo is the front-runner when it comes to privacy-focused search engines, there are a number of players in this category. To better understand them, I reached out to their teams to dig deeper than the information just found online.

Swisscows

Image from author, May 2025

One rising contender is Swisscows, a Switzerland-based engine that recently marked its 10-year milestone.

It’s more than a search engine; it’s a whole ecosystem with encrypted messaging, secure cloud storage, VPN services, and an AI-powered summary tool focused on keeping user data private.

With roughly 25 million searches per month and a user base spanning Switzerland, the U.S., and Germany, Swisscows stands out for filtering out adult and violent content, making it popular among educators and families.

Its results come from its own index and Brave, chosen for their privacy-first approach.

“We don’t personalize or profile users,” the team told me. “That means more neutral, manipulation-free search results.”

Swisscows is also investing in semantic search and AI, aiming not to build chatbots but to improve information discovery and trend insights, hinting at a more ethical path for AI in search.

Startpage

Another major player is Startpage, which operates out of the Netherlands. The company has also rolled out a private browsing app, handling billions of searches yearly.

Startpage also doesn’t engage in user profiling. That means no tracking, no cookies by default, and no storing of IP addresses.

Users get results sourced from Google and Bing, but do not have the data collection that typically comes with them.

“People are simply done with being watched,” said the Startpage team. “As AI becomes more embedded in search, the demand for privacy is only increasing. Trust depends on clear policies and a commitment to not compromise user rights.”

Mojeek

Then there’s Mojeek, an independent engine with indexing and server infrastructure.

Unlike privacy-conscious tools that piggyback off bigger indexes, Mojeek runs its stack out of one of the UK’s most sustainable data centers.

By 2022, its index had hit 6 billion pages, a sizable feat for a standalone engine.

Mojeek doesn’t store search histories, use cookies, or track users. It delivers the same results to everyone, providing a transparent alternative to mainstream engines’ personalization-heavy approaches.

It’s also the default choice on several privacy-oriented browsers, like Privacy Browser, and is integrated into Pale Moon, SerenityOS, and Kagi Search.

What’s Fuelling The Shift?

This movement isn’t just about escaping ads or dodging trackers but reclaiming control.

AI-driven tools like ChatGPT, Google’s AI Overviews, and Bing AI are reshaping search by relying more on user data than ever.

As AI becomes more integrated into search engines, privacy becomes a central point of differentiation.

At the same time, regulatory pressure is intensifying. Governments are pushing back on unchecked data use, from the GDPR and the Digital Services Act in Europe to the proposed American Privacy Rights Act.

By the end of 2024, modern data protection laws were expected to cover three-quarters of the global population, reflecting a worldwide demand for stricter safeguards.

Optimizing For Privacy Search Engines

To optimize for privacy-first search engines like Swisscows and Startpage, marketers need to rethink their strategies.

Standard SEO tactics that depend heavily on tracking user behavior don’t hold up well when personalization is limited.

Instead, the focus shifts to a deeper understanding of the audience, what questions they’re asking, how they phrase them, and the intent behind their searches.

Creating content that directly answers real user needs, keeping the site structure intuitive, and using language that clearly reflects search intent has become a central focus.

Without behavioral tracking, insight must come from sources like on-site search data, user reviews, forum conversations, and direct feedback.

In this space, winning in SEO means less about gaming the system and more about delivering practical, trustworthy information in a straightforward way.

The Future Of Search Is Changing

Traditional search engines are increasingly wrapped up in advertising and AI. Still, privacy-first options are emerging as both safer and more ethical alternatives.

Whether it’s Swisscows with its commitment to content integrity or Startpage delivering Google-quality results without the tracking, these platforms represent a new direction shaped by more informed, privacy-conscious users.

More Resources:


Featured Image: Thapana_Studio/Shutterstock

Local SEO: How To Make More Customers Click, Choose & Walk Through Your Doors [Webinar] via @sejournal, @hethr_campbell

How do you turn local searches into real foot traffic?

If your business relies on being found locally, clicks alone aren’t enough. You need future customers to choose you and show up.

Whether you’re managing search visibility, local listings, or digital customer experience, this session will help you turn more searches into measurable visits and offline conversions.

Join us for “Local SEO: How To Make More Customers Click, Choose & Walk Through Your Doors” on Wednesday, May 28 at 2 PM ET.  In this session, we’ll explore real consumer behavior and how it shapes your local SEO strategy.

Why This Webinar Is Worth Your Time:

Based on consumer research from over 2,000 individuals across the UK, US, France, and Germany, this session will give you a clear picture of what makes people take action.

In this session, you’ll learn: 

✅ What gets consumers to choose one business over another.
✅ Actionable tips to optimize local SEO strategies across Google, Apple, voice search, AI tools & more.
✅ How to improve visibility, clarity, and trust across every location you manage.
✅ Digital signals that matter most to consumers.

Presented by Krystal Taing (VP) and Paul Modaley (Content Marketing Manager) at Uberall, this event is built for businesses that want to capture more high-intent traffic and convert it into real-world outcomes across any industry.

What Makes This Session Different:

You won’t hear guesses or theories. 

You’ll walk away with real data and proven strategies based on how people search, decide, and shop in your area.

Let’s help you drive results for your local and multi-location brick-and-mortar businesses.

Can’t make it live? Sign up anyway, and we’ll send the full recording to your inbox.