Google Says You Don’t Need AEO Or GEO To Rank In AI Overviews via @sejournal, @martinibuster

Google’s Gary Illyes confirmed that AI Search does not require specialized optimization, saying that “AI SEO” is not necessary and that standard SEO is all that is needed for both AI Overviews and AI Mode.

AI Search Is Everywhere

Standard search, in the way it used to be with link algorithms playing a strong role, no longer exists. AI is embedded within every step of the organic search results, from crawling to indexing and ranking. AI has been a part of Google Search for ten years, beginning with RankBrain and expanding from there.

Google’s Gary Illyes made it clear that AI is embedded within every step of today’s search ranking process.

Kenichi Suzuki (LinkedIn Profile) posted a detailed summary of what Illyes discussed, covering four main points:

AI Search features use the same infrastructure as traditional search

  1. AI Search Optimization = SEO
  2. Google’s focus is on content quality and is agnostic as to how it was created
  3. AI is deeply embedded into every stage of search
  4. Generative AI has unique features to ensure reliability

There’s No Need For AEO Or GEO

The SEO community has tried to wrap their minds around AI search, with some insisting that ranking in AI search requires an approach to optimization so distinct from SEO that it warrants its own acronym. Other SEOs, including an SEO rockstar, have insisted that optimizing for AI search is fundamentally the same as standard search. I’m not saying that one group of SEOs is right and another is wrong. The SEO community collectively discussing a topic and reaching different conclusions is one of the few things that doesn’t change in search marketing.

According to Google, ranking in AI Overviews and AI Mode requires only standard SEO practices.

Suzuki shared why AI search doesn’t require different optimization strategies:

“Their core message is that new AI-powered features like AI Overviews and AI Mode are built upon the same fundamental processes as traditional search. They utilize the same crawler (Googlebot), the same core index, and are influenced by the same ranking systems.

They repeatedly emphasized this with the phrase “same as above” to signal that a separate, distinct strategy for “AI SEO” is unnecessary. The foundation of creating high-quality, helpful content remains the primary focus.”

Content Quality Is Not About How It’s Created

The second point that Google made was that their systems are tuned to identify content quality and that identifying whether the content was created by a human or AI is not part of that quality assessment.

Gary Illyes is quoted as saying:

“We are not trying” to differentiate based on origin.”

According to Kenichi, the objective is to:

“…identify and reward high-quality, helpful, and reliable content, regardless of whether it was created by a human or with the assistance of AI.”

AI Is Embedded Within Every Stage Of Search

The third point that Google emphasized is that AI plays a role at every stage of search: crawling, indexing, and ranking.

Regarding the ranking part, Suzuki wrote:

“RankBrain helps interpret novel queries, while the Multitask Unified Model (MUM) understands information across various formats (text, images, video) and 75 different languages.”

Unique Processes Of Generative AI Features

The fourth point that Google emphasized is to acknowledge that AI Overviews does two different things at the ranking stage:

  1. Query Fan-Out
    Generates multiple queries in order to provide deeper answers to queries, using the query fan-out technique.
  2. Grounding
    AI Overviews checks the generated answers against online sources to make sure that they are factually accurate, a process called grounding.

Suzuki explains:

“It then uses a process called “grounding” to check the generated text against the information in its search index, a crucial step designed to verify facts and reduce the risk of AI ‘hallucinations.’”

Takeaways:

AI SEO vs. Traditional SEO

  • Google explicitly states that specialized “AI SEO” is not necessary.
  • Standard SEO practices remain sufficient to rank in AI-driven search experiences.

Integration of AI in Google Search

  • AI technology is deeply embedded across every stage of Google’s organic search: crawling, indexing, and ranking.
  • Technologies like RankBrain and the Multitask Unified Model (MUM) are foundational to Google’s current search ranking system.

Google’s Emphasis on Content Quality

  • Content quality assessment by Google is neutral regarding whether humans or AI produce the content.
  • The primary goal remains identifying high-quality, helpful, and reliable content.

Generative AI-Specific Techniques

  • Google’s AI Overviews employ specialized processes like “query fan-out” to answer queries thoroughly.
  • A technique called “grounding” is used to ensure factual accuracy by cross-checking generated content against indexed information.

Google clarified that there’s no need for AEO/GEO for Google AI Overviews and AI Mode. Standard search engine optimization is all that’s needed to rank across both standard and AI-based search. Content quality remains an important part of Google’s algorithms, and they made a point to emphasize that they don’t check whether content is created by a human or AI.

Featured Image by Shutterstock/Luis Molinero

Google: AI Overviews Drive 10% More Queries, Per Q2 Earnings via @sejournal, @MattGSouthern

New data from Google’s Q2 2025 earnings call suggests that AI features in Search are driving higher engagement.

Google reported that AI Overviews contribute to more than 10% additional queries for the types of searches where they appear.

With AI Overviews now reaching 2 billion monthly users, this is a notable shift from the early speculation that AI would reduce the need to search.

AI Features Linked to Higher Query Volume

Google reported $54.2 billion in Search revenue for Q2, marking a 12% increase year-over-year.

CEO Sundar Pichai noted that both overall and commercial query volumes are up compared to the same period last year.

Pichai said during the earnings call:

“We are also seeing that our AI features cause users to search more as they learn that Search can meet more of their needs. That’s especially true for younger users.”

He added:

“We see AI powering an expansion in how people are searching for and accessing information, unlocking completely new kinds of questions you can ask Google.”

This is the first quarter where Google has quantified how AI Overviews impact behavior, rather than just reporting usage growth.

More Visual, Conversational Search Activity

Google highlighted continued growth in visual and multi-modal search, especially among younger demographics. The company pointed to increased use of Lens and Circle to Search, often in combination with AI Overviews.

AI Mode, Google’s conversational interface, now has over 100 million monthly active users across the U.S. and India. The company plans to expand its capabilities with features like Deep Search and personalized results.

Language Model Activity Is Accelerating

In a stat that received little attention, Google disclosed it now processes more than 980 trillion tokens per month across its products. That figure has doubled since May.

Pichai stated:

“At I/O in May, we announced that we processed 480 trillion monthly tokens across our surfaces. Since then we have doubled that number.”

The rise in token volume shows how quickly AI is being used across Google products like Search, Workspace, and Cloud.

Enterprise AI Spending Continues to Climb

Google Cloud posted $13.6 billion in revenue for the quarter, up 32% year-over-year.

Adoption of AI tools is a major driver:

  • Over 85,000 enterprises are now building with Gemini
  • Deal volume is increasing, with as many billion-dollar contracts signed in the first half of 2025 as in all of last year
  • Gemini usage has grown 35 times compared to a year ago

To support growth across AI and Cloud, Alphabet raised its projected capital expenditures for 2025 to $85 billion.

What You Should Know as a Search Marketer

Google’s data challenges the idea that AI-generated answers are replacing search. Instead, features like AI Overviews appear to prompt follow-up queries and enable new types of searches.

Here are a few areas to watch:

  • Complex queries may become more common as users gain confidence in AI
  • Multi-modal search is growing, especially on mobile
  • Visibility in AI Overviews is increasingly important for content strategies
  • Traditional keyword targeting may need to adapt to conversational phrasing

Looking Ahead

With Google now attributing a 10% increase in queries to AI Overviews, the way people interact with search is shifting.

For marketers, that shift isn’t theoretical, it’s already in progress. Search behavior is leaning toward more complex, visual, and conversational inputs. If your strategy still assumes a static SERP, it may already be out of date.

Keep an eye on how these AI experiences roll out beyond the U.S., and watch how query patterns change in the months ahead.


Featured Image: bluestork/shutterstock

Google Search Central APAC 2025: Everything From Day 1 via @sejournal, @TaylorDanRW

Search Central Live Deep Dive Asia Pacific 2025 brings together SEOs from across the region for three days of insight, networking, and practical advice.

Held at the Carlton Hotel Bangkok Sukhumvit, the event features an impressive speaker lineup alongside structured networking breaks.

Attendees have the chance to meet familiar faces, connect with global SEO leaders, and share ideas on the latest trends shaping our industry.

The conference is split over three days, with each day covering a key part of Google’s processes: crawling, indexing, and serving.

Some of the practical tips that emerged from day one:

  1. Keep building human‑focused content. Google’s models favor natural, expert writing above all.
  2. Optimize for multiple modalities. Make sure images have descriptive alt text, videos have transcripts, and voice search is supported by conversational language.
  3. Monitor crawl budget. Fix 5XX errors promptly and streamline your site’s structure to guide Googlebot efficiently.
  4. Use Search Console recommendations. Non‑expert site owners can benefit from the guided suggestions feature to improve usability and performance.
  5. Stay flexible. Long‑held traffic trends may shift as AI features grow. Past success does not equal future success.

A Pivotal Moment For Search

Mike Jittivanich, director of marketing for South East Asia and South Asia Frontier, set the tone in his keynote by declaring that we’ve reached a pivotal moment in search. He identified three forces at work:

  1. AI innovation that rivals past major shifts such as mobile and social media.
  2. Evolving user consumption patterns, as people expect faster, more conversational ways to find information.
  3. Changing habits of younger generations, who interact with search differently from their parents.

This trio of drivers underlines that past success no longer guarantees future success in search.

As Liz Reid, VP of Search at Google, has put it, “Search is never a solved problem.”

Image from author, July 2025

New formats, from AI Overviews to multimodal queries, must be woven alongside traditional blue links in a way that keeps pace with user expectations.

Gen Z: The Fastest‑Growing Search Demographic

One of the most eye-opening statistics came from a session on generational trends: Gen Z (aged 18-24) is the fastest-growing group of searchers.

Image from author, July 2025

Lens usage alone grew 65% year‑on‑year, with over 100 billion Lens searches so far in 2025. Remarkably, 1 in 5 searches via Lens now carries commercial intent.

Younger users are also more likely to initiate searches in non-traditional ways.

Roughly 10% of their journeys begin with Circle to Search or other AI‑powered experiences, rather than typing into a search box. For SEOs, this means optimizing for image and voice queries is no longer optional.

Why Human‑Centered Content Wins

Across several talks, speakers emphasized that Google’s machine‑learning ranking algorithms learn from content created by humans for humans.

These models understand natural language patterns and reward authentic, informative writing.

In contrast, AI‑generated text occupies its own space in the index, and Google’s ranking systems are not trained on that portion. Gary Illyes explained that:

Our algorithms train on the highest‑quality content in the index, which is clearly human‑created.

For your site, the takeaway is clear: Keep focusing on well‑researched, engaging content.

SEO fundamentals, like clear structure, relevant keywords, and solid internal linking, remain vital.

There is no separate checklist for AI features. If you’re doing traditional SEO well, you’ll naturally be eligible for AI Overviews and AI Mode features.

AI In Crawling And Indexing

Two sessions shed light on how AI is touching the crawling and indexing process:

  • AI Crawl Impact: Sites are seeing increased crawl rates as Googlebot adapts to new AI‑powered features. However, a higher crawl rate does not automatically boost ranking.
  • Status Codes and Crawl Budget: Only server errors (5XX) consume crawl budget; 1XX and 4XX codes do not affect it, though 4XX can influence scheduling and prioritization.

Cherry Prommawin explained that crawl budget is the product of crawl rate limit (how fast Googlebot can crawl) and crawl demand (how much it wants to crawl).

If your site has broken links or slow responses, it may slow down the overall crawling process.

Google Search Is Evolving In Two Ways

Google Search is evolving along two main focus points: the types of queries users can pose and the range of answers Google can deliver.

The Questions Users Can Ask

Queries are becoming longer and more conversational. Searches of five or more words are growing at 1.5X the rate of shorter queries.

Beyond text, users now routinely turn to voice, images, and Circle to Search: For Gen Z, about 10% of journeys start with these AI-powered entry points.

The Results Google Can Provide

AI Overviews can generate balanced summaries when there’s no single “right” answer, while AI Mode offers end‑to‑end generative experiences for shopping, meal planning, and multi‑modal queries.

Google is bringing DeepMind’s reasoning models into Search to power these richer, more nuanced results, blending text, images, and action‑oriented guidance in a single interface.

Image from author, July 2025

LLMs.txt & Robots.txt

Gary Illyes and Amir Taboul discussed Google’s stance on robots.txt and the IETF working group’s proposed LLMs.txt standard.

Much like meta keywords of old, LLMs.txt is not a Google initiative and not seen as beneficial, or something they’re looking to adopt.

Google’s view is that robots.txt remains the primary voluntary standard for controlling crawlers. If you choose to block AI‑specific bots, you can do so in robots.txt, but know that not all AI crawlers will obey it.

AI Features As Extensions Of Search

AI Mode and AI Overviews rely on the exact same crawling, indexing, and serving infrastructure as traditional Search.

Googlebot handles both blue‑link results and AI features, while other crawlers in the same system feed Gemini and large language models (LLMs).

Image from author, July 2025

Every page still undergoes HTML parsing, rendering, deduplication, and statistical models, such as BERT, for understanding and spam detection when it’s time to serve results. The same query‑interpretation pipelines and ranking signals, such as RankBrain, MUM, and other ML models, order information for both classic blue links and AI‑powered answers.

AI Mode and AI Overviews are simply new front-end features built on the familiar Search foundations that SEOs have been optimizing for all along.

Making The Most Of Google Search Console

Finally, Daniel Waisberg led a session on effectively utilizing Search Console in this new era.

Waisberg described Search Console as the bridge between Google’s infrastructure (crawling, indexing, serving) and your site. Key points that came from these sessions included:

  • Data latency: Finalized data in Search Console is typically two days old, based on the Pacific time zone. Partial and near-final data sit behind the scenes and may differ by up to 1%.
  • Feature lifecycle: New enhancements progress from user need to available data, then through design and development, to testing and launch.
  • Recommendations feature: This tool is aimed at users who are not data experts, suggesting actionable improvements without overwhelming them.

By understanding how Search Console presents data, you can diagnose crawl issues, track performance, and identify opportunities for AI-driven features.

That’s it for the end of day one. Watch out tomorrow for our coverage of day two at Google Search Central Live, with more Google insights to come.

More Resources:


Featured Image: Dan Taylor/SALT.agency

Google Shares SEO Guidance For State-Specific Product Pricing via @sejournal, @MattGSouthern

In a recent SEO Office Hours video, Google addressed whether businesses can show different product prices to users in different U.S. states, and what that means for search visibility.

The key point: Google only indexes one version of a product page, even if users in different locations see different prices.

Google Search Advocate John Mueller stated in the video:

“Google will only see one version of your page. It won’t crawl the page from different locations within the U.S., so we wouldn’t necessarily recognize that there are different prices there.”

How Google Handles Location-Based Pricing

Google confirmed it doesn’t have a mechanism for indexing multiple prices for the same product based on a U.S. state.

However, you can reflect regional cost differences by using the shipping and tax fields in structured data.

Mueller continued:

“Usually the price difference is based on what it actually costs to ship this product to a different state. So with those two fields, maybe you could do that.”

For example, you might show a base price on the page, while adjusting the final cost through shipping or tax settings depending on the buyer’s location.

When Different Products Make More Sense

If you need Google to recognize distinct prices for the same item depending on state-specific factors, Google recommends treating them as separate products entirely.

Mueller added:

“You would essentially want to make different products in your structured data and on your website. For example, one product for California specifically, maybe it’s made with regards to specific regulations in California.”

In other words, rather than dynamically changing prices for one listing, consider listing two separate products with different pricing and unique product identifiers.

Key Takeaway

Google’s infrastructure currently doesn’t support state-specific price indexing for a single product listing.

Instead, businesses will need to adapt within the existing framework. That means using structured data fields for shipping and tax, or publishing distinct listings for state variants when necessary.

Hear Mueller’s full response in the video below:

Do We Need A Separate Framework For GEO/AEO? Google Says Probably Not via @sejournal, @TaylorDanRW

At Google Search Central Live Deep Dive Asia Pacific 2025, Cherry Prommawin and Gary Illyes led a session on how AI fits into Search.

They asked whether we need separate frameworks for Generative Engine Optimization (GEO) and Answer Engine Optimization (AEO).

Their insights suggest that GEO and AEO do not require wholly new disciplines.

Photo taken by author, Search Central Live Deep Dive Asia Pacific, July 2025

AI Features Are Just Features

Cherry Prommawin explained that AI Mode, AI Overviews, Circle to Search, and Lens behave like featured snippets or knowledge panels.

These features draw on the same ranking signals and data sources as traditional Search.

They all run on Google’s core indexing and ranking engine without requiring a standalone platform. Adding an AI component is simply a matter of introducing extra interpretation layers.

Gary Illyes emphasized that both AI-driven tools and classic Search services share a single, unified infrastructure. This underlying infrastructure handles indexing, ranking, and serving for all result types.

AI Mode and AI Overviews are just features of Search, and built on the same Search infrastructure.

Deploying new AI capabilities means integrating additional models into the same system. Circle to Search and Lens simply add their query-understanding modules on top.

Crawling

All the AI Overviews and AI Mode features rely on the same crawler that powers Googlebot. This crawler visits pages, follows links, and gathers fresh content.

Gemini is treated as a separate system within Google’s crawler ecosystem and uses its own bots within Google’s ecosystem to feed data into its models.

Indexing

In AI Search, the core indexing process mirrors the methods used for traditional search. Pages that have been crawled are analyzed and organized into the index, then statistical models and BERT are applied to refine that data.

These statistical models have been in use for more than 20 years and were first created to support the “did you mean” feature and help catch spam.

BERT adds a deeper understanding of natural language to the mix.

Photo taken by author, Search Central Live Deep Dive Asia Pacific, July 2025

Serving

Once the index is built, the system must interpret each user query. It looks for stop words, identifies key terms, and breaks the query into meaningful parts.

The ranking phase then orders hundreds of potential results based on various signals. Different formats, such as text, images, and video, carry different weightings.

RankBrain applies machine learning to adjust those signals while MUM brings a multimodal, multitask approach to understanding complex queries and matching them with the best possible answers.

What This Means: Use The Same Principles From SEO

Given the tight integration of AI features with standard Search, creating distinct GEO or AEO programs may duplicate existing efforts.

As SEOs, we should be able to apply existing optimization practices to both AI Search and “traditional” Search products. Focusing on how AI enhancements fit into current workflows lets teams leverage their expertise.

Spreading resources to build separate frameworks could pull attention away from higher-impact tasks.

Cherry Prommawin and Gary Illyes concluded their session by reinforcing that AI is another feature in the Search product.

SEO professionals can continue to refine their strategies using the same principles that guide traditional search engine optimization.

More Resources:


Featured Image taken by author

Pew Research Confirms Google AI Overviews Is Eroding Web Ecosystem via @sejournal, @martinibuster

Pew Research Center tracked real web browsing behavior and confirmed what many publishers and SEOs have claimed: AI Overviews does not send traffic back to websites. The results show that the damage caused by AI summaries to the web ecosystem is as bad as or worse than is commonly understood.

Methodology

The Pew Research study tracked over 900 adults who consented to installing an online browsing tracker to record their browsing behavior in the month of March 2025. The dataset contains 68,879 unique Google search queries, and a total of 12,593 queries triggered an AI summary.

Confirmed: Google AI Search Is Eroding Referral Traffic

The tracked user data confirms publisher complaints about a drop in referral traffic caused by AI search results. Google users who encounter an AI search result are less likely to click on a link and visit a website than users who see only a standard search result.

Only 8% of users who encountered an AI summary clicked a link (in the AI summary or the standard search results) to visit a website. Users who only saw a standard search result tended to click to visit a website 15% of the time, nearly twice as many as users who viewed an AI summary.

Users rarely click a link within an AI summary. Only 1% of users clicked an AI summary link and visited a website.

AI Summaries Cause Less Web Engagement

In a recent interview, Google’s CEO Sundar Pichai pushed back on the notion that AI summaries have a negative impact on the web ecosystem. He said that the fact that there is more content being created on the web than at any other time is proof that the web ecosystem is thriving. He said that

“So, generally there are more web pages… I think people are producing a lot of content, and I see consumers consuming a lot of content. We see it in our products.”

Pichai also insisted that people are consuming content across multiple forms of content (video, images, text) and that publishers today should be presenting content within more than just one format.

However, contrary to what Google’s CEO said, AI is not encouraging users to consume more content, it’s having the opposite effect. The Pew research data shows that AI summaries cause users to engage less with web content.

According to the research findings:

Users End Their Browsing Session

“Google users are more likely to end their browsing session entirely after visiting a search page with an AI summary than on pages without a summary.

This happened on 26% of pages with an AI summary, compared with 16% of pages with only traditional search results.”

Users Refrain From Clicking On Traditional Search Links

It also says that users tended to not click on a traditional search result when faced with an AI summary:

“Users who encountered an AI summary clicked on a traditional search result link in 8% of all visits. Those who did not encounter an AI summary clicked on a search result nearly twice as often (15% of visits).”

Only 1% Click Citation Links In AI Summaries

Users who see an AI summary overwhelmingly do not click the citations to the websites that the AI summary links to.

The report shows:

“Google users who encountered an AI summary also rarely clicked on a link in the summary itself. This occurred in just 1% of all visits to pages with such a summary.”

This confirms what publishers and SEOs have been saying to Google over and over again: Google AI Overviews robs publishers of referral traffic. Rob is a strong word but given the context that Google is using web content to “synthesize” an answer to a search query that does not result in a referral click, the word “rob” is what inevitably comes to mind to a publisher or SEO who worked hard to create the content.

Another startling fact shared in research is that almost 66% of users either browsed somewhere else on Google or completely bailed on Google without clicking a link to visit a website. In other words, nearly 66% of Google’s users do not click a link to visit the web ecosystem.

The report explains:

“…the largest share of Google searches in our study resulted in the user either browsing elsewhere on Google or leaving the site entirely without clicking a link in the search results. Around two-thirds of all searches resulted in one of these actions.”

Wikipedia, YouTube And Reddit Dominate Google Searches

Google has been holding publisher events and Search Central Live events all around the world to listen to publisher feedback and to promise that Google will work harder to surface a greater variety of content. I know that the Googlers at these events are not lying, but those promises of surfacing more high-quality content are subverted by the grim facts presented in the Pew research of actual users.

One of the biggest complaints is that Reddit and Wikipedia dominate the search results. The research validates publisher and SEO concerns because it shows that not only are Reddit and Wikipedia the most commonly cited websites, but Google’s own YouTube ranks among the top three most cited web destinations.

The report explains:

“The most frequently cited sources in both Google AI summaries and standard search results are Wikipedia, YouTube and Reddit. These three sites are the most commonly linked sources in AI summaries and standard search results alike.

Collectively, they accounted for 15% of the sources that were listed in the AI summaries we examined. They made up a similar share (17%) of the sources listed in standard search results.”

The report also shows:

  • “Wikipedia links are somewhat more common in AI summaries than in standard search pages”
  • “YouTube links are somewhat more common in standard search results than in AI summaries.”

These Are The Facts

Pew Research’s study of over 68,000 search queries from the browsing habits of over 900 adults reveals that Google’s AI summaries sharply reduce clicks to websites, with just 8% of users clicking any link and only 1% engaging with citations in AI answers.

Users encountering AI summaries are more likely to end their sessions or stay within Google’s ecosystem rather than visiting independent websites. This confirms publisher and SEO concerns that AI-driven search erodes web traffic and concentrates attention on a few dominant platforms like Wikipedia, Reddit, and YouTube.

These are the facts. They show that SEOs and publishers are right that AI Overviews is siphoning traffic out of the web ecosystem.

Featured Image by Shutterstock/Asier Romero

AI Search is Here: Make Sure Your Brand Stands Out In The New Era Of SEO [Webinar] via @sejournal, @lorenbaker

Wish you could control what AI says about your brand?

You’re not alone. 

As generative search becomes the default for tools like ChatGPT, Gemini, and Claude, fewer people are clicking through to traditional search results. If your content isn’t part of their training data or grounding sources, it’s effectively invisible.

And that means one thing: you’re no longer just optimizing for humans or search engines. You’re optimizing for machines that summarize the internet.

Introducing Generative Engine Optimization (GEO)

In this tactical webinar, we’ll break down what it takes to get your brand cited, linked, and quoted in AI-generated content, intentionally.

You’ll discover:

  • How to show up in AI search results.
  • Ways to increase your AIO (AI Overview) brand presence.
  • Proven SEO & GEO workflows you can copy today.

Learn How To Influence LLMs

This isn’t theory. We’ll walk through the specific strategies SEOs and marketers are using right now to shape what language models say, and don’t say, about their brands.

Expect insights on:

  • How foundational training data is gathered (and how you might influence it).
  • The role of search and retrieval-based answers (RAG) in real-time LLM responses.
  • What makes content “quotable” to machines, and what gets ignored.

Stay Visible As AI Search Becomes The Default

AI search isn’t coming. It’s here. And it’s rewriting how visibility works.

In this session, you’ll learn:

  • Why traditional SEO tactics still matter (especially for citation).
  • How query fanout and grounding shape which documents LLMs pull from.
  • Which formats and language structures improve your chances of being cited.

This is for SEOs, content strategists, and marketing leads who want to stay relevant as AI redefines the playing field.

Why This Webinar Is A Must-Attend

Whether you’re refining your search strategy or trying to future-proof your brand visibility, this session offers high-ROI insights you can apply immediately.

✅ Actionable examples

✅ Real-world GEO workflows

✅ Early looks at emerging standards like MCP, A2A, and llms.txt

📍 Designed for experienced marketers ready to lead change.

Reserve Your Spot Or Get The Recording

🛑 Can’t make it live? No problem. Register anyway, and we’ll send you the full recording so you don’t miss a thing.

Sites Have Little Control in Google SERPs

Over the years, Google has limited how websites can control their appearance in search results.

Here’s what sites cannot control in Google search.

Sitelinks

For some searches, especially involving brand names, Google shows links below the listing title. These are called sitelinks. Unfortunately, Google’s algorithm often displays sitelinks that are irrelevant or unimportant to the site’s business.

Owners have no control over these URLs. The only methods to remove a sitelink are to delete the page or add the noindex meta tag, but both would also remove the page from all Google searches.

Here are sitelinks for a “Practical Ecommerce” query:

Screenshot of Google sitelinks for Practical Ecommerce

Websites have little control over sitelinks, such as this example for Practical Ecommerce.

Listing title

The listing title is the most prominent section of a search snippet and largely influences the number of clicks. Google used to display only a page’s title tag for the listing.

A few years ago, however, Google began displaying titles based on search queries, for relevance. The result is often fewer clicks.

There’s no way to stop Google from rewriting a page title. Using an HTML title as an H1 heading increases the likelihood that Google will use it, in my experience, as it aligns the listing title with what searchers would see on the subsequent page.

Google now decides SERP listing titles based on the query, such as “how to build a website.”

Listing description

A page’s HTML meta description summarizes its content. Google has long considered meta descriptions as hints rather than directives. It displays meta descriptions only if relevant to the query.

Websites can influence listing descriptions, which appear below the title, by including on a page summary paragraphs, conclusions, and short answers. Depending on the query, Google could display part of those sections in a description.

Otherwise, sites have no control over the SERP snippet’s description.

A listing may or may not use the page’s HTML meta description.

AI Overviews

Google’s AI Overviews are artificial intelligence-generated answers on top of search results.

AI Overviews typically satisfy searchers’ needs, thereby eliminating the need to click. Hence many site owners prefer Google not to use their content in AI Overviews. I know no way to block Google from using a site’s content in AI Overviews while still indexing it for conventional SERPs.

Google’s Extended directive in a site’s robots.txt file blocks Gemini but not AI Overviews. A nosnippet meta tag will likely block AI Overviews, as well as all SERPs snippet descriptions.

AI Overviews typically satisfy searchers’ needs, thereby eliminating the need to click. This example is for the query “how to build a website.”

Featured snippets

Featured snippets used to appear at the top Google SERPs to provide quick answers to a query. They now appear in the middle of SERP pages, if at all, given the rise of AI Overviews.

Featured snippets typically decrease the number of clicks to a linked URL. Websites have no control over appearing in a featured snippet or its content.

A nosnippet meta tag instructs search engines not to display a page in featured snippet, but it also removes descriptions from the page’s non-featured listing.

A well-structured page — short FAQs, HTML headings, concise summaries — can influence the contents of a featured snippet, but there’s no guarantee.

In short, Google is reducing websites’ control over SERPs as it prioritizes what searchers seek.
Sites can influence their SERP appearance by focusing on concise content, well-structured pages, and appropriate headings.

Operationalizing Your Topic-First SEO Strategy via @sejournal, @Kevin_Indig

Last week, I walked through the shift from keyword-first to topic-first SEO – and why that mindset change matters more than ever for long-term visibility in both search and large language models (LLMs).

This week, we’re getting tactical. Because understanding the shift is one thing, operationalizing it across your team is another.

In this issue, Amanda and I are breaking down:

  • How to build and use a topic map and matrix (with a map template for premium readers).
  • Why a deep understanding of your audience is crucial to true topical depth.
  • Guidance for internal + external linking by topic (with tool recommendations).
  • For premium readers: Practical advice on measuring SEO performance by topic.

If you’re trying to build durable organic visibility and authority for your brand – and not just chase hacks for AI overviews – this is your blueprint.

Image Credit: Kevin Indig

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

How To Operationalize A Topic-First SEO Strategy

Last week, we covered how you need to shift from keywords to topics (if you haven’t already).

But what if you’re not quite sure how to operationalize this approach across your team?

Let’s talk about how to do that.

To earn lasting visibility – and not short-term visibility bought by hacky LLM visibility tricks – your brand needs to signal to search engines and LLMs that it’s an authority in topics related to your offerings for the intended audience you serve.

You’ll do this by:

  1. Building a map of your parent topics.
  2. Using audience research and personas as lenses to create content through.
  3. Expanding with subtopics and “zero-volume” content creation, because fringe content adds depth.
  4. Optimizing both your internal and external links with a topic-first approach.

Build A Map Of Your Parent Topics

First up, you need to build your topic map.

(You know, if you don’t already have an old doc or spreadsheet out there collecting dust, buried in your Google Drive, with your core topic pillars and subtopics already stored.)

This is the first step in building a thorough persona-based SEO topic matrix.

A topic matrix is a strategic framework that compiles your brand’s key topics, subtopics, and content formats needed to comprehensively cover a subject area for search visibility.

It helps align content with user intent, target personas, and search visibility opportunities, creating a roadmap for developing topical authority and minimizing keyword cannibalization.

If you haven’t built one before, this is going to look different from keyword lists of the past, and it might be organized like this:

Image Credit: Kevin Indig

Amanda interjecting here: Even if you have built one before, stick with us. We’ve got a visual for you below that will help communicate to stakeholders how/why a topic-first approach matters to earning visibility and authority for your brand’s core offerings. Plus, premium subscribers get the ready-to-go template.

Later, once your topic matrix is complete, you’ll use your keyword universe to select priority keywords to pair with your overall topic and individual pages.

Instead of living in keyword lists, you’ll live in a topic map, prioritizing meeting the needs of separate personas or ideal customer profiles (ICPs) in your target audience, and later pairing search queries that best help the people you serve find you.

To start building a list of your parent topics, you need to:

  • Outline the exact topics your brand needs to own. This is where you start. (And many of you reading this already have this locked in.).
  • Inventory your existing content: What topics do you cover already? What topics do we actually need to cover? Where are the gaps? Which ones convert the best?
  • Make sure you log all your core offerings (i.e., features, services, core products) as topics or subtopics themselves.

These are the “buckets” under which all other content should logically live (regardless of the persona, funnel stage, or search intent you’re optimizing for).

Think of them as your brand’s semantic backbone, so to speak … these are the foundational topics that every page ultimately ladders up to.

Here’s how to determine them:

1. Start with your offerings.

  • What services do you provide?
  • What features or products do you sell?
  • What problems do you solve?

2. Group offerings into themes.

  • Which of those offerings can be grouped under a broader topic?
  • What high-level conversations do your users consistently return to?

3. Refine for relevance.

  • You’re aiming for topics broad enough to support many subtopics, but specific enough to reflect your unique authority in your area of expertise.

Let’s look at an example of a fictional DTC brand that also offers some B2B services: Kind Habitat. (Needs a better name, but let’s move on. 😆)

Kind Habitat offers eco-friendly home furnishings and sustainable materials via a small ecommerce store as well as residential and commercial interior design services.

Let’s say its target audience includes homeowners, renters, residential and commercial property managers, as well as both residential builders and designers that focus on sustainability and eco-friendly values.

With that in mind, its ecommerce products and design services could all be mapped to five simplified but distinct core topics:

  • Sustainable interior design.
  • Eco-friendly building materials.
  • Zero-waste living.
  • Sustainable furniture shopping.
  • Green home upgrades.

Every piece of content they create should tie back to one or more of these core topics, and that ensures the site builds deep, durable authority in its niche.

(And keep in mind, this is a simplified example here. You might have up to 10 parent topics … or more, depending on the breadth of your offerings or expertise areas.)

Next up, you’re going to work to expand your topic map, starting with audience research.

Use Audience Research And Personas

Here’s where those personas your brand invested so heavily in come into play.
You’ll need to map out (1) who you’re solving problems for and (2) how their queries change based on unique persona, intent, audience type, or industry sector.

But how do you know if you’ve identified the right people (personas) and their queries?

You can spend tens of thousands investing in deep buyer persona market research.

But if your resources are limited, talk to your sales team. Talk to your customer care team. And (gasp) talk to your customers and/or leads who didn’t buy from you.

And if you’re just starting out and don’t have sales or customer teams in place, have your founder dig into their email inbox, LinkedIn DMs, etc., and mine for information.

As Spartoro’s Amanda Natividad states in “How to Turn Audience Research Into Content Ideas” (a great read, btw):

Questions are content gold. Each question represents an information gap you can fill with valuable content. [1]

Then, your job is to take the collected information gaps and fold them into your overall topic matrix.

Keep in mind, though, when optimizing for your core topics, you’ll also need to target different intents across the topic and the funnel via different perspectives, painpoints, and viewpoints (a.k.a. “ranch style SEO”).

Here’s an exciting bonus to investing in this approach: Persona-aligned content that offers deep topic coverage and unique perspectives can bring natural information gain to the overall topical conversation.

I (Kevin) opened up this discussion on topics vs. keywords over on LinkedIn, and I have to say, Tommy Walker gives an excellent example of how he thinks about this topic expansion in the thread:

Screenshot from LinkedIn, July 2025 (Image Credit: Kevin Indig)

Your topics can be expanded exponentially in many directions, based on the people you’re creating content for and the problems they have:

People:

  • Core audiences.
  • Crafted personas.
  • Multiple sectors (if applicable to your product or service).

Problems:

  • Core problem/needs your brand solves for each audience.
  • Unique problems experienced by each persona that your brand solves.
  • Core problems unique to multiple sectors (and in the language of those sectors).

Let’s circle back to our fictional example with Kind Habitat, that sustainable interior design firm with a quickly-made-up name and a mini ecommerce store.

Here’s what their “people and problems” that they’d optimize their core topics for would look like:

People:

  • Core audiences: Homeowners, renters, property managers, builders, designers.
  • Crafted personas:
    • Homeowner: Stan, 45, high-income earner, second-time homeowner in suburban area, looking to renovate sustainably.
    • Renter: Nicole, 31, mid-income earner, long-term rent-controlled apartment in a big city with values of sustainability, who is researching sustainable home decor and design.
    • Property Manager: Quinn, 25, mid-income earner, entry-level property manager for small local firm that values zero-waste construction and sustainable renovations.
    • Builder: JP, 57, high-income earner, owns sustainable building firm, seeking zero-waste, low-toxin approach to new builds and prioritizing energy-efficient design in luxury homes.
    • Designer: Sydney, 29, mid-income earner, junior to mid-level associate at a commercial interior design firm seeking both products and plans for sustainable furnishings and design.
  • Multiple sectors (if applicable to your product or service): Residential real estate, property managers for multi-family housing, real estate portfolios, or commercial real estate, sustainable building firms, individual homeowners, and renters interested in sustainable design.

Keep in mind, you could fan out your audience even further with three to five individual audience personas under each audience type.

And once your audience data is finally ready to go, you’d then expand into the problems faced by each audience, persona, and sector across each targeted topic.

Once you have your core topics covered (and have addressed your core features, offerings, services, audience pain points, and organic audience questions, etc.), you’d expand even further into content that offers unique perspectives, hot takes, and even digs into current events related to your industry or product/services.

That’s … a lot of content.

Using Amanda’s topic map visual, here’s what it could look like … for just one parent topic.

You could just keep going. For-ev-er.

(But your content doesn’t have to. If you establish your brand as an authority by publishing content with depth of coverage and information gain baked in, you can accomplish a lot with a tight, well-developed library of pages.)

Here’s what I’d recommend if you have the team members or freelancers on hand:

  • Assign specific team members or freelancers to cover core topics. Essentially, you’d have trained writer-SMEs for each major topic you’d like to target across your strategy. That way, content can be produced more accurately … and faster.
  • Divvy up work based on personas. If you have multiple audience types, like the Kind Habitat example, assign production to your team based on different personas/audiences, so your content producers can hone in on the needs of – and the way they speak to – each persona.
  • Use AI to scale topic coverage while tailoring to persona type. A tool like AirOps can help you build out workflows based on specific topics and specific personas; that way, you’re creating iterations of core pieces of work geared toward the specific needs, pain points, and problems of each industry sector, persona, etc.
  • When refreshing older content to combat content decay, refresh by topics. Don’t just refresh one page that has experienced a decline. Work on keeping content decay in check by refreshing subtopics/clusters as a whole whenever possible. Assign one producer/individual contributor to work on the cluster of related pages.

Expand With Subtopics, Because Fringe Content Adds Depth

Once you’ve mapped your audience and their problems across your core topics, you need to expand your coverage with subtopics, especially the ones that live on the edges and directly speak to your target ICPs.
This is the kind of content that rarely shows up in a traditional keyword list, although you can definitely map specific keywords and intents to these pages in order to adjacently optimize for organic visibility.

However, you won’t always have a clear “search volume” number for this type of content.

Sometimes this content is going to be messy. Sometimes it’s going to be weird.

You need to thoroughly know your core audience and understand their most pressing needs and questions that you can solve for. (Even the fringe ones.)

But this “fringe content” is what makes your site actually helpful, authoritative, and hard to replicate.

Think of it this way: The best organic search strategies don’t just optimize for the top 10 questions on a topic – they anticipate the next 100.

They dig into the side doors, caveats, gotchas, exceptions, industry language quirks, and debates.

You must go beyond building clusters and instead build context for your brand within your targeted topic.

Here’s where to look when expanding with meaningful subtopics:

  1. Sales calls with leads, customer care questions, and actual customer interviews: There’s a gold mine here, and every brand has it. (Yes, even yours.) Use it to your advantage. I recommend tools like Gong/Chorus + Humata AI to help.
  2. Reddit + Quora discussions: Look for questions that no one has great concrete answers to or resources/solutions for. Use a tool like Gummy Search to streamline this research.
  3. Context that will build out your topic environment: You’re not just building a tidy cluster with “best X tools,” “top tools for Y,” and “X vs Y.” Ask: What misconceptions need to be cleared up? What advanced tips only experts talk about when they talk shop? Lean on your internal SMEs, or invest in paying SMEs hourly, getting connected to them via platforms like JustAnswer.
  4. Wikipedia table of contents and footnotes: While this might initially sound like strange guidance, if you truly feel you’ve covered your core topics for all your ICPs from multiple perspectives and for all their common pain points, this approach can help you branch out into connected subtopics. Caveat: Of course, don’t invest in covering subtopics that don’t matter to your ICPs … or angles they already understand thoroughly. (This research is very manual. If you have a workaround you’d suggest, send it my way.)
  5. People Also Ask questions in the SERP: Keep these in mind: They still exist for a reason. Use your standard SEO tools like Semrush, Ahrefs, etc., to explore these within your topic.

So, with topic-first optimization at the center, should you be organizing your internal links by topic instead of just navigation structure or blog recency?

Um, yes – definitely. And if you weren’t doing that already, the time to start is now.

Topic-based internal linking is one of the most powerful (and underutilized) ways to reinforce topical authority.

Most content teams default to one of two internal linking strategies:

  1. Navigation-based linking: whatever shows up in your menu or footer.
  2. Date-based linking: linking to “recent posts” regardless of topic relevance.

The problem? These methods serve the convenience of the content management system (CMS), not the reader or search engine.

A topic-first internal linking strategy intentionally:

  • Connects all relevant pages under a single topic or persona target.
  • Links related subtopics together to increase crawl depth and surface additional value.
  • Boosts orphaned or underperforming assets with contextually relevant links.

You can simplify this task with an SEO tool like ClearscopeSurferAhrefs, etc. (For convenience, the pages explaining how these features work per tool are linked here.)

For example, tools like these surface internal linking opportunities within the pages you’re monitoring within the tool. The feature then gives you clear related anchor text on where to add the URLs specifically.

The manual part? Having your content producers or SEO analysts determine if the tool’s suggested page is in the right topic cluster to warrant an anchor link. (But you can also set up topic clusters/content segments within tools like Clerascope that can help guide your producers.)

Used with permission from 4aGoodCause, a top monthly giving platform for nonprofits. (Link)

But you should be employing a topic-based backlink strategy, too.

You don’t just want backlinks. You want links that have authority in your target topics and/or with your audience.

For instance, our example from earlier, Kind Habitat, doesn’t need low-quality backlinks from around the globe to build topical authority in the sustainable interior design niche.

This brand needs to invest in backlinks that include:

  • High-authority sites in similar topics, like ThisOldHouse.com, MarthaStewart.com, Houzz.com, and HomeAdvisor.com.
  • Local and regional publications for this brand’s service areas.
  • Manufacturers of sustainable, low-toxin home building products and materials.
  • Professional associations for interior designers, builders, and property managers who value sustainable and green design.

Here’s the payoff of taking a topic-first approach: Once you shift your strategy to cover core topics deeply – across the right audience segments and intent layers – you unlock a Topical Authority Flywheel.

Here’s how it works:

Better coverage → Better engagement and organic links → Better visibility across more queries.

Image Credit: Kevin Indig

When your site deeply addresses a topic, you not only become more useful to your audience, but you also are more visible to search engines and LLMs.

You build the kind of brand context that LLMs surface and that Google’s evolving AI-driven results reward.

And yes, it’s measurable.

Track your performance by topic, not just by page or keyword.

If you’ve mapped and organized your content well, you can group related URLs and monitor how the topic as a whole performs:

  • Watch how refreshed or expanded topic clusters improve in average rank, CTR, and conversions over time.
  • Look for early signals of lift within the first 10-30 days after refreshing or publishing a comprehensive set of content on a given topic.
  • Monitor link velocity. Strong topic clusters reap rewards.

Operationalizing a topic-first approach isn’t just about traffic.

It’s about building a defensible edge in search/LLM visibility by doing the thing many brands still are missing out on: going deep, not wide.


Featured Image: Paulo Bobita/Search Engine Journal

Google Says It Could Make Sense To Use Noindex Header With LLMS.txt via @sejournal, @martinibuster

Google’s John Mueller answered a question about llms.txt related to duplicate content, stating that it doesn’t make sense that it would be viewed as duplicate content, but he also stated it could make sense to take steps to prevent indexing.

LLMs.txt

Llms.txt is a proposal to create a new content format standard that large language models can use to retrieve the main content of a web page without having to deal with other non-content data, such as advertising, navigation, and anything else that is not the main content. It offers web publishers the ability to provide a curated, Markdown-formatted version of the most important content. The llms.txt file sits at the root level of a website (example.com/llms.txt).

Contrary to some claims made about llms.txt, it is not in any way similar in purpose to robots.txt. The purpose of robots.txt is to control robot behavior, while the purpose of llms.txt is to provide content to large language models.

Will Google View Llms.txt As Duplicate Content?

Someone on Bluesky asked if llms.txt could be seen by Google as duplicate content, which is a good question. It could happen that someone outside of the website might link to the llms.txt and that Google might begin surfacing that content instead of or in addition to the HTML content.

This is the question asked:

“Will Google view LLMs.txt files as duplicate content? It seems stiff necked to do so, given that they know that it isn’t, and what it is really for.

Should I add a “noindex” header for llms.txt for Googlebot?”

Google’s John Mueller answered:

“It would only be duplicate content if the content were the same as a HTML page, which wouldn’t make sense (assuming the file itself were useful).

That said, using noindex for it could make sense, as sites might link to it and it could otherwise become indexed, which would be weird for users.”

Noindex For Llms.txt

Using a noindex header for the llms.txt is a good idea because it will prevent the content from entering Google’s index. Using a robots.txt to block Google is not necessary because that will only block Google from crawling the file which will prevent it from seeing the noindex.

Featured Image by Shutterstock/Krakenimages.com