WordPress Takes Steps To Integrate AI via @sejournal, @martinibuster

WordPress announced the formation of an AI Team that will focus on coordinating the development and integration of AI within the WordPress core. The team is to function similarly to the Performance Team, focusing on developing canonical plugins that users can install to test new functionality before a decision is made about whether or how to integrate new functionalities into the WordPress core itself.

The goal for the team is to help create a strategic focus, rapid testing to deployment and to provide a centralized location for collaborating on ideas and projects.

The team will include two Google employees, Felix Arntz and Pascal Birchler. Arntz is a Senior Software Engineer at Google who contributes to the WordPress core and to other WordPress plugins and has worked as a lead for the Performance Team.

Pascal Birchler, a Developer Relations Engineer and WordPress core committer, recently led a project to integrate the Model Context Protocol (MCP) with WordPress via WP-CLI.

The WordPress announcement called it an important step:

“This is an exciting and important step in WordPress’s evolution. I look forward to seeing what we’ll create together and in the open.”

WordPress First Steps On Path Blazed By Competitors

The formation of an AI team is long overdue, as even the new open source Drupal CMS designed to provide an easy to use interface for marketers and creators has AI-powered features built-in. Third-party proprietary CMS provider Wix already and shopping platform Shopify have both integrated AI into their user’s workflows.

Read the official WordPress announcement:

Announcing the Formation of the WordPress AI Team

Featured Image by Shutterstock/Hananeko_Studio

WordPress Unpauses Development But Has It Run Out Of Time? via @sejournal, @martinibuster

Automattic announced that it is reversing its four-month pause in WordPress development and will return to focusing on the WordPress core, Gutenberg, and other projects. The pause in contributions came at a critical moment, as competitors outpaced WordPress in ease of use and technological innovation left the platform behind.

Did WordPress Need A Four-Month Pause?

Automattic’s return to normal levels of contributions were initially contingent on WP Engine withdrawing their lawsuit against Automattic and Mullenweg, with the announcement stating:

“We’re excited to return to active contributions to WordPress core, Gutenberg, Playground, Openverse, and WordPress.org when the legal attacks have stopped.”

WP Engine and Automattic are still locked in litigation, so what changed?

Automattic suggests that it has reconsidered its place as the future of content management:

“After pausing our contributions to regroup, rethink, and plan strategically, we’re ready to press play again and return fully to the WordPress project.

…We’ve learned a lot from this pause that we can bring back to the project, including a greater awareness of the many ways WordPress is used and how we can shape the future of the web alongside so many passionate contributors. We’re committed to helping it grow and thrive…”

Automattic’s announcement suggests that they realized moving forward with WordPress is important despite continued litigation.

But did Automattic really need a four-month pause to come to that realization?

Where Did The WordPress Money Go?

And it’s not like Automattic was hurting for money to throw at WordPress. Salesforce Ventures invested $300 million dollars into Automattic in 2019 and an elated Mullenweg wrote that this would enable them to almost double the pace of innovation for WP.com, their enterprise offering WordPress VIP, WooCommerce, Jetpack, and increase resources to WordPress.org and Gutenberg.

Mullenweg wrote:

“For Automattic, the funding will allow us to accelerate our roadmap (perhaps by double) and scale up our existing products—including WordPress.com, WordPress VIP, WooCommerce, Jetpack, and (in a few days when it closes) Tumblr. It will also allow us to increase investing our time and energy into the future of the open source WordPress and Gutenberg.”

In the years immediately following the $300 million investment, updates to WooCommerce increased by 47.62% and as high as 80.95% and just a little bit higher for the year 2024. Jetpack continued at an average release schedule of 7 updates per year although it shot up to 22 updates in 2024. The enterprise level WordPress VIP premium service may have also benefited (changelog here).

Updates to the WordPress Core remained fairly unchanged according to the official release announcements and the pace of Gutenberg releases also followed a steady pace, with no significant increases.

List of number of WordPress release announcements per year:

  • 2019 – 29 announcements
  • 2020 28 announcements
  • 2021 26 announcements
  • 2022 27 announcements
  • 2023 26 announcements
  • 2024 30 announcements
  • 2025 9 announcements

All the millions of dollars invested in Automattic, along with any other income earned, had no apparent effect on the pace of innovation in the WordPress core.

Survival Of The Fittest CMS

A positive development from Automattic’s pause to rethink is the announcement of a new AI group, modeled after their Performance group. The new team is tasked with coordinating AI initiatives within WordPress’ core development. Like their Performance group, the new AI group was formed after their competitors had outpaced them, so WordPress is once again late in adapting to user needs and the fast pace of technology.

Matt Mullenweg struggled to answer where WordPress would be in five years when asked at the February 2025 WordCamp Asia event. He asked someone from Automattic to join him on stage to answer the question, but that other person also couldn’t answer because there was, in fact, no plan or idea other than the short-term roadmap focused on the immediate future.

Mullenweg explained the lack of a long-term vision as a strategic decision to remain adaptable to the fast pace of technology:

“Outside of Gutenberg, we haven’t had a roadmap that goes six months or a year, or a couple versions, because the world changes in ways you can’t predict.

But being responsive is, I think, really is how organisms survive.

You know, Darwin, said it’s not the fittest of the species that survives. It’s the one that’s most adaptable to change. I think that’s true for software as well.”

That’s a somewhat surprising statement, given that WordPress has a history of being years late to prioritizing website performance and AI integration. Divi, Elementor, Beaver Builder, and other WordPress editing environments had already cracked the code on democratizing web design in 2017 with block-based, point-and-click editors when WordPress began their effort to develop their own block-based editor.

Eight years later, Gutenberg is so difficult for many users that the official Classic Editor plugin has over ten million installations, and advanced web developers prefer other, more advanced web builders.

Takeaways:

  • Automattic’s Strategic Reversal
    Automattic reversed its pause on WordPress contributions despite unresolved litigation with WP Engine, perhaps signaling a change in internal priorities or external pressures.
  • Delayed Response to AI Trends
    A new AI group has been formed within WordPress core development, but this move comes years after competitors embraced AI—suggesting a reactive rather than proactive strategy.
  • Lack of Long-Term Vision
    WordPress leadership admits to having no roadmap beyond the short term, framing adaptability as a strength even as the platform lags in addressing user needs and keeping up with technological trends.
  • Minimal Impact from Major Investments
    Despite receiving hundreds of millions in funding, core WordPress and Gutenberg development showed no significant acceleration, raising questions about where investment actually went.
  • Usability and Competitive Lag
    Gutenberg arguably struggles with usability, as shown by the popularity of the Classic Editor plugin and user preference for third-party builders.
  • WordPress at a Competitive Disadvantage
    WordPress now finds itself needing to catch up in a CMS market that has evolved rapidly in both ease of use and innovation.

The bottom line is that the pace of development for the WordPress core and Gutenberg remained steady after the 2019 investment, and after all of the millions of dollars that Automattic received from companies like Newfold Digital, sponsored contributions, and volunteer contributions from individuals themselves, the effect on the speed of development and innovation maintained the same follow-the-competitors-from-behind pace.

Automattic’s return to WordPress core development inadvertently calls attention to how far the platform has fallen behind competitors like Wix in usability and innovation, despite major investments and years of community support. For users and developers, this means that WordPress must now work to regain trust by proving it can adapt quickly and deliver the tools that modern site developers, businesses, and content creators actually need.

Automattic has a legitimate dispute with WP Engine, but the way it was approached became a major distraction that resulted in an arguably unnecessary four-month pause to WordPress development. The platform might have been in danger of losing relevance if not for the work of third-party innovators, and it still arguably lags behind competitors.

Google Lens Integration For YouTube Shorts: Search Within Videos via @sejournal, @MattGSouthern

Google has integrated Lens into YouTube Shorts.

Now, you can search for items you see in videos directly from your phone.

How The New Feature Works

Here’s how to use Google Lens in YouTube Shorts:

  • Pause any Short by tapping the screen.
  • Select “Lens” from the top menu.
  • Circle, highlight, or tap anything you want to search.

You can identify objects, translate text, or learn about locations. The results appear right above the video. When you’re finished, just swipe down to continue watching.

Here’s an example of the interface:

Screenshot from: YouTube.com/CreatorInsider, May 2025.

The feature works with products, plants, animals, landmarks, and text. You can even translate captions in real-time. Some searches include AI Overviews that provide more detailed information about what you’re looking for.

Google shared an example in its announcement:

“If you’re watching a short filmed in a location that you want to visit, you can select a landmark to identify it and learn more about the destination’s culture and history.”

See a demonstration in the video below:

Important Limitations

There are some key restrictions. Google Lens won’t work on Shorts with YouTube Shopping affiliate tags or paid product promotions.

The support docs are clear:

“Tagging a product via YouTube Shopping will disable the lens search.”

Search results only show organic content, meaning no ads will appear when you use Lens. Google also states that it doesn’t use facial recognition technology, although the system may display results for famous people when relevant.

The feature is only compatible with mobile devices (iOS and Android). Google says the beta is “starting to roll out to all viewers this week,” though it hasn’t shared specific dates for different regions.

What This Means For Marketers

This update presents several opportunities for content creators and marketers:

  • Visual elements in your Shorts can now boost engagement.
  • Travel and hospitality businesses receive free visibility when their locations feature in videos.
    Educational creators can benefit as viewers explore the topics presented in their content.

The ban on affiliate content poses a challenge. Creators who rely on YouTube Shopping must carefully consider their monetization strategies. They will need to find a balance between discoverable content and their revenue goals.

Looking Ahead

Google Lens in YouTube Shorts signals a shift in how people interact with video content. You can now search within videos, not just for them.

For marketers, this means visual elements matter more than ever. The objects, locations, and text in your videos are now searchable entry points.

The exclusion of monetized content also sets up an interesting dynamic. Creators must choose between affiliate revenue and visibility in visual search.

Start planning your Shorts with searchable moments in mind. Your viewers are about to become visual searchers.

Google: Database Speed Beats Page Count For Crawl Budget via @sejournal, @MattGSouthern

Google has confirmed that most websites still don’t need to worry about crawl budget unless they have over one million pages. However, there’s a twist.

Google Search Relations team member Gary Illyes revealed on a recent podcast that how quickly your database operates matters more than the number of pages you have.

This update comes five years after Google shared similar guidance on crawl budgets. Despite significant changes in web technology, Google’s advice remains unchanged.

The Million-Page Rule Stays The Same

During the Search Off the Record podcast, Illyes maintained Google’s long-held position when co-host Martin Splitt inquired about crawl budget thresholds.

Illyes stated:

“I would say 1 million is okay probably.”

This implies that sites with fewer than a million pages can stop worrying about their crawl budget.

What’s surprising is that this number has remained unchanged since 2020. The web has grown significantly, with an increase in JavaScript, dynamic content, and more complex websites. Yet, Google’s threshold has remained the same.

Your Database Speed Is What Matters

Here’s the big news: Illyes revealed that slow databases hinder crawling more than having a large number of pages.

Illyes explained:

“If you are making expensive database calls, that’s going to cost the server a lot.”

A site with 500,000 pages but slow database queries might face more crawl issues than a site with 2 million fast-loading static pages.

What does this mean? You need to evaluate your database performance, not just count the number of pages. Sites with dynamic content, complex queries, or real-time data must prioritize speed and performance.

The Real Resource Hog: Indexing, Not Crawling

Illyes shared a sentiment that contradicts what many SEOs believe.

He said:

“It’s not crawling that is eating up the resources, it’s indexing and potentially serving or what you are doing with the data when you are processing that data.”

Consider what this means. If crawling doesn’t consume many resources, then blocking Googlebot may not be helpful. Instead, focus on making your content easier for Google to process after it has been crawled.

How We Got Here

The podcast provided some context about scale. In 1994, the World Wide Web Worm indexed only 110,000 pages, while WebCrawler indexed 2 million. Illyes called these numbers “cute” compared to today.

This helps explain why the one-million-page mark has remained unchanged. What once seemed huge in the early web is now just a medium-sized site. Google’s systems have expanded to manage this without altering the threshold.

Why The Threshold Remains Stable

Google has been striving to reduce its crawling footprint. Illyes revealed why that’s a challenge.

He explained:

“You saved seven bytes from each request that you make and then this new product will add back eight.”

This push-and-pull between efficiency improvements and new features helps explain why the crawl budget threshold remains consistent. While Google’s infrastructure evolves, the basic math regarding when crawl budget matters stays unchanged.

What You Should Do Now

Based on these insights, here’s what you should focus on:

Sites Under 1 Million Pages:
Continue with your current strategy. Prioritize excellent content and user experience. Crawl budget isn’t a concern for you.

Larger Sites:
Enhance database efficiency as your new priority. Review:

  • Query execution time
  • Caching effectiveness
  • Speed of dynamic content generation

All Sites:
Redirect focus from crawl prevention to indexing optimization. Since crawling isn’t the resource issue, assist Google in processing your content more efficiently.

Key Technical Checks:

  • Database query performance
  • Server response times
  • Content delivery optimization
  • Proper caching implementation

Looking Ahead

Google’s consistent crawl budget guidance demonstrates that some SEO fundamentals are indeed fundamental. Most sites don’t need to worry about it.

However, the insight regarding database efficiency shifts the conversation for larger sites. It’s not just about the number of pages you have; it’s about how efficiently you serve them.

For SEO professionals, this means incorporating database performance into your technical SEO audits. For developers, it underscores the significance of query optimization and caching strategies.

Five years from now, the million-page threshold might still exist. But sites that optimize their database performance today will be prepared for whatever comes next.

Listen to the full podcast episode below:


Featured Image: Novikov Aleksey/Shutterstock

Google’s Gary Illyes Warns AI Agents Will Create Web Congestion via @sejournal, @MattGSouthern

A Google engineer has warned that AI agents and automated bots will soon flood the internet with traffic.

Gary Illyes, who works on Google’s Search Relations team, said “everyone and my grandmother is launching a crawler” during a recent podcast.

The warning comes from Google’s latest Search Off the Record podcast episode.

AI Agents Will Strain Websites

During his conversation with fellow Search Relations team member Martin Splitt, Illyes warned that AI agents and “AI shenanigans” will be significant sources of new web traffic.

Illyes said:

“The web is getting congested… It’s not something that the web cannot handle… the web is designed to be able to handle all that traffic even if it’s automatic.”

This surge occurs as businesses deploy AI tools for content creation, competitor research, market analysis, and data gathering. Each tool requires crawling websites to function, and with the rapid growth of AI adoption, this traffic is expected to increase.

How Google’s Crawler System Works

The podcast provides a detailed discussion of Google’s crawling setup. Rather than employing different crawlers for each product, Google has developed one unified system.

Google Search, AdSense, Gmail, and other products utilize the same crawler infrastructure. Each one identifies itself with a different user agent name, but all adhere to the same protocols for robots.txt and server health.

Illyes explained:

“You can fetch with it from the internet but you have to specify your own user agent string.”

This unified approach ensures that all Google crawlers adhere to the same protocols and scale back when websites encounter difficulties.

The Real Resource Hog? It’s Not Crawling

Illyes challenged conventional SEO wisdom with a potentially controversial claim: crawling doesn’t consume significant resources.

Illyes stated:

“It’s not crawling that is eating up the resources, it’s indexing and potentially serving or what you are doing with the data.”

He even joked he would “get yelled at on the internet” for saying this.

This perspective suggests that fetching pages uses minimal resources compared to processing and storing the data. For those concerned about crawl budget, this could change optimization priorities.

From Thousands to Trillions: The Web’s Growth

The Googlers provided historical context. In 1994, the World Wide Web Worm search engine indexed only 110,000 pages, whereas WebCrawler managed to index 2 million. Today, individual websites can exceed millions of pages.

This rapid growth necessitated technological evolution. Crawlers progressed from basic HTTP 1.1 protocols to modern HTTP/2 for faster connections, with HTTP/3 support on the horizon.

Google’s Efficiency Battle

Google spent last year trying to reduce its crawling footprint, acknowledging the burden on site owners. However, new challenges continue to arise.

Illyes explained the dilemma:

“You saved seven bytes from each request that you make and then this new product will add back eight.”

Every efficiency gain is offset by new AI products requiring more data. This is a cycle that shows no signs of stopping.

What Website Owners Should Do

The upcoming traffic surge necessitates action in several areas:

  • Infrastructure: Current hosting may not support the expected load. Assess server capacity, CDN options, and response times before the influx occurs.
  • Access Control: Review robots.txt rules to control which AI crawlers can access your site. Block unnecessary bots while allowing legitimate ones to function properly.
  • Database Performance: Illyes specifically pointed out “expensive database calls” as problematic. Optimize queries and implement caching to alleviate server strain.
  • Monitoring: Differentiate between legitimate crawlers, AI agents, and malicious bots through thorough log analysis and performance tracking.

The Path Forward

Illyes pointed to Common Crawl as a potential model, which crawls once and shares data publicly, reducing redundant traffic. Similar collaborative solutions may emerge as the web adapts.

While Illyes expressed confidence in the web’s ability to manage increased traffic, the message is clear: AI agents are arriving in massive numbers.

Websites that strengthen their infrastructure now will be better equipped to weather the storm. Those who wait may find themselves overwhelmed when the full force of the wave hits.

Listen to the full podcast episode below:


Featured Image: Collagery/Shutterstock

Google’s Query Fan-Out Patent: Thematic Search via @sejournal, @martinibuster

A patent that Google filed in December 2024 presents a close match to the Query Fan-Out technique that Google’s AI Mode uses. The patent, called Thematic Search, offers an idea of how AI Mode answers are generated and suggests new ways to think about content strategy.

The patent describes a system that organizes related search results to a search query into categories, what it calls themes, and provides a short summary for each theme so that users can understand the answers to their questions without having to click a link to all of the different sites.

The patent describes a system for deep research, for questions that are broad or complex. What’s new about the invention is how it automatically identifies themes from the traditional search results and uses an AI to generate an informative summary for each one using both the content and context from within those results.

Thematic Search Engine

Themes is a concept that goes back to the early days of search engines, which is why this patent caught my eye a few months ago and caused me to bookmark it.

Here’s the TL/DR of what it does:

  • The patent references its use within the context of a large language model and a summary generator.
  • It also references a thematic search engine that receives a search query and then passes that along to a search engine.
  • The thematic search engine takes the search engine results and organizes them into themes.
  • The patent describes a system that interfaces with a traditional search engine and uses a large language model for generating summaries of thematically grouped search results.
  • The patent describes that a single query can result in multiple queries that are based on “sub-themes”

Comparison Of Query Fan-Out And Thematic Search

The system described in the parent mirrors what Google’s documentation says about the Query Fan-Out technique.

Here’s what the patent says about generating additional queries based on sub-themes:

“In some examples, in response to the search query 142-2 being generated, the thematic search engine 120 may generate thematic data 138-2 from at least a portion of the search results 118-2. For example, the thematic search engine 120 may obtain the search results 118-2 and may generate narrower themes 130 (e.g., sub-themes) (e.g., “neighborhood A”, “neighborhood B”, “neighborhood C”) from the responsive documents 126 of the search results 118-2. The search results page 160 may display the sub-themes of theme 130a and/or the thematic search results 119 for the search query 142-2. The process may continue, where selection of a sub-theme of theme 130a may cause the thematic search engine 120 to obtain another set of search results 118 from the search engine 104 and may generate narrower themes 130 (e.g., sub-sub-themes of theme 130a) from the search results 118 and so forth.”

Here’s what Google’s documentation says about the Query Fan-Out Technique:

“It uses a “query fan-out” technique, issuing multiple related searches concurrently across subtopics and multiple data sources and then brings those results together to provide an easy-to-understand response. This approach helps you access more breadth and depth of information than a traditional search on Google.”

The system described in the patent resembles what Google’s documentation says about the Query Fan-Out technique, particularly in how it explores subtopics by generating new queries based on themes.

Summary Generator

The summary generator is a component of the thematic search system. It’s designed to generate textual summaries for each theme generated from search results.

This is how it works:

  • The summary generator is sometimes implemented as a large language model trained to create original text.
  • The summary generator uses one or more passages from search results grouped under a particular theme.
  • It may also use contextual information from titles, metadata, surrounding related passages to improve summary quality.
  • The summary generator can be triggered when a user submits a search query or when the thematic search engine is initialized.

The patent doesn’t define what ‘initialization’ of the thematic search engine means, maybe because it’s taken for granted that it means the thematic search engine starts up in anticipation of handling a query.

Query Results Are Clustered By Theme Instead Of Traditional Ranking

The traditional search results, in some examples shared in the patent, are replaced by grouped themes and generated summaries. Thematic search changes what content is shown and linked to users. For example, a typical query that a publisher or SEO is optimizing for may now be the starting point for a user’s information journey. The thematic search results leads a user down a path of discovering sub-themes of the original query and the site that ultimately wins the click might not be the one that ranks number one for the initial search query but rather it may be another web page that is relevant for an adjacent query.

The patent describes multiple ways that the thematic search engine can work (I added bullet points to make it easier to understand):

  • “The themes are displayed on a search results page, and, in some examples, the search results (or a portion thereof) are arranged (e.g., organized, sorted) according to the plurality of themes. Displaying a theme may include displaying the phrase of the theme.
  • In some examples, the thematic search engine may rank the themes based on prominence and/or relevance to the search query.
  • The search results page may organize the search results (or a portion thereof) according to the themes (e.g., under the theme of ‘cost of living”, identifying those search results that relate to the theme of ‘cost of living”).
  • The themes and/or search results organized by theme by the thematic search engine may be rendered in the search results page according to a variety of different ways, e.g., lists, user interface (UI) cards or objects, horizontal carousel, vertical carousel, etc.
  • The search results organized by theme may be referred to as thematic search results. In some examples, the themes and/or search results organized by theme are displayed in the search results page along with the search results (e.g., normal search results) from the search engine.
  • In some examples, the themes and/or theme-organized search results are displayed in a portion of the search results page that is separate from the search results obtained by the search engine.”

Content From Multiple Sources Are Combined

The AI-generated summaries are created from multiple websites and grouped under a theme. This makes link attribution, visibility, and traffic difficult to predict.

In the following citation from the patent, the reference to “unstructured data” means content that’s on a web page.

According to the patent:

“For example, the thematic search engine may generate themes from unstructured data by analyzing the content of the responsive documents themselves and may thematically organize the search results according to the themes.

….In response to a search query (“moving to Denver”), a search engine may obtain search results (e.g., responsive documents) responsive to that search query.

The thematic search engine may select a set of responsive documents (e.g., top X number of search results) from the search results obtained by the search engine, and generate a plurality of themes (e.g., “neighborhoods”, “cost of living”, “things to do”, “pros and cons”, etc.) from the content of the responsive documents.

A theme may include a phrase, generated by a language model, that describes a theme included in the responsive documents. In some examples, the thematic search engine may map semantic keywords from each responsive document (e.g., from the search results) and connect the semantic keywords to similar semantic keywords from other responsive documents to generate themes.”

Content From Source Pages Are Linked

The documentation states that the thematic search engine links to the URLs of the source pages. It also states that the thematic search result could include the web page’s title or other metadata. But the part that’s important for SEOs and publishers is the part about attribution, links.

“…a thematic search result 119 may include a title 146 of the responsive document 126, a passage 145 from the responsive document 126, and a source 144 of the responsive document. The source 144 may be a resource locator (e.g., uniform resource location (URL)) of the responsive document 126.

The passage 145 may be a description (e.g., a snippet obtained from the metadata or content of the responsive document 126). In some examples, the passage 145 includes a portion of the responsive document 126 that mentions the respective theme 130. In some examples, the passage 145 included in the thematic search result 119 is associated with a summary description 166 generated by the language model 128 and included in a cluster group 172.”

User Interaction Influences Presentation

As previously mentioned, the thematic search engine is not a ranked list of documents for a search query. It’s a collection of information across themes that are related to the initial search query. User interaction with those AI generated summaries influences which sites are going to receive traffic.

Automatically generated sub-themes can present alternative paths on the user’s information journey that begins with the initial search query.

Summarization Uses Publisher Metadata

The summary generator uses document titles, metadata, and surrounding textual content. That may mean that well-structured content may influence how summaries are constructed.

The following is what the patent says, I added bullet points to make it easier to understand:

  • “The summary generator 164 may receive a passage 145 as an input and outputs a summary description 166 for the inputted passage 145.
  • In some examples, the summary generator 164 receives a passage 145 and contextual information as inputs and outputs a summary description 166 for the passage 145.
  • In some examples, the contextual information may include the title of the responsive document 126 and/or metadata associated with the responsive document 126.
  • In some examples, the contextual information may include one or more neighboring passages 145 (e.g., adjacent passages).
  • In some examples, the contextual information may include a summary description 166 for one or more neighboring passages 145 (e.g., adjacent passages).
  • In some examples, the contextual information may include all the other passages 145 on the same responsive document 126. For example, the summary generator may receive a passage 145 and the other passages 145 (e.g., all other passages 145) on the same responsive document 126 (and, in some examples, other contextual information) as inputs and may output a summary description 166 for the passage 145.”

Thematic Search: Implications For Content & SEO

There are two way that AI Mode ends for a publisher:

  1. Since users may get their answers from theme summaries or dropdowns, zero-click behavior is likely to increase, reducing traffic from traditional links.
  2. Or, it could be that the web page that provides the end of the user’s information journey for a given query is the one that receives the click.

I think this means that we really need to re-think the paradigm of ranking for keywords and maybe consider what the question is that’s being answered by a web page, and then identify follow-up questions that may be related to that initial query and either include that in the web page or create another web page that answers what may be the end of the information journey for a given search query.

You can read the patent here:

Thematic Search (PDF)

Read Google’s Documentation Of AI Mode (PDF)

Google Fixes AI Mode Traffic Attribution Bug via @sejournal, @MattGSouthern

Google has fixed a bug that caused AI Mode search traffic to be reported as “direct traffic” instead of “organic traffic” in Google Analytics.

The problem started last week. Google was adding a special code (rel=”noopener noreferrer”) to links in its AI Mode search results. This code caused Google Analytics to incorrectly attribute traffic to websites, rather than from Google search.

Reports from Aleyda Solis, Founder at Orainti, and others in the SEO community confirm the issue is resolved.

Discovery of the Attribution Problem

Maga Sikora, an SEO director specializing in AI search, first identified the issue. She warned other marketers:

“Traffic from Google’s AI Mode is being tagged as direct in GA — not organic, as Google adds a rel=’noopener noreferrer’ to those links. Keep this in mind when reviewing your reports.”

The noreferrer code is typically used for security purposes. However, in this case, it was blocking Google Analytics from tracking the actual source of the traffic.

Google Acknowledges the Bug

John Mueller, Search Advocate at Google, quickly responded. He suggested it was a mistake on Google’s end, stating:

“My assumption is that this will be fixed; it looks like a bug on our side.”

Mueller also explained that Search Console doesn’t currently display AI Mode data, but it will be available soon.

He added:

“We’re updating the documentation to reflect this will be showing soon as part of the AI Mode rollout.”

Rapid Resolution & Current Status

Google fixed the problem within days.

Solis confirmed the fix:

“I don’t see the ‘noreferrer’ in Google’s AI Mode links anymore.”

She’s now seeing AI Mode data in her analytics and is verifying that traffic is correctly labeled as “organic” instead of “direct.”

Impact on SEO Reporting

The bug may have affected your traffic data for several days. If your site received AI Mode traffic during this period, some of your “direct” traffic may have been organic search traffic.

This misclassification could have:

  • Skewed conversion tracking
  • Affected budget decisions
  • Made SEO performance look worse than it was
  • Hidden the true impact of AI Mode on your site

What To Do Now

Here’s your action plan:

  1. Audit recent traffic data – Check for unusual spikes in direct traffic from the past week
  2. Document the issue – Note the affected dates for future reference
  3. Adjust reporting – Consider adding notes to client reports about the temporary bug
  4. Prepare for AI Mode tracking – Start planning how to measure this new traffic source

Google’s prompt response shows it understands the importance of accurate data for marketers.


Featured Image: Tada Images/Shutterstock

Is SEO Still Relevant In The AI Era? New Research Says Yes via @sejournal, @MattGSouthern

New research analyzing 25,000 user searches found that websites ranked #1 on Google appear in AI search answers 25% of the time.

This data demonstrates that traditional SEO remains relevant, despite claims that AI has rendered it obsolete.

Tomasz Rudzki, co-founder of ZipTie, studied real searches across ChatGPT, Perplexity, and Google’s AI Overviews. His findings challenge the widespread belief that AI makes traditional SEO pointless.

Top Rankings Translate To AI Visibility

The data shows a clear pattern: if you rank #1 on Google, you have a 1-in-4 chance of appearing in AI search results. Lower rankings result in lower chances.

Rudzki stated:

“The higher you rank in Google’s top 10, the more likely you are to appear in AI search results across platforms. This isn’t speculation – it’s based on real queries from real users.”

The pattern holds across all major AI search platforms, suggesting that they all rely on traditional rankings when selecting sources.

How AI Search Engines Select Sources

The study detailed how AI search operates, using information from Google’s antitrust trial. The process involves three main steps:

Step 1: Pre-selection
AI systems identify the best documents for each query, favoring pages with higher Google rankings.

Step 2: Content Extraction
The AI extracts relevant information from these top-ranking pages, prioritizing content that directly answers the user’s question.

Step 3: AI Synthesis
The AI synthesizes this information into one clear answer, utilizing Google’s Gemini model for this step.

Google’s internal documents from the trial confirmed a critical fact: using top-ranking content enhances the accuracy of AI responses, which explains why traditional rankings continue to be so significant.

The Query Fan-Out Effect Explained

Sometimes, you’ll come across sources that don’t make it into the top 10. Research identified two reasons why:

Reason 1: Personalization

Search results differ by user. A page might rank high for one user but not for another.

Reason 2: Query Fan-Out

This is the more significant factor. According to Google’s documentation:

“Both AI Overviews and AI Mode may use a ‘query fan-out’ technique — issuing multiple related searches across subtopics and data sources — to develop a response.”

Here’s what that means in simple terms:

When you search for “SEO vs SEM,” the AI discreetly runs multiple searches:

  • “What is SEO?”
  • “SEO explained”
  • “What is PP?C”
  • Plus several other related searches

Pages that perform well for these additional searches can appear in results even if they don’t rank for your primary search.

The research shows we need to think differently about content.

Traditional SEO focused on creating the “best page.” This meant comprehensive guides covering everything about a topic.

AI search wants the “best answer.” This means specific, focused responses to exact questions.

The analysis notes:

“When someone asks specifically about iPhone 15 battery life, you may rank top 1 in Google, but AI doesn’t care about it if you don’t provide a precise, relevant answer to that exact question.”

Marketers need to shift from keyword optimization to answering real questions.

Practical Implications For Digital Marketers

Here’s what marketers should do based on these findings:

  • Continue your SEO efforts: Top 10 rankings directly impact AI visibility. Do not abandon your SEO strategies.
  • Restructure your content: Divide lengthy guides into sections that address specific questions.
  • Target related searches: Optimize for various versions of your main keywords.
  • Write clearly: AI systems favor straightforward answers over content loaded with keywords.
  • Track everything: Monitor your visibility in both traditional and AI search results.

Industry Impact and Future Considerations

This research comes at the perfect time. AI search is growing rapidly. Understanding how it connects to traditional rankings gives you an edge.

Consider this: Only 25% of #1-ranked content appears in AI results. That means 75% is missing out. This suggests an opportunity for marketers who adapt.

Rudzki concludes:

“Instead of asking ‘How do I rank higher?’ start asking ‘How do I better serve users who have specific questions?’ That mindset shift is the key to thriving in the AI search era.”

For an industry experiencing rapid adoption of AI, these findings provide a strong foundation for informed strategic decisions. Instead of abandoning SEO practices, the evidence suggests building on what already works.


Featured Image: Tada Images/Shutterstock

Google’s CEO Says AI Overviews Website Referrals Are Increasing via @sejournal, @martinibuster

Google’s Sundar Pichai said in an interview that AI Overviews sends more traffic to a wider set of websites, insisting that Google cares about the web ecosystem and that he expects AI Mode to continue to send more traffic to websites, a claim that the interviewer challenged.

AI Agents Remove Customer Relationship Opportunities

There is a revolutionary change in how ecommerce that’s coming soon, where AI agents research and make purchase decisions on behalf of consumers. The interviewer brought up that some merchants have expressed concern that this will erode their ability to upsell or develop a customer relationship.

A customer relationship can be things like getting them to subscribe to an email or to receive text messages about sales, offer a coupon for a future purchase or to get them to come back and leave product reviews, all the ways that a human consumer interacts with a brand that an AI agent does not.

Sundar Pichai responded that AI agents present a good user experience and compared the AI agent in the middle between a customer and a merchant to a credit card company that sits in between the merchant and a customer, it’s a price that a merchant is willing to pay to increase business.

Pichai explained:

“I can literally see, envision 20 different ways this could work. Consumers could pay a subscription for agents, and their agents could rev share back. So you know, so that that is the CIO use case you’re talking about. That’s possible. We can’t rule that out. I don’t think we should underestimate, people may actually see more value participating in it.

I think this is, you know, it’s tough to predict, but I do think over time like you know like if you’re removing friction and improving user experience, it’s tough to bet against those in the long run, right? And so I think, in general if you’re lowering friction for it, you know, and and people are enjoying using it, somebody’s going to want to participate in it and grow their business.

And like would brands want to be in retailers? Why don’t they sell directly today? Why don’t they sell directly today? Why won’t they do that? Because retailers provide value in the middle.

Why do merchants take credit cards? There are many parts like and you find equilibrium because merchants take credit cards because they see more business as part of taking credit cards than not, right. And which justifies the increased cost of taking credit cards and may not be the perfect analogy. But I think there are all these kinds of effects going around.”

Pichai Claims That Web Ecosystem Is Growing

The interviewer began talking about the web ecosystem, calling attention to the the “downstream” effect of AI Search and AI search agents on information providers and other sites on the web.

Pichai started his answer by doing something he did in another interview about this same question where he deflected the question about web content by talking about video content.

He also made the claim that Google isn’t killing the web ecosystem and cited that the number of web pages in Google’s index has grown by 45% over the past two years, claiming it’s not AI generated content.

He said:

“I do think people are consuming a lot more information and the web is one specific format. So we should talk about the web, but zooming back out, …there are new platforms like YouTube and others too. So I think people are just consuming a lot more information, right? So it feels like like an expansionary moment. I think there are more creators. People are putting out more content, you know, and so people are generally doing a lot more. Maybe people have a little extra time in their hands. And so it’s a combination of all that.

On the web, look things have been interesting and you know we’ve had these conversations for a while, you know, obviously in 2015 there was this famous, the web is dead. You know, I always have it somewhere around, you know, which I look at it once in a while. Predictions, it’s existed for a while.

I think web is evolving pretty profoundly. When we crawl, when we look at the number of pages available to us, that number has gone up by 45% in the last two years alone. So that’s a staggering thing to think about.”

The interviewer challenged Pichai’s claim by asking if Google is detecting whether that increase in web pages is because they’re AI generated.

Pichai was caught by surprise by that question and struggled to find the answer and then finally responded that Google has many techniques for understanding the quality of web pages, including whether it was machine generated.

He doubled down on his statement that the web ecosystem is growing and then he started drifting off-topic, then he returned to the topic.

He continued:

“That doesn’t explain the trend we are seeing. So, generally there are more web pages. At an underlying level, so I think that’s an interesting phenomenom. I think everybody as a creator, like you do at The Verge, I think today if you’re doing stuff you have to do it in a cross-platform, cross-format way. So I think things are becoming more dynamic cross-format.

I think another thing people are underestimating with AI is AI will make it zero-friction to move from one format to another, because our models are multi-modal.

So I think this notion, the static moment of, you produce content by format, whereas I think machines can help translate it from, almost like different languages and they can go seamlessly between. I think it’s one of the incredible opportunities to be unlocked.

I think people are producing a lot of content, and I see consumers consuming a lot of content. We see it in our products. Others are seeing it too. So that’s probably how I would answer at the highest level.”

Related: The Data Behind Google’s AI Overviews: What Sundar Pichai Won’t Tell You

Search Traffic and Referral Patterns

The interviewer asked Pichai what his response is to people who say that AI Overviews is crushing their business.

Pichai answered:

“AI mode is going to have sources and you know, we’re very committed as a direction, as a product direction, part of why people come to Google is to experience that breadth of the web and and go in the direction they want to, right?

So I view us as giving more context. Yes, there are certain questions which may get answers, but overall that’s the pattern we see today. And if anything over the last year, it’s clear to us the breadth of where we are sending people to is increasing. And, so I expect that to be true with AI Mode as well.”

The interviewer immediately responded by noting that if everything Pichai said was true, people would be less angry with him.

Pichai dismissed the question, saying:

“You’re always going to have areas where people are robustly debating value exchanges, etc. … No one sends traffic to the web the way we do.”

See also: Google’s AI Overviews Slammed By News Publishers

Oh, Really?

What do you think? Are Google’s AI features prioritizing sending traffic to web sites?

Watch the Sundar Pichai interview here:

Featured image is screenshot from video

Google Claims AI Overviews Monetize At Same Rate As Traditional Search via @sejournal, @MattGSouthern

Google claims that search results with AI Overviews generate the same amount of advertising revenue as traditional search results.

This claim was made during Google Marketing Live when the company revealed plans to expand AI Overview ads to desktop users and more English-speaking markets.

If true, this could reshape how marketers perceive Google’s AI-powered future. However, the claim raises questions about how Google measures success and what it means for your campaigns.

Marketers need to understand what lies behind these claims and what they indicate for the future of search advertising.

AI Overviews Reaches Massive Scale

Google launched AI Overviews on mobile in the US last year. Since then, the company has quickly expanded the feature worldwide. It now processes AI-generated responses for users in more than 200 countries.

Shashi Thakur, Google’s VP/GM of Advertising, stated during the press session:

“We started rolling out AI overviews in search on US mobile last year. At this point, we are reaching a billion and a half users using it every month.”

Thakur oversees advertising across Google’s search products. This includes Google.com, Discover, Image Search, Lens, and Maps. He noted that users are happy with the feature.

The expansion shows Google’s confidence in both user adoption and commercial success. The company announced the desktop expansion that morning at the event, representing the latest phase of their rapid global rollout.

Thakur explained the growth impact:

“The consequence of us building AI overviews is that people are seeing growth. People are asking more of those questions… So we are seeing growth. So people are asking more questions. Many of those questions are even commercial. So we are seeing a growth even in commercial.”

Google’s Broader Vision For Search Evolution

Google’s approach to AI Overviews reflects a fundamental shift in how the company thinks about search capabilities. Thakur outlined this vision:

“At its core, we think about search as expanding the kinds of curiosities you can express. Humans have innumerable number of curiosities. There’s only a fraction of those that gets expressed to search. The more we advance the technology, the more we advance the product, users can bring more of their curiosities to search.”

This philosophy drives Google’s push toward AI-powered responses that can handle more complex and nuanced queries than traditional keyword-based searches.

How Google Measures AI Overview Monetization

Google’s revenue claims are based on controlled experiments. The company compares identical search queries with and without AI Overviews. They use standard A/B testing methods.

This means showing the AI feature to some users while holding it back from others. Then they measure the revenue difference.

Thakur explained to reporters:

“When we say AI overviews monetizes at the same rate, if you had taken the exact same set of queries and not shown AI overviews, it would have monetized at some rate. This continues to monetize at the same rate.”

The testing focuses on overall business value and revenue. It doesn’t examine individual metrics, such as click-through rates. Google emphasized this represents performance across many queries, not individual searches.

For advertisers, this suggests AI Overviews don’t hurt existing search advertising effectiveness. However, the long-term effects of changing user behavior patterns remain unclear.

Shashi Thakur speaks to press at Google Marketing Live.
Photo: Matt G. Southern/Search Engine Journal.

Strategic Approach To AI Overview Advertising

Google states that ads within AI Overviews adhere to the same quality guidelines as traditional search ads. The company requires that ads be of high quality and fit well with the user experience. All ads must be marked as sponsored content.

Advertisers have three placement options for AI Overview ads: above the AI response, below the response, or integrated within the AI answer itself. This gives marketers flexibility in how they appear alongside AI-generated content.

The complexity of modern user behavior drives Google’s advertising strategy. Thakur noted:

“I think the main thing to take away from those conversations is user journeys are complicated. And users get inspiration to get into their commercial journeys at innumerable points in their journeys.”

The integration focuses on identifying commercial intent within complex queries through what Google refers to as “faceted” searches. These are complex questions that contain multiple sub-questions, some of which have commercial intent.

Thakur gave an example of a user asking about airline rules for traveling with pets. That person might then need pet carriers or travel accessories, creating natural opportunities for advertising. The AI system can identify these layered commercial needs within a single complex query.

Google uses various classifiers to identify commercial intent, including shopping queries, travel queries, and insurance queries. This automated classification system helps match ads to relevant user needs.

Thakur stated:

“Ads need to be high quality, and they need to be cohesive with the experience. Ads of this nature extend how good the answer is for certain users.”

Google reports positive user feedback about ads shown with AI Overviews. This suggests the integration doesn’t significantly hurt user satisfaction.

This user acceptance seems crucial to Google’s strategy. The company plans to expand AI Overview advertising to more platforms and markets.

Shashi Thakur speaks to press at Google Marketing Live. Photo: Matt G. Southern/Search Engine Journal.

Implications For Digital Marketers

The revenue parity claim addresses advertiser concerns about AI’s impact on search advertising effectiveness.

Thakur acknowledged the fundamental question marketers are asking:

“So now, the question we often get from our advertisers, and it’s a natural question, which is, this is great. Search is evolving in lots of exciting directions. How do we participate? And how do we connect with our customers in the context of this evolving experience?”

Thakur noted that over 80% of Google advertisers already use some form of AI-driven advertising technology. This suggests the industry is ready for more AI integration.

However, the shift toward AI-powered search responses may require advertisers to adapt their strategies. Users are asking increasingly complex, longer queries. Traditional keyword targeting may not be effective in addressing these.

Google’s solution involves increased automation through tools like the newly announced “AI Max for search” feature. Early beta testing of AI Max has shown promising results, with advertisers experiencing an average 27% increase in conversions while maintaining similar return on investment (ROI) targets.

Thakur explained the motivation behind AI Max:

“So the motivation for this, essentially, was this changing user behavior. That’s number one. As we heard from our advertisers, we got the feedback very clearly that transparency and control of the form, they were already used on search campaigns. That continues to be super important in addition to the automation.”

The tool maintains the transparency and control features that advertisers expect from traditional search campaigns, including keyword performance reporting and campaign controls. This addresses concerns about losing visibility when embracing automation.

The company’s emphasis on automation reflects a challenge. It’s hard to match ads to sophisticated, conversational queries that can contain multiple commercial intents.

Manual keyword strategies may become less effective over time. This is especially true as search behavior evolves toward natural language interactions.

AI Mode Expansion Creates New Opportunities

Beyond AI Overviews, Google is testing ads within its new AI Mode, which enables fully conversational search experiences. Early data indicates that users in AI mode ask questions that are up to twice as long as regular search queries.

These longer, more conversational queries create additional opportunities for identifying commercial intent within complex questions. The extended query length often means users are providing more context about their needs, potentially making ad targeting more precise.

Google is applying lessons learned from AI Overviews to ensure ads in AI mode maintain the same quality and user experience standards.

Looking Ahead

Thakur emphasized that Google’s approach remains focused on delivering a high-quality user experience while providing business value to advertisers.

The actual test of Google’s revenue claims will come as AI Overviews mature. User behavior patterns need time to solidify.

As Google continues expanding AI Overview advertising globally, digital marketers face a balancing act. They must embrace new automated tools while maintaining the control and transparency that drive successful campaign performance.


Featured Image: Mijansk786/Shutterstock