Google Gemini Upgrades: New AI Capabilities Announced At I/O via @sejournal, @MattGSouthern

Google has announced updates to its Gemini AI platform at Google I/O, introducing features that could transform how search and marketing professionals analyze data and interact with digital tools.

The new capabilities focus on enhanced reasoning, improved interface interactions, and more efficient workflows.

Gemini 2.5 Models Get Performance Upgrades

Google highlights that Gemini 2.5 Pro leads the WebDev Arena leaderboard with an ELO score of 1420. It ranks first in all categories on the LMArena leaderboard, which measures human preferences for AI models.

The model features a one-million-token context window for processing large content inputs, effectively supporting both long text analysis and video understanding.

Meanwhile, Gemini 2.5 Flash has been updated to enhance performance in reasoning, multimodality, code, and long context processing.

Google reports it now utilizes 20-30% fewer tokens than previous versions. The updated Flash model is currently available in the Gemini app and will be generally available for production in Google AI Studio and Vertex AI in early June.

Gemini Live: New Camera and Screen Sharing Capabilities

The expanded Gemini Live feature is a significant addition to the Gemini ecosystem, now available on Android and iOS devices.

Google reports that Gemini Live conversations are, on average, five times longer than text-based interactions.

The updated version includes:

  • Camera and screen sharing capabilities, allowing users to point their phones at objects for real-time visual help.
  • Integration with Google Maps, Calendar, Tasks, and Keep (coming in the next few weeks).
  • The ability to create calendar events directly from conversations.

These features enable marketers to demonstrate products, troubleshoot issues, and plan campaigns through natural conversations with AI assistance.

Deep Think: Enhanced Reasoning for Complex Problems

The experimental “Deep Think” mode for Gemini 2.5 Pro uses research techniques that enable the model to consider multiple solutions before responding.

Google is making Deep Think available to trusted testers through the Gemini API to gather feedback prior to a wider release.

New Developer Tools for Marketing Applications

Several enhancements to the developer experience include:

  • Thought Summaries: Both 2.5 Pro and Flash will now provide structured summaries of their reasoning process in the Gemini API and Vertex AI.
  • Thinking Budgets: This feature is expanding to 2.5 Pro, enabling developers to manage token usage for thinking prior to responses, which impacts costs and performance.
  • MCP Support: The introduction of native support for the Model Context Protocol in the Gemini API allows for integration with open-source tools.

Here are examples of what thought summaries and thinking budgets look like in the Gemini interface:

Image Credit: Google
Image Credit: Google

Gemini in Chrome & New Subscription Plans

Gemini is being integrated into Chrome, rolling out to Google AI subscribers in the U.S. This feature allows users to ask questions about content while browsing websites.

You can see an example of this capability in the image below:

Image Credit: Google

Google also announced two subscription plans: Google AI Pro and Google AI Ultra.

The Ultra plan costs $249.99/month (with 50% off the first three months for new users) and provides access to Google’s advanced models with higher usage limits and early access to experimental AI features.

Looking Ahead

These updates to Gemini signify notable advancements in AI that marketers can integrate into their analytical workflows.

As these features roll out in the coming months, SEO and marketing teams can assess how these tools fit with their current strategies and technical requirements.

The incorporation of AI into Chrome and the upgraded conversational abilities indicate ongoing evolution in how consumers engage with digital content, a trend that search and marketing professionals must monitor closely.

Google Expands AI Features in Search: What You Need to Know via @sejournal, @MattGSouthern

At its annual I/O developer conference, Google announced upgrades to its AI-powered Search tools, making features like AI Mode and AI Overviews available to more people.

These updates, which Search Engine Journal received an advanced look at during a preview event, show Google’s commitment to creating interactive search experiences.

Here’s what’s changing and what it means for digital marketers.

AI Overviews: Improved Accuracy, Global Reach

AI Overviews, launched last year, are now available in over 200 countries and more than 40 languages.

Google reports that this feature is transforming how people utilize Search, with a 10% increase in search activity for queries displaying AI Overviews in major markets like the U.S. and India.

At the news preview, Liz Reid, Google’s VP and Head of Search, addressed concerns regarding AI accuracy.

She acknowledged that there have been “edge cases” where AI Overviews provided incorrect or even harmful information. Reid explained that these issues were taken seriously, corrections were made, and continuous AI training has led to improved results over time.

Expect Google to continue enhancing how AI ensures accuracy and reliability.

AI Mode: Now Available to More Users

AI Mode is now rolling out to all users in the U.S. without the need to sign up for Search Labs.

Previously, only testers could try AI Mode. Now, anyone in the U.S. will see a new tab for AI Mode in Search and in the Google app search bar.

How AI Mode Works

AI Mode uses a “query fan-out” system that breaks big questions into smaller parts and runs many searches at once.

Users can also ask follow-up questions and get links to helpful sites within the search results.

Google is using AI Mode and AI Overviews as testing grounds for new features, like the improved Gemini 2.5 AI model. User feedback will help shape what becomes part of the main Search experience.

New Tools: Deep Search, Live Visual Search, and AI-Powered Agents

Deep Search: Research Made Easy

Deep Search in AI Mode helps users dig deeper. It can run hundreds of searches at once and build expert-level, fully-cited reports in minutes.

Image Credit: Google
Image Credit: Google

Live Visual Search With Project Astra

Google is updating how users can search visually. With Search Live, you can use your phone’s camera to talk with Search about what you see.

For example, point your camera at something, ask a question, and get quick answers and links. This feature can boost local searches, visual shopping, and on-the-go learning.

Image Credit: Google

AI Agents: Getting Tasks Done for You

Google is adding agentic features, which are AI tools capable of managing multi-step tasks.

Initially, AI Mode will assist users in purchasing event tickets, reserving restaurant tables, and scheduling appointments. The AI evaluates hundreds of options and completes forms, but users always finalize the purchase.

Partners such as Ticketmaster, StubHub, Resy, and Vagaro are already onboard.

Image Credit: Google
Image Credit: Google

Smarter Shopping: Try On Clothes and Buy With Confidence

AI Mode is enhancing the shopping experience. The new tools use Gemini and Google’s Shopping Graph and include:

  • Personalized Visuals: Product panels show items based on your style and needs.
  • Virtual Try-On: Upload a photo to see how clothing looks on you, powered by Google’s fashion AI.
  • Agentic Checkout: Track prices, get sale alerts, and let Google’s AI buy for you via Google Pay when the price drops.
  • Custom Charts: For sports and finance, AI Mode can build charts and graphs using live data.
Image Credit: Google

Personalization and Privacy Controls

Soon, AI Mode will offer more personalized results by using your past searches and, if you opt in, data from other Google apps like Gmail.

For example, if you’re planning a trip, AI Mode can suggest restaurants or events based on your bookings and interests. Google says you’ll always know when your personal info is used and can manage your privacy settings anytime.

Google’s View: Search Use Cases Are Growing

CEO Sundar Pichai addressed how AI is reshaping search during the preview event.

He described the current transformation as “far from a zero sum moment,” noting that the use cases for Search are “dramatically expanding.”

Pichai highlighted increasing user excitement and conveyed optimism, stating that “all of this will keep getting better” as AI capabilities mature.

Looking Ahead

Google’s latest announcements signal a continued push toward AI as the core of the search experience.

With AI Mode rolling out in the U.S. and global expansion of AI Overviews, marketers should proactively adapt their strategies to meet the evolving expectations of both users and Google’s algorithms.

Use IndexNow For AI Search And Shopping SEO via @sejournal, @martinibuster

Microsoft Bing published an announcement stating that the IndexNow search crawling technology is a powerful way for ecommerce companies to surface the latest and most accurate shopping-related information in AI Search and search engine shopping features.

Generative Search Requires Timely Shopping Information

Ecommerce sites typically depend on merchant feeds, search engine crawling and updates to Schema.org structured data to communicate what’s for sale, new products, retired products, changes to prices, availability and other important features. Each of those methods can be a point of failure due to slow crawling by search engines and inconsistent updating which can delay the correct information from surfacing in AI search and shopping features.

IndexNow solves that problem. Content platforms like Wix, Duda, Shopify and WooCommerce support IndexNow, a Microsoft technology that enables speeding indexing of new or updated content. Pairing IndexNow with Schema.org assures fast indexing so that the correct information surfaces in AI Search and shopping features.

IndexNow recommends the following Schema.org Product Type properties:

  • “title (name in JSON-LD)
  • description
  • price (list/retail price)
  • link (product landing page URL)
  • image link (image in JSON-LD)
  • shipping (especially important for Germany and Austria)
  • id (a unique identifier for the product)
  • brand
  • gtin
  • mpn
  • datePublished
  • dateModified
  • Optional fields to further enhance context and classification:
  • category (helps group products for search and shopping platforms)
  • seller (recommended for marketplaces or resellers)
  • itemCondition (e.g., NewCondition, UsedCondition)”

Read more at Microsoft Bing’s Blog:

IndexNow Enables Faster and More Reliable Updates for Shopping and Ads

Featured Image by Shutterstock/Paper piper

Does Google’s AI Overviews Violate Its Own Spam Policies? via @sejournal, @martinibuster

Search marketers assert that Google’s new long-form AI Overviews answers have become the very thing Google’s documentation advises publishers against: scraped content lacking originality or added value, at the expense of content creators who are seeing declining traffic.

Why put the effort into writing great content if it’s going to be rewritten into a complete answer that removes the incentive to click the cited source?

Rewriting Content And Plagiarism

Google previously showed Featured Snippets, which were excerpts from published content that users could click on to read the rest of the article. Google’s AI Overviews (AIO) expands on that by presenting entire articles that answer a user’s questions and sometimes anticipates follow-up questions and provides answers to those, too.

And it’s not an AI providing answers. It’s an AI repurposing published content. That action is called plagiarism when a student does the same thing by repurposing an existing essay without adding unique insight or analysis.

The thing about AI is that it is incapable of unique insight or analysis, so there is zero value-add in Google’s AIO, which in an academic setting would be called plagiarism.

Example Of Rewritten Content

Lily Ray recently published an article on LinkedIn drawing attention to a spam problem in Google’s AIO. Her article explains how SEOs discovered how to inject answers into AIO, taking advantage of the lack of fact checking.

Lily subsequently checked on Google, presumably to see if her article was ranking and discovered that Google had rewritten her entire article and was providing an answer that was almost as long as her original.

She tweeted:

“It re-wrote everything I wrote in a post that’s basically as long as my original post “

Did Google Rewrite Entire Article?

An algorithm that search engines and LLMs may use to analyze content is to determine what questions the content answers. This way the content can be annotated according to what answers it provides, making it easier to match a query to a web page.

I used ChatGPT to analyze Lily’s content and also AIO’s answer. The number of questions answered by both documents were almost exactly the same, twelve. Lily’s article answered 13 questions while AIO provided answeredo twelve.

Both articles answered five similar questions:

  • Spam Problem In AI Overviews
    AIO: “s there a spam problem affecting Google AI Overviews?
    Lily Ray: What types of problems have been observed in Google’s AI Overviews?
  • Manipulation And Exploitation of AI Overviews
    AIO: How are spammers manipulating AI Overviews to promote low-quality content?
    Lily Ray: What new forms of SEO spam have emerged in response to AI Overviews?
  • Accuracy And Hallucination Concerns
    AIO: Can AI Overviews generate inaccurate or contradictory information?
    Lily Ray: Does Google currently fact-check or validate the sources used in AI Overviews?
  • Concern About AIO In The SEO Community
    AIO: What concerns do SEO professionals have about the impact of AI Overviews?
    Lily Ray: Why is the ability to manipulate AI Overviews so concerning?
  • Deviation From Principles of E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness)
    AIO: What kind of content is Google prioritizing in response to these issues?
    Lily Ray: How does the quality of information in AI Overviews compare to Google’s traditional emphasis on E-E-A-T and trustworthy content?

Plagiarizing More Than One Document

Google’s AIO system is designed to answer follow-up and related questions, “synthesizing” answers from more than one original source and that’s the case with this specific answer.

Whereas Lily’s content argues that Google isn’t doing enough, AIO rewrote the content from another document to say that Google is taking action to prevent spam. Google’s AIO differs from Lily’s original by answering five additional questions with answers that are derived from another web page.

This gives the appearance that Google’s AIO answer for this specific query is “synthesizing” or “plagiarizing” from two documents to answer the question Lily Ray’s search query, “spam in ai overview google.”

Takeaways

  • Google’s AI Overviews is repurposing web content to create long-form content that lacks originality or added-value.
  • Google’s AIO answers mirror the content they summarize, copying the structure and ideas to answer identical questions inherent in the articles.
  • Google’s AIO arguably deviates from Google’s own quality standards, using rewritten content in a manner that mirrors Google’s own definitions of spam.
  • Google’s AIO features apparent plagiarism of multiple sources.

The quality and trustworthiness of AIO responses may  not reach the quality levels set by Google’s principles of Experience, Expertise, Authoritativeness, and Trustworthiness because AI lacks experience and apparently there is no mechanism for fact-checking.

The fact that Google’s AIO system provides essay-length answers arguably removes any incentive for users to click through to the original source and may help explain why many in the search and publisher communities are seeing less traffic. The perception of AIO traffic is so bad that one search marketer quipped on X that ranking #1 on Google is the new place to hide a body, because nobody would ever find it there.

Google could be said to plagiarize content because AIO answers are rewrites of published articles that lack unique analysis or added value, placing AIO firmly within most people’s definition of a scraper spammer.

Featured Image by Shutterstock/Luis Molinero

WordPress Scraper Plugin Compromised By Security Vulnerability via @sejournal, @martinibuster

A WordPress plugin that automatically posts content scraped from other websites has been discovered to contain a critical vulnerability that allows anyone to upload malicious files to affected websites. The severity of the vulnerability is rated at 9.8 on a scale of 1-10.

Crawlomatic Multisite Scraper Post Generator Plugin for WordPress

The Crawlomatic WordPress plugin is sold via the Envato CodeCanyon store for $59 per license. It enables users to crawl forums, weather statistics, articles from RSS feeds, and directly scrape the content from other websites and then automatically publish the content on the user’s website.

The plugin’s Envato CodeCanyon web page features a banner that notes that the author of the plugin has been recognized for having met “WordPress quality standards” and displays a badge indicating that it is “Envato WP Requirements Compliant,” an indication that it meets Envato’s “security, quality, performance and coding standards in WordPress plugins and themes.”

The plugin’s directory page explains that it it can crawl and scrape virtually any website, including JavaScript-based sites, promising that it can turn a user’s website into a “money making machine.”

Unauthenticated Arbitrary File Upload

The Crawlomatic WordPress plugin is missing a filetype validation check in all version prior to and including version 2.6.8.1.

According to a warning posted on Wordfence:

“The Crawlomatic Multipage Scraper Post Generator plugin for WordPress is vulnerable to arbitrary file uploads due to missing file type validation in the crawlomatic_generate_featured_image() function in all versions up to, and including, 2.6.8.1. This makes it possible for unauthenticated attackers to upload arbitrary files on the affected site’s server which may make remote code execution possible.”

Users of the plugin are recommended by Wordfence to update to at least version 2.6.8.2.

Read more at Wordfence:

Crawlomatic Multipage Scraper Post Generator <= 2.6.8.1 – Unauthenticated Arbitrary File Upload

Featured Image by Shutterstock/nakaridore

Googler’s Deposition Offers View Of Google’s Ranking Systems via @sejournal, @martinibuster

A Google engineer’s redacted testimony published online by the U.S. Justice Department offers a look inside Google’s ranking systems, offering an idea about Google’s quality scores and introduces a mysterious popularity signal that uses Chrome data.

The document offers a high level and very general view of ranking signals, providing a sense of what the algorithms do but not the specifics.

Hand-Crafted Signals

For example, it begins with a section about the “hand crafting” of signals which describes the general process of taking data from quality raters, clicks and so on and applying mathematical and statistical formulas to generate a ranking score from three kinds of signals. Hand crafted means scaled algorithms that are tuned by search engineers. It doesn’t mean that they are manually ranking websites.

Google’s ABC Signals

The DOJ document lists three kinds of signals that are referred to as ABC Signals and correspond to the following:

  • A – Anchors (pages linking to the target pages),
  • B – Body (search query terms in the document),
  • C – Clicks (user dwell time before returning to the SERP)

The statement about the ABC signals is a generalization of one part of the ranking process. Ranking search results is far more complex and involves hundreds if not thousands of additional algorithms at every step of the ranking process, from indexing, link analysis, anti-spam processes, personalization, re-ranking, and other processes. For example, Liz Reid has discussed Core Topicality Systems as part of the ranking algorithm and Martin Splitt has discussed annotations as a part of understanding web pages.

This is what the document says about the ABC signals:

“ABC signals are the key components of topicality (or a base score), which is Google’s determination of how the document is relevant to the query.

T* (Topicality) effectively combines (at least) these three signals in a relatively hand-crafted way. Google uses to judge the relevance of the document based on the query terms.”

The document offers an idea of the complexity of ranking web pages:

“Ranking development (especially topicality) involves solving many complex mathematical problems. For topicality, there might be a team of engineers working continuously on these hard problems within a given project.

The reason why the vast majority of signals are hand-crafted is that if anything breaks Google knows what to fix. Google wants their signals to be fully transparent so they can trouble-shoot them and improve upon them.”

The document compares their hand-crafted approach to Microsoft’s automated approach, saying that when something breaks at Bing it’s far more difficult to troubleshoot than it is with Google’s approach.

Interplay Between Page Quality And Relevance

An interesting point revealed by the search engineer is that page quality is independent of query. If a page is determined to be high quality, trustworthy, it’s regarded as trustworthy across all related queries which is what is meant by the word static, it’s not dynamically recalculated for each query. However, there are relevance-related signals in the query that can be used to calculate the final rankings, which shows how relevance plays a decisive role in determining what gets ranked.

This is what they said:

“Quality
Generally static across multiple queries and not connected to a specific query.

However, in some cases Quality signal incorporates information from the query in addition to the static signal. For example, a site may have high quality but general information so a query interpreted as seeking very narrow/technical information may be used to direct to a quality site that is more technical.

Q* (page quality (i.e., the notion of trustworthiness)) is incredibly important. If competitors see the logs, then they have a notion of “authority” for a given site.

Quality score is hugely important even today. Page quality is something people complain about the most…”

AI Gives Cause For Complaints Against Google

The engineer states that people complain about quality but also says that AI aggravates the situation by making it worse.

He says about page quality:

“Nowadays, people still complain about the quality and AI makes it worse.

This was and continues to be a lot of work but could be easily reverse engineered because Q is largely static and largely related to the site rather than the query.”

eDeepRank – A Way To Understand LLM Rankings

The Googler lists other ranking signals, including one called eDeepRank which is an LLM-based system that uses BERT, which is a language related model.

He explains:

“eDeepRank is an LLM system that uses BERT, transformers. Essentially, eDeepRank tries to take LLM-based signals and decompose them into components to make them more transparent. “

That part about decomposing LLM signals into components seems to be a reference of making the LLM-based ranking signals more transparent so that search engineers can understand why the LLM is ranking something.

PageRank Linked To Distance Ranking Algorithms

PageRank is Google’s original ranking innovation and it has since been updated. I wrote about this kind of algorithm six years ago . Link distance algorithms calculate the distance from authoritative websites for a given topic (called seed sites) to other websites in the same topic. These algorithms start with a seed set of authoritative sites in a given topic and sites that are further away from their respective seed site are determined to be less trustworthy. Sites that are closer to the seed sets are likelier to be more authoritative and trustworthy.

This is what the Googler said about PageRank:

“PageRank. This is a single signal relating to distance from a known good source, and it is used as an input to the Quality score.”

Read about this kind of link ranking algorithm: Link Distance Ranking Algorithms

Cryptic Chrome-Based Popularity Signal

There is another signal whose name is redacted that’s related to popularity.

Here’s the cryptic description:

“[redacted] (popularity) signal that uses Chrome data.”

A plausible claim can be made that this confirms that the Chrome API leak is about actual ranking factors. However, many SEOs, myself included, believe that those APIs are developer-facing tools used by Chrome to show performance metrics like Core Web Vitals within the Chrome Dev Tools interface.

I suspect that this is a reference to a popularity signal that we might not know about.

The Google engineer does refer to another leak of documents that reference actual “components of Google’s ranking system” but that they don’t have enough information for reverse engineering the algorithm.

They explain:

“There was a leak of Google documents which named certain components of Google’s ranking system, but the documents don’t go into specifics of the curves and thresholds.

For example
The documents alone do not give you enough details to figure it out, but the data likely does.”

Takeaway

The newly released document summarizes a U.S. Justice Department deposition of a Google engineer that offers a general outline of parts of Google’s search ranking systems. It discusses hand-crafted signal design, the role of static page quality scores, and a mysterious popularity signal derived from Chrome data.

It provides a rare look into how signals like topicality, trustworthiness, click behavior, and LLM-based transparency are engineered and offers a different perspective on how Google ranks websites.

Featured Image by Shutterstock/fran_kie

HTTP Status Codes Google Cares About (And Those It Ignores) via @sejournal, @MattGSouthern

Google’s Search Relations team recently shared insights about how the search engine handles HTTP status codes during a “Search Off the Record” podcast.

Gary Illyes and Martin Splitt from Google discussed several status code categories commonly misunderstood by SEO professionals.

How Google Views Certain HTTP Status Codes

While the podcast didn’t cover every HTTP status code (obviously, 200 OK remains fundamental), it focused on categories that often cause confusion among SEO practitioners.

Splitt emphasized during the discussion:

“These status codes are actually important for site owners and SEOs because they tell a story about what happened when a particular request came in.”

The podcast revealed several notable points about how Google processes specific status code categories.

The 1xx Codes: Completely Ignored

Google’s crawlers ignore all status codes in the 1xx range, including newer features like “early hints” (HTTP 103).

Illyes explained:

“We are just going to pass through [1xx status codes] anyway without even noticing that something was in the 100 range. We just notice the next non-100 status code instead.”

This means implementing early hints might help user experience, but won’t directly benefit your SEO.

Redirects: Simpler Than Many SEOs Believe

While SEO professionals often debate which redirect type to use (301, 302, 307, 308), Google’s approach focuses mainly on whether redirects are permanent or temporary.

Illyes stated:

“For Google search specifically, it’s just like ‘yeah, it was a redirection.’ We kind of care about in canonicalization whether something was temporary or permanent, but otherwise we just [see] it was a redirection.”

This doesn’t mean redirect implementation is unimportant, but it suggests the permanent vs. temporary distinction is more critical than the specific code number.

Client Error Codes: Standard Processing

The 4xx range of status codes functions largely as expected.

Google appropriately processes standard codes like 404 (not found) and 410 (gone), which remain essential for proper crawl management.

The team humorously mentioned status code 418 (“I’m a teapot”), an April Fool’s joke in the standards, which has no SEO impact.

Network Errors in Search Console: Looking Deeper

Many mysterious network errors in Search Console originate from deeper technical layers below HTTP.

Illyes explained:

“Every now and then you would get these weird messages in Search Console that like there was something with the network… and that can actually happen in these layers that we are talking about.”

When you see network-related crawl errors, you may need to investigate lower-level protocols like TCP, UDP, or DNS.

What Wasn’t Discussed But Still Matters

The podcast didn’t cover many status codes that definitely matter to Google, including:

  • 200 OK (the standard successful response)
  • 500-level server errors (which can affect crawling and indexing)
  • 429 Too Many Requests (rate limiting)
  • Various other specialized codes

Practical Takeaways

While this wasn’t a comprehensive guide to HTTP status codes, the discussion revealed several practical insights:

  • For redirects, focus primarily on the permanent vs. temporary distinction
  • Don’t invest resources in optimizing 1xx responses specifically for Google
  • When troubleshooting network errors, look beyond HTTP to deeper protocol layers
  • Continue to implement standard status codes correctly, including those not specifically discussed

As web technology evolves with HTTP/3 and QUIC, understanding how Google processes these signals can help you build more effective technical SEO strategies without overcomplicating implementation.


Featured Image: Roman Samborskyi/Shutterstock

How Referral Traffic Undermines Long-Term Brand Growth via @sejournal, @martinibuster

Mordy Oberstein, a search marketing professional whom I hold in high esteem, recently shared the provocative idea that referral traffic is not a brand’s friend and that every brand, as it matures, should wean itself from it. Referrals from other websites are generally considered a sign of a high-performing business, but it’s not a long-term strategy because it depends on sources that cannot be controlled.

Referral Traffic Is Necessary But…

Mordy Oberstein (LinkedIn profile), formerly of Wix, asserted in a Facebook post that relying on a traffic source, whether that’s another website or a search engine, offers a degree of vulnerability to maintaining steady traffic and performance.

He broke it down as a two-fold weakness:

  • Relying on the other site to keep featuring your brand.
  • Relying on Google to keep ranking that other site which in turn sends visitors to your brand.

The flow of traffic can stop at either of those two points, which is a hidden weakness that can affect the long-term sustainability of healthy traffic and sales.

Mordy explained:

“It’s a double vulnerability…

1) Relying on being featured by the website (the traffic source)
2) Relying on Google to give that website …traffic (the channel)

There are two levels of exposure & vulnerability.

As your brand matures, you want to own your own narrative.

More referral traffic is not your friend. It’s why, as a brand matures, it should wean off of it.

Full disclosure, this is my opinion. I am sure a lot of people will disagree.”

Becoming A Destination

I’ve always favored promoting a site in a way that helps it become synonymous with a given topic because that’s how to make it a default destination and encourage the kinds of signals that Google interprets as authoritative. I’ve done things like created hats with logos to give away, annual product giveaways and other promotional activities, both online and offline. While my competition was doing SEO busy work I created fans. Promoting a site is basically just getting it in front of people, both online and offline.

Brand Authority Is An Excuse, Not A Goal

Some SEOs believe in a concept called Brand Authority, which is a misleading explanation for why a website rank.  The term Brand Authority is not about Branding and it’s not about Authoritativeness, either. It’s just an excuse for why a site is top-ranked.

The phrase Brand Authority has its roots in PageRank. Big brand websites used to have a PageRank of 9 out of 10 and even a 10/10, which enabled them to rank for virtually any keywords they wanted. A link from one of those sites practically guarantee a top ten ranking. But Google ended the outsized influence of PageRank because it resulted in less relevant results, which was around 2004-ish, about the time that Google started using Navboost, a ranking signal that essentially measures how people feel about a site, which is what PageRank does, too.

This insight, that Google uses signals about how people feel about a site, is important because the feelings people have for a business are what being a brand is all about.

Marty Neumeier, a thought leader on how to promote companies (author of The Brand Gap) explained what being a brand is all about:

“Instead of creating the brand first, the company creates customers (through products and social media), the customers build the brand (through purchases and advocacy), and the customer-built brand sustains the company (through “tribal” loyalty). This model takes into account a profound and counterintuitive truth: a brand is not owned by the company, but by the customers who draw meaning from it. Your brand isn’t what you say it is. It’s what they say it is.”

Neumeier also explains how brand is about customer feelings:

“The best brands are vivid. They create clear mental pictures and powerful feelings in the minds and hearts of customers. They’re brought to life through their touchpoints, the places where customers experience them, from the first exposure to a brand’s name, to buying the product, to eventually making it part of who they are.”

That “tribal loyalty” is the kind of thing Google tries to measure. So when Danny Sullivan talks about differentiating your site to make it like a brand, he is not referring to so-called “brand authority.” He is talking about doing the kinds of things that influence people to feel positive about a site.

Getting Back To Mordy Oberstein

It seems to me that what he’s saying is that referral traffic is a stepping stone towards becoming a destination, it’s a means to an end. It’s not the goal, it’s a step toward the goal of becoming a destination.

On the other side of that process, I think it’s important to maintain relevance with potential site visitors and customers, especially today with the rapid pace of innovation, generational change, new inventions, and new product models. Relevance to people has been a Google ranking signal for a long time, beginning with PageRank, then with additional signals like Navboost.

The SEO factor that the SEO industry has largely missed is the part about about getting people to think positive thoughts about your site and your business, enough to share with other people.

Mordy’s insight about traffic is beautiful and elegant.

Read Mordy’s entire post on Facebook.

Featured Image by Shutterstock/Yunus Praditya

Google Clarifies: AI Overview Links Share Single Position In Search Console via @sejournal, @MattGSouthern

Google’s John Mueller has clarified that all links within AI Overviews (AIOs) share a single position in Google Search Console.

SEO consultant Gianluca Fiorelli asked Mueller how Search Console tracks position data for URLs in Google’s AI-generated answer boxes.

Mueller referenced Google’s official help docs, explaining:

“Basically an AIO counts as a block, so it’s all one position. It can be first position, if the block is shown first, but I don’t know if AIO is always shown first.”

This indicates that every website linked in an AI Overview receives the same position value in Search Console reports.

This occurs regardless of where the link appears in the overview panel, whether immediately visible or hidden until a user expands the box.

What Google’s Documentation Says

Google’s Search Console Help docs explain how AI Overview metrics work:

  • Position: “An AI Overview occupies a single position in search results, and all links in the AI Overview are assigned that same position.”
  • Clicks: “Clicking a link to an external page in the AI Overview counts as a click.”
  • Impressions: “Standard impression rules apply. To be counted as an impression, the link must be scrolled or expanded into view.”

The docs also note:

“Search Console doesn’t include data from experiments in Search Labs, as these experiments are still in active development.”

The Missing Data Behind Google’s Click Claims

This discussion highlights an ongoing debate in the SEO community regarding the performance of links in AI Overviews.

Lily Ray, Vice President of SEO Strategy & Research at Amsive, recently pointed out Google’s year-old claim that websites receive more clicks when featured in AI Overviews, stating:

“I would love to see a single GSC report that confirms this statement, because every study so far has shown the opposite.”

Ray’s statement reflects the concerns of many SEO professionals, as Google has not provided data to support its claims.

Looking Ahead

While we now understand how position metrics are recorded, the question remains: Do AI Overview placements drive more or less traffic than traditional search listings?

Google claims one thing, but many people report different experiences.

Since all AIO links share the same position, it’s difficult to determine which specific placements perform better.

This debate highlights the need for more precise data about how AIOs affect website traffic compared to regular search results.


Featured Image: Roman Samborskyi/Shutterstock