Google Isn’t Going Anywhere: Ahrefs Ambassador On LLM Inclusion & Why Relationships Still Win via @sejournal, @theshelleywalsh

There’s a divided line in the industry between those who think optimizing for AI is separate from SEO and those who think LLM discovery is just SEO. But, this is an unproductive argument, because whatever you think, LLM inclusion is now part of SEO discovery.

So, let’s just focus on how the search journey works now and where you can find real business value.

To discuss inclusion in LLMs, I invited Patrick Stox to the latest edition of IMHO to find out what he thinks. As product advisor, technical SEO, and brand ambassador at Ahrefs, Patrick has plenty of data to work with and insights into what’s actually working for LLM inclusion right now.

In the face of the AI takeover, Patrick’s take is that Google isn’t going anywhere, and he still thinks human relationships are critical.

You can watch the full interview with Patrick on IMHO below.

Google Isn’t Going Anywhere

With the industry obsessing over ChatGPT, AI Overviews, and AI Mode, it’s easy to assume that traditional search really is dead. However, Patrick was quick to say, “I’m not betting against Google.”

“Google is still everything for most people … Most of the people that are using [LLMs] are tech forward, but the majority of folks are still just Googling things”

Recent Ahrefs data estimated that Google owns an estimated 40% of all traffic to websites, with LLM referrals still a fraction by comparison. Although Google’s share of traffic may be down a couple of percent this year, it still dominates.

After experimenting with ChatGPT and Claude when they first launched, Patrick found himself returning to Google’s AI Mode and Gemini, and thinks others will do the same. “Even I just went back to Google,” he admitted. “I think we’re going to see more of that as they improve their systems.”

Google continues releasing competitive AI innovations, and Patrick predicts these will pull many users back into Google’s ecosystem.

“I’m not betting against Google,” he says. “They’ve got more data than anyone, and they’re still on the bleeding edge.”

The Attribution Problem: LLMs Might Drive Conversions, But We Can’t Prove It

Even though sites are seeing growing referrals from LLMs, establishing attribution to any real value from LLM traffic is a challenge right now. We can talk about brand awareness, but C-Suite is only interested in business value.

Patrick agreed that while you can count mentions and citations in AI answers, that doesn’t easily translate into board-level reporting.

“You can measure how often you’re mentioned versus competitors … but going back to a business, I can’t report on that stuff. It’s all secondary, tertiary metrics.”

For Patrick, revenue and revenue-adjacent metrics still matter. That said, Ahrefs has had some signals from AI search traffic.

“We did track the signups. When I first looked at this data back in July, all the traffic from AI search was half a percent of our traffic total. But at the time, it was 12.1% of our total conversions.” He explained.

This has now dropped below 10%, while the traffic share has grown slightly.

Two Strategies That Are Working For LLM Inclusion

I asked if Ahrefs is actively investing in LLM inclusion, and Patrick said they are trying a number of different things, and the two fundamental approaches that determine LLM visibility are repetition and differentiation.

“Whatever the internet says, that’s kind of what’s being returned in these systems,”

Repetition means ensuring consistent messaging across multiple websites. LLMs synthesize what “the internet says,” so if you want to be recognized for something, that narrative needs to exist broadly. For Ahrefs, this has meant actively spreading the message that they have evolved beyond just SEO tools into a comprehensive digital marketing platform.

Differentiation through original data works alongside the repetition to stand out. Ahrefs has invested heavily in unique data studies throughout the year, including non-English language research. “This data is being heavily cited, heavily returned in these systems because there’s nothing else out there like it,” Patrick explained.

The more surprising tactic that is also currently working is listicles.

“I hate to say it, but listicles … they work right now. I don’t think it’s future-proof at all, but at the same time, I don’t want to just not be there.”

Agentic AI And The Threat Of Closed Systems

I then asked about agentic AI and systems, and does Patrick have concerns about systems becoming closed.

As LLM agents begin booking travel, making purchases, or accessing APIs directly, most likely they would rely on a small set of partners from big brands.

“ChatGPT isn’t going to make deals with unknown companies,” Stox says. “If they book flights, they’ll use major providers. If they use a dictionary, they’ll pick one dictionary.”

This would be the real threat to smaller businesses. “If an agent decides ‘we only check out through Amazon,’ a lot of stores lose sales overnight,” Patrick warns. There is no guaranteed defense. The only strategy we can follow right now is to grow your brand and footprint.

“What was the thing they used to say for Google? Make them embarrassed to not have you included.”

Beyond LLM Optimization: Channels That Still Matter

Patrick emphasized a point that’s possibly been forgotten in the AI hype: “It’s not ChatGPT that’s the second largest search engine, it’s still YouTube by far.”

YouTube has been a hugely successful referral platform for Ahrefs, and the company invested heavily in video. Patrick recommends both long and short-form, for brand discovery.

Community participation on platforms such as Reddit, Slack, and Discord also offers substantial value, but only when companies genuinely participate rather than spam.

While many brands have tried to brute-force Reddit with spam, Patrick says there can be huge value in genuine participation, especially when employees are allowed to represent the company authentically.

“You have literally a paid workforce of advocates who work for your company. Let them go out and talk to people … answer questions, basically advertise for you. They want to do it already. So let them.”

If You Started A Product Today, Where Would You Bet?

As a final question, I asked Patrick where he’d invest if launching a startup today; he did not hesitate to say relationships.

“If I launched a startup, the first thing I’d invest in is relationships. That’s still the most powerful channel … I think if I did do something like that, I’d probably grow it pretty fast. More from my connections than anything else,” he said.

After relationships, he’d focus on YouTube, website content creation, and telling friends about the product. In other words, “just normal marketing.”

“We’ve gone through this tech revolution, and now we’re realizing everything still comes back to direct connections with people.”

And that may be the most important insight of all. In an era of AI-driven discovery, the brands that win are the ones that remain unmistakably human.

Watch the full video interview with Patrick Stox here:

Thank you to Patrick Stox for offering his insights and being my guest on IMHO.

More Resources:


Featured Image: Shelley Walsh/Search Engine Journal

How Google’s Web Guide Helps SEO

Google’s Web Guide is an experiment launched in July 2025 that uses AI to organize a user’s search results. To try the feature, enable it in Google Labs.

Unlike a fan-out (which guesses what additional information is helpful in a searcher), Web Guide analyzes the content of top-ranking pages and groups them by topic.

AI then summarizes each category, providing an overview of the pages.

Perhaps unintentionally, Web Guide is handy for search engine optimization by revealing Google’s understanding of keywords.

Targeted queries

Organic search results order web pages by ranking signals. Yet searchers cannot easily discern the pages’ content type or topics without visiting each one. Web Guide provides a summary, thus implying how Google interprets a query.

For example, Web Guide groups the search results for “how to build a website” by the following topics:

  • “Comprehensive guides to building a website”
  • “Building websites with no-code builders”
  • “Creating websites with Google Sites”
  • “Website building with Squarespace”
  • “Building websites with Wix”
  • “Building websites with Canva”
  • “Website development with HTML, CSS, and JavaScript”
  • “Learning web development: courses & tutorials”
  • “Choosing website builders”
  • “Community advice on website building (Reddit threads and forums)”
  • “Understanding domain names and hosting”
  • “Web design principles and best practices”

Creators looking to search-optimize an article or course on website building can use the list for topics to include.

Web Guide can also identify competitors. For example, searching “waterproof sneakers” in Web Guide generates a section of the best-known brands:

Google search results page for “Waterproof Sneaker,” showing branded results from Nike, Adidas, and On. Each result includes a headline about waterproof shoes and brief descriptions referencing materials like GORE-TEX and RAIN.RDY. A product photo appears next to the Adidas listing.

Web Guide can identify competitors, as shown in this example for “waterproof sneakers.”

It also reveals alternative keywords to target, such as “water resistant” and “water shoes”:

Water-Resistant and Water Shoes

Some sneakers offer water resistance or are designed as full water shoes, with specific technologies like HDry® membrane providing complete waterproofing and breathability, while others prioritize quick drying.

Brand search

Searching for a brand name in Web Guide provides insight into what Google knows about the company and the URLs that impact its understanding. For example, searching “home chef” in Web Guide generates a separate section for the prices of that service. AI summarizes each ranking page.

Web Guide results also help brands ensure off-site consistency and identify which user-generated content to monitor. For example, brands that change pricing can use Web Guide to find a list of URLs to update.

Google search results page for “Home Chef Pricing & Plans,” displaying links from Home Chef, MiumMium, YouTube, and Reddit. Listings highlight meal costs starting around $9.99 per serving, weekly cost estimates, and comparisons of Home Chef pricing versus grocery stores. A small profile image accompanies one of the results.

Searching for “home chef” in Web Guide returns a section on pages that address the service’s prices.

Competitors

Queries in Web Guide reveal its preference among competitors. Take “Home Chef” and “Green Chef,” for example. Searching “home chef vs green chef” reveals Web Guide’s AI prefers the latter:

Green Chef typically comes out ahead due to its organic ingredients, health-conscious dietary plans, and sustainability efforts, whereas Home Chef offers greater affordability, customization, and convenience with quick-prep meals.

The URLs listed below the initial summary are also AI-summarized, offering a list of publications and authors to contact for clarifications or enhancements.

Google search results page for “home chef vs green chef,” summarizing comparison content. Featured results from meal websites and review sites discuss differences in meal plans, pricing, ingredients, and dietary options between Home Chef and Green Chef. A food photo is shown next to one of the listings.

Queries in Web Guide reveal Google’s preference for top competitors, such as this comparison of “Home Chef” and “Green Chef.”

Web Guide may or may not become public. Many such Google Labs experiments never do. While aimed at consumers, it implicitly helps search optimizers by revealing how Google’s AI interprets a query or understands a brand.

Google CTR Trends In Q3: Branded Clicks Fan Out, Longer Queries Hold via @sejournal, @MattGSouthern

Advanced Web Ranking released its Q3 Google organic clickthrough report, tracking CTR changes by ranking position across query types and industries.

The company compared July through September against April through June. The dataset is international, so the patterns reflect broad search behavior rather than a single region.

Here’s what stands out in this quarter’s report.

Branded Desktop Searches Shift Clicks Down-Page

The clearest movement this quarter shows up in branded queries on desktop.

For searches containing a brand or business name, position 1 lost 1.52 percentage points of CTR. Positions 2 through 6 gained a combined 8.71 points.

Unbranded queries were mostly unchanged, so this shift appears specific to how people navigate brand SERPs on desktop.

Commercial & Location Queries Lose Top CTR

When AWR sorted results by intent, commercial and location searches posted the clearest top-position declines.

Commercial queries, defined as searches including terms like “buy” or “price,” saw positions 1 and 2 on desktop drop a combined 4.20 points. Position 1 accounted for most of that loss at 3.01 points.

Location searches also weakened at the top. Position 1 fell 2.52 points on desktop and 2.13 points on mobile.

AWR doesn’t attribute cause, but these are the SERPs where rich results and other modules can crowd the page.

The takeaway is that top organic placements in commercial and local contexts captured a smaller share of clicks in Q3 than they did in Q2.

Longer Queries Hold Steady

Query length shows another split that matters for forecasting traffic.

On desktop, position-1 CTR fell for shorter multi-word searches. Two-word queries dropped 1.22 points and three-word queries dropped 1.24 points at the top spot.

AWR notes that 4+ word queries were the only group with steady CTR this quarter.

On mobile, the movement went the other way for the shortest queries. One-word searches gained 1.52 points at position 1.

The takeaway here is that short, generic desktop searches remain the most volatile category of CTR performance, while longer searches looked more stable in Q3.

Industry Winners And Losers

AWR tracked CTR shifts across 18 verticals and tied those changes to demand trends.

The report highlighted several large moves:

  • Arts & Entertainment had the steepest single-position decline, with position 1 on desktop down 5.13 points.
  • Travel showed the strongest gain, with position 2 on desktop up 2.46 points.
  • Shopping saw a redistribution near the top. Position 1 on desktop fell 2.10 points, while positions 2 and 3 gained a combined 2.83 points.

The takeaway is that CTR isn’t shifting evenly across verticals. Some categories are seeing a top-spot squeeze, while others are seeing clicks spread across more of the upper results.

Why This Matters For You

Q3 adds another data point for explaining CTR changes when rankings stay flat.

For branded desktop searches, position 1 is still dominant, but it’s no longer absorbing as much of the clickshare as last quarter.

If you track brand terms, it’s worth watching whether traffic is distributing across multiple listings on those SERPs.

And if your traffic depends on short, high-volume desktop queries, this report suggests those segments are still the most exposed to quarter-over-quarter click shifts. Longer searches were the only length group that held steady at the top in Q3.

Looking Ahead

AWR’s report reflects an international dataset and doesn’t isolate a single driver behind the CTR movement. Still, the direction in Q3 is clear in a few places.

Branded desktop clicks are spreading beyond position 1, and commercial and local SERPs continue to pressure the top organic slot.


Featured Image: Roman Samborskyi/Shutterstock

SEO Pulse: Gemini 3 Arrives & Adobe Buys Semrush via @sejournal, @MattGSouthern

Welcome to the week’s Pulse: updates affect how AI surfaces content, how you track brand demand, and where core SEO tools sit in the wider marketing stack.

Google launched Gemini 3 directly into AI Mode in Search, Adobe announced a $1.9 billion acquisition of Semrush, and Google shipped two reporting updates in Search Console: custom annotations and a branded queries filter.

Here’s what matters for you and your work.

Google Brings Gemini 3 To AI Mode On Launch Day

Google released Gemini 3 Pro and integrated it into AI Mode in Search on day one. This is the first time a new Gemini model has shipped to Search at launch.

Gemini 3 Pro is available now in AI Mode for Google AI Pro and Ultra subscribers in the U.S. by choosing “Thinking” from the model dropdown. Google plans to expand access to all U.S. users soon, with higher usage limits for paid subscribers.

Key Facts: Gemini 3 is live in AI Mode, the Gemini app, AI Studio, Vertex AI, and Google’s Antigravity platform. It brings new generative UI layouts and a more aggressive query fan-out system, with automatic model selection coming soon to route complex questions to Gemini 3.

Why SEOs Should Pay Attention

Gemini 3 pushes AI Mode further away from static answer boxes toward dynamic, tool-based responses. Instead of plain text, Google can decide when to surface calculators, simulations, or comparison tables based on your query, which changes how often people need to click through, even when your content underpins the answer.

Mordy Oberstein, Founder at Unify Marketing, connected Gemini 3’s capabilities to Google’s broader strategy in a LinkedIn post:

Gemini 3 offering a more diverse display “take” on the topic is where this is headed. I think if you combine this with what Liz Reid (Google’s Head of Search) said in a recent WSJ interview, the future of AI Mode is full-on SERP integration happens is multi-media text output + original source firsthand knowledge exploration.

His point frames Gemini 3 less as a model upgrade and more as another step toward AI Mode becoming the default SERP experience.

Read our full coverage: Google Brings Gemini 3 To Search’s AI Mode

Search Console Adds Custom Annotations To Performance Reports

Google launched custom annotations in Search Console performance reports. The feature lets you add contextual notes directly to traffic charts, marking specific dates with explanations for site changes or external events.

You can right-click any date on a performance chart, select “Add annotation,” and write a note up to 120 characters explaining what happened.

Key Facts: All annotations are visible to everyone with access to a property, each property can store up to 200 annotations, and entries older than 500 days are automatically deleted.

Why SEOs Should Pay Attention

Keeping track of when you shipped a change has always been awkward in Search Console. You make a template update, fix a technical issue, or publish a new section, then come back weeks later and have to reconstruct the timeline from Jira tickets or chat logs.

Custom annotations move that context into the chart itself so you can see change points alongside traffic shifts.

Brodie Clark, Independent SEO Consultant, highlighted the timing in a LinkedIn post:

Overall, I think this is a great move for GSC. Especially after changes like we’ve seen with the disabling of &num=100, which messed with our impressions and average position data massively. These annotations appear directly on your chart, providing a clear visual reference point for your data (just make sure they’re useful – because everyone who can access the property can see them).

For teams, that shared view makes it easier to understand why traffic changed without chasing down who did what and when.

Read the announcement: Custom annotations in Search Console

Adobe Acquires Semrush In $1.9 Billion Cash Deal

Adobe and Semrush announced a definitive agreement for Adobe to acquire Semrush in an all-cash transaction valued at approximately $1.9 billion.

Adobe will pay $12.00 per share, a premium of around 77% over Semrush’s prior closing price. Semrush shares climbed more than 70 percent after the announcement.

Key Facts: Both boards have approved the deal, closing is targeted for the first half of 2026 pending regulatory and shareholder approval, and Semrush will join Adobe’s Digital Experience business alongside Adobe Experience Manager and Adobe Analytics.

Why SEOs Should Pay Attention

Core SEO and visibility tooling continues to consolidate into large enterprise suites. Semrush has already moved toward monitoring brand presence across AI assistants as well as traditional search, which fits with Adobe’s focus on cross-channel experience and analytics.

Eli Schwartz, Author of “Product-Led SEO,” outlined the deal’s strategic implications on LinkedIn:

Adobe + Semrush means three things: SEO is still a very valuable channel, yet it was undervalued by Wall Street, which is why Adobe paid a premium on its market cap. The value isn’t in seeing the visibility – the value is seeing what happens after the visibility. Search visibility + analytics is going to make a potent tool. The cross-sell and upsell opportunities between these businesses are going to be massive.

If you rely on Semrush, you may see product and pricing shift toward deeper integration with Adobe’s stack, which could benefit teams already standardizing on Adobe while changing the equation for everyone else.

Read our full coverage: Adobe To Acquire Semrush In $1.9 Billion Cash Deal

Google Search Console Adds Branded Queries Filter

Google introduced a branded queries filter in the Search Console Performance report that automatically separates branded and non-branded search traffic.

The filter appears under “Filter by query” and works across all search types, including web, image, video, and news. A new card in the Insights report shows the breakdown of clicks for branded versus non-branded queries.

Key Facts: Google uses an AI-driven system to classify branded queries, including misspellings, variations, and brand-related products or services. The filter is only available for top-level properties with enough volume and is rolling out gradually over the coming weeks.

Why SEOs Should Pay Attention

Separating branded and non-branded traffic makes it easier to see whether your SEO work is expanding reach or amplifying existing demand.

Non-branded queries are your discovery channel, while branded queries reflect how often people look you up by name. With this filter, you can benchmark both segments before and after big initiatives and understand whether growth is coming from new audiences, increased brand demand, or a mix of the two.

Mags Sikora, SEO Director at Strategy for AI-Led SERPs, pointed out the technical detail in a LinkedIn post:

Crucially, this isn’t regex-based. Google is using an AI-driven system that recognises your brand across languages, catches typos and variations, and can even classify queries that don’t explicitly mention the brand but refer to a unique product or service you offer.

She added that Google acknowledges some queries may be misclassified due to the dynamic, contextual nature of brand detection, and that the filter only changes reporting, not rankings.

Read the announcement: Branded queries filter in Search Console

Theme Of The Week: Making AI Search Legible

Each story this week is about making AI-powered search easier to see and explain.

Gemini 3 pushes more queries into dynamic AI layouts, while custom annotations and the branded queries filter give you better ways to document changes and separate brand demand from discovery. Adobe’s Semrush deal continues the trend toward rolling SEO visibility into broader analytics stacks.

Taken together, this week is less about “new features” and more about storytelling: where your brand shows up in AI experiences, how that visibility changes over time, and how you translate those patterns into metrics your stakeholders actually care about.

Top Stories Of The Week:

More Resources:


Featured Image: pui_bunny/Shutterstock

The Role Of Brand Authority And E-E-A-T In The AI Search Era via @sejournal, @DuaneForrester

AI-generated answers are spreading across search. Google and Bing are each presenting synthesized responses alongside regular results. These answers are not replacing traditional SERPs yet, but they are taking up attention. As they improve, they influence what people see first and what they trust most. The question is no longer whether they will change search, but how much of your brand’s visibility they will absorb as they expand. And as usage of ChatGPT, Claude, Perplexity, and other platforms continues to expand, we’re going to start to see user habits shift. Which means we’ll see more engagement with synthesized answers with no traditional SERPs in sight at all.

Being ranked is no longer enough. When machines decide which brands to cite or quote, the deciding factor is trust. The brands that become part of AI-generated answers are those seen as authoritative and credible. That is where E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness) takes on greater importance.

Image Credit: Duane Forrester

Understanding E-E-A-T

Yup, we are about to re-walk well-traveled territory in this section, much of which you may already know. But here’s the rub … this is still news to some folks, and so many who claim to know it, still get the execution wrong, so please bear with me with this section if you are already crushing it with E-E-A-T.

E-E-A-T is not a single ranking factor. It is a framework used by Google’s search evaluators to judge how credible, useful, and accurate a page appears. You can read the full guidelines here: https://services.google.com/fh/files/misc/hsw-sqrg.pdf.

Experience refers to first-hand involvement. It is the signal that you have actually done or tested what you are writing about. Expertise is the skill or background that ensures accuracy. Authoritativeness reflects recognition from others: citations, backlinks, and mentions that confirm your credibility. Trustworthiness is the foundation. It is built through transparency, consistency, and honesty. In Google’s guidelines, trust is described as the single most important quality of a high-value page. The other three factors exist to reinforce it.

These same principles are now emerging in AI systems. Models trained to generate answers rely on reliable, verifiable information. A system cannot “feel” trust, but it can measure it through repetition and context. The more your brand appears in credible environments, the stronger your statistical trust signal becomes.

It’s worth also noting that E-E-A-T is not a Holy Grail. It’s not the silver bullet, a magic concept, or a single-point savior for sites struggling with poor UX, weak content, troubled histories, and so on. It’s a part of the whole landscape of work you need to do to enjoy success, but I’m calling it out here because this whole article is really about trust and its importance to LLM-based answers.

How AI Answers Are Changing Discovery

Search results still look familiar, but discovery no longer begins and ends with a search box. AI-generated answers now appear in Gemini, Perplexity, Bing Copilot, ChatGPT, and Claude, each shaping what people learn before they ever visit a website. These systems don’t replace traditional results, but they compete for the same attention. They answer quickly, carry conversational authority, and often satisfy curiosity before a click happens.

For SEOs, this creates two overlapping visibility systems. The first is still the structured web: ranking pages through links, metadata, and relevance. The second is the interpretive layer of AI retrieval and synthesis. Instead of evaluating pages in order, these systems evaluate meaning. They identify fragments of content, score them for reliability, and rewrite them into new narratives. Visibility no longer depends only on ranking high; it depends on being known, cited, and semantically retrievable.

Each major platform handles this differently.

  • Gemini and Bing Copilot remain closest to classic search, combining web results with AI-generated summaries. They still reference source domains and show linked citations, giving SEOs some feedback on what’s being surfaced.
  • Perplexity acts as a bridge between web and conversation. It routinely cites the domains it draws from, often favoring pages with structured data, clear headings, and current publication dates.
  • ChatGPT and Claude represent a different kind of discovery altogether. Inside these environments, users often never see the open web. Answers are drawn from model knowledge, premium connectors, or browsing results, sometimes citing, sometimes not. Yet they still shape awareness and trust. When a consumer asks for “the best CRM for small business,” and your brand appears in that response, the exposure influences perception even if it happens outside Google’s ecosystem.

That’s the part most marketers miss: Visibility now extends beyond what typical analytics can track. People are discovering, comparing, and deciding inside AI tools that don’t register as traffic sources. A mention in ChatGPT or Claude may not show up in referral logs, but it builds brand familiarity that can resurface later as a direct visit or branded search.

This creates a new discovery pathway. A user might start with an AI conversation, remember a brand name that sounded credible, and later search for it manually. Or they might see it mentioned again inside Gemini’s summaries and click then. In both cases, awareness grows without a single traceable referral.

The measurement gap is real. Current analytics tools are built for link-based behavior, not conversational exposure. Yet the signals are visible if you know where to look. Rising branded search volume, increased direct traffic, and mentions across AI surfaces are early indicators of AI-driven visibility. Several emerging platforms now monitor brand appearance inside ChatGPT, Claude, Gemini, and Perplexity responses, offering the first glimpses of how brands perform in this new layer.

In practice, this means SEO strategy now extends beyond ranking factors into retrieval factors. Crawlable, optimized content remains essential, but it also needs to be citation-ready. That means concise, fact-driven writing, updated sources, and schema markup that defines your authors, organization, and entities clearly enough for both crawlers and AI parsers to verify.

Traditional SEO remains your discoverability engine. AI citation has become your credibility engine. One ensures you can be found; the other ensures you can be trusted and reused. When both operate together, your brand moves from being searchable to being referable, and that’s where discovery now happens.

Expanding Challenges To Brands

This shift introduces new risks that can quietly undermine visibility.

  • Zero-click exposure is the first. Your insights might appear inside an AI answer without attribution if your brand identity is unclear or your phrasing too generic. This isn’t really “new” to SEOs who have long had to deal with typical zero-click answer boxes in SERPs, but this expands that footprint noticeably.
  • Entity confusion is another. If your structured data or naming conventions are inconsistent, AI systems can mix your brand with similar ones.
  • Reputation bleed happens when old or inaccurate content about your brand lingers on third-party sites. AI engines scrape that information and may present it as fact.
  • Finally, trust dilution is an issue. The flood of AI-generated content is making it harder for systems to separate credible human work from synthetic filler. In response, they will likely narrow the pool of trusted domains.

These risks are not yet widespread, but the direction is obvious. Brands that delay strengthening trust signals will feel it later.

How To Build Trust And Authority

Building authority today means creating signals that both people and machines can verify. This is what content moating looks like in practice: establishing proof of expertise that’s difficult to fake or copy. It starts with clear ownership. Every piece of content should identify who created it and why that person is qualified to speak on the topic. Readers and algorithms alike look for visible credentials, experience, and professional context. When authorship is transparent, credibility becomes traceable.

Freshness signals care. Outdated information, dead links, or references to old data quietly undermine trust. Keeping content current shows ongoing involvement in your subject and helps both users and search systems recognize that your knowledge is active, not archived.

Structure supports this effort. Schema markup for articles, authors, and organizations gives machines a way to verify what they’re seeing. It clarifies relationships: who wrote the piece, what company they represent, and how it fits into a larger body of work. Without it, even well-written content can get lost in the noise.

External validation deepens the signal. When reputable outlets cite or reference your work, it strengthens your perceived authority. Media mentions, partnerships, and collaborations all act as third-party endorsements that reinforce your brand’s credibility. They tell both people and AI systems that others already trust what you have to say.

Then there’s the moat that no algorithm can replicate: original insight. Proprietary data, firsthand experience, and in-depth case studies show real expertise. These are the assets that set your content apart from AI-generated summaries because they contain knowledge that isn’t available elsewhere on the web.

Finally, consistency ties it all together. The version of your brand that appears on your website, LinkedIn profile, YouTube channel, and review sites should all align. Inconsistent bios, mismatched tone, or outdated information create friction that weakens perceived trust. Authority is cumulative. It grows when every signal points in the same direction.

The Coming Wave Of Verification

In the near future, trust will not just be a guideline. It will become a measurable inclusion standard. Major AI platforms are developing what are often called universal verifiers, systems that check the accuracy and reliability of content before it is included in an answer. These tools will aim to confirm that cited information is factually correct and that the source has a history of accuracy.

When this arrives, the brands that already display strong trust cues will pass verification more easily. Those without structured data, transparent authorship, or verifiable sourcing will struggle to appear. What HTTPS did for security, these systems may soon do for credibility.

This will also redefine technical SEO. It will not be enough for your site to be fast and crawlable. It will need to be verifiable. That means clear author data, factual sourcing, and strong entity ties that confirm ownership.

How To Measure Progress

New forms of visibility require new measurement. Traditional metrics like traffic, backlinks, and keyword rankings still matter, but they no longer tell the full story.

  • Track whether your brand appears in AI-generated answers. Use the new tools/platforms available, chatbots, and answer engines to test your visibility.
  • Monitor branded search volume over time; it reflects whether your exposure in AI summaries is driving awareness.
  • Audit your structured data and author markup regularly. Consistency is what keeps you trusted.
  • Track external mentions and citations in high-trust environments. Authority builds where consistency meets recognition.

What Matters Most

E-E-A-T was once a quality checklist. Now it is a visibility strategy. Search systems and AI models are moving toward the same destination – finding reliable information faster.

Experience proves you have done the work. Expertise ensures you can explain it accurately. Authoritativeness confirms others trust you. Trustworthiness ties it all together. And if you believe your own interpretation and approach to E-E-A-T is good enough, look at your current search rankings. They can act as an early warning for you. If you consistently fail to rank well for key terms, that could be a clue that the AI systems will see your content as “less than,” when compared to competing pieces of content. By no means is that a straight map, but if you consistently struggle to meet the requirements of traditional search trust gates, it’s unlikely you’ll get a pass from AI systems as they ramp up their focus on trust.

The brands that live these principles will be the ones cited, quoted, and remembered. In a world of AI-generated answers, your reputation becomes your ranking signal. Build it deliberately. Make it visible. Keep it consistent.

That is how you stay trusted when the answers start writing themselves.

More Resources:


This post was originally published on Duane Forrester Decodes.


Featured Image: Viktoriia_M/Shutterstock

SEO Community Reacts To Adobe’s Semrush Acquisition via @sejournal, @martinibuster

The SEO community is excited by the Semrush Adobe acquisition. The consensus is that it’s a milestone in the continuing evolution of SEO in the age of generative AI. Adobe’s purchase comes at a time of AI-driven uncertainty and may be a sign of the importance of data for helping businesses and marketers who are still trying to find a new way forward.

Cyrus Shepard tweeted that he believes the Semrush sale creates an opportunity for Ahrefs under the belief that Adobe’s scale and emphasis on the enterprise market will present an opportunity for Ahrefs to move fast to respond to rapidly changing needs of the marketing industry.

He tweeted:

“Adobe’s marketing tools lean towards ENTERPRISE (AEM, Adobe Analytics). If Adobe leans this way with Semrush, it may be a less attractive solution to smaller operators.

With this acquisition, @ahrefs remains the only large, independent SEO tool suite on the market. Ahrefs is able to move fast and innovate – I suspect this creates an opportunity for Ahrefs – not a problem.”

Shepard is right, some of Adobe’s products (like Adobe Analytics) do lean toward enterprise users but there’s a significant small and medium size business user base for design related tools with pricing at the $99/month range that make the tools relatively affordable. Nevertheless that’s a significant cost compared to the $600 range that Adobe used to charge for standalone versions for Windows and Mac.

I agree that Ahrefs is quite likely the best positioned tool to serve the needs of the SMB end of the SEO industry should Semrush increase focus on the enterprise market. But there are also smaller tools like SERPrecon that are tightly focused on helping businesses deliver results and may benefit from the vacuum left by Semrush.

Validates SEO Platforms

Seth Besmertnik, CEO of the enterprise SEO platform Conductor, sees the acquisition as validating SEO platforms, which is a valid observation considering how much money, in cash, Semrush was acquired for.

Besmertnik wrote:

“I’m feeling a lot this morning. HUGE news today. Adobe will be acquiring Semrush…our partner, competitor, and an ally in the broader SEO and AEO/GEO world for over a decade.

For a long time, big tech ignored SEO. It drove half of the internet’s traffic, yet somehow never cleared the bar as something to own. I always believed the day would come when major platforms took this category seriously. Today is that day.”

It’s an exciting moment! We’re starting to see some consolidation and this represents huge recognition of how important the work of SEOs is. From traditional SEO through optimizing for AI platforms, the work is important. Clearly Adobe is thinking this way on behalf of their clientele, which means great things ahead.”

Besmertnik also made the point that the industry is entering a transitional phase where platforms that are adapted to AI will be the leaders of tomorrow.

He added:

“This next era won’t be led by legacy architectures. It will be led by platforms that built their foundations for AI…and by companies engineered for the data-first, enterprise-grade world that’s now taking shape.”

Validates SEO

Duane Forrester, formerly of Bing, shared the insight that the acquisition shows how important SEO is, especially as the industry is evolving to meet the challenges of AI search.

Duane shared:

“It’s an exciting moment! We’re starting to see some consolidation and this represents huge recognition of how important the work of SEOs is. From traditional SEO through optimizing for AI platforms, the work is important. Clearly Adobe is thinking this way on behalf of their clientele, which means great things ahead.”

Online Reactions Were Mostly Positive

There were a few comments with negative sentiment published in response to Adobe’s announcement on X (formerly Twitter), where some used the post to vent about pricing and other grudges but many others from the SEO community offered congratulations to Semrush.

What It All Means

As multiple people have said, the sale of Semrush is a landmark moment for SEO and for SEO platforms because it puts a dollar figure on the importance of digital marketing at a time when the search marketing industry is struggling to reach consensus of how SEO should evolve to meet the many changes introduced by AI Search.

Many Questions Remain Unanswered

What Will Adobe Actually Do With Semrush’s Product?

Will Semrush remain a standalone product or will it be offered in multiple versions for enterprise users and SMBs or will it be folded into one of Adobe’s cloud offerings?

Pricing

A common concern is about pricing and whether the cost of Semrush will go up. Is it possible that the price could actually come down?

Semrush Is A Good Fit For Adobe

Adobe started as a software company focused on graphic design products but by the turn of the millenium it began acquiring companies directly related to digital marketing and web design, but increasingly focusing on the enterprise market. Data is useful for planning content and also for better understanding what’s going on at search engines and at AI-based search and chat. Semrush is a good fit for Adobe.

Featured Image by Shutterstock/Sunil prajapati

Google’s Old Search Era Is Over – Here’s What 2026 SEO Will Really Look Like

For years, Google’s predictable, and at times too easily gamed, ecosystem created an illusion that SEO success came from creating any and all content and checking boxes rather than understanding users.

During the era of massive top‑of‑funnel traffic and generously ranked low‑quality content, many marketers don’t realize it, but they mistook timing and loopholes for talent. Google unintentionally fueled this overconfidence by rewarding keyword stuffing, shallow articles, and formulaic playbooks that had little to do with real expertise.

Those days are gone. Today, AI-slop in the SERPs, fragmented discovery across social and generative AI chatbots, and the rise of agentic systems have exposed just how fragile those old SEO tactics really were.

SEO isn’t dying; it’s finally maturing.

And the marketers who win from this point forward are the ones who:

  • Understand audience behavior.
  • Build trust.
  • Earn authoritative attention across platforms, formats, and AI-powered environments.

That’s why we created SEO Trends 2026, our most comprehensive annual analysis yet.

It captures where discovery is shifting, how search behavior is changing, and what’s actually working for top SEOs right now.

And, it’s based on first-hand insights from some of the most respected operators in the industry.

Inside this year’s edition, you’ll learn:

  • How to protect your visibility in an AI-first discovery landscape.
  • Which platforms and content types are emerging as new engines of trust.
  • Why brand experience now influences rankings as much as on-page content.
  • The single most important strategic shift SEOs must make for 2026.

Key Finding #1: SEO Is Splintering Into New Discovery Paths

Discovery has fractured far beyond the ten blue links. Users now bounce between TikTok, Reddit, YouTube, ChatGPT, Gemini, and AI assistants before ever reaching a website.

Gen Z alone starts 1 in 10 searches with Google Lens, and 20% of those carry commercial intent. 

Traditional TOFU content has lost ground as AI systems increasingly summarize it.

Why it matters for SEO: Visibility now requires showing up consistently across multiple platforms, not just search.

Learn how to start reallocating your content and platform strategy to match this shift. Download the SEO Trends 2026 ebook for the tactical playbook.

Key Finding #2: Content AI Can’t Replicate Is Driving Results

Top SEOs reported that the content performing best in 2026 is the kind AI can’t easily imitate: opinionated commentary, first-hand experience, data-rich insights, and multimedia storytelling.

Shelley Walsh highlights that video interviews and experience-based formats “gain visibility across social, SERPs, and LLMs” precisely because they contain a human perspective.

SEO Opportunity: SEOs must invest in formats that feel unmistakably human. It’s not enough to publish “helpful content.” You need content that’s un-cannibalizable.

Download the ebook to explore SEO-first content trends that are gaining visibility in 2026.

Key Finding #3: AI Is Now A Competitive Necessity And A Threat

AI assistants and chatbots are quickly becoming the default discovery channel for millions of users.

LLMs now absorb the informational queries that once fueled website traffic, and they evaluate brands based on third-party mentions, sentiment, and authority signals across platforms.

Yet at the same time, these systems introduce new risk:

  • Truncated SERPs.
  • Hallucinations.
  • Opaque ranking logic.

As Katie Morton notes, Google is incentivized to keep users on its properties, often at the expense of search quality.

Why it matters for SEO in 2026: If you aren’t shaping how AI systems interpret your brand, they’ll pull from someone else’s narrative.

Get direction from the industry’s top SEO experts in SEO Trends 2026.

Key Finding #4 & SEO Predictions For 2026

Download the full ebook to access the complete set of 2026 predictions.

Search is changing faster than ever, but the through-line is clear: SEO is becoming a holistic, multi-platform marketing discipline.

User journeys now weave through AI agents, social feeds, community forums, image results, chat interfaces, and, only sometimes, traditional SERPs. Brands need to meet users wherever they seek information, and ensure that every touchpoint reinforces clarity, authority, and trust.

The most successful teams in 2026 will:

  • Invest deeply in audience understanding.
  • Create content that satisfies human expectations, not algorithmic myths.
  • Build owned communities to reduce platform dependence.
  • Monitor how AI systems surface, summarize, and cite their content.
  • Prioritize conversion and loyalty over traffic alone.

If you want to future-proof your search strategy and strengthen your brand’s presence across every discovery engine, download SEO Trends 2026 today. It’s the clearest roadmap we’ve ever published for navigating the AI search era with confidence.

Get the full ebook now and start building your 2026 strategy with data, not guesswork.

SEO Trends 2026


Featured Image: CHIEW/Shutterstock

Digital Equity Is Brand Equity: Don’t Lose Search Visibility In a Merger via @sejournal, @billhunt

Most mergers and acquisitions (M&A) fail to account for the digital infrastructure and visibility of the acquired brands. While executives obsess over legal, financial, and branding integration, they overlook the most visible and valuable touchpoint: the website. This digital neglect often leads to steep drops in search visibility, broken customer journeys, and millions in lost revenue.

This article breaks down the Digital Dilution Effect, a compounding loss of equity, visibility, and performance when digital is mismanaged during M&A, and offers a recovery playbook for executives looking to preserve and grow digital value.

I’ve seen the negative impact firsthand, working with multinationals that acquire dozens of companies each year. It’s the same drill over and over. I remember being in a meeting where the SVP was screaming at the former CEO of an acquired company for not delivering.

The CEO shot back:

“You destroyed everything. We used to get 90% of our leads from organic search. Now our 1,000-page site is gone, replaced by six fluff pages buried in your corporate site with no marketing or ad support.”

That moment became the catalyst for a project I’d been lobbying for: integrating digital migration planning into the M&A process to prevent what I now call the Digital Dilution Effect, the systematic erosion of online visibility and value post-acquisition.

What Is The Digital Dilution Effect?

Digital Dilution is the measurable loss of traffic, brand equity, and revenue that occurs when websites are merged, redirected, or rebranded without a coordinated SEO, content, and infrastructure strategy.

It’s the digital version of goodwill impairment, but worse:

  • The audience knows something’s broken.
  • The platforms (Google, Bing, ChatGPT) lose trust in your content.
  • Your visibility gets reassigned to a competitor or the generative AI black hole.

Why it matters:

In a world where discovery and decision-making are increasingly digital, failing to maintain your brand’s digital presence during an M&A can wipe out the very value you paid for.

The Most Common Causes

  1. Visibility Loss From Domain Consolidation. Rebranding a target company without preserving its search footprint is the fastest way to disappear from customer queries. Redirects are often misconfigured, delayed, or deprioritized.
  2. Visibility Loss From Content Consolidation. As in the experience above, the acquired companies’ digital assets are consolidated from hundreds or thousands into a few “product pages” on the acquirer’s website, losing all the equity they had gained.
  3. Mismatched Infrastructure & CMS Conflicts. Many acquired sites run on different platforms. Migrating to a “standard” content management system (CMS) without considering indexation, internal linking, and site structure almost always leads to crawl chaos.
  4. Conflicting Geo Targeting & Hreflang Implementation. For global firms, improper hreflang consolidation or mismatched country/language logic can result in pages being served to the wrong markets or not at all.
  5. Content Cannibalization. When duplicate or overlapping content isn’t rationalized, search engines are forced to choose which version to index, often selecting neither.
  6. Analytics & Conversion Tracking Breakage. If tracking is not unified across merged properties, you’re flying blind – unable to measure loss, retention, or recovery efforts.
  7. Delay Between Brand Announcement And Web Update. There’s often a months-long gap between press releases and full web updates. During this window, confused users and crawlers both disengage.

Case In Point: A Costly Oversight

A global manufacturing firm acquired a smaller European competitor in a $200 million deal. The acquired brand had strong organic rankings across multiple languages and had become the default source in Google’s AI snippets for specific technical questions.

However:

  • The SEO team wasn’t consulted until eight weeks after the post-acquisition rebrand launched.
  • All top-performing content was redirected to a single press release page.
  • Traffic dropped 94% within 30 days.
  • The AI systems removed the content from summaries, and competitors replaced it.

The cost?

Over $4.5 million in lost monthly inbound lead value, plus the erosion of the technical authority they had spent years building.

The Real Cost Of Misalignment

During M&A, you’ll hear executives ask:

“How quickly can we realize synergies?”
“What’s the roadmap for operational integration?”

But rarely:

“What’s our plan for preserving digital visibility and brand equity?”

That absence is costly.

  • Marketing loses traction with no ability to retarget or convert.
  • Sales loses via the inbound pipeline that powered growth.
  • Product teams struggle to communicate value.
  • Investors see a drop in performance that contradicts synergy projections.

And because SEO and digital visibility aren’t line items in the M&A model, the root cause is often missed.

Why It Keeps Happening

M&A teams are built for compliance and speed.

  • Legal teams want minimal liability.
  • IT wants platform standardization.
  • Marketing wants the new brand live, fast.

But no one is assigned to protect digital equity. The SEO team, if they’re even consulted, often gets overruled or brought in too late.

And in global M&As, the fragmentation is even worse:

  • Regionally controlled sites follow different standards.
  • Language variants conflict with the new global strategy.
  • Schema and structured data get stripped in the migration.

All of this results in a loss of discoverability – and with it, business momentum.

A Digital Recovery Playbook

To avoid – or reverse – digital dilution, here’s what leaders must do:

1. Audit Digital Visibility Before The Deal Closes

Understand which pages drive traffic, leads, and brand authority. This becomes your digital equity ledger.

2. Create A Visibility Preservation Plan

Build a redirect map, structured data strategy, and hreflang alignment plan before you migrate anything.

3. Assign A Digital Integration Lead

Give them real authority – someone who understands SEO, analytics, infrastructure, and cross-functional coordination.

4. Involve SEO In The Deal Room

Just as you review legal liabilities and brand risks, assess the visibility and platform risks with equal rigor.

5. Use The New Brand Launch As A Visibility Catalyst

Turn your rebrand into a content and media boost, not a silent flicker. Leverage schema, press coverage, and AI-optimized structured content.

6. Monitor And Course Correct

Expect a short-term dip, but monitor indexed pages, impressions, and citations weekly. Course correct aggressively.

Final Thought: Treat Digital Equity Like Brand Equity

In the analog world, a brand’s equity resides in customer trust, product perception, and reputation. In the digital world, that equity is increasingly stored in search visibility, content authority, and structured presence across AI and web ecosystems.

You wouldn’t toss out brand recognition in a logo redesign. Don’t toss out digital visibility in an M&A.

If the acquired company’s website is responsible for 60% of inbound leads, killing it without a plan is self-sabotage. If their blog is quoted in Google SGE or ChatGPT, removing it erases your relevance in future answers.

The CMO, CTO, and CSO must work together – from day zero of due diligence – not just to integrate operations but to preserve digital dominance.

Because if your brand can’t be found, it can’t be chosen. And if your new site becomes invisible, that “strategic acquisition” just became a liability.

M&A success isn’t just about alignment on paper; it’s about continuity in search, AI, and user experience. Protect that, and you protect your investment.

More Resources:


Featured Image: Anton Vierietin/Shutterstock

Selling AI Search Strategies To Leadership Is About Risk via @sejournal, @Kevin_Indig

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

AI search visibility isn’t “too risky” to invest in for executives to buy-in. Selling AI search strategies to leadership is about risk.

Image Credit: Kevin Indig

A Deloitte survey of +2,700 leaders reveals that getting buy-in for an AI search strategy isn’t about innovation, but risk.

SEO teams keep failing to sell AI search strategies for one reason: They’re pitching deterministic ROI in a probabilistic environment.

The old way: Rankings → traffic → revenue. But that event chain doesn’t exist in AI systems.

LLMs don’t rank. They synthesize. And Google’s AI Overviews and AI Mode don’t “send traffic.” They answer.

Yet most teams still walk into a leadership meeting with a deck built on a decaying model. Then, executives say no – not because AI search “doesn’t work,” but because the pitch asks them to fund an outcome nobody can guarantee.

In AI search, you cannot sell certainty. You can only sell controlled learning.

1. You Can’t Sell AI Search With A Deterministic ROI Model

Everyone keeps asking the wrong question: “How do I prove my AI search strategy will work so leadership will fund it?” You can’t; there’s no traffic chain you can model. Randomness is baked directly into the outputs.

You’re forcing leadership to evaluate your AI search strategy with a framework that’s already decaying. Confusion about AI search vs. traditional SEO metrics and forecasting is blocking you from buy-in. When SEO teams try to sell an AI search strategy to leadership, they often encounter several structural problems:

  1. Lack of clear attribution and ROI: Where you see opportunity, leadership sees vague outcomes and deprioritizes investment. Traffic and conversions from AI Overviews, ChatGPT, or Perplexity are hard to track.
  2. Misalignment with core business metrics: It’s harder to tie results to revenue, CAC, or pipeline – especially in B2B.
  3. AI search feels too experimental: Early investments feel like bets, not strategy. Leadership may see this as a distraction from “real” SEO or growth work.
  4. No owned surfaces to leverage: Many brands aren’t mentioned in AI answers at all. SEO teams are selling a strategy that has no current baseline.
  5. Confusion between SEO and AI search strategy: Leadership doesn’t understand the distinction between optimizing for classic Google Search vs. LLMs vs. AI Overviews. Clear differentiation is needed to secure a new budget and attention.
  6. Lack of content or technical readiness: The site lacks the structured content, brand authority, or documentation to appear in AI-generated results.

2. Pitch AI Search Strategy As Risk Mitigation, Not Opportunity

Executives don’t buy performance in ambiguous environments. They buy decision quality. And the decision they need you to make is simple: Should your brand invest in AI-driven discovery before competitors lock in the advantage – or not?

Image Credit: Kevin Indig

AI search is still an ambiguous environment. That’s why your winning strategy pitch should be structured for fast, disciplined learning with pre-set kill criteria instead of forecasting traffic → revenue. Traditionally, SEO teams pitch outcomes (traffic, conversions), but leadership needs to buy learning infrastructure (testing systems, measurement frameworks, kill criteria) for AI search.

Leadership thinks you’re asking for “more SEO budget” when you’re actually asking them to buy an option on a new distribution channel.

Everyone treats the pitch as “convince them it will work” when it should be “convince them the cost of not knowing is higher than the cost of finding out.” Executives don’t need certainty about impact – they need certainty that you’ll produce a decision with their money.

Making stakes crystal clear:

Your Point of View + Consequences = Stakes. Leaders need to know what happens if they don’t act.

Image Credit: Kevin Indig

The cost of passing on an AI search strategy can be simple and brutal:

  1. Competitors who invest early in AI search visibility will build entity authority and brand presence.
  2. Organic traffic stagnates and will drop over time while cost-per-click rises.
  3. AI Overviews and AI Mode outputs will replace queries your brand used to win in Google.
  4. Your influence on the next discovery channel will be decided without you.

AI search strategy builds brand authority, third-party mentions, entity relationships, content depth, pattern recognition, and trust signals in LLMs. These signals compound. They also freeze into the training data of future models.

If you aren’t shaping that footprint now, the model will rely on whatever scraps already exist based on whatever your competitors are feeding it.

3. Sell Controlled Experiments – Small, Reversible, And Time-Boxed

You’re asking for resources to discover the truth before the market makes the decision for you. This approach collapses resistance because it removes the fear of sunk cost and turns ambiguity into manageable, reversible steps.

A winning AI search strategy proposal sounds like:

  • “We’ll run x tests over 12 months.”
  • “Budget: ≤0.3% of marketing spend.”
  • “Three-stage gates with Go/No-Go decisions.”
  • “Scenario ranges instead of false-precision forecasts.”
  • “We stop if leading indicators don’t move by Q3.”

45% of executives rely more on instinct than facts. Balance your data with a compelling narrative – focus on outcomes and stakes, not technical details.

I covered how to build a pitch deck and strategic narrative in how to explain the value of SEO to executives, but focus on selling learning as a deliverable under the current AI search landscape.

When presenting to leaders, they focus on three things only: money (revenue, profit, cost), market (market share, time-to-market), and exposure (retention, risk). Structure every pitch around these.

The SCQA framework (Minto Pyramid) guides you:

  • Situation: Set the context.
  • Complication: Explain the problem.
  • Question: What should we do?
  • Answer: Your recommendation.

This is the McKinsey approach – and executives expect it.


Featured Image: Paulo Bobita/Search Engine Journal

The Knowns And Unknowns Of Structured Data Attribution via @sejournal, @marthavanberkel

As marketers, we love a great funnel. It provides clarity on how our strategies are working. We have conversion rates and can track the customer journey from discovery through conversion. But in today’s AI-first world, our funnel has gone dark.

We can’t yet fully measure visibility in AI experiences like ChatGPT or Perplexity. While emerging tools offer partial insights, their data isn’t comprehensive or consistently reliable. Traditional metrics like impressions and clicks still don’t tell the whole story in these spaces, leaving marketers facing a new kind of measurement gap.

To help bring clarity, let’s look at what we know and don’t know about measuring the value of structured data (also known as schema markup). By understanding both sides, we can focus on what’s measurable and controllable today, and where the opportunities lie as AI changes how customers discover and engage with our brands.

Why Most ‘AI Visibility’ Data Isn’t Real

AI has created a hunger for metrics. Marketers, desperate to quantify what’s happening at the top of the funnel, are turning to a wave of new tools. Many of these platforms are creating novel measurements, such as “brand authority on AI platforms,” that aren’t grounded in representative data.

For example, some tools are trying to measure “AI prompts” by treating short keyword phrases as if they were equivalent to consumer queries in ChatGPT or Perplexity. But this approach is misleading. Consumers are writing longer, context-rich prompts that go far beyond what keyword-based metrics suggest. These prompts are nuanced, conversational, and highly personalized – nothing like traditional long-tail queries.

These synthetic metrics offer false comfort. They distract from what’s actually measurable and controllable. The fact is, ChatGPT, Perplexity, and even Google’s AI Overviews aren’t providing us with clear and comprehensive visibility data.

So, what can we measure that truly impacts visibility? Structured data.

What Is AI Search Visibility?

Before diving into metrics, it’s worth defining “AI search visibility.” In traditional SEO, visibility meant appearing on page one of search results or earning clicks. In an AI-driven world, visibility means being understood, trusted, and referenced by both search engines and AI systems. Structured data plays a role in this evolution. It helps define, connect, and clarify your brand’s digital entities so that search engines and AI systems can understand them.

The Knowns: What We Can Measure With Confidence For Structured Data

Let’s talk about what is known and measurable today with regard to structured data.

Increased Click-Through Rates From Rich Results

From data in our quarterly business review, we see, by implementing structured data on a page, the content qualifies for a rich result, and enterprise brands consistently see an increase in click-through rates. Google currently supports more than 30 types of rich results, which continue to appear in organic search.

For example, from our internal data, in Q3 2025, one enterprise brand in the home appliances industry saw click-through rates on product pages increase by 300% when a rich result was awarded. Rich results continue to provide both visibility and conversion gains from organic search.

Example of a product rich result on Google's search engine results pageExample of a product rich result on Google’s search engine results page (Screenshot by author, November 2025)

Increased Non-Branded Clicks From Robust Entity Linking

It’s important to distinguish between basic schema markup and robust schema markup with entity linking that results in a knowledge graph. Schema markup describes what’s on a page. Entity linking connects those things to other well-defined entities across your site and the web, creating relationships that define meaning and context.

An entity is a unique and distinguishable thing or concept, such as a person, product, or service. Entity linking defines how those entities relate to one another, either through external authoritative sources like Wikidata and Google’s knowledge graph or your own internal content knowledge graph.

For example, imagine a page about a physician. The schema markup would describe the physician. Robust, semantic markup would also connect to Wikidata and Google’s knowledge graph to define their specialty, while linking to the hospital and medical services they provide.

Image from author, November 2025

AIO Visibility

Traditional SEO metrics can’t yet measure AI experiences directly, but some platforms can identify some instances when a brand is mentioned in an AI Overview (AIO) result.

Research from a BrightEdge report found that adopting entity-based SEO practices supports stronger AI visibility. The report noted:

“AI prioritizes content from known, trusted entities. Stop optimizing for fragmented keywords and start building comprehensive topic authority. Our data shows that authoritative content is three times more likely to be cited in AI responses than narrowly focused pages.”

The Unknowns: What We Can’t Yet Measure

While we can measure the impact of entities in schema markup through existing SEO metrics, we don’t yet have direct visibility into how these elements influence large language model (LLM) performance.

How LLMs Are Using Schema Markup

Visibility starts with understanding – and understanding starts with structured data.

Evidence for this is growing. In Microsoft’s Oct. 8, 2025 blog post, “Optimizing Your Content for Inclusion in AI Search Answers (Microsoft Advertising,” Krishna Madhaven, Principal Product Manager for Microsoft Bing, wrote:

“For marketers, the challenge is making sure their content is easy to understand and structured in a way that AI systems can use.”

He added:

“Schema is a type of code that helps search engines and AI systems understand your content.”

Similarly, Google’s article, “Top ways to ensure your content performs well in Google’s AI experiences on Search,” reinforces that “structured data is useful for sharing information about your content in a machine-readable way.”

Why are Google and Microsoft both emphasizing structured data? One reason may be cost and efficiency. Structured data helps build knowledge graphs, which serve as the foundation for more accurate, explainable, and trustworthy AI. Research has shown that knowledge graphs can reduce hallucinations and improve performance in LLMs:

While schema markup itself isn’t typically ingested directly to train LLMs, the retrieval phase in retrieval-augmented generation (RAG) systems plays a crucial role in how LLMs respond to queries. In recent work, Microsoft’s GraphRAG system generates a knowledge graph (via entity and relation extraction) from textual data and leverages that graph in its retrieval pipeline. In their experiments, GraphRAG often outperforms a baseline RAG approach, especially for tasks requiring multi-hop reasoning or grounding across disparate entities.

This helps explain why companies like Google and Microsoft are encouraging enterprise brands to invest in structured data – it’s the connective tissue that helps AI systems retrieve accurate, contextual information.

Beyond Page-Level SEO: Building Knowledge Graphs

There’s an important distinction between optimizing a single page for SEO and building a knowledge graph that connects your entire enterprise’s content. In a recent interview with Robby Stein, VP of Product at Google, it was noted that AI queries can involve dozens of subqueries behind the scenes (known as query fan-out). This suggests a level of complexity that demands a more holistic approach.

To succeed in this environment, brands must move beyond optimizing pages and instead build knowledge graphs, or rather, a data layer that represents the full context of their business.

The Semantic Web Vision, Realized

What’s really exciting is that the vision for the semantic web is here. As Tim Berners-Lee, Ora Lassila, and James Hendler wrote in “The Semantic Web” (Scientific American, 2001):

“The Semantic Web will enable machines to comprehend semantic documents and data, and enable software agents roaming from page to page to execute sophisticated tasks for users.”

We’re seeing this unfold today, with transactions and queries happening directly within AI systems like ChatGPT. Microsoft is already preparing for the next stage, often called the “agentic web.” In November 2024, RV Guha – creator of Schema.org and now at Microsoft – announced an open project called NLWeb. The goal of NLWeb is to be “the fastest and easiest way to effectively turn your website into an AI app, allowing users to query the contents of the site by directly using natural language, just like with an AI assistant or Copilot.”

In a recent conversation I had with Guha, he shared that NLWeb’s vision is to be the endpoint for agents to interact with websites. NLWeb will use structured data to do this:

“NLWeb leverages semi-structured formats like Schema.org…to create natural language interfaces usable by both humans and AI agents.”

Turning The Dark Funnel Into An Intelligent One

Just as we lack real metrics for measuring brand performance in ChatGPT and Perplexity, we also don’t yet have full metrics for schema markup’s role in AI visibility. But we do have clear, consistent signals from Google and Microsoft that their AI experiences do, in part, use structured data to understand content.

The future of marketing belongs to brands that are both understood and trusted by machines. Structured data is one factor towards making that happen.

More Resources:


Featured Image: Roman Samborskyi/Shutterstock