Ex-Googler: Google Sees Publisher Traffic As A Necessary Evil via @sejournal, @martinibuster

Google says it values the open web, and a current Googler confirmed in a private conversation at the recent Search Central Live in New York that the company, including CEO Sundar Pichai, cares about the web ecosystem. But that message is contradicted by an ex-Googler, who said Google internally regards sending traffic to publishers as “a necessary evil.”

Constant Evolution Of Google Search

Elizabeth Reid, VP of Search, is profiled in Bloomberg as the one responsible for major changes at Google search beginning in 2021, particularly AI Overviews. She was previously involved in Google Maps and is the one who revealed the existence of core topicality systems at Google.

Her statements about search show how it’s changing and give an idea of how publishers and SEOs should realign their perspectives. The main takeaway is that technology enables users to interact with information in different ways and search has to evolve with that to keep up with them. In her view, what’s happening is now a top-down approach to search where Google is imposing changes on users but rather it’s Google being responsive to users.

Her approach to search was said to be informed by her experience at Google Maps where Sergey Brin pushed the team to release Maps before they felt comfortable releasing it, teaching her that this enabled them to understand what users really wanted faster than had they waited longer.

According to Bloomberg:

“Reid refers to her approach as a “constant evolution” rather than a complete overhaul. Her team is still struggling to define the purpose of Google Search in this new era, according to interviews with 21 current and former search executives and employees…”

AI And Traditional Google Search

Google Search lost 20% of their search engineers who went over to focus on rolling out generative AI so perhaps it’s not surprising that she believes the search bar will lose prominence. According to the report:

“Reid predicts that the traditional Google search bar will become less prominent over time. Voice queries will continue to rise, she says, and Google is planning for expanded use of visual search, too.”

But she also said that the search bar isn’t going away:

“The search bar isn’t going away anytime soon, Reid says, but the company is moving toward a future in which Google is always hovering in the background. ‘The world will just expand,’ she says. ‘It’s as if you can ask Google as easily as you could ask a friend, only the friend is all-knowing, right?’”

Sending Traffic To Publishers Is A Necessary Evil

The article offers seemingly contradictory statements about how Google sees its relationship with the web ecosystem. An unnamed former Googler is quoted as saying that “giving” traffic to publishers is a necessary evil.

“Giving traffic to publisher sites is kind of a necessary evil. The main thing they’re trying to do is get people to consume Google services,” the former executive says. “So there’s a natural tendency to want to have people stay on Google pages, but it does diminish the sort of deal between the publishers and Google itself.”

What Current Googlers Say

At the Google Search Central Live event at New York City I had the opportunity to have a private conversation with a Googler about Google CEO Sundar Pichai’s inability to articulate what Google does to support the web ecosystem. The Googler told me that they’ve heard Sundar Pichai express a profound recognition of their relationship with publishers and said that it’s something he reflects on seriously.

That statement by the Googler was echoed in the article by something that Liz Reid and Sundar Pichai said:

“Reid says that Google cares deeply about publishers and that AI Overviews is a jumping-off point for users to conduct further research on the open web. Pichai, for his part, stresses the need to send ‘high-quality’ traffic to websites, instead of making users click around on sites that may not be relevant to them.

‘We are in the phase of making sure through this moment that we are improving the product, but in a way that prioritizes sending traffic to the ecosystem,’ he says, adding, ‘That’s been the most important goal.’”

Takeaways

  • Google is reshaping Search based on user behavior, not top-down mandates. But the fact that OpenAI’s ChatGPT pushed Google into rolling out their answer shows that other forces aside from user behaviors are in play as well.
  • Traditional search bar is becoming less central, replaced by voice (likely mobile devices) and visual search (also mobile). Google is multimodal, which means that it operates within multiple senses, like audio and visual. Publishers should really think hard about how that affects their business and how they can align it to also be multimodal so as to evolve along with users so that their content is already there when Google itself evolves to meet them there, too.
  • AI Overviews and possibly the Gemini Personal AI Assistant could signal a shift toward Google acting as an ambient presence, not a destination.
  • Google’s relationship with publishers has never been more strained. The disconnect between the public-facing statements and those by anonymous ex-Googlers send a signal that Google needs to be more out front with their relationship with publishers. For example, Google’s Search Central videos used to be interactive sessions with publishers, gradually drying up to scripted question and answers and now it’s completely gone. Although I believe what the Googler told me about Pichai’s regard for publishers because I know them to be truthful, the appearance that their search relations team has retreated behind closed doors sends a louder signal.
  • Google leadership emphasizes commitment to sending “high-quality traffic” to websites. But SEOs and publishers are freaking out that traffic is lower and the sentiment may be that Google should consider a little more give and a lot less take.

Hat tip to Glenn Gabe for calling attention to this article.

Featured Image by Shutterstock/photoschmidt

The Problem with Optimizing for GenAI

Optimizing for visibility in generative AI platforms seems easy enough. Enter a sitemap URL into Bing Webmaster Tools and Search Console, and, voilà, Microsoft Copilot, Google Gemini, and even ChatGPT access and reference the content.

But Clint Butler reminds us it’s not that simple. He says the problem is knowing which platform to optimize, as the tactics differ. And rarely do those platforms mention the source, much less link to it. Google, incredibly, uses one source to power AI Overviews but links to others in the citations.

Clint is a decorated, 20-year military veteran who now runs Digtalteer, a prominent search-engine-optimization agency. I spoke with him last month on the state of AI search, ecommerce optimization tactics, schema markup, and more.

Our entire audio dialog is embedded below. The transcript is edited for clarity and length.

Eric Schwartzman: What’s working now for search engine optimization?

Clint Butler: The elephant in the room is artificial intelligence, large language models, and the types of content you can make with them — text, video, audio — and how business owners can leverage that within their marketing, including search.

You and I are old school. We’ve written content by hand. Now we can put our knowledge into the models and get a nice content base. It doesn’t take us as long to push out articles because we can edit, review, and publish them more efficiently.

Schwartzman: Is it more about using AI to generate content, or should we optimize our existing content to appear in LLMs such as ChatGPT, Claude, or Gemini?

Butler: The problem with that is knowing which model to optimize. If you’re going for the Microsoft version, admission is simple. Add your stuff to Bing Webmaster Tools, and your site is in Copilot.

As far as I can tell, Google uses its top-ranking organic pages to generate AI Overviews. But it then links to other pages for citations!

It’s worth asking whether achieving a top organic ranking is worth it. Google should fix that citation practice.

Schwartzman: As for LLMs, we’ve got OpenAI, Gemini from Google, and Llama from Facebook. Presumably content would enter all of the training sets if it hit LinkedIn, Bing Webmaster Tools, and the web for Google, as well as Facebook and X.

Butler: That’s another misnomer. Let’s say you have an ecommerce site selling scented candles. You publish a lot of content about scented candles.

Just because the AI bots crawl you doesn’t mean you will be in the data sets or properly cited. Gemini is the most prominent AI to get in front of people because it powers AI Overviews.

And, again, just because Google uses your content doesn’t mean you get a link. The rules vary depending on the LLM. Bing and Copilot are a bit better at citing the source. You would more likely get a click to your scented candles out of Copilot than Google’s Gemini.

Schwartzman: Are you saying SEO becomes less important? Should merchants look at other channels for traffic?

Butler: The SEO priority for ecommerce merchants is Google Merchant Center. That’s what Google uses to populate many search snippets and product carousels where AI Overviews typically don’t appear.

Ecommerce merchants trying to generate traffic and awareness through informational marketing may have a problem, although the latest data that I’ve seen from AccuRanker and Sistrix suggests it’s not insurmountable if you’re in the top three listings. The click-through rate for those placements is only down from 26% to 24% with AI Overviews.

So long as you’re in the top three, you’re okay.

Schwartzman: You work with ecommerce clients. What are their common SEO mistakes?

Butler: It depends on the platform, but product names and category optimization are the two big ones. Say you have a scented Halloween candle and call it “Eric’s Freaky Friday Halloween Candle” versus simply “Jasmine Candle.”

People search for jasmine-scented candles but not for your fancy name. Use that fancy name in your product description but not the name.

Schwartzman: Is there a balance? Say I’m searching “jasmine candle,” and the SERP choices are “Jasmine Candle” and “Jasmine Candle: Eric’s Freaky Friday Jasmine Candle.” I’m likely going with the one that isn’t so vanilla.

Butler: That’s true. But the problem is that Google will likely truncate after “Jasmine Candle.” Searchers won’t see the “Freaky Friday” part. Experiment instead with inserting sizes or maybe even colors.

Schwartzman: What is a healthy click-through rate for a query such as “scented candles”?

Butler: Search Console suggests 2% to 5% is good, but keep in mind where that data comes from. It’s not 100% accurate. In my experience, take what you see in Search Console with a grain of salt. If you want 100% accurate data, run a Google Ads campaign.

Search Console is useful for basic decisions, however.

Schwartzman: You’re an expert on Schema.org structured data. You offer a course on how to write schema. You’ve even ranked blank pages just with schema. But it’s not practical for ecommerce merchants with large catalogs to write advanced schema for each item. What should they do?

Butler: Much of the advanced product variant data on Google search results come from Merchant Feed, not from schema inserted by the seller. So the first step for merchants is setting up Google Merchant Feed.

Beyond that, merchants can use schema selectively for rich snippets. Say a seller has 100 products, and 10 generate most of the revenue. Implement a nice product schema on just those 10.

There are helpful tools, too. Shopify users can leverage its Google & YouTube tool. Fill in the fields — pricing, shipping, categories, imagery — and the tool will populate Google Merchant Feed, which, again, drives SERP carousels.

Schwartzman: How can readers get in touch?

Butler: My agency is Digitalteer.com. I’m on LinkedIn and X.

Checking In: Where Are We With The Google Lawsuits? via @sejournal, @Kevin_Indig

We’ve been focused on the impact of AI on Search and how to still make gains in this competitive and volatile SEO world.

It’s easy to forget that two big lawsuits are deciding potential remedies against Google soon, which could affect organic traffic and the search landscape.

I know most of you don’t spend your free time reading up on antitrust law – I certainly don’t.

But the truth is, the rulings in these Google cases could impact your website’s traffic – which is a big deal for any business trying to grow online.

On one hand, a weakened Google could open the door for AI chatbots and other new players to shake up the landscape.

On the other, a strengthened Google would solidify its position as the gatekeeper of customer acquisition.

Image Credit: Lyna ™

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

Context

There are many lawsuits against Google, some of which come with the territory of being one of the biggest companies in the world.

However, two prominent cases stand out because they have the power to transform the company:

1. “Search Monopoly Lawsuit: United States v. Google LLC (2020)”

  1. The claim: Google unlawfully maintains monopolies in the search and search advertising markets by using exclusive agreements (e.g., with Apple) and paying device manufacturers to make Google the default search engine.
  2. Demanded remedies: Google should divest Chrome and Android (or remove mandatory Google services from Android), terminate exclusive agreements, add choice screens, and share data with competitors. Important: the judgment would last 10 years from its effective date, with the potential for early termination under certain circumstances.1
  3. Expected judgment: August 2025.

2. “Digital Advertising Lawsuit: United States v. Google LLC (2023)”

  1. The claim: Google has unlawfully monopolized key digital advertising technologies and markets, including ad exchanges and publisher ad servers. Google engages in exclusionary practices that stifle competition, such as acquiring competitors, manipulating auctions, and restricting publishers from using rival technology platforms.
  2. Demanded remedies: Google should sell Google Ad Exchange and Ad Publisher Server, allow advertisers and publishers to pick other services, and make auctions more transparent.
  3. Expected judgment: Early to mid-2025.

Each case will significantly change Google’s position if the remedies (consequences) come into rule. They already attract other lawsuits.

As I mentioned in my article, the Chegg lawsuit might have been strategically filed to build on top of these two DoJ lawsuits.

Two reasons: Trump and AI.

1. Administration

The Trump administration has cut the already dull teeth of many antitrust government bodies and runs a mafia/kleptocracy, which introduces a significant wildcard into both lawsuits.

  • Even though the first Google lawsuit started in Trump’s first term, Google understands that there is a chance the Trump administration will stop the DoJ lawsuit or weaken remedies and will do everything in its power to lobby for the outcome. Google tried to persuade Trump with a one-million-dollar gift from Sundar PichAI and complied with an executive Trump order to remove DEI programs and hiring goals for federal contractors.
  • Trump has spoken out against a break-up at an event in Chicago in October: “If you do that, are you going to destroy the company? What you can do without breaking it up is make sure it’s more fair.”2
  • Trump appointed Republican Andrew Ferguson as the new chairman of the Federal Trade Commission (FTC), making it more likely to steer the outcome in his favor. However, in March, the DoJ reaffirmed its position on divesting Chrome despite pulling back its ask for Google to divest its investments in AI (e.g., Anthropic).3

2. Competition From AI

AI hasn’t just threatened Google Search but also leveled the playing field.

Many products, from meta AI to ChatGPT & Co to Copilot, can answer questions now and are basically a search engine, which means Google’s monopoly position could be questioned.

  • Judge Mehta, who rules in the Search Monopoly case, addressed this point: “AI cannot replace the fundamental building blocks of search, including web crawling, indexing, and ranking.”
  • However, he could change his mind based on the rapid growth of many AI chatbots and the fact that so many new ones pop up left and right.

Conclusion: A Small Chance For The Open Web

Google is doing well: Search revenue has grown to almost $200 billion in 2024, up from $175 billion the year before.

Search ads still make up over 50% of Alphabet’s revenue, YouTube is stable, and Cloud offsets the dropping Network revenue.

Alphabet still makes over 50% of revenue from Search ads (Image Credit: Kevin Indig)
  • Is Google doing well because people use it more or because advertisers have no alternative? Probably both: Sparktoro found that Google searches grew by 20% YoY in 2024. At the same time, “42% said Google and search engines are becoming less useful.” Maybe the explanation of the paradox is that more searches result from users not finding what they want.
  • Google is still the biggest source of traffic by a big margin, even if AI would mean 20-30% less referral traffic.

The lawsuit matters because if it goes through, it could accelerate or decelerate traffic from Google.

How the search ecosystem would change if the remedies went through as proposed:

  • Google will lose a significant data advantage from Chrome and competitors will benefit big time from Google’s data. I reported how Ecosia and QWANT are building their own search index to become independent of Google and Bing. Enforced data sharing would support this process and inspire competitors like DuckDuckGo.
  • Google would encounter a big hit in mobile traffic from Apple devices, similar to how it already sees a weakened market share in the EU (see the above article as well). Device manufacturers could pre-install different search engines, adding to the pain.
  • Google might make search better to compete harder, benefitting users. But it could also push Google to be even more aggressive about AI and send even less traffic to websites.

I concluded that one scenario is most likely to happen in my last post on the Search Monopoly case Monopoly:

Google must end its exclusivity deals immediately. Apple needs to let users choose a default search engine when setting up their devices. Google could get hefty fines for every year they keep the contract with Apple going.

Realistically, that still seems to be the most likely outcome. But for the web economy, it would be best if the judges in both lawsuits ruled against Google.


1 Source

2 Trump expected to shift course on antitrust, stop Google breakup

3 Trump’s Justice Department still wants to break up Google


Featured Image: Paulo Bobita/Search Engine Journal

Brand Visibility Is the New SEO

Brand visibility is becoming critical for AI search because consumers increasingly use ChatGPT, Gemini, Claude, and others for product and company recommendations.

As early as July 2023, a Capgemini study of consumers in 13 advanced economies (PDF) found that “generative AI tools such as ChatGPT are becoming the new go-to for 70% of consumers when it comes to seeking product or service recommendations, replacing traditional methods such as search.”

In other words, consumers’ use of genAI to discover brands and products is skyrocketing. Yet those platforms must be aware of your business or products to recommend them.

That’s where brand search comes into play. Search engines drive brand discovery for genAI platforms and research for humans.

In August 2024, SparkToro cited studies revealing that 99% of genAI users continued to use traditional search engines for details, such as specs, prices, and reviews. Hence consumers likely discover brands and products on genAI and go to Google, Bing, and others to learn more.

This trend will surely grow. The number and type of brand searches are now essential search metrics.

So what can we track and how?

Brand search volume

The easiest way to monitor brand visibility is via search engine data, tracking the number of queries on your company’s name and products.

There are a few options.

Search Console

Search Console’s Performance section reports brand queries. Click Search results > Add filter > Query. Then enter your brand name for the number of impressions and clicks from searchers.

Screenshot of the Search Console report

Search Console reports impressions and clicks on any term, including a brand name. Click image to enlarge.

Google Trends

Unlike Search Console, Google Trends tracks competitors’ brands, too. Type your brand name or a competitor’s for trends.

Google Trends tracks any term — yours and a competitor’s. This example shows “ahrefs.” Click image to enlarge.

Google Trends will also report rising related queries that include a brand name. Glimpse, a freemium Chrome extension, overlays Google Trends data on important search metrics such as volume and long-tail queries containing your brand name and competitors’.

Google Trends reports rising queries that include a brand name, such as “ahrefs.” Click image to enlarge.

Semrush

Semrush includes search volume for any keyword (versus a group of keywords in Google Ads reports) over time.

Use the “Keyword overview” tool in Semrush’s dashboard and select months and years in the dropdown menu.

Select months and years in Semrush’s “Keyword Overview” tool. Click image to enlarge.

Brand search impressions

Fewer queries in search engines result in clicks. Thus the number of brand impressions — searchers who view your organic brand listings — is now all-important.

Use Search Console’s “Query” filter to track those impressions.

Screenshot a Search Console report for brand impressions.

Track brand-name impressions in Search Console. Click image to enlarge.

Google’s March Core Update: Early Observations From Initial Rollout via @sejournal, @MattGSouthern

Google’s March 2025 Core Update, announced on March 13th and expected to complete its rollout this week, is creating turbulence in search results according to multiple industry tracking tools.

Data from Local SEO Guide and SISTRIX indicate this may be a highly impactful update.

“Most Volatile” SERPs in 12 Months

According to tracking data from Local SEO Guide, which monitors 100,000 home services keywords, the week of March 10th showed the highest SERP volatility observed in over a year. This aligns with Google’s official announcement of the March Core Update on March 13th.

SISTRIX data confirms these findings, with its Google Update Radar showing movement beginning March 16th across both the UK and US markets. The company monitors one million SERPs daily to track the update’s impact.

Winners & Losers

Local SEO Guide identified several clear winners and losers in their tracking data. Sites gaining the most visibility include:

  • ThisOldHouse.com
  • Reddit.com
  • Yelp.com
  • HomeDepot.com
  • Quora.com

Conversely, sites experiencing the most significant drops in visibility include:

  • DIYChatroom.com
  • GarageJournal.com
  • Bluettipower.com
  • Everfence.com
  • MrHandyMan.com

SISTRIX’s analysis revealed additional impacted domains in the UK market, with significant losses for quora.com (-15.76%), vocabulary.com (-10.93%), and expedia.co.uk (-20.60%). Government sites weren’t spared either, with hmrc.gov.uk showing a dramatic 52.60% visibility decrease.

Retail Sector Impact

The retail sector has seen interesting shifts. SISTRIX data shows that notonthehighstreet.com experienced a 56.28% visibility increase in UK searches, while uniqlo.com saw a 76.12% gain.

On the negative side, several retailers lost ground, with zara.com dropping 24.00%, amazon.com declining 13.84%, and diy.com falling 7.75% in visibility.

Key Trends Emerging

Andrew Shotland, CEO of Local SEO Guide, identified several potential patterns in this update:

1. Forum Content Devaluation

Two forums, DIYChatroom and GarageJournal, saw visibility drops despite having experienced a 1,000%+ increase over the past year.

Shotland notes this may not be a direct demotion, but Google is elevating sites like Reddit alongside features like Discussions and Forums widgets and Popular Products grids.

2. Fight Against AI-Generated Content

Sites like Bluettipower.com, which appears to have created thousands of data-driven pages likely using AI, have seen visibility declines. Other sites with “kitchen-sink, made-it-for-SEO” content are similarly affected.

3. Cross-Sector Impact

Unlike some updates targeting specific niches, this core update affects sites across various sectors, including retail, government, forums, and content publishers.

What’s Next

Google has provided little information about the improvements to its search algorithm in this core update. The full effects may not be clear until the rollout is complete.

Google’s March Core algorithm update is still rolling out. Search Engine Journal will monitor changes and offer updates as more information becomes available. Please continue sending in your reports.


Featured Image: eamesBot/Shutterstock

Google Provides Timeline To Improve Publishers’ Search Visibility via @sejournal, @MattGSouthern

Google has publicly committed to December 31 as a deadline for improving how independent publishers appear in search results.

This timeline emerged during an exchange on X between Danny Sullivan, Google’s Search Liaison, and several concerned publishers.

A Turning Point for Independent Publishers?

The exchange began with Jonathan Jones sharing notes from a discussion where Google addressed concerns about independent content creators.

According to Jones’ post, Sullivan acknowledged Google’s need to “reward sites better” and expressed interest in helping “smaller independent sites to succeed.”

What made this conversation notable was publisher Nate Hake’s push for accountability, which resulted in Google providing a deadline. Something Google typically avoids when discussing ranking improvements.

“Can we take that to mean ‘December 31, 2025’ (if not before)?” Hake asked directly.

“Yes,” responded Google’s Search Liaison, adding the caveat that “this doesn’t mean all sites will go back up to wherever they were if they are down from a previous peak.”

Long-Standing Frustrations Come to a Head

The exchange highlighted the tension between Google and independent publishers, which have seen their search visibility decline in recent years.

“Honestly, everything you are saying sounds exactly like what you said when we visited Google HQ in October,” Hake wrote. “Same words, same inaction.”

Hake then detailed what he claims Google has done since October: “reduced independent publisher visibility even more” while continuing “to preference Reddit, Quora, and the 16 VC-backed media companies.”

Others joined the conversation, expressing similar frustrations with Google’s communication style. Mordy Oberstein characterized Google’s guidance as “ethereal” and “anything but concrete and consistent,” noting that publishers need more precise models of “what good sites look like.”

Google’s Response: Gradual Improvements, Not a Single Update

In response to these criticisms, Sullivan explained that improvements would be incremental rather than delivered in one major update:

“There’s no specific date because there’s no one specific thing that the teams are working on to improve. There are multiple things, because search has multiple things that are involved in ranking.”

He added:

“There have been some changes already launched with that goal. Some sites may have benefited from them; others might not, but that’s also because the sites themselves are all different.”

Sullivan acknowledged the need for better guidance, stating:

“I’d like to see us do a better job with guidance and documentation focused on content issues to add to our existing stuff that’s primarily about technical issues.”

Why This Matters

Many publishers have reported traffic declines following recent Google updates, with some claiming visibility has dropped despite maintaining high-quality content.

As Google’s March Core Update continues to roll out, publishers are anxious to see if it will resolve their ranking issues.

Some websites might notice changes with this update. However, we can expect improvements for more publishers by December.

Sullivan’s commitment is a small but notable victory for those who have pushed for greater transparency and accountability from Google.

Google Search Central Live NYC: Insights On SEO For AI Overviews via @sejournal, @martinibuster

Danny Sullivan, Google’s Search Liaison, shared insights about AI Overviews, explaining how predictive summaries, grounding links, and the query fan-out technique work together to shape AI-generated search results.

Optimizing For AIO

Danny Sullivan shared insights into how AI Overviews are generated, helping explain why Google may link to websites that don’t match the typical search results. While the links can differ, he emphasized that the fundamentals of search optimization remain unchanged.

This is what Danny Sullivan said, based on my notes:

“The core fundamental things haven’t really changed. If you’re doing things that are making you successful on search, those sorts of things should transfer into some of the things that you see in the generative AI kind of summaries.”

Google Explains Why AIO Results Are Different

One of the main takeaways from this part of Danny’s presentation was his explanation of why Google AIO search results are different. This is the clearest explanation of why the AIO search results are different, every SEO and publisher needs to know this.

He introduced two concepts to familiarize yourself with in order to better understand AIO search results.

  1. Predictive Summaries
  2. Grounding Links

Predictive Summaries

Danny solved the mystery behind AIO search results that show content and links that are different from what organic search results show, which makes it harder to understand how to optimize for that kind of AI search results.

He shared that the reason for that kind of AIO is something called predictive summaries. Predictive summaries show answers to a search query but also try to predict related variations of what a user will also want to see. This sounds a lot like Google’s patent about Information Gain. Information Gain is about predicting the next question that a searcher may ask after reading the answer to their present question. Information gain is a patent that is strictly to the context of AI Search and AI Assistants.

Here is what he said, according to my notes:

“One thing I think that people find really confusing sometimes is that they’ll do a query and especially you’ll see …these are the top 10 results, but I don’t see them in the AIO, what’s going on?

And it’s like, yeah, the query in the search box is the same query, but the model that’s going out there to try to understand what to show is kind of an overview, going beyond just the top 10 results. It’s understanding a lot of results and it’s understanding a lot of variations that you might kind of get and so that it’s coming back and it’s trying to provide its predictive summary of what the query is related to.”

Grounding Links

Sullivan also revealed that “grounding links” are another reason why AIO search results are different from the regular organic search results. An AIO search result is a summary of a topic that includes facts about multiple subtopics. The purpose of grounding is to anchor the entire summary to verifiable information from the web ecosystem.

In the context of AIO, grounding is the process of confirming the factual authenticity of the AI summaries so that a searcher can click to read about any subtopic discussed in the answer summary provided by AIO. This is the second reason why the links in AIO show a variety not normally seen in the organic search results.

One way to look at this is that the links are more contextual than the regular ten blue links of the organic search results. These contextual links are also referred to as qualified clicks or qualified links, links that are hyper-specific and more relevant in general than organic search results.

Danny appears to say that the grounding links are created from searches that are related to the initial search query, but are not the same. Like, if you want to explain how a conventional automobile runs, you need information about the powertrain which is made up of a gas combustion engine, a transmission, the axles and so on. Answering a complex question requires grounding from a wide array of information sources.

According to my notes, this is how Danny Sullivan explained it:

“And then on top of that, it’s then also trying to bring in the grounding links. And those grounding links, because it kind of comes from a broader set aren’t just going to match. The queries are going to be different and the overall set is going to be different.

Which is why it’s a great opportunity for diversity and whatever our query thing is that we say, but that’s why you can see different things that are showing there.”

Don’t Mess Up Your Rankings

Sullivan cautioned about trying to rank for both the organic and the different parts of the AIO summaries, saying that it’s likely to “mess things up” because “it doesn’t really work like that.”

Query Fan-Out Technique

Danny Sullivan also touched on the topic of AI Mode, saying that right now it’s not really something to optimize for because it’s still in Google Labs and it’s very likely to change and be something different if it ever gets out of Google Labs.

But he did say that AI Mode uses something called a query fan-out technique.

He said:

“…one of the things they talk about is like ‘we use an advanced query fan out technique with multiple related queries in it…’ And it’s basically that what I said before.

You issued a query. You try to understand the variations and things that are related. which by the way is not that much different to how search works at the moment even when you didn’t have the AI elements to it. Because when you would issue a query now we try to understand synonyms, we try to understand the meaning of the entire query. If it’s a sentence, we try to match it in all sorts of different ways …because sometimes it just brings you better results.”

Takeaways:

Google Search Liaison, aka Danny Sullivan, encouraged the use of the core SEO fundamentals, saying that they are still relevant for ranking. Danny explained why the links in AI Overviews can sometimes differ significantly from those in the organic search results, introducing three concepts that help understand AIO search results better.

Three concepts related to AIO search results to understand:

  1. Predictive Summaries
  2. Grounding Links
  3. Query Fan-Out Technique
Google Business Profile Suspensions Rise, But Appeals Are Delayed

Google Business Profile (GBP) suspensions have steadily risen since January, and as they increase, appeal resolution times have grown significantly – from about five days to nearly five weeks.

As a Platinum Product Expert in the Google Business Profile Support Forums, I help small businesses navigate Google’s GBP platform.

For many of these SMBs, Google is a primary lead driver, and when their listing goes south due to a suspension, life can get very hard.

Image from author, March 2025

What We Know

We noticed in February that forum complaints about suspensions reached their highest level since last August.

Users typically try support first but when frustrated, a subset of them find their way to the forums to see what their next steps are.

The forums provide a canary-in-the-coal-mine function allowing outside observers to understand problems that Google Business Profiles are currently experiencing.

Based on the current daily average posting rate, we estimate that March’s suspension-related posts will surpass February’s total. (Image from author, March 2025)

The weekly influx of these posts has not slowed. In fact, it’s accelerating.

Posts peak on either Mondays or Tuesdays, when business owners return to the office to deal with their suspensions and appeals.

You can see in the following chart that these daily high points and weekly totals continue to increase.

Image from author, March 2025

Why It’s Happening

Many others, who are managing local profiles, report an uptick in suspensions. The exact reason for this increase remains unclear.

As usual, Google has provided no explanation, despite overwhelming demand on both support channels and the forum.

When Google updates the algo, perhaps to increase trust in its listings, suspensions seem to be triggered – even when the user makes even minor changes to the profile.

Unfortunately, we do not yet know which attributes Google is finding unacceptable.

Google staffs their support for the “typical” level of suspensions. Appeals of those suspensions are handled by humans, and a suspension increase can cause the staff to get further and further behind.

These delays contribute to some of the “noise” we are seeing in the forum. At this point, the appeal process is taking somewhere on the order of 4 weeks or more, not the ~5 days noted by Google.

Image from author, March 2025

Bulk & API Accounts Also Impacted

GBP Bulk and API accounts – where a single corporate account can add new locations in bulk, with minimal additional verification or change information – have been impacted as well.

Several bulk and API account managers report that individual listings within bulk and API accounts now require manual re-verification even after minor edits, creating massive headaches for the corporate marketing teams trying to re-verify a listing in Peoria.

Late yesterday, Google confirmed the re-verification issues in a statement on the forum.

Our research indicates that the problem started much earlier than last week, and we are not convinced that the problem is yet solved.

Image from author, March 2025

However, Yext was reporting continuing issues on their system update page.

Image from author, March 2025

Don’t Make Changes

We strongly advise not making any changes to your listing at this time.

It appears Google does not yet have a handle on whatever is causing the increase in suspensions and re-verifications.

It obviously doesn’t have a handle on dealing with the large number of appeals. Thus, if your listing gets suspended, you will experience significant delays in getting reinstated.

We recommend you pause making any changes to your individual, API-managed, and bulk listings, at least until Google clarifies the issues or, more importantly, until support addresses the appeals backlog.

More Resources:


Featured Image: voronaman/Shutterstock

Q2 SEO & AI Update: How To Track & Optimize AI Search Performance [Webinar] via @sejournal, @hethr_campbell

Are you ready to ensure your SEO tools deliver the most accurate, real-time data in 2025 and beyond?

With the constant evolution of Google’s search algorithms, staying ahead requires not just adapting to changes but anticipating them. Join us on March 27, 2025, for our in-depth webinar, “Q2 SEO & AI Update: How To Track & Optimize AI Search Performance,” where we’ll explore how SEO professionals and B2B companies can effectively navigate these changes.

In this session, you’ll learn:

  • SERP Data Accuracy: Why precise SERP data is essential for SEO success and strategic decision-making.
  • AI-Driven Search Impact: How AI is changing SEO tracking and the adaptive strategies you need to employ.
  • Evaluating SEO Data Providers: Key factors for assessing the reliability and long-term viability of your data sources.

Exclusive Insights from Industry Experts

Gain insights from Bright Data’s top specialists, who will share actionable strategies that you can implement immediately to keep your data accurate and powerful using APIs. Prepare for an enlightening discussion on adapting your strategies to leverage AI for enhanced SEO visibility.

Live Q&A: Get Your Questions Answered

The webinar concludes with a LIVE Q&A session, allowing you to ask our hosts about specific challenges and opportunities in optimizing for AI-enhanced SERPs.

Don’t Miss Out!

Prepare yourself for the future of SEO data collection and strategy with this powerful new webinar. Embrace the tools and knowledge necessary to dominate the new landscape of AI-driven search results.

Can’t attend live? No worries—register anyway, and we’ll send you the recording to ensure you don’t miss out on these crucial insights.

Google Explains Why Indexed Pages May Not Appear In Search via @sejournal, @MattGSouthern

Google’s Martin Splitt explains why indexed pages may not appear in search results, highlighting relevance and ranking competition.

  • Indexed pages may not appear if other pages are more relevant or user engagement is low.
  • Google’s process involves discovery, crawling, indexing, and ranking for visibility.
  • Focus on high-quality, user-focused content to improve search visibility.