Stop Focusing On Google, It’s Time to Focus On Being Visible via @sejournal, @wburton27

Remember when “Just Google it” was the solution to all your search needs? Unfortunately, those days are changing fast.

While Google remains the king of search, the ground beneath its feet is shifting as brands, marketers, and end users are noticing that there are some new sheriffs in town.

The search ecosystem that puts all your eggs in the Google basket might not be a wise move anymore.

Today’s search landscape isn’t just about algorithm updates and being visible on Google. It’s about recognizing that your audience exists across multiple touchpoints from traditional search engines, i.e., from Bing and Google to AI chatbots, from social platforms to specialized marketplaces like Amazon, etc.

The businesses that thrive won’t be the ones waiting to see what happens; they will be the pioneers already establishing a strong presence across this expanding universe of search.

Google Still Dominates, But It Is Being Challenged

Google is facing some competition.

StatCounter shows Google’s global search share dropped below 90% and remained there throughout the last quarter of 2024, marking the first such decline in nearly a decade.

This shift coincides with significant legal headwinds.

In 2025, Google faces multiple antitrust challenges, with a judge recently finding that Google has a monopoly in search and has acted to maintain it.

These legal troubles might cause Google to change its business practices and may have an impact on its market dominance, allowing other social and AI platforms to capture more of Google’s market share.

This does not mean that Google is going down; it just signifies that Google is no longer the only game in town, and therefore relying on Google only could be increasingly risky.

For example, if you’re an ecommerce retailer that generates 60-80% of your traffic from Google and your site experiences a temporary drop in visibility during a core update for creating AI content, you would be in big trouble.

If your marketing strategy does not have any alternative traffic sources, your revenue could potentially decrease by 40% or more in a matter of weeks.

Meanwhile, if your competitors have diversified their digital presence across multiple platforms, including AI shopping assistants and social commerce channels, they might experience only minor fluctuations in their traffic and sales.

It’s An Omnichannel World

Your audience does not think in terms of platforms; they think in terms of their needs.

For example, a user might ask ChatGPT for information on sustainable materials, browse Instagram for some home design inspiration, check Amazon for product comparisons, and then Google specific brands before making a purchase.

This changing customer journey means that businesses must be acutely aware of where their traffic originates and how much traffic comes from various sources.

The days of relying on and checking only your Google Analytics for Google traffic are over.

In order to succeed, you must have a holistic view of your visibility across the entire digital landscape.

For example, my friend Claudia has an outdated kitchen and is looking to get a new one after 20 years of living in her home with her family.

Here is what Claudia’s journey looked like in this new ecosystem:

  • Claudia started out by going to ChatGPT and typing in “best kitchen design brands,” and found some information mentioning several designer brands.
  • Since the intent behind kitchen design is image-based, Claudia then searches on Pinterest for visual inspiration, and saves some images from the designer brands that she found in ChatGPT.
  • Claudia then looks to Reddit to gather feedback about specific brands and learn from others’ experiences.
  • She checks YouTube for installation tutorials but decides she needs a professional.
  • Claudia then Googles local contractors with high ratings and reviews, contacts one of them, and gets a quote.
Screenshot from ChatGPT, April 2025

Now, if you’re a business that is only focused on Google, guess what? You would not gain Claudia and other clients because you would miss multiple touchpoints in their user journey, as she searched on different channels and platforms. You must have content that enforces your brand at every stage.

Don’t Fall Behind

The time is now to adopt an omnichannel strategy, stay ahead of trends, experiment with different platforms, and maintain a strong performance on established channels like Google so we won’t be left behind.

Imagine if the following scenarios were to occur; what would happen to your business?

  • A loss of 30% of your traffic overnight.
  • You’re not finding where your customers are spending time before they make purchase decisions.
  • You’re not visible on ChatGPT, Bing, YouTube, Reddit, etc.

One of the brands I consulted with in the financial industry noticed that searches about retirement planning were being asked for on AI platforms and in Google.

We created a comprehensive, citation-rich content strategy that got them mentioned in some major financial publications.

When users searched for retirement planning in ChatGPT, their brand was mentioned as a source, which drove leads and conversions.

Screenshot from ChatGPT, April 2025

 AI Works Differently Than Traditional Search

AI chatbots like ChatGPT don’t work like Google’s algorithm. They don’t rank websites; instead, they gather information and identify authoritative sources.

If you want to be visible in ChatGPT, then you need to change your approach.

  • Being a recognized name in your industry increases your chances of being mentioned.
  • Being featured across multiple platforms strengthens your authority and increases your visibility in AI chatbots.
  • Getting referenced by other respected sources helps build trust.
  • Have clear, conversational, and structured content that AI chatbots can reference and find.
  • Be active in social communities like Reddit.
  • Build trust and credibility through positive reviews and ratings.

ChatGPT and other AI chatbots and platforms look more broadly at the digital ecosystem and get information from Quora, Reddit, social media, forum conversations, and reviews.

AI chatbots also understand long-tail queries in a more nuanced way than traditional search.

It’s All About Balance

People are not running for the hills and abandoning Google.

Google remains the king and is likely to retain its market share leadership for the foreseeable future, but things are changing.

To succeed today, you must implement an omnichannel SEO strategy, maintain a strong Google presence, and be where your audience is.

Continue to:

Wrapping Up

Search engines like Google will continue to evolve, alongside ChatGPT, social platforms, and other search technologies that are expected to emerge in the coming years.

But, the days of relying only on Google as your primary digital marketing channel are behind us.

Brands that are discoverable, credible, and helpful will be successful, wherever your audience seeks information.

Brands that win in 2025 won’t be asking “How do we rank better on Google?” but rather “How do we ensure we are visible on every channel and have content that resonates and answers all our customers’ questions?”

This shift in perspective, from platform-centric to audience-centric, is the true key to sustainable digital success.

More Resources:


Featured Image: 3rdtimeluckystudio/Shutterstock

30-Year SEO Pro Shows How To Adapt To Google’s Zero-Click Search via @sejournal, @martinibuster

Search marketer Michael Bonfils recently discussed how AI is disrupting search marketing and shared insights into what he feels is an appropriate response to one of the most difficult search environments he’s seen in his thirty years of experience.

Michael Bonfils (LinkedIn profile) has worked in digital marketing since virtually the dawn of it all, well before Google even existed. He’s a leading international digital marketer with experience across every aspect of digital marketing, from on-page SEO to digital advertising. Michael joined Gianluca Fiorelli (LinkedIn profile) on the Advanced Web Ranking podcast and shared his insights on the challenges AI is bringing to digital marketing and novel ideas for how to navigate them.

Brutal Environment For Digital Marketing

Gianluca mentioned there’s a perception gap with AI where on one side are marketers who are heralding the end of SEO and PPC and on the other side are the “AI bros” who cheerlead that everything is going to become even better, with better leads from ChatGPT, etc.

He shook his head and said:

“It’s neither going to be a disaster and it’s neither going to be an AI paradise.”

Gianluca asked him what trends he’s seeing. Michael responded that the trends he’s seeing is that click volume has gone down since the introduction of AI. He said during other times when volume is down the click through rates go up, like during the pandemic. But that’s not happening now. Click through rates are down, volume is down but Cost Per Clicks are at historic highs.

Michael observed,

“But now, …the level we’re at now is the worst time since 2019 during the pandemic and prior to that it was never that bad.

…If you want throw the CPC factor in, the CPC’s are historically higher than they have been for years. So now we’ve got this perfect problem, click through rates down, volume down, CPC’s up. What does that mean? ROI is getting hit and clients are leaning on organic to try to make up for whatever shortfall there is and they can’t find it, they can’t find the traffic.

So to answer your question, …now that we’re going into Europe with AI overviews, are they impacting things? One hundred percent. And they’ll continue to change. “

Later on they discussed how a lot of what Google is doing is reactionary, a response to external pressures from companies like Perplexity AI and OpenAI, and the search industry is caught in the middle of it.

AI Overviews Leads To Loss Of Strategic Data

Michael Bonfils discusses how AI overviews leads to zero-click behavior and while most SEOs stop right there, Michael points out that this situation affects the data that’s available to marketers and as a consequence impacts content strategy.

Use IndexNow For AI Search And Shopping SEO via @sejournal, @martinibuster

Microsoft Bing published an announcement stating that the IndexNow search crawling technology is a powerful way for ecommerce companies to surface the latest and most accurate shopping-related information in AI Search and search engine shopping features.

Generative Search Requires Timely Shopping Information

Ecommerce sites typically depend on merchant feeds, search engine crawling and updates to Schema.org structured data to communicate what’s for sale, new products, retired products, changes to prices, availability and other important features. Each of those methods can be a point of failure due to slow crawling by search engines and inconsistent updating which can delay the correct information from surfacing in AI search and shopping features.

IndexNow solves that problem. Content platforms like Wix, Duda, Shopify and WooCommerce support IndexNow, a Microsoft technology that enables speeding indexing of new or updated content. Pairing IndexNow with Schema.org assures fast indexing so that the correct information surfaces in AI Search and shopping features.

IndexNow recommends the following Schema.org Product Type properties:

  • “title (name in JSON-LD)
  • description
  • price (list/retail price)
  • link (product landing page URL)
  • image link (image in JSON-LD)
  • shipping (especially important for Germany and Austria)
  • id (a unique identifier for the product)
  • brand
  • gtin
  • mpn
  • datePublished
  • dateModified
  • Optional fields to further enhance context and classification:
  • category (helps group products for search and shopping platforms)
  • seller (recommended for marketplaces or resellers)
  • itemCondition (e.g., NewCondition, UsedCondition)”

Read more at Microsoft Bing’s Blog:

IndexNow Enables Faster and More Reliable Updates for Shopping and Ads

Featured Image by Shutterstock/Paper piper

From Search To Discovery: Why SEO Must Evolve Beyond The SERP via @sejournal, @alexmoss

The search landscape undergoes its biggest shift in a generation.

If you’ve been in SEO long enough to remember the glory days of the all-organic search engine results pages (SERP), you’ll know how much of this real estate has been gradually taken over by paid ads, other first-party products, and rich snippets.

Now, the most aggressive transition of all: AI Overviews (as well as search-based large language model platforms).

At BrightonSEO last month, I explored how this evolution is forcing us to rethink what SEO means and why discoverability, not just ranking, is the new north star.

The “Dawn” Of The Zero-Click Isn’t Just Over – It’s Now Assumed

We’ve been reading about the rise of zero-click searches for some time now, but this “takeover” has been much more noticeable over the past 12 months.

I recently searched [how to teach my child to tell the time], and after scrolling through a parade of paid product ads, Google-owned assets, and the AI Overview summaries, I scrolled a good three pages down the SERP.

Google and other search and discovery platforms want to keep users in their ecosystems. For SEO pros, this means traditional metrics such as click-through rate (CTR) are becoming less valuable by the day.

From Answer Engines To Assistant Engines

LLMs have changed not just the way a result is displayed to the user but also changed the traditional search flow born within the browser into a multi-step flow that the native SERP simply cannot support in the same way.

The research process is collapsing into a single, seamless exchange.

Traditional flow vs Multi-step flowImage used with permission from Alain Schlesser, May 2025

But as technology accelerates, our own curiosity and research skills are at risk of declining or disappearing completely as the evolution of technology exponentially grows.

Assistant engines and wider LLMs  are the new gatekeepers between our content and the person discovering that content – our potential “new audience.”

They parse, consume, understand, and then synthesize content, which is the deciding factor in what it mentions to whom/what it interacts with.

Structured data is still crucial, as context, transparency, and sentiment matter more than ever.

Personal LLM agent flow diagramPersonal LLM agent flow diagram by Alain Schlesser, used with permission, May 2025

Challenges Are Different, But Also The Same

As an SEO, our challenges with this new behavior affect the way we do – and report on – our jobs.

In reality, many are just old headaches in shiny new wrappers:

  • Attribution is a mess: With AI Overviews and LLMs synthesizing content, it’s harder than ever to see where your traffic comes from – or if you’re getting any at all. There are some tools out there that do monitor, but we’re in the early days to see a standard. Even Google said they have no plans on adding insights on AIO within Search Console.
  • Traffic is fragmenting (again): We saw this with social media platforms at the beginning, where discovery happened outside the organic SERPs. Discovery is now happening everywhere, all at once. With attribution also harder to ascertain, this is a bigger challenge today.
  • Budgets are under scrutiny from fear, uncertainty, and doubt (FUD): The native SERP is changing too much, so some may assume there’s less (or no) value in doing SEO much anymore (untrue!).

The Shift Of Success Metrics

The days of our current success metrics are dwindling. The days of vanity-led metrics are coming to an end.

Similar to how our challenges are the same but different, this also applies to how we redefine success metrics:

Old Hat New Hat
Content Context + sentiment
Keywords Intent
Brand Brand + sentiment
Rankings Mentions
Links from external sources Citations across various channels
SERP monopoly Share of voice
E-E-A-T Still E-E-A-T
Structured data Entities, knowledge graph & vector embeds
Answering Assisting

What Can You Do About It?

Information can be aggregated, but personality can’t. This is why it’s still our responsibility to help “assist the assistant” to consider and include you as part of that aggregated information and synthesized answer.

  • Stick to the fundamentals: Never neglect SEO 101.
  • Third-party perspective is increasingly important, so ensure this is maintained and managed well to ensure positive brand sentiment.
  • Embrace structured data: Even if some say it’s becoming less crucial for LLMs to understand entities, structured data is being used right now inside major LLMs to output structured data within responses, giving them an established and standardised way to understand your content.
  • Educate stakeholders: Shift the conversation from rankings and clicks to discoverability and brand presence. The days of the branded unlinked mention suddenly have more value than “acquiring X followed non-branded anchor text links pcm.”
  • Experiment with your content: Try new ways to produce and market your content beyond the traditional word. Here, video is useful not only for humans but also for LLMs, who are now “watching” and understanding them to aid their response.
  • Create helpful, unique content: To add to the above, don’t produce for the sake of production.

LLMs.txt: The Potential To Be The New Standard

Keep an eye on emerging standards proposals, such as llms.txt, which is one way some are adapting and contributing to how LLMs ingest our content beyond our traditional approaches offered with robots.txt and XML sitemaps.

While some are skeptical about this standard, I believe it is still something worth implementing now, and I understand its true benefits for the future.

There is (virtually) non-existent risk in implementing something that doesn’t take too much time or resources to produce, so long as you’re doing so with a white hat approach.

Conclusion: Embrace Discoverability And New Metrics

SEO isn’t dead. It’s expanding, but at a rate we haven’t experienced before.

Discoverability is the new go-to success metric, but it’s not without flaws, especially as the way we search continues to change.

This is no longer about “ranking well” anymore. This is now about being understood, surfaced, trusted, and discovered across every platform and assistant that matters.

Embrace and adapt to the changes, as it’s going to continue for some time.

More Resources:


Featured Image: PeopleImages.com – Yuri A/Shutterstock

Googler’s Deposition Offers View Of Google’s Ranking Systems via @sejournal, @martinibuster

A Google engineer’s redacted testimony published online by the U.S. Justice Department offers a look inside Google’s ranking systems, offering an idea about Google’s quality scores and introduces a mysterious popularity signal that uses Chrome data.

The document offers a high level and very general view of ranking signals, providing a sense of what the algorithms do but not the specifics.

Hand-Crafted Signals

For example, it begins with a section about the “hand crafting” of signals which describes the general process of taking data from quality raters, clicks and so on and applying mathematical and statistical formulas to generate a ranking score from three kinds of signals. Hand crafted means scaled algorithms that are tuned by search engineers. It doesn’t mean that they are manually ranking websites.

Google’s ABC Signals

The DOJ document lists three kinds of signals that are referred to as ABC Signals and correspond to the following:

  • A – Anchors (pages linking to the target pages),
  • B – Body (search query terms in the document),
  • C – Clicks (user dwell time before returning to the SERP)

The statement about the ABC signals is a generalization of one part of the ranking process. Ranking search results is far more complex and involves hundreds if not thousands of additional algorithms at every step of the ranking process, from indexing, link analysis, anti-spam processes, personalization, re-ranking, and other processes. For example, Liz Reid has discussed Core Topicality Systems as part of the ranking algorithm and Martin Splitt has discussed annotations as a part of understanding web pages.

This is what the document says about the ABC signals:

“ABC signals are the key components of topicality (or a base score), which is Google’s determination of how the document is relevant to the query.

T* (Topicality) effectively combines (at least) these three signals in a relatively hand-crafted way. Google uses to judge the relevance of the document based on the query terms.”

The document offers an idea of the complexity of ranking web pages:

“Ranking development (especially topicality) involves solving many complex mathematical problems. For topicality, there might be a team of engineers working continuously on these hard problems within a given project.

The reason why the vast majority of signals are hand-crafted is that if anything breaks Google knows what to fix. Google wants their signals to be fully transparent so they can trouble-shoot them and improve upon them.”

The document compares their hand-crafted approach to Microsoft’s automated approach, saying that when something breaks at Bing it’s far more difficult to troubleshoot than it is with Google’s approach.

Interplay Between Page Quality And Relevance

An interesting point revealed by the search engineer is that page quality is independent of query. If a page is determined to be high quality, trustworthy, it’s regarded as trustworthy across all related queries which is what is meant by the word static, it’s not dynamically recalculated for each query. However, there are relevance-related signals in the query that can be used to calculate the final rankings, which shows how relevance plays a decisive role in determining what gets ranked.

This is what they said:

“Quality
Generally static across multiple queries and not connected to a specific query.

However, in some cases Quality signal incorporates information from the query in addition to the static signal. For example, a site may have high quality but general information so a query interpreted as seeking very narrow/technical information may be used to direct to a quality site that is more technical.

Q* (page quality (i.e., the notion of trustworthiness)) is incredibly important. If competitors see the logs, then they have a notion of “authority” for a given site.

Quality score is hugely important even today. Page quality is something people complain about the most…”

AI Gives Cause For Complaints Against Google

The engineer states that people complain about quality but also says that AI aggravates the situation by making it worse.

He says about page quality:

“Nowadays, people still complain about the quality and AI makes it worse.

This was and continues to be a lot of work but could be easily reverse engineered because Q is largely static and largely related to the site rather than the query.”

eDeepRank – A Way To Understand LLM Rankings

The Googler lists other ranking signals, including one called eDeepRank which is an LLM-based system that uses BERT, which is a language related model.

He explains:

“eDeepRank is an LLM system that uses BERT, transformers. Essentially, eDeepRank tries to take LLM-based signals and decompose them into components to make them more transparent. “

That part about decomposing LLM signals into components seems to be a reference of making the LLM-based ranking signals more transparent so that search engineers can understand why the LLM is ranking something.

PageRank Linked To Distance Ranking Algorithms

PageRank is Google’s original ranking innovation and it has since been updated. I wrote about this kind of algorithm six years ago . Link distance algorithms calculate the distance from authoritative websites for a given topic (called seed sites) to other websites in the same topic. These algorithms start with a seed set of authoritative sites in a given topic and sites that are further away from their respective seed site are determined to be less trustworthy. Sites that are closer to the seed sets are likelier to be more authoritative and trustworthy.

This is what the Googler said about PageRank:

“PageRank. This is a single signal relating to distance from a known good source, and it is used as an input to the Quality score.”

Read about this kind of link ranking algorithm: Link Distance Ranking Algorithms

Cryptic Chrome-Based Popularity Signal

There is another signal whose name is redacted that’s related to popularity.

Here’s the cryptic description:

“[redacted] (popularity) signal that uses Chrome data.”

A plausible claim can be made that this confirms that the Chrome API leak is about actual ranking factors. However, many SEOs, myself included, believe that those APIs are developer-facing tools used by Chrome to show performance metrics like Core Web Vitals within the Chrome Dev Tools interface.

I suspect that this is a reference to a popularity signal that we might not know about.

The Google engineer does refer to another leak of documents that reference actual “components of Google’s ranking system” but that they don’t have enough information for reverse engineering the algorithm.

They explain:

“There was a leak of Google documents which named certain components of Google’s ranking system, but the documents don’t go into specifics of the curves and thresholds.

For example
The documents alone do not give you enough details to figure it out, but the data likely does.”

Takeaway

The newly released document summarizes a U.S. Justice Department deposition of a Google engineer that offers a general outline of parts of Google’s search ranking systems. It discusses hand-crafted signal design, the role of static page quality scores, and a mysterious popularity signal derived from Chrome data.

It provides a rare look into how signals like topicality, trustworthiness, click behavior, and LLM-based transparency are engineered and offers a different perspective on how Google ranks websites.

Featured Image by Shutterstock/fran_kie

How Referral Traffic Undermines Long-Term Brand Growth via @sejournal, @martinibuster

Mordy Oberstein, a search marketing professional whom I hold in high esteem, recently shared the provocative idea that referral traffic is not a brand’s friend and that every brand, as it matures, should wean itself from it. Referrals from other websites are generally considered a sign of a high-performing business, but it’s not a long-term strategy because it depends on sources that cannot be controlled.

Referral Traffic Is Necessary But…

Mordy Oberstein (LinkedIn profile), formerly of Wix, asserted in a Facebook post that relying on a traffic source, whether that’s another website or a search engine, offers a degree of vulnerability to maintaining steady traffic and performance.

He broke it down as a two-fold weakness:

  • Relying on the other site to keep featuring your brand.
  • Relying on Google to keep ranking that other site which in turn sends visitors to your brand.

The flow of traffic can stop at either of those two points, which is a hidden weakness that can affect the long-term sustainability of healthy traffic and sales.

Mordy explained:

“It’s a double vulnerability…

1) Relying on being featured by the website (the traffic source)
2) Relying on Google to give that website …traffic (the channel)

There are two levels of exposure & vulnerability.

As your brand matures, you want to own your own narrative.

More referral traffic is not your friend. It’s why, as a brand matures, it should wean off of it.

Full disclosure, this is my opinion. I am sure a lot of people will disagree.”

Becoming A Destination

I’ve always favored promoting a site in a way that helps it become synonymous with a given topic because that’s how to make it a default destination and encourage the kinds of signals that Google interprets as authoritative. I’ve done things like created hats with logos to give away, annual product giveaways and other promotional activities, both online and offline. While my competition was doing SEO busy work I created fans. Promoting a site is basically just getting it in front of people, both online and offline.

Brand Authority Is An Excuse, Not A Goal

Some SEOs believe in a concept called Brand Authority, which is a misleading explanation for why a website rank.  The term Brand Authority is not about Branding and it’s not about Authoritativeness, either. It’s just an excuse for why a site is top-ranked.

The phrase Brand Authority has its roots in PageRank. Big brand websites used to have a PageRank of 9 out of 10 and even a 10/10, which enabled them to rank for virtually any keywords they wanted. A link from one of those sites practically guarantee a top ten ranking. But Google ended the outsized influence of PageRank because it resulted in less relevant results, which was around 2004-ish, about the time that Google started using Navboost, a ranking signal that essentially measures how people feel about a site, which is what PageRank does, too.

This insight, that Google uses signals about how people feel about a site, is important because the feelings people have for a business are what being a brand is all about.

Marty Neumeier, a thought leader on how to promote companies (author of The Brand Gap) explained what being a brand is all about:

“Instead of creating the brand first, the company creates customers (through products and social media), the customers build the brand (through purchases and advocacy), and the customer-built brand sustains the company (through “tribal” loyalty). This model takes into account a profound and counterintuitive truth: a brand is not owned by the company, but by the customers who draw meaning from it. Your brand isn’t what you say it is. It’s what they say it is.”

Neumeier also explains how brand is about customer feelings:

“The best brands are vivid. They create clear mental pictures and powerful feelings in the minds and hearts of customers. They’re brought to life through their touchpoints, the places where customers experience them, from the first exposure to a brand’s name, to buying the product, to eventually making it part of who they are.”

That “tribal loyalty” is the kind of thing Google tries to measure. So when Danny Sullivan talks about differentiating your site to make it like a brand, he is not referring to so-called “brand authority.” He is talking about doing the kinds of things that influence people to feel positive about a site.

Getting Back To Mordy Oberstein

It seems to me that what he’s saying is that referral traffic is a stepping stone towards becoming a destination, it’s a means to an end. It’s not the goal, it’s a step toward the goal of becoming a destination.

On the other side of that process, I think it’s important to maintain relevance with potential site visitors and customers, especially today with the rapid pace of innovation, generational change, new inventions, and new product models. Relevance to people has been a Google ranking signal for a long time, beginning with PageRank, then with additional signals like Navboost.

The SEO factor that the SEO industry has largely missed is the part about about getting people to think positive thoughts about your site and your business, enough to share with other people.

Mordy’s insight about traffic is beautiful and elegant.

Read Mordy’s entire post on Facebook.

Featured Image by Shutterstock/Yunus Praditya

Google Clarifies: AI Overview Links Share Single Position In Search Console via @sejournal, @MattGSouthern

Google’s John Mueller has clarified that all links within AI Overviews (AIOs) share a single position in Google Search Console.

SEO consultant Gianluca Fiorelli asked Mueller how Search Console tracks position data for URLs in Google’s AI-generated answer boxes.

Mueller referenced Google’s official help docs, explaining:

“Basically an AIO counts as a block, so it’s all one position. It can be first position, if the block is shown first, but I don’t know if AIO is always shown first.”

This indicates that every website linked in an AI Overview receives the same position value in Search Console reports.

This occurs regardless of where the link appears in the overview panel, whether immediately visible or hidden until a user expands the box.

What Google’s Documentation Says

Google’s Search Console Help docs explain how AI Overview metrics work:

  • Position: “An AI Overview occupies a single position in search results, and all links in the AI Overview are assigned that same position.”
  • Clicks: “Clicking a link to an external page in the AI Overview counts as a click.”
  • Impressions: “Standard impression rules apply. To be counted as an impression, the link must be scrolled or expanded into view.”

The docs also note:

“Search Console doesn’t include data from experiments in Search Labs, as these experiments are still in active development.”

The Missing Data Behind Google’s Click Claims

This discussion highlights an ongoing debate in the SEO community regarding the performance of links in AI Overviews.

Lily Ray, Vice President of SEO Strategy & Research at Amsive, recently pointed out Google’s year-old claim that websites receive more clicks when featured in AI Overviews, stating:

“I would love to see a single GSC report that confirms this statement, because every study so far has shown the opposite.”

Ray’s statement reflects the concerns of many SEO professionals, as Google has not provided data to support its claims.

Looking Ahead

While we now understand how position metrics are recorded, the question remains: Do AI Overview placements drive more or less traffic than traditional search listings?

Google claims one thing, but many people report different experiences.

Since all AIO links share the same position, it’s difficult to determine which specific placements perform better.

This debate highlights the need for more precise data about how AIOs affect website traffic compared to regular search results.


Featured Image: Roman Samborskyi/Shutterstock

The Rise Of Privacy-First Search Engines via @sejournal, @TaylorDanRW

Google has long held a firm grip on the search engine landscape, but that dominant veneer is starting to show cracks.

In recent months, regulatory scrutiny, public mistrust, and rising anxiety around AI have pushed digital privacy into the spotlight.

Millions of users are now evaluating their relationship with “big tech” and actively seeking alternatives, prioritizing trust and anonymity.

What was once relegated to being a niche concern is now a broader user shift, with privacy-first search engines gaining momentum across various demographics.

The Privacy Shift

Recent stats clearly show that people are becoming more privacy-aware and want greater control.

Norton reports that 85% of users globally want tighter reins on their data.

In the U.S., over 87% of voters back restrictions on the sale of personal data without consent, while 86% support limits on what companies can collect in the first place.

That awareness is turning into action.

A 2024 study found that 51% of users between 18 and 24 actively take steps to protect their digital footprint. This shows how people search, with apparent platform choices and behavior shifts.

DuckDuckGo, Brave, And The “Privacy Engine” Movement

DuckDuckGo is at the forefront of this change. Since its launch in 2008, it’s grown into a major player with over 100 million daily searches.

Brave Search, integrated into the privacy-focused Brave browser, is also gaining ground. Built on an index from its own crawler and a number of “crowd-sourced” sources, DDG is committed to ad-free, unbiased results.

Brave reflects the demand for tools that serve users rather than advertisers.

These platforms highlight a growing appetite for search options among a growing user base that rejects surveillance and upholds user agency.

The Rise Of New Privacy Engines

Awareness around data tracking has driven more users to seek out search engines that don’t rely on surveillance-based business models.

Traditional engines like Google and Bing have come under fire for harvesting user data to fuel targeted advertising.

In contrast, privacy-first search engines are gaining traction by rejecting tracking, behavioral profiling, and data retention, offering users more control and transparency over how their search activity is handled.

While DuckDuckGo is the front-runner when it comes to privacy-focused search engines, there are a number of players in this category. To better understand them, I reached out to their teams to dig deeper than the information just found online.

Swisscows

Image from author, May 2025

One rising contender is Swisscows, a Switzerland-based engine that recently marked its 10-year milestone.

It’s more than a search engine; it’s a whole ecosystem with encrypted messaging, secure cloud storage, VPN services, and an AI-powered summary tool focused on keeping user data private.

With roughly 25 million searches per month and a user base spanning Switzerland, the U.S., and Germany, Swisscows stands out for filtering out adult and violent content, making it popular among educators and families.

Its results come from its own index and Brave, chosen for their privacy-first approach.

“We don’t personalize or profile users,” the team told me. “That means more neutral, manipulation-free search results.”

Swisscows is also investing in semantic search and AI, aiming not to build chatbots but to improve information discovery and trend insights, hinting at a more ethical path for AI in search.

Startpage

Another major player is Startpage, which operates out of the Netherlands. The company has also rolled out a private browsing app, handling billions of searches yearly.

Startpage also doesn’t engage in user profiling. That means no tracking, no cookies by default, and no storing of IP addresses.

Users get results sourced from Google and Bing, but do not have the data collection that typically comes with them.

“People are simply done with being watched,” said the Startpage team. “As AI becomes more embedded in search, the demand for privacy is only increasing. Trust depends on clear policies and a commitment to not compromise user rights.”

Mojeek

Then there’s Mojeek, an independent engine with indexing and server infrastructure.

Unlike privacy-conscious tools that piggyback off bigger indexes, Mojeek runs its stack out of one of the UK’s most sustainable data centers.

By 2022, its index had hit 6 billion pages, a sizable feat for a standalone engine.

Mojeek doesn’t store search histories, use cookies, or track users. It delivers the same results to everyone, providing a transparent alternative to mainstream engines’ personalization-heavy approaches.

It’s also the default choice on several privacy-oriented browsers, like Privacy Browser, and is integrated into Pale Moon, SerenityOS, and Kagi Search.

What’s Fuelling The Shift?

This movement isn’t just about escaping ads or dodging trackers but reclaiming control.

AI-driven tools like ChatGPT, Google’s AI Overviews, and Bing AI are reshaping search by relying more on user data than ever.

As AI becomes more integrated into search engines, privacy becomes a central point of differentiation.

At the same time, regulatory pressure is intensifying. Governments are pushing back on unchecked data use, from the GDPR and the Digital Services Act in Europe to the proposed American Privacy Rights Act.

By the end of 2024, modern data protection laws were expected to cover three-quarters of the global population, reflecting a worldwide demand for stricter safeguards.

Optimizing For Privacy Search Engines

To optimize for privacy-first search engines like Swisscows and Startpage, marketers need to rethink their strategies.

Standard SEO tactics that depend heavily on tracking user behavior don’t hold up well when personalization is limited.

Instead, the focus shifts to a deeper understanding of the audience, what questions they’re asking, how they phrase them, and the intent behind their searches.

Creating content that directly answers real user needs, keeping the site structure intuitive, and using language that clearly reflects search intent has become a central focus.

Without behavioral tracking, insight must come from sources like on-site search data, user reviews, forum conversations, and direct feedback.

In this space, winning in SEO means less about gaming the system and more about delivering practical, trustworthy information in a straightforward way.

The Future Of Search Is Changing

Traditional search engines are increasingly wrapped up in advertising and AI. Still, privacy-first options are emerging as both safer and more ethical alternatives.

Whether it’s Swisscows with its commitment to content integrity or Startpage delivering Google-quality results without the tracking, these platforms represent a new direction shaped by more informed, privacy-conscious users.

More Resources:


Featured Image: Thapana_Studio/Shutterstock

Local SEO: How To Make More Customers Click, Choose & Walk Through Your Doors [Webinar] via @sejournal, @hethr_campbell

How do you turn local searches into real foot traffic?

If your business relies on being found locally, clicks alone aren’t enough. You need future customers to choose you and show up.

Whether you’re managing search visibility, local listings, or digital customer experience, this session will help you turn more searches into measurable visits and offline conversions.

Join us for “Local SEO: How To Make More Customers Click, Choose & Walk Through Your Doors” on Wednesday, May 28 at 2 PM ET.  In this session, we’ll explore real consumer behavior and how it shapes your local SEO strategy.

Why This Webinar Is Worth Your Time:

Based on consumer research from over 2,000 individuals across the UK, US, France, and Germany, this session will give you a clear picture of what makes people take action.

In this session, you’ll learn: 

✅ What gets consumers to choose one business over another.
✅ Actionable tips to optimize local SEO strategies across Google, Apple, voice search, AI tools & more.
✅ How to improve visibility, clarity, and trust across every location you manage.
✅ Digital signals that matter most to consumers.

Presented by Krystal Taing (VP) and Paul Modaley (Content Marketing Manager) at Uberall, this event is built for businesses that want to capture more high-intent traffic and convert it into real-world outcomes across any industry.

What Makes This Session Different:

You won’t hear guesses or theories. 

You’ll walk away with real data and proven strategies based on how people search, decide, and shop in your area.

Let’s help you drive results for your local and multi-location brick-and-mortar businesses.

Can’t make it live? Sign up anyway, and we’ll send the full recording to your inbox.

How To Increase Google Discover Visibility Naturally Using These Ranking Signals via @sejournal, @rollerads

This post was sponsored by RollerAds. The opinions expressed in this article are the sponsor’s own.

Want more visibility in Google Discover?

Not sure how to get into Google’s personalized news feeds?

Discover isn’t like search. You don’t rank for keywords.

You get selected.

And that means the best way to get featured isn’t to optimize for keywords; it’s to optimize for specific algorithmic signals.

In this guide, we’ll cover the core ranking signals that help Google determine which content belongs in Discover feeds, and how you can naturally boost those signals using tools like push notifications.

Google Discover Optimization Tips: Which Signals Tell Google Your Content Belongs in Discover?

Google Discover uses a different algorithm from traditional search results.

While it still considers many of the same quality indicators, Discover visibility depends less on keywords and more on how your content performs in the real world.

Here are the most important content quality signals for Discover.

1. E-E-A-T: Experience, Expertise, Authoritativeness, Trust

A good rule of thumb is to follow the “E-E-A-T” guideline:

  • Experience: Firsthand, real-world familiarity with the subject.
  • Expertise: Deep knowledge and skill in your content niche.
  • Authoritativeness: Recognition from other trusted sources.
  • Trustworthiness: Accurate, unbiased, and reliable information.

2. Engagement Metrics

These tell Google your content resonates with users and may be worth promoting more widely.

3. Strong Visuals & Headlines

Discover is highly visual, so if you don’t stand out immediately, the users are likely to scroll past your content.

Take your time to polish headlines to get attention, but make sure they accurately reflect the content of your article, post, or whatever you’re writing right now.

Engaging headlines, images, and videos perform better, especially when those assets are optimized for mobile.

4. Technical SEO & Mobile Optimization

While you don’t need to “rank” per se, you do need a well-optimized site, which includes:

  • Fast load times: Consider page speed and overall efficiency. Use PageSpeed Insights to ensure your web pages are optimized for user performance.
  • Mobile-friendly layouts: Google Discover is only available on mobile devices, as there is currently no desktop version.
  • Structured data: Google relies on structured data to categorize content and provide relevant suggestions for users. To attract more engaged and relevant users, you need to add tags and structure data so that Google can better recognize and categorize your content.
  • Internal linking & link building: It will help you create your own network of content. This concerns old articles, too, as they might serve as a gateway to newer pieces of content.
  • RSS or Atom Feed: Allow users to follow you to receive updates quickly. Google generates a feed for you automatically, but you can connect your own.
  • Google Web Stories: Similar to Instagram, these stories appear under the Visual Stories banner on mobiles and serve to expand your reach. Stories are easy to create, engaging, interactive, and fun.

    Track, test, improve. Use Google Search Console (GSC) to monitor your performance and statistics. Unlike Google Analytics, it has a dedicated tab for monitoring Google Discover traffic.

    5. Freshness & Topical Relevance

    Valuable content addresses and solves pain points.

    For content to have a better chance of showing up in Discover feeds, it should be:

    • Accurate.
    • Timely.
    • Trending.
    • Helpful.
    • Continuously updated.

    This is especially powerful if your content is tied to current events or spikes in interest, as shown in Google Trends.

    To discover what users search for, try:

    • Google Search: Enter a query and scroll down to view related and popular requests.
    • Google Search’s Autocomplete: Start typing a search and observe the suggested autocomplete queries; these are the queries that many others regularly search for.
    • Google Trends: Identify how popular a content direction is in any part of the world. This is also great for identifying seasonality.

    How Google Discover Works

    1. Google Discover suggests your content, which should include all the positive signals mentioned above.
    2. The Google app user engages with your content within Google Discover, adding to Google’s knowledge of how users interact with your website.
    3. These engagements (visitor volume, time on page, user experience, etc.) indicate to Google that your content is well-suited for similar readers.
    4. Google increases your reach and visibility on Google Discover.
    5. Those new viewers engage with your content in a similar pattern.
    6. The cycle repeats, spreading your optimized content to more Google Discover timelines.

    This is known as a positive loop because your content consistently passes positive ranking signals back to Google’s Discover algorithm, thereby continuing to increase in engagement.

    How Do I Create A Positive Loop & Show Up In Google Discover?

    Now that you know what Google is looking for, here’s how to naturally boost those signals.

    We know that Google Discover places your content based on:

    • High-clickthrough rates.
    • Long time-on-page.
    • Repeat visitors.

    So, how can you increase those metrics?

    By getting a dedicated reader base that is always ready to consume your new content.

    Push notifications are a great way to alert your dedicated readers that new content is out.

    And they will feed your Google Discover algorithm data.

    How To Use Push Notifications To Boost These Google Signals

    Many publishers avoid push notifications, believing they’re too promotional or might harm user experience (UX).

    However, modern push notification platforms allow you to take a more hybrid approach, combining editorial updates with monetization to boost visibility.

    Why Hybrid Push Notifications Help Boost Discover Visibility

    Done right, push notifications help your content get discovered organically by:

    • Increasing CTR with a second wave of distribution.
    • Driving fast engagement shortly after publication.
    • Bringing back repeat readers to increase session depth.
    • Boosting behavioral signals that Google uses to judge quality.

    In other words, push notifications support the very engagement metrics that can lead to more Discover visibility.

        When users receive a mix of informative and promotional pushes, each message feels fresh, encouraging clicks and boosting your CTR.

        Higher engagement signals to Google that your content is valuable, increasing the chances of it being featured on Discover.

        And since Discover traffic is largely made up of new visitors, each one becomes a fresh opportunity to grow your subscriber base.

        Once users opt in, you can keep re-engaging them, creating a cycle of rising visibility, CTR, and traffic.

        Image created by RollerAds April, 2025

        How to Implement Hybrid Push Format to Get on Discover Faster

        In a recent case study, one RollerAds publisher increased their revenue from $0 to $60,000 per month by pairing great content with hybrid push notifications and Discover-optimized distribution. The key was creating content that signals quality and leveraging distribution to show it.

        With a tool like RollerAds, you can gain a streamlined way to:

        • Send personalized push notifications for your latest content.
        • Mix promotional and editorial messaging without spamming your readers.
        • Increase engagement, retention, and revenue simultaneously.

        Simply register your site, get a custom strategy from your account manager, and start boosting content visibility, without compromising user experience.

        Even better? You can monetize this traffic directly with ad formats designed for Discover audiences, no intrusive pop-ups or poor user experience. Just clear, engaging content with a side of revenue.

        For SEJ readers, use the code SEJ30 to add +30% to your funds before July 1st, 2025.

        Just show the code to your account manager on RollerAds before your first payment.

        Getting featured on Google Discover isn’t just about luck; it’s about strategy.

        From creating high-quality, relevant content to optimizing visuals, headlines, and mobile performance, every step counts. However, to truly stand out and amplify your chances, pairing content strategy with smart tools, such as hybrid push notifications from RollerAds, can make all the difference.

        Engaging your audience through push updates not only drives more clicks but also signals content quality to Google, boosting your Discover reach. With the right monetization tools, you can convert that traffic into substantial revenue.


        Image Credits

        Featured Image: Image by RollerAds. Used with permission.

        In-Post Image: Images by RollerAds. Used with permission.