Use IndexNow For AI Search And Shopping SEO via @sejournal, @martinibuster

Microsoft Bing published an announcement stating that the IndexNow search crawling technology is a powerful way for ecommerce companies to surface the latest and most accurate shopping-related information in AI Search and search engine shopping features.

Generative Search Requires Timely Shopping Information

Ecommerce sites typically depend on merchant feeds, search engine crawling and updates to Schema.org structured data to communicate what’s for sale, new products, retired products, changes to prices, availability and other important features. Each of those methods can be a point of failure due to slow crawling by search engines and inconsistent updating which can delay the correct information from surfacing in AI search and shopping features.

IndexNow solves that problem. Content platforms like Wix, Duda, Shopify and WooCommerce support IndexNow, a Microsoft technology that enables speeding indexing of new or updated content. Pairing IndexNow with Schema.org assures fast indexing so that the correct information surfaces in AI Search and shopping features.

IndexNow recommends the following Schema.org Product Type properties:

  • “title (name in JSON-LD)
  • description
  • price (list/retail price)
  • link (product landing page URL)
  • image link (image in JSON-LD)
  • shipping (especially important for Germany and Austria)
  • id (a unique identifier for the product)
  • brand
  • gtin
  • mpn
  • datePublished
  • dateModified
  • Optional fields to further enhance context and classification:
  • category (helps group products for search and shopping platforms)
  • seller (recommended for marketplaces or resellers)
  • itemCondition (e.g., NewCondition, UsedCondition)”

Read more at Microsoft Bing’s Blog:

IndexNow Enables Faster and More Reliable Updates for Shopping and Ads

Featured Image by Shutterstock/Paper piper

From Search To Discovery: Why SEO Must Evolve Beyond The SERP via @sejournal, @alexmoss

The search landscape undergoes its biggest shift in a generation.

If you’ve been in SEO long enough to remember the glory days of the all-organic search engine results pages (SERP), you’ll know how much of this real estate has been gradually taken over by paid ads, other first-party products, and rich snippets.

Now, the most aggressive transition of all: AI Overviews (as well as search-based large language model platforms).

At BrightonSEO last month, I explored how this evolution is forcing us to rethink what SEO means and why discoverability, not just ranking, is the new north star.

The “Dawn” Of The Zero-Click Isn’t Just Over – It’s Now Assumed

We’ve been reading about the rise of zero-click searches for some time now, but this “takeover” has been much more noticeable over the past 12 months.

I recently searched [how to teach my child to tell the time], and after scrolling through a parade of paid product ads, Google-owned assets, and the AI Overview summaries, I scrolled a good three pages down the SERP.

Google and other search and discovery platforms want to keep users in their ecosystems. For SEO pros, this means traditional metrics such as click-through rate (CTR) are becoming less valuable by the day.

From Answer Engines To Assistant Engines

LLMs have changed not just the way a result is displayed to the user but also changed the traditional search flow born within the browser into a multi-step flow that the native SERP simply cannot support in the same way.

The research process is collapsing into a single, seamless exchange.

Traditional flow vs Multi-step flowImage used with permission from Alain Schlesser, May 2025

But as technology accelerates, our own curiosity and research skills are at risk of declining or disappearing completely as the evolution of technology exponentially grows.

Assistant engines and wider LLMs  are the new gatekeepers between our content and the person discovering that content – our potential “new audience.”

They parse, consume, understand, and then synthesize content, which is the deciding factor in what it mentions to whom/what it interacts with.

Structured data is still crucial, as context, transparency, and sentiment matter more than ever.

Personal LLM agent flow diagramPersonal LLM agent flow diagram by Alain Schlesser, used with permission, May 2025

Challenges Are Different, But Also The Same

As an SEO, our challenges with this new behavior affect the way we do – and report on – our jobs.

In reality, many are just old headaches in shiny new wrappers:

  • Attribution is a mess: With AI Overviews and LLMs synthesizing content, it’s harder than ever to see where your traffic comes from – or if you’re getting any at all. There are some tools out there that do monitor, but we’re in the early days to see a standard. Even Google said they have no plans on adding insights on AIO within Search Console.
  • Traffic is fragmenting (again): We saw this with social media platforms at the beginning, where discovery happened outside the organic SERPs. Discovery is now happening everywhere, all at once. With attribution also harder to ascertain, this is a bigger challenge today.
  • Budgets are under scrutiny from fear, uncertainty, and doubt (FUD): The native SERP is changing too much, so some may assume there’s less (or no) value in doing SEO much anymore (untrue!).

The Shift Of Success Metrics

The days of our current success metrics are dwindling. The days of vanity-led metrics are coming to an end.

Similar to how our challenges are the same but different, this also applies to how we redefine success metrics:

Old Hat New Hat
Content Context + sentiment
Keywords Intent
Brand Brand + sentiment
Rankings Mentions
Links from external sources Citations across various channels
SERP monopoly Share of voice
E-E-A-T Still E-E-A-T
Structured data Entities, knowledge graph & vector embeds
Answering Assisting

What Can You Do About It?

Information can be aggregated, but personality can’t. This is why it’s still our responsibility to help “assist the assistant” to consider and include you as part of that aggregated information and synthesized answer.

  • Stick to the fundamentals: Never neglect SEO 101.
  • Third-party perspective is increasingly important, so ensure this is maintained and managed well to ensure positive brand sentiment.
  • Embrace structured data: Even if some say it’s becoming less crucial for LLMs to understand entities, structured data is being used right now inside major LLMs to output structured data within responses, giving them an established and standardised way to understand your content.
  • Educate stakeholders: Shift the conversation from rankings and clicks to discoverability and brand presence. The days of the branded unlinked mention suddenly have more value than “acquiring X followed non-branded anchor text links pcm.”
  • Experiment with your content: Try new ways to produce and market your content beyond the traditional word. Here, video is useful not only for humans but also for LLMs, who are now “watching” and understanding them to aid their response.
  • Create helpful, unique content: To add to the above, don’t produce for the sake of production.

LLMs.txt: The Potential To Be The New Standard

Keep an eye on emerging standards proposals, such as llms.txt, which is one way some are adapting and contributing to how LLMs ingest our content beyond our traditional approaches offered with robots.txt and XML sitemaps.

While some are skeptical about this standard, I believe it is still something worth implementing now, and I understand its true benefits for the future.

There is (virtually) non-existent risk in implementing something that doesn’t take too much time or resources to produce, so long as you’re doing so with a white hat approach.

Conclusion: Embrace Discoverability And New Metrics

SEO isn’t dead. It’s expanding, but at a rate we haven’t experienced before.

Discoverability is the new go-to success metric, but it’s not without flaws, especially as the way we search continues to change.

This is no longer about “ranking well” anymore. This is now about being understood, surfaced, trusted, and discovered across every platform and assistant that matters.

Embrace and adapt to the changes, as it’s going to continue for some time.

More Resources:


Featured Image: PeopleImages.com – Yuri A/Shutterstock

Googler’s Deposition Offers View Of Google’s Ranking Systems via @sejournal, @martinibuster

A Google engineer’s redacted testimony published online by the U.S. Justice Department offers a look inside Google’s ranking systems, offering an idea about Google’s quality scores and introduces a mysterious popularity signal that uses Chrome data.

The document offers a high level and very general view of ranking signals, providing a sense of what the algorithms do but not the specifics.

Hand-Crafted Signals

For example, it begins with a section about the “hand crafting” of signals which describes the general process of taking data from quality raters, clicks and so on and applying mathematical and statistical formulas to generate a ranking score from three kinds of signals. Hand crafted means scaled algorithms that are tuned by search engineers. It doesn’t mean that they are manually ranking websites.

Google’s ABC Signals

The DOJ document lists three kinds of signals that are referred to as ABC Signals and correspond to the following:

  • A – Anchors (pages linking to the target pages),
  • B – Body (search query terms in the document),
  • C – Clicks (user dwell time before returning to the SERP)

The statement about the ABC signals is a generalization of one part of the ranking process. Ranking search results is far more complex and involves hundreds if not thousands of additional algorithms at every step of the ranking process, from indexing, link analysis, anti-spam processes, personalization, re-ranking, and other processes. For example, Liz Reid has discussed Core Topicality Systems as part of the ranking algorithm and Martin Splitt has discussed annotations as a part of understanding web pages.

This is what the document says about the ABC signals:

“ABC signals are the key components of topicality (or a base score), which is Google’s determination of how the document is relevant to the query.

T* (Topicality) effectively combines (at least) these three signals in a relatively hand-crafted way. Google uses to judge the relevance of the document based on the query terms.”

The document offers an idea of the complexity of ranking web pages:

“Ranking development (especially topicality) involves solving many complex mathematical problems. For topicality, there might be a team of engineers working continuously on these hard problems within a given project.

The reason why the vast majority of signals are hand-crafted is that if anything breaks Google knows what to fix. Google wants their signals to be fully transparent so they can trouble-shoot them and improve upon them.”

The document compares their hand-crafted approach to Microsoft’s automated approach, saying that when something breaks at Bing it’s far more difficult to troubleshoot than it is with Google’s approach.

Interplay Between Page Quality And Relevance

An interesting point revealed by the search engineer is that page quality is independent of query. If a page is determined to be high quality, trustworthy, it’s regarded as trustworthy across all related queries which is what is meant by the word static, it’s not dynamically recalculated for each query. However, there are relevance-related signals in the query that can be used to calculate the final rankings, which shows how relevance plays a decisive role in determining what gets ranked.

This is what they said:

“Quality
Generally static across multiple queries and not connected to a specific query.

However, in some cases Quality signal incorporates information from the query in addition to the static signal. For example, a site may have high quality but general information so a query interpreted as seeking very narrow/technical information may be used to direct to a quality site that is more technical.

Q* (page quality (i.e., the notion of trustworthiness)) is incredibly important. If competitors see the logs, then they have a notion of “authority” for a given site.

Quality score is hugely important even today. Page quality is something people complain about the most…”

AI Gives Cause For Complaints Against Google

The engineer states that people complain about quality but also says that AI aggravates the situation by making it worse.

He says about page quality:

“Nowadays, people still complain about the quality and AI makes it worse.

This was and continues to be a lot of work but could be easily reverse engineered because Q is largely static and largely related to the site rather than the query.”

eDeepRank – A Way To Understand LLM Rankings

The Googler lists other ranking signals, including one called eDeepRank which is an LLM-based system that uses BERT, which is a language related model.

He explains:

“eDeepRank is an LLM system that uses BERT, transformers. Essentially, eDeepRank tries to take LLM-based signals and decompose them into components to make them more transparent. “

That part about decomposing LLM signals into components seems to be a reference of making the LLM-based ranking signals more transparent so that search engineers can understand why the LLM is ranking something.

PageRank Linked To Distance Ranking Algorithms

PageRank is Google’s original ranking innovation and it has since been updated. I wrote about this kind of algorithm six years ago . Link distance algorithms calculate the distance from authoritative websites for a given topic (called seed sites) to other websites in the same topic. These algorithms start with a seed set of authoritative sites in a given topic and sites that are further away from their respective seed site are determined to be less trustworthy. Sites that are closer to the seed sets are likelier to be more authoritative and trustworthy.

This is what the Googler said about PageRank:

“PageRank. This is a single signal relating to distance from a known good source, and it is used as an input to the Quality score.”

Read about this kind of link ranking algorithm: Link Distance Ranking Algorithms

Cryptic Chrome-Based Popularity Signal

There is another signal whose name is redacted that’s related to popularity.

Here’s the cryptic description:

“[redacted] (popularity) signal that uses Chrome data.”

A plausible claim can be made that this confirms that the Chrome API leak is about actual ranking factors. However, many SEOs, myself included, believe that those APIs are developer-facing tools used by Chrome to show performance metrics like Core Web Vitals within the Chrome Dev Tools interface.

I suspect that this is a reference to a popularity signal that we might not know about.

The Google engineer does refer to another leak of documents that reference actual “components of Google’s ranking system” but that they don’t have enough information for reverse engineering the algorithm.

They explain:

“There was a leak of Google documents which named certain components of Google’s ranking system, but the documents don’t go into specifics of the curves and thresholds.

For example
The documents alone do not give you enough details to figure it out, but the data likely does.”

Takeaway

The newly released document summarizes a U.S. Justice Department deposition of a Google engineer that offers a general outline of parts of Google’s search ranking systems. It discusses hand-crafted signal design, the role of static page quality scores, and a mysterious popularity signal derived from Chrome data.

It provides a rare look into how signals like topicality, trustworthiness, click behavior, and LLM-based transparency are engineered and offers a different perspective on how Google ranks websites.

Featured Image by Shutterstock/fran_kie

How Referral Traffic Undermines Long-Term Brand Growth via @sejournal, @martinibuster

Mordy Oberstein, a search marketing professional whom I hold in high esteem, recently shared the provocative idea that referral traffic is not a brand’s friend and that every brand, as it matures, should wean itself from it. Referrals from other websites are generally considered a sign of a high-performing business, but it’s not a long-term strategy because it depends on sources that cannot be controlled.

Referral Traffic Is Necessary But…

Mordy Oberstein (LinkedIn profile), formerly of Wix, asserted in a Facebook post that relying on a traffic source, whether that’s another website or a search engine, offers a degree of vulnerability to maintaining steady traffic and performance.

He broke it down as a two-fold weakness:

  • Relying on the other site to keep featuring your brand.
  • Relying on Google to keep ranking that other site which in turn sends visitors to your brand.

The flow of traffic can stop at either of those two points, which is a hidden weakness that can affect the long-term sustainability of healthy traffic and sales.

Mordy explained:

“It’s a double vulnerability…

1) Relying on being featured by the website (the traffic source)
2) Relying on Google to give that website …traffic (the channel)

There are two levels of exposure & vulnerability.

As your brand matures, you want to own your own narrative.

More referral traffic is not your friend. It’s why, as a brand matures, it should wean off of it.

Full disclosure, this is my opinion. I am sure a lot of people will disagree.”

Becoming A Destination

I’ve always favored promoting a site in a way that helps it become synonymous with a given topic because that’s how to make it a default destination and encourage the kinds of signals that Google interprets as authoritative. I’ve done things like created hats with logos to give away, annual product giveaways and other promotional activities, both online and offline. While my competition was doing SEO busy work I created fans. Promoting a site is basically just getting it in front of people, both online and offline.

Brand Authority Is An Excuse, Not A Goal

Some SEOs believe in a concept called Brand Authority, which is a misleading explanation for why a website rank.  The term Brand Authority is not about Branding and it’s not about Authoritativeness, either. It’s just an excuse for why a site is top-ranked.

The phrase Brand Authority has its roots in PageRank. Big brand websites used to have a PageRank of 9 out of 10 and even a 10/10, which enabled them to rank for virtually any keywords they wanted. A link from one of those sites practically guarantee a top ten ranking. But Google ended the outsized influence of PageRank because it resulted in less relevant results, which was around 2004-ish, about the time that Google started using Navboost, a ranking signal that essentially measures how people feel about a site, which is what PageRank does, too.

This insight, that Google uses signals about how people feel about a site, is important because the feelings people have for a business are what being a brand is all about.

Marty Neumeier, a thought leader on how to promote companies (author of The Brand Gap) explained what being a brand is all about:

“Instead of creating the brand first, the company creates customers (through products and social media), the customers build the brand (through purchases and advocacy), and the customer-built brand sustains the company (through “tribal” loyalty). This model takes into account a profound and counterintuitive truth: a brand is not owned by the company, but by the customers who draw meaning from it. Your brand isn’t what you say it is. It’s what they say it is.”

Neumeier also explains how brand is about customer feelings:

“The best brands are vivid. They create clear mental pictures and powerful feelings in the minds and hearts of customers. They’re brought to life through their touchpoints, the places where customers experience them, from the first exposure to a brand’s name, to buying the product, to eventually making it part of who they are.”

That “tribal loyalty” is the kind of thing Google tries to measure. So when Danny Sullivan talks about differentiating your site to make it like a brand, he is not referring to so-called “brand authority.” He is talking about doing the kinds of things that influence people to feel positive about a site.

Getting Back To Mordy Oberstein

It seems to me that what he’s saying is that referral traffic is a stepping stone towards becoming a destination, it’s a means to an end. It’s not the goal, it’s a step toward the goal of becoming a destination.

On the other side of that process, I think it’s important to maintain relevance with potential site visitors and customers, especially today with the rapid pace of innovation, generational change, new inventions, and new product models. Relevance to people has been a Google ranking signal for a long time, beginning with PageRank, then with additional signals like Navboost.

The SEO factor that the SEO industry has largely missed is the part about about getting people to think positive thoughts about your site and your business, enough to share with other people.

Mordy’s insight about traffic is beautiful and elegant.

Read Mordy’s entire post on Facebook.

Featured Image by Shutterstock/Yunus Praditya

Google Clarifies: AI Overview Links Share Single Position In Search Console via @sejournal, @MattGSouthern

Google’s John Mueller has clarified that all links within AI Overviews (AIOs) share a single position in Google Search Console.

SEO consultant Gianluca Fiorelli asked Mueller how Search Console tracks position data for URLs in Google’s AI-generated answer boxes.

Mueller referenced Google’s official help docs, explaining:

“Basically an AIO counts as a block, so it’s all one position. It can be first position, if the block is shown first, but I don’t know if AIO is always shown first.”

This indicates that every website linked in an AI Overview receives the same position value in Search Console reports.

This occurs regardless of where the link appears in the overview panel, whether immediately visible or hidden until a user expands the box.

What Google’s Documentation Says

Google’s Search Console Help docs explain how AI Overview metrics work:

  • Position: “An AI Overview occupies a single position in search results, and all links in the AI Overview are assigned that same position.”
  • Clicks: “Clicking a link to an external page in the AI Overview counts as a click.”
  • Impressions: “Standard impression rules apply. To be counted as an impression, the link must be scrolled or expanded into view.”

The docs also note:

“Search Console doesn’t include data from experiments in Search Labs, as these experiments are still in active development.”

The Missing Data Behind Google’s Click Claims

This discussion highlights an ongoing debate in the SEO community regarding the performance of links in AI Overviews.

Lily Ray, Vice President of SEO Strategy & Research at Amsive, recently pointed out Google’s year-old claim that websites receive more clicks when featured in AI Overviews, stating:

“I would love to see a single GSC report that confirms this statement, because every study so far has shown the opposite.”

Ray’s statement reflects the concerns of many SEO professionals, as Google has not provided data to support its claims.

Looking Ahead

While we now understand how position metrics are recorded, the question remains: Do AI Overview placements drive more or less traffic than traditional search listings?

Google claims one thing, but many people report different experiences.

Since all AIO links share the same position, it’s difficult to determine which specific placements perform better.

This debate highlights the need for more precise data about how AIOs affect website traffic compared to regular search results.


Featured Image: Roman Samborskyi/Shutterstock

The Rise Of Privacy-First Search Engines via @sejournal, @TaylorDanRW

Google has long held a firm grip on the search engine landscape, but that dominant veneer is starting to show cracks.

In recent months, regulatory scrutiny, public mistrust, and rising anxiety around AI have pushed digital privacy into the spotlight.

Millions of users are now evaluating their relationship with “big tech” and actively seeking alternatives, prioritizing trust and anonymity.

What was once relegated to being a niche concern is now a broader user shift, with privacy-first search engines gaining momentum across various demographics.

The Privacy Shift

Recent stats clearly show that people are becoming more privacy-aware and want greater control.

Norton reports that 85% of users globally want tighter reins on their data.

In the U.S., over 87% of voters back restrictions on the sale of personal data without consent, while 86% support limits on what companies can collect in the first place.

That awareness is turning into action.

A 2024 study found that 51% of users between 18 and 24 actively take steps to protect their digital footprint. This shows how people search, with apparent platform choices and behavior shifts.

DuckDuckGo, Brave, And The “Privacy Engine” Movement

DuckDuckGo is at the forefront of this change. Since its launch in 2008, it’s grown into a major player with over 100 million daily searches.

Brave Search, integrated into the privacy-focused Brave browser, is also gaining ground. Built on an index from its own crawler and a number of “crowd-sourced” sources, DDG is committed to ad-free, unbiased results.

Brave reflects the demand for tools that serve users rather than advertisers.

These platforms highlight a growing appetite for search options among a growing user base that rejects surveillance and upholds user agency.

The Rise Of New Privacy Engines

Awareness around data tracking has driven more users to seek out search engines that don’t rely on surveillance-based business models.

Traditional engines like Google and Bing have come under fire for harvesting user data to fuel targeted advertising.

In contrast, privacy-first search engines are gaining traction by rejecting tracking, behavioral profiling, and data retention, offering users more control and transparency over how their search activity is handled.

While DuckDuckGo is the front-runner when it comes to privacy-focused search engines, there are a number of players in this category. To better understand them, I reached out to their teams to dig deeper than the information just found online.

Swisscows

Image from author, May 2025

One rising contender is Swisscows, a Switzerland-based engine that recently marked its 10-year milestone.

It’s more than a search engine; it’s a whole ecosystem with encrypted messaging, secure cloud storage, VPN services, and an AI-powered summary tool focused on keeping user data private.

With roughly 25 million searches per month and a user base spanning Switzerland, the U.S., and Germany, Swisscows stands out for filtering out adult and violent content, making it popular among educators and families.

Its results come from its own index and Brave, chosen for their privacy-first approach.

“We don’t personalize or profile users,” the team told me. “That means more neutral, manipulation-free search results.”

Swisscows is also investing in semantic search and AI, aiming not to build chatbots but to improve information discovery and trend insights, hinting at a more ethical path for AI in search.

Startpage

Another major player is Startpage, which operates out of the Netherlands. The company has also rolled out a private browsing app, handling billions of searches yearly.

Startpage also doesn’t engage in user profiling. That means no tracking, no cookies by default, and no storing of IP addresses.

Users get results sourced from Google and Bing, but do not have the data collection that typically comes with them.

“People are simply done with being watched,” said the Startpage team. “As AI becomes more embedded in search, the demand for privacy is only increasing. Trust depends on clear policies and a commitment to not compromise user rights.”

Mojeek

Then there’s Mojeek, an independent engine with indexing and server infrastructure.

Unlike privacy-conscious tools that piggyback off bigger indexes, Mojeek runs its stack out of one of the UK’s most sustainable data centers.

By 2022, its index had hit 6 billion pages, a sizable feat for a standalone engine.

Mojeek doesn’t store search histories, use cookies, or track users. It delivers the same results to everyone, providing a transparent alternative to mainstream engines’ personalization-heavy approaches.

It’s also the default choice on several privacy-oriented browsers, like Privacy Browser, and is integrated into Pale Moon, SerenityOS, and Kagi Search.

What’s Fuelling The Shift?

This movement isn’t just about escaping ads or dodging trackers but reclaiming control.

AI-driven tools like ChatGPT, Google’s AI Overviews, and Bing AI are reshaping search by relying more on user data than ever.

As AI becomes more integrated into search engines, privacy becomes a central point of differentiation.

At the same time, regulatory pressure is intensifying. Governments are pushing back on unchecked data use, from the GDPR and the Digital Services Act in Europe to the proposed American Privacy Rights Act.

By the end of 2024, modern data protection laws were expected to cover three-quarters of the global population, reflecting a worldwide demand for stricter safeguards.

Optimizing For Privacy Search Engines

To optimize for privacy-first search engines like Swisscows and Startpage, marketers need to rethink their strategies.

Standard SEO tactics that depend heavily on tracking user behavior don’t hold up well when personalization is limited.

Instead, the focus shifts to a deeper understanding of the audience, what questions they’re asking, how they phrase them, and the intent behind their searches.

Creating content that directly answers real user needs, keeping the site structure intuitive, and using language that clearly reflects search intent has become a central focus.

Without behavioral tracking, insight must come from sources like on-site search data, user reviews, forum conversations, and direct feedback.

In this space, winning in SEO means less about gaming the system and more about delivering practical, trustworthy information in a straightforward way.

The Future Of Search Is Changing

Traditional search engines are increasingly wrapped up in advertising and AI. Still, privacy-first options are emerging as both safer and more ethical alternatives.

Whether it’s Swisscows with its commitment to content integrity or Startpage delivering Google-quality results without the tracking, these platforms represent a new direction shaped by more informed, privacy-conscious users.

More Resources:


Featured Image: Thapana_Studio/Shutterstock

Local SEO: How To Make More Customers Click, Choose & Walk Through Your Doors [Webinar] via @sejournal, @hethr_campbell

How do you turn local searches into real foot traffic?

If your business relies on being found locally, clicks alone aren’t enough. You need future customers to choose you and show up.

Whether you’re managing search visibility, local listings, or digital customer experience, this session will help you turn more searches into measurable visits and offline conversions.

Join us for “Local SEO: How To Make More Customers Click, Choose & Walk Through Your Doors” on Wednesday, May 28 at 2 PM ET.  In this session, we’ll explore real consumer behavior and how it shapes your local SEO strategy.

Why This Webinar Is Worth Your Time:

Based on consumer research from over 2,000 individuals across the UK, US, France, and Germany, this session will give you a clear picture of what makes people take action.

In this session, you’ll learn: 

✅ What gets consumers to choose one business over another.
✅ Actionable tips to optimize local SEO strategies across Google, Apple, voice search, AI tools & more.
✅ How to improve visibility, clarity, and trust across every location you manage.
✅ Digital signals that matter most to consumers.

Presented by Krystal Taing (VP) and Paul Modaley (Content Marketing Manager) at Uberall, this event is built for businesses that want to capture more high-intent traffic and convert it into real-world outcomes across any industry.

What Makes This Session Different:

You won’t hear guesses or theories. 

You’ll walk away with real data and proven strategies based on how people search, decide, and shop in your area.

Let’s help you drive results for your local and multi-location brick-and-mortar businesses.

Can’t make it live? Sign up anyway, and we’ll send the full recording to your inbox.

How To Increase Google Discover Visibility Naturally Using These Ranking Signals via @sejournal, @rollerads

This post was sponsored by RollerAds. The opinions expressed in this article are the sponsor’s own.

Want more visibility in Google Discover?

Not sure how to get into Google’s personalized news feeds?

Discover isn’t like search. You don’t rank for keywords.

You get selected.

And that means the best way to get featured isn’t to optimize for keywords; it’s to optimize for specific algorithmic signals.

In this guide, we’ll cover the core ranking signals that help Google determine which content belongs in Discover feeds, and how you can naturally boost those signals using tools like push notifications.

Google Discover Optimization Tips: Which Signals Tell Google Your Content Belongs in Discover?

Google Discover uses a different algorithm from traditional search results.

While it still considers many of the same quality indicators, Discover visibility depends less on keywords and more on how your content performs in the real world.

Here are the most important content quality signals for Discover.

1. E-E-A-T: Experience, Expertise, Authoritativeness, Trust

A good rule of thumb is to follow the “E-E-A-T” guideline:

  • Experience: Firsthand, real-world familiarity with the subject.
  • Expertise: Deep knowledge and skill in your content niche.
  • Authoritativeness: Recognition from other trusted sources.
  • Trustworthiness: Accurate, unbiased, and reliable information.

2. Engagement Metrics

These tell Google your content resonates with users and may be worth promoting more widely.

3. Strong Visuals & Headlines

Discover is highly visual, so if you don’t stand out immediately, the users are likely to scroll past your content.

Take your time to polish headlines to get attention, but make sure they accurately reflect the content of your article, post, or whatever you’re writing right now.

Engaging headlines, images, and videos perform better, especially when those assets are optimized for mobile.

4. Technical SEO & Mobile Optimization

While you don’t need to “rank” per se, you do need a well-optimized site, which includes:

  • Fast load times: Consider page speed and overall efficiency. Use PageSpeed Insights to ensure your web pages are optimized for user performance.
  • Mobile-friendly layouts: Google Discover is only available on mobile devices, as there is currently no desktop version.
  • Structured data: Google relies on structured data to categorize content and provide relevant suggestions for users. To attract more engaged and relevant users, you need to add tags and structure data so that Google can better recognize and categorize your content.
  • Internal linking & link building: It will help you create your own network of content. This concerns old articles, too, as they might serve as a gateway to newer pieces of content.
  • RSS or Atom Feed: Allow users to follow you to receive updates quickly. Google generates a feed for you automatically, but you can connect your own.
  • Google Web Stories: Similar to Instagram, these stories appear under the Visual Stories banner on mobiles and serve to expand your reach. Stories are easy to create, engaging, interactive, and fun.

    Track, test, improve. Use Google Search Console (GSC) to monitor your performance and statistics. Unlike Google Analytics, it has a dedicated tab for monitoring Google Discover traffic.

    5. Freshness & Topical Relevance

    Valuable content addresses and solves pain points.

    For content to have a better chance of showing up in Discover feeds, it should be:

    • Accurate.
    • Timely.
    • Trending.
    • Helpful.
    • Continuously updated.

    This is especially powerful if your content is tied to current events or spikes in interest, as shown in Google Trends.

    To discover what users search for, try:

    • Google Search: Enter a query and scroll down to view related and popular requests.
    • Google Search’s Autocomplete: Start typing a search and observe the suggested autocomplete queries; these are the queries that many others regularly search for.
    • Google Trends: Identify how popular a content direction is in any part of the world. This is also great for identifying seasonality.

    How Google Discover Works

    1. Google Discover suggests your content, which should include all the positive signals mentioned above.
    2. The Google app user engages with your content within Google Discover, adding to Google’s knowledge of how users interact with your website.
    3. These engagements (visitor volume, time on page, user experience, etc.) indicate to Google that your content is well-suited for similar readers.
    4. Google increases your reach and visibility on Google Discover.
    5. Those new viewers engage with your content in a similar pattern.
    6. The cycle repeats, spreading your optimized content to more Google Discover timelines.

    This is known as a positive loop because your content consistently passes positive ranking signals back to Google’s Discover algorithm, thereby continuing to increase in engagement.

    How Do I Create A Positive Loop & Show Up In Google Discover?

    Now that you know what Google is looking for, here’s how to naturally boost those signals.

    We know that Google Discover places your content based on:

    • High-clickthrough rates.
    • Long time-on-page.
    • Repeat visitors.

    So, how can you increase those metrics?

    By getting a dedicated reader base that is always ready to consume your new content.

    Push notifications are a great way to alert your dedicated readers that new content is out.

    And they will feed your Google Discover algorithm data.

    How To Use Push Notifications To Boost These Google Signals

    Many publishers avoid push notifications, believing they’re too promotional or might harm user experience (UX).

    However, modern push notification platforms allow you to take a more hybrid approach, combining editorial updates with monetization to boost visibility.

    Why Hybrid Push Notifications Help Boost Discover Visibility

    Done right, push notifications help your content get discovered organically by:

    • Increasing CTR with a second wave of distribution.
    • Driving fast engagement shortly after publication.
    • Bringing back repeat readers to increase session depth.
    • Boosting behavioral signals that Google uses to judge quality.

    In other words, push notifications support the very engagement metrics that can lead to more Discover visibility.

        When users receive a mix of informative and promotional pushes, each message feels fresh, encouraging clicks and boosting your CTR.

        Higher engagement signals to Google that your content is valuable, increasing the chances of it being featured on Discover.

        And since Discover traffic is largely made up of new visitors, each one becomes a fresh opportunity to grow your subscriber base.

        Once users opt in, you can keep re-engaging them, creating a cycle of rising visibility, CTR, and traffic.

        Image created by RollerAds April, 2025

        How to Implement Hybrid Push Format to Get on Discover Faster

        In a recent case study, one RollerAds publisher increased their revenue from $0 to $60,000 per month by pairing great content with hybrid push notifications and Discover-optimized distribution. The key was creating content that signals quality and leveraging distribution to show it.

        With a tool like RollerAds, you can gain a streamlined way to:

        • Send personalized push notifications for your latest content.
        • Mix promotional and editorial messaging without spamming your readers.
        • Increase engagement, retention, and revenue simultaneously.

        Simply register your site, get a custom strategy from your account manager, and start boosting content visibility, without compromising user experience.

        Even better? You can monetize this traffic directly with ad formats designed for Discover audiences, no intrusive pop-ups or poor user experience. Just clear, engaging content with a side of revenue.

        For SEJ readers, use the code SEJ30 to add +30% to your funds before July 1st, 2025.

        Just show the code to your account manager on RollerAds before your first payment.

        Getting featured on Google Discover isn’t just about luck; it’s about strategy.

        From creating high-quality, relevant content to optimizing visuals, headlines, and mobile performance, every step counts. However, to truly stand out and amplify your chances, pairing content strategy with smart tools, such as hybrid push notifications from RollerAds, can make all the difference.

        Engaging your audience through push updates not only drives more clicks but also signals content quality to Google, boosting your Discover reach. With the right monetization tools, you can convert that traffic into substantial revenue.


        Image Credits

        Featured Image: Image by RollerAds. Used with permission.

        In-Post Image: Images by RollerAds. Used with permission.

        Cracking the SEO Code: Regain Control of Search Visibility in the Age of AI [Webinar] via @sejournal, @hethr_campbell

        Trying to regain lost visibility in AI-powered search results?

        As AI Overviews and answer engines continue to reshape how search works, organic visibility can disappear overnight. If your traffic has taken a hit, you may need a more complete strategy to recover and grow.

        Join us for Own The Total SERP: How To Regain Lost Visibility Across Paid, Organic and Local SEO.” This webinar will introduce the TotalSERP strategy, a unified approach designed to help you reclaim visibility across the entire search landscape.

        Why This Session Is Important

        Search is no longer limited to paid or organic results. Success now comes from owning the full search engine results pages (SERPs), including local listings and AI-driven experiences.

        On May 27, 2025, at 12pm ET, you will learn:
        ✅ How to gain total SERP visibility across paid, organic and local search
        ✅ How to use Gen AI to improve content and capture intent
        ✅ How to turn an integrated search strategy into measurable business results

        This session is led by Bhavin Prashad, Associate Vice President of Digital Media, and Dan Lauer, SEO Strategist at DAC. They will walk you through the TotalSERP strategy and show how it can help you rebuild what Google’s algorithm and AI may have taken away.

        What makes this session different

        The TotalSERP strategy aligns your paid, organic, and local efforts into one consistent plan. It is designed to help you capture customers at every stage of their search journey.

        Let’s help you take back control of your visibility and drive results across every part of the search experience.

        If you cannot attend live, go ahead and register. We will send you the full recording after the event.

        The Triple-P Framework: AI & Search Brand Presence, Perception & Performance

        As brands compete for market share across a whole range of AI platforms, each with its own way of presenting information, brands are on red alert.

        The three pillars of presence, perception, and performance that I discuss in this article may help marketers navigate new times. This is especially true as search and AI undergo their biggest make-over ever.

        What’s driving this change?

        AI isn’t just retrieving information anymore – it’s actively evaluating, framing, and recommending brands before prospects even click a link.

        It’s happening now, and it’s accelerating.

        Think about it. Today, in many ways, ChatGPT has become just as synonymous with AI as Google was when it launched core search.

        More and more users and marketers are experimenting with and utilizing Google AIO, ChatGPT, Perplexity, and more.

        According to a recent BrightEdge survey, over 53% of marketers regularly use multiple (two or more) AI search platforms weekly.

        AI Is Reshaping How Brands Are Presented And Perceived

        Consider how buyers research options today: In Google AIO, a traveler planning a Barcelona vacation once needed dozens of separate searches, each representing an opportunity for visibility.

        Now? They ask one question to an AI assistant and receive a complete itinerary, compressing what 50 touchpoints once took into a single interaction.

        AI is no longer a passive search engine. It’s an active evaluator, interpreting intent, forming opinions, and determining which brands deserve attention.

        In enterprise SEO and B2B contexts, the shift is even more pronounced. AI is effectively writing the request for proposal (RFP), establishing evaluation criteria, and creating shortlists without brands having direct input.

        Take enterprise software evaluation, for instance. When a CIO asks an AI about the “best enterprise resource planning solutions,” the AI’s response typically features:

        • A curated shortlist of vendors.
        • Evaluation criteria that the AI deems relevant.
        • Strengths and limitations of each solution.
        • Recommendations based on various scenarios.

        These responses don’t just inform decisions. They frame the entire evaluation process before a vendor’s content is visited.

        The question isn’t whether this transformation is happening. It’s whether your brand is prepared for it.

        Read more: 5 Key Enterprise SEO And AI Trends For 2025

        The Triple-P Framework For AI Search Success

        After analyzing thousands of AI search responses using our BrightEdge Generative Parser™, I’ve developed the Triple-P framework (Presence, Perception, and Performance) as a strategic compass for navigating this new landscape.

        Let’s break down each component.

        Presence: Beyond Traditional Rankings

        While Google still commands 89.71% of search market share, the ecosystem is diversifying rapidly:

        • ChatGPT: 19% monthly traffic growth.
        • Perplexity: 12% monthly traffic growth.
        • Claude: 166% monthly traffic surge.
        • Grok: 266% early-stage spike.

        (Source: BrightEdge Generative Parser™ April 2025)

        Our research shows that the presence of AI Overviews has nearly doubled since June 2024, with comparison features growing by 70-90% and product visualization features by 45-50% in B2B sectors.

        Image from author, May 2025

        For enterprise marketers, Google is always your starting point. However, it’s not just about ranking on Google anymore; it’s about showing up wherever AI models showcase your brand.

        For example, consider these industry-specific implications:

        • For CPG brands: When consumers ask about product sustainability, AI doesn’t just list eco-friendly options; it evaluates authenticity based on consistent messaging across digital touchpoints.
        • For SaaS companies: Buyers researching integration capabilities receive AI-curated assessments that either position you as a compatibility leader or exclude you entirely.
        • For healthcare providers: Patient questions about treatment options trigger AI responses that cite the most authoritative content, not necessarily the highest-ranking websites.

        We are in an era of compressed decision-making. Invisibility equals elimination.

        Perception: When AI Forms Opinions

        The most revealing insight from our research is that only 31% of AI-generated brand mentions are positive; of those, just 20% include direct recommendations.

        Source: BrightEdge AI Catalyst and Generative Parser ™, May 2025

        This is a wake-up call for all marketers, especially those managing a brand.

        Even when your brand appears in AI results, how it’s framed varies dramatically depending on the AI model, training data, and interpretive logic.

        In some AI engines, your brand may appear as the industry leader. In others, you may be completely absent.

        What The Data Shows:

        • Brands with strong pre-existing recognition receive more positive mentions in AI responses.
        • Consistent messaging across digital touchpoints makes brands more likely to be cited positively.
        • AI systems appear to “average” brand signals across the web when forming perceptions.

        When we analyzed sentiment distribution (April 2025) in AI responses by industry, we saw significant variation, which you could group-match to verticals. For example:

        • Finance: Positive mentions aligned around good content on regulatory compliance and security.
        • Healthcare: Positive mentions aligned around good content with accuracy and credibility as key factors.
        • Retail: Positive mentions aligned around good customer experience and shopping.
        • Technology: Positive mentions aligned around content on innovation and reliability as primary criteria.

        The implications are clear: Perception management is now as crucial as presence.

        How does this play out in practice?

        When brands implement coordinated perception management strategies across multiple channels, they see improvements in AI sentiment within 60-90 days.

        Performance: New Metrics That Matter

        The final P (Performance) requires entirely new measurement approaches.

        When AI overviews appear in search results, click-through rates often drop by up to 50% according to internal BrightEdge data. Yet, conversion rates typically remain strong, suggesting AI qualifies leads before they reach your site.

        We’re entering an era where impressions will be high, click-through rates may drop, but conversions will increase.

        I explained at our recent quarterly briefing. AI filters options and delivers buyers who are closer to decisions.

        The impact varies dramatically by query type:

        • Informational queries: Reduction in clicks, minimal conversion impact.
        • Navigational queries: Reduction in clicks, negligible conversion impact.
        • Commercial queries: Reduction in clicks, higher conversion rates.
        • Transactional queries: Reduction in clicks, higher conversion rates.

        This pattern suggests AI is most effective at qualifying commercial intent, delivering more purchase-ready traffic.

        And impressions matter now – they are a new brand metric.

        Five Essential AI Search Metrics:

        1. AI Presence Rate: Percentage of target queries where your brand appears in AI responses.
        2. Citation Authority: How consistently you are cited as the primary source.
        3. Share Of AI Conversation: Your semantic real estate in AI answers versus competitors.
        4. Prompt Effectiveness: How well your content answers natural language prompts.
        5. Response-To-Conversion Velocity: How quickly AI-influenced prospects convert. Brands with strong pre-existing recognition will receive more positive mentions in AI responses.

        Position within AI responses matters as much as position in traditional SERPs once did.

        Monthly reporting cycles are becoming obsolete. AI-generated results can shift within hours, demanding real-time monitoring capabilities.

        The DNA Of AI-Optimized Content

        In my experience, content is more likely to be cited by AI with:

        • Comprehensive coverage: Content addressing multiple related questions outperforms narrow content.
        • Structured data implementation: Pages with robust schema markup see higher citation rates.
        • Expert validation: Content with clear expert authorship signals receives more citations.
        • Multi-format delivery: Topics presented in multiple formats (text, video, data visualizations) earn more citations.
        • First-party data inclusion: Original research and proprietary data increase citation likelihood.

        These patterns suggest AI systems are increasingly sophisticated in their ability to identify genuinely authoritative content versus content merely optimized for traditional ranking factors.

        In my last article, I discussed how Google AIO, ChatGPT, and Perplexity differ and where they share some common optimization traits.

        Five Actionable Strategies For Triple-P Success

        Based on our extensive research, here are five implementation strategies aligned with this framework:

        1. Adopt Entity-Based SEO

        AI prioritizes content from known, trusted entities. Stop optimizing for fragmented keywords and start building comprehensive topic authority.

        Our data shows that authoritative content is three times more likely to be cited in AI responses than narrowly focused pages.

        Implementation Steps:

        • Perform an entity audit: Identify how search engines currently understand your brand as an entity.
        • Develop topical maps: Create comprehensive coverage of core topics rather than isolated keywords
        • Implement entity-based schema: Use structured data to explicitly define your brand’s relationship to key topics.
        • Build consistent entity references: Ensure name, address, and phone (NAP) consistency across all digital properties.
        • Cultivate authoritative connections: Earn mentions and links from recognized authorities in your space.

        Enterprise brands implementing entity-based SEO will see an uplift in AI citations.

        2. Implement Perception Management

        With 69% of AI brand mentions not explicitly positive, you must actively shape sentiment.

        Image from author, May 2025

        Brands that implement proactive sentiment management strategies will see success.

        Implementation Steps:

        • Monitor AI sentiment tracking: Establish baseline sentiment across AI platforms.
        • Identify perception gaps: Compare AI perceptions against desired brand positioning.
        • Address criticism proactively: Create content that honestly addresses common concerns.
        • Amplify authentic strengths: Develop evidence-based content highlighting genuine advantages.
        • Build consistent messaging: Align key messages across all digital touchpoints.

        3. Integrate Real-Time Citation Monitoring

        Tracking AI citations regularly is now vital to improve mention rates.

        This requires capability beyond traditional rank tracking or Google Search Console analysis.

        Implementation Steps:

        • Deploy continuous monitoring: Track AI responses for priority queries across platforms.
        • Implement competitor citation alerts: Get notified when competitors gain or lose citations.
        • Conduct prompt variation testing: Analyze how different user phrasings affect your brand’s inclusion.
        • Track citation position: Monitor where within AI responses your brand appears.
        • Measure citation authority: Assess whether you’re positioned as a primary or secondary source.

        4. Deploy Cross-Core Search And AI Platforms

        Companies that take an integrated approach across traditional search and multiple AI platforms will see higher return on investment (ROI) on search investments.

        The future belongs to unified measurement frameworks that connect traditional SEO metrics with emerging AI citation patterns.

        Implementation Steps:

        • Build unified dashboards: Integrate traditional search metrics with AI citation data.
        • Map keyword-to-prompt relationships: Connect traditional keywords to conversational AI prompts.
        • Analyze traffic source shifts: Track changing patterns between direct search and AI-referred traffic.
        • Segment by AI platform: Monitor performance variations across different AI search environments.
        • Connect to business outcomes: Tie AI presence metrics directly to conversion and revenue data.

        5. Use AI To Win At AI

        This isn’t theoretical. It’s delivering measurable results:

        • BrightEdge Autopilot users averaged a 65% performance improvement.
        • BrightEdge Copilot users saved 1.2 million content research hours.

        The brands succeeding most in AI search leverage AI in their workflows.

        Implementation Steps:

        • Automate content research: Use AI to identify comprehensive topic coverage opportunities.
        • Implement AI-driven schema markup: Systematically structure data for machine interpretation.
        • Deploy prompt effectiveness testing: Continuously test how well content answers real user prompts.
        • Create AI-optimized content briefs: Define exactly what comprehensive coverage means for each topic.
        • Analyze AI citation patterns: Identify what characteristics make competitor content citation-worthy.

        Teams using AI for AI optimization will benefit from higher productivity and improved performance to gain that must-have competitive edge in search and AI today.

        What’s Coming Next: AI-To-AI Marketing

        Looking ahead to two to three years, expect AI to evolve from an information assistant to a trusted advisor that buyers rely on for evaluation, comparison, and vendor selection.

        We’re already seeing early indicators of AI-to-AI marketing, where procurement teams use AI agents to automate research and vendor vetting.

        Emerging Trends:

        • Digital twin marketplaces: Buyers will interact with simulated versions of B2B solutions before speaking with vendors
        • Vertical-specific AI companions: Industry-specialized models for cybersecurity, manufacturing, and healthcare.
        • AI agent purchasing: Autonomous systems are not just researching but also completing transactions on users’ behalf.
        • Continuous entity validation: AI systems continuously monitor brand claims against real-world evidence.
        • Multi-modal search experiences: Voice, image, and text-based AI interactions requiring omnichannel optimization.

        Read more: As Chatbots And AI Search Engines Converge: Key Strategies For SEO

        The Trust Premium In AI Search

        Consumers are always more likely to trust brands they already recognize.

        • AI functions as a trust bridge.
        • When consumers delegate decision-making to AI, pre-existing brand familiarity becomes disproportionately influential.
        • The impact is most pronounced in high-consideration purchases.

        This creates both a challenge and an opportunity. Established brands must protect their advantage, while emerging brands must strategically build recognition signals detectable by AI.

        Organizational Structure For AI Search Success

        Leading organizations are already creating “collaborative intelligence” roles – specialists managing the interplay between human creativity and AI amplification.

        Successful teams typically include:

        • AI Search Strategists: Focus on overall presence, perception, and performance.
        • Prompt Engineers: Specialize in understanding how users phrase requests to AI.
        • Content Scientists: Develop evidence-based approaches to comprehensive coverage.
        • AI Citation Analysts: Monitor and optimize for inclusion in AI responses.
        • Schema Specialists: Ensure that the machine-readable structure enhances entity understanding.

        These cross-functional teams integrate with traditional SEO, content marketing, analytics, and business intelligence functions.

        The Bottom Line

        In this new landscape, the question isn’t whether your website ranks. It’s whether AI recommends your brand when it matters most.

        The Triple-P framework gives you the structure to navigate this future with confidence.

        Here’s what I recommend getting started:

        • Conduct an AI presence audit: Understand where your brand appears in AI responses across key platforms.
        • Analyze sentiment distribution: Assess not just if you’re mentioned, but how you’re portrayed in AI-generated content.
        • Connect AI metrics to business results: Start tracking the relationship between AI presence and conversion patterns.
        • Identify entity perception gaps: Compare how AI systems understand your brand versus your desired positioning.
        • Deploy real-time monitoring: Implement systems to track citation changes as they happen.

        The branded AI search revolution isn’t coming – it’s already here.

        The brands that embrace the Triple-P framework today will be the ones AI recommends tomorrow.

        Note: In March 2025, BrightEdge surveyed over 1,000 of its customers who are marketers. Findings from this survey are referenced above.

        More Resources:


        Featured Image: Moon Safari/Shutterstock