The State Of AI Chatbots And SEO via @sejournal, @Kevin_Indig

Last week, I published a meta-analysis of AI Overviews and their impact on SEO.

Today, I publish an analysis of the research on AI chatbots and their potential impact on customer acquisition and purchase decisions.

Image Credit: Lyna ™

I’ve analyzed 14 studies and research papers to answer five key questions:

    1. How valuable is AI chatbot visibility?
    2. How can you grow your AI chatbot visibility?
    3. How are people searching on AI chatbots?
    4. What challenges are associated with AI chatbots?
    5. Where are AI chatbots headed?

This analysis is perfect for you if you:

  • Are unsure about whether to invest in AI chatbot visibility.
  • Want an overview of the state of AI chatbots.
  • Look for ways to optimize for AI chatbots.

I don’t include AI Overviews in this analysis since I’ve covered them in depth in last week’s Memo.

Sources:

Image Credit: Kevin Indig

Get the spreadsheet.

How Valuable Is AI Chatbot Visibility?

While AI chatbot traffic currently represents a tiny percentage of overall traffic, the data shows early evidence for the value of citations and mentions.

AI chatbot adoption is skyrocketing, referral traffic to websites is growing, and traffic quality is high.

Adoption

ChatGPT has over 400 million weekly users as of January 2025.1

Semrush, 12/24: Most ChatGPT users are from the U.S. (25%) or India (12%), followed by India, Brazil, the UK and Germany. 70% are male, and over 50% are between 18 and 34 years old.

Higher Visibility, 02/25: 71.5% of consumers use ChatGPT for searching but complementary to Google, not as a replacement.

Ahrefs, 02/25: 63% of websites receive at least some traffic from AI sources. Only 0.17% of total visits came from AI Chatbots, with top sites achieving up to 6%.

  • 98% of AI traffic comes from three AI chatbots: ChatGPT (> 50%), Perplexity (30.7%), and Gemini (17.6%).
  • Smaller sites get proportionally more visits from AI.

Semrush, 02/25: The generative AI market was valued at $67 billion in 2024 and is expected to grow annually by 24.4% through 2030.

Referral Traffic

Semrush, 12/24: ChatGPT referrals to websites grew by 60% between June and October.

Semrush, 02/25: ChatGPT’s reach has expanded dramatically, sending traffic to over 30,000 unique domains daily in November 2024, up from less than 10,000 in July.

  • Online services, education, and mass media are getting the most referral traffic from ChatGPT after filtering out authentication URLs. Retail, finance, and healthcare show lower volumes.

Growth Memo, 02/25: The quality of AI chatbot traffic is superior in several key metrics:

  • The average session duration is 10.4 minutes for AI chatbot referrals versus 8.1 minutes for Google traffic.
  • Users view more pages: 12.4 pages on average for AI chatbot referrals compared to 11.8 for Google traffic.

Impact On Purchase Decisions:

Adobe, 10/24: 25% of Britons use AI while shopping online.

  • AI usage rose 10x between July and September to 10 billion visits to UK retail websites and ~100 million products.
  • Most shoppers are looking for deals:

In an Adobe survey of 5,000 U.S. consumers, 7 in 10 respondents who have used generative AI for shopping believe it enhances their experience. Additionally, 20% of respondents turn to generative AI to find the best deals, followed by quickly finding specific items online (19%) and getting brand recommendations (15%).

Semrush, 02/25: 46% of ChatGPT queries use the Search feature.

The research paper “A comparative study on the effect of ChatGPT recommendation and AI recommender systems on the formation of a consideration set” by Chang et al. looked at 471 consumers to understand:

  • Whether ChatGPT impacts consumer choices.
  • The process that impacts choices.
  • The impact on products with low-brand awareness vs. high-brand awareness.

Results:

  • ChatGPT does influence the consumer purchase journey and products recommended by ChatGPT are more likely to be adopted.
  • Products with low brand awareness see higher trust after a recommendation from ChatGPT.

My Take

  • ChatGPT had 560 million unique worldwide visitors in December 2024, compared to Google’s 6.5 billion. For comparison, that’s still small but about the size of X/Twitter today.
  • ChatGPT sending more referral traffic to a diverse list of domains is probably a strategic move to win the web over and establish itself more as an alternative to Google. I don’t think OpenAI has to do that. I think they strategically chose to.
  • So far, it seems young men in the U.S., BRIC, and Europe are the major users of ChatGPT. If that’s your target audience, optimizing for AI chatbot visibility should be a higher priority.
  • To be crystal clear, I don’t think anybody has to optimize for AI chatbot visibility. I’m confident that most industries will be fine doing classic SEO for years to come. Some will even be fine in a decade. However, you can’t unsee the rapid adoption, which leads us to a situation where two things are true: classic SEO still works, and there is a first-mover advantage on AI chatbots.

How Can You Grow Your AI Chatbot Visibility?

Improving AI chatbot visibility is a mix of known and new levers.

Crawlability

Being visible on AI chatbots starts with being visible to their crawlers. Crystal Carter, head of SEO Commus at Wix, calls this “retrievability.”

Groomed XML sitemaps, strong internal linking, fast server response, and clean HTML are a good start.

LLM crawlers are less forgiving than Google when it comes to JavaScript and client-side rendering for critical SEO components. Avoid at all cost!

Brand Strength

Ziff Davis, 11/24: A Ziff Davis study compares Domain Authority in curated (OpenWebText, OpenWebText2) with uncurated public web indices (Common Crawl, C4) to investigate how major AI companies like OpenAI, Google, and Meta trained their large language models. The unsurprising conclusion is that AI developers prefer curated text to train their models, naturally giving commercial publishers more visibility.

Semrush, 12/24: Google tends to show larger domains, ChatGPT smaller ones. The opposite is true for transactional searches: Search GPT prefers larger domains, Google smaller ones.

Seer, 01/25: Backlinks showed no correlation with AI chatbot visibility.

Organic Ranks

Seer, 01/25: Brands ranking on page 1 of Google showed a strong correlation (~0.65) with LLM mentions. Bing rankings also mattered, but a little less (~0.5–0.6).

Semrush, 02/25: The overlap between Google, Perplexity, and ChatGPT search is low (25-35% on average). However, the overlap between ChatGPT search and Bing is much higher (average = 7 domains) than with Google (4 domains).

Go Off-Google

Semrush, 02/25: YouTube is the third largest domain by referral traffic from ChatGPT. Facebook, LinkedIn, and GitHub are in the top 10.

Growth Memo, 02/25: Amazon, eBay, and Walmart dominate in Google Search just as much as in AI chatbots.

My Take

  • There is a big question of how important backlinks are for AI chatbot visibility. I think there is a trap to think they have a direct impact. The way I understand the data is that they help with Google/Bing visibility, which passively translates to AI chatbot visibility. They might also help with LLM crawler discoverability. So, they’re still important but not as much as the content itself.
  • The biggest lever seems to be citable content on and off of Google: Industry reports with exclusive research and data, original surveys and case studies, and thought leadership content from recognized experts.
  • I wouldn’t restrict myself from optimizing for AI chatbot visibility as a small business with little to no visibility on classic search engines.
  • Ecommerce is an outlier because the journey is so much more transactional than for B2B or media. On one hand, the strong visibility of big ecommerce platforms like Amazon provides a direct path for AI chatbot visibility for merchants. On the other hand, integrating with programs like Perplexity’s Buy With Pro seems worth trying out.

How Are People Searching On AI Chatbots?

Consumers use AI chatbots differently than Google unless they turn on search features.

Semrush, 02/25: 70% of ChatGPT queries represent entirely new types of intent that don’t fit traditional search categories (navigational, informational, commercial, transactional).

  • Users are asking longer, more complex questions, with non-search-enabled ChatGPT prompts averaging 23 words compared to 4.2 words when search is enabled.

Higher Visibility, 02/25: People use different AI chatbots for different user intents, e.g., Google for initial product research, ChatGPT for product comparison, and Instagram for discovering new products. However, almost 80% stick to traditional search engines for informational searches.

Growth Memo, 02/25: AI chatbots send significantly more traffic to homepages (22% on average) compared to Google (10%) yet still maintain higher engagement metrics. This trend suggests that AI chatbots are effectively preparing users for brand interactions.

My Take

  • It’s fascinating to see that when people turn on Search in ChatGPT, they use shorter queries and emulate their behavior on Google. I wonder if this behavior sticks over the long term or not. If so, we can assume a stronger carryover from players who dominate classic search engines today to AI chatbots. If not, it might open the field to new players.
  • I’ve long been dissatisfied with our broad classification of user intents (information, navigational, etc.). We had this wrong for a long time. It’s too coarse. 70% of use cases are likely task-related and don’t fit our model for classic search engines. AI chatbots are more than search engines but solve the same problems, just with different means. That’s also where I see Google lagging behind: Consumers already associate AI chatbots with tasks rather than finding information.

What Challenges Are Associated With AI Chatbots?

AI chatbots make for a compelling marketing channel but put marketers in front of tracking and bias problems.

Tracking

We can track the referral source for almost all AI chatbots, but some traffic can still fall into the direct traffic bucket.

Citations in ChatGPT typically include a “utm_source=chatgpt.com” parameter, but links in search results don’t have the parameter.2

Ahrefs, 02/25: AI traffic is likely underreported because AI chatbots like Copilot get clustered into direct while they’re actually referrals.

Brand Bias

Semrush, 12/24: Consumers and users are skeptical about AI output. 50% say they trust it more when it’s been reviewed by a human.

In the paper “Global is Good, Local is Bad?” Kamruzzaman et al. conducted experiments with fill-in-the-blank questions across four product categories and 15 countries (English only). The researchers studied the effect of:

  • Brand attribute bias: global vs. local brands.
  • Socio-economic bias: luxury vs non-luxury brands.
  • Geo bias: local brands when the domestic country is specified.

Results:

  • LLMs across multiple models (GPT-4o, Llama-3, Gemma-7B, Mistral-7B) consistently associate global brands with positive and local brands with negative attributes.
  • LLMs tend to recommend luxury brands to people from high-income countries. In contrast, non-luxury brands are more commonly suggested for people from low-income countries, even when models were given the flexibility to suggest the same brands for both groups.

The underlying reasons are that local brand names are underrepresented in LLM training data, and large companies can afford larger marketing campaigns and, therefore, create more bias.

In the paper “Generative AI Search Engines as Arbiters of Public Knowledge: An Audit of Bias and Authority” by Li et al., researchers tested how ChatGPT, Bing Chat, and Perplexity answer questions about four major topics: climate change, vaccination, alternative energy, and trust in media. They wanted to see if the AI showed bias in its answers and how it tried to appear trustworthy.

The results:

  • The AI tends to match the emotion of the question. If you ask a negative question, you get a negative answer.
  • Different topics got different emotional treatment, e.g., vaccination and alternative energy got more positive responses than climate change and media trust.
  • Bing Chat and Perplexity heavily cite news media and businesses.
  • Heavy reliance on U.S. sources (65% of sources), even when used in other countries.
  • Too many commercial/business sources, especially for topics like alternative energy.
  • Some models mix unreliable sources with good ones.
  • Answers often include uncertain language and hedging to avoid taking strong positions.

My Take

  • We’re used to significant tracking gaps from Google and Bing, so unless AI chatbots try to persuade site owners with more data, we’ll have to continue to operate with aggregate data, as I mentioned in Death of the Keyword.
  • AI chatbot bias is serious. User trust is key to winning, so I assume AI developers are aware and try to solve the problem. Until then, we have to factor bias in with our optimization strategies and do our best to clearly indicate the target audience for our product in our content.

Conclusion: Where It’s All Going

The data we have today shows that AI chatbots are developing into a significant customer acquisition channel with many familiar mechanics.

However, their task-based nature, bias, and demographics suggest we should be cautious when using the same approach as classic search engines.

Don’t forget – Search is just a means to an end. Ultimately, people search to solve problems, i.e., do tasks.

The fact that AI chatbots can skip the search part and do tasks on the spot means they’re superior to classic search engines. For this reason, I expect Google to add more agentic capabilities to AI Overviews or launch a new Gemini-based product in Search.

The underlying technology allows AI chatbots to fork off search engine ranks and develop their own signals. And it evolves rapidly.

The evolution so far went from machine learning in the pre-2022 era to early LLMs and now inference models (think: reasoning).

Better reasoning allows LLMs to recognize user intent even better than classic search engines, making it easier to train models on better sources to mention or cite.

This brings me to the question of whether Google/Bing incumbents will also dominate AI chatbots down the road. Right now, the answer is yes. But for how long?

Generational preferences could be the biggest driver of new platforms. The easiest way for Google to become irrelevant is to lose young people.

  • Semrush, 02/25: Searchers over 35 years use Google more often than ChatGPT. People between 18 and 24 use ChatGPT 46.7% of the time, compared to Google with 24.7%.
  • Higher Visibility, 02/25: 82% of Gen Z occasionally use AI chatbots, compared to 42% of Baby Boomers.

There is a chance that multimodality will quickly play a more prominent role in AI chatbot adoption. So far, text interfaces dominate.

But Google already reports 10 billion searches with Google Lens, and Meta’s Ray Ban smartglasses are very successful. Other than Google Search, the LLM answer format is easy to transport to other devices and modalities, which could transform AI.3


1 ChatGPT now has 400 million weekly users — and a lot of competition

2 Deep Dive: Tracking How ChatGPT + Search & Others Send Users To Your Site

3 Google Lens Reaches 10 Billion Monthly Searches


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Why Is Google Not Indexing My Pages? via @sejournal, @rollerblader

This week’s Ask An SEO question comes from Harjeet:

“Hi! I have a website that provides information of warehouses all over United states. The problem is that only eight to nine pages are indexed by Google. But it has many dynamic pages.

For example, if you do a search for any location on homepage search bar, it will open the location page with the listings.

But Google is not indexing the location pages and warehouse listings as well. Can you help me solve this issue?”

That’s a great question, and I think I can help.

After looking at your website, it looks like you have a crawling issue with a lack of actual pages and not an indexing one.

I can verify you only have nine pages in Google’s indexes. This is because you only have nine actual pages on your website.

I want to start with an overview of some of the site issues. Then, after the overview, I’ll share how to resolve a large chunk of this so you can begin getting pages crawled and indexed.

Identifying The Gaps In Your Website Structure

It’s very hard for both consumers and search engines to navigate your website because you are missing links and easy-to-find navigation.

There are also two sitemaps, and one of them is incorrect. The good news is your correct one is listed in robots.txt, but it only has your site’s navigation in it and not the pages that are being created dynamically.

To start, list the most important subpages in your sitemap so search engines can find them more easily.

The robots.txt file also includes a disallow directive for all user agents which is problematic because this is blocking crawling and not guiding crawlers to proper folders and pages.

Specify “allow” for the important folders and pathways you want crawled. This guides spiders to the pages and folders you feel are most important.

On top of the sitemap and robots.txt issues, there is no internal linking. Internal links are important to allow crawlers to move around your site and find new pages that need indexing.

Next is to look at the quality of your content and how it is displayed.

The content on your site is thin and there is no content that can provide the kind of information that experts in storage in the local area would know.

Add unique content and location information to each page individually (more on this below in the how to fix this issue section).

The site is also missing robots meta tags and canonical links. Meta robot tags give directions on whether the pages should be indexed or not and if links should be followed or not.

Canonical links say which version is the official version of a page and can help deduplicate similar or competing pages. Adding these will help search engines know what to do as the pages and links are discovered.

Last, you have no city or state-based pages as dedicated resources, but you do have them in a drop-down in the search box. The issue here is these pages don’t exist unless searched for.

If they don’t exist on an official URL, search engines cannot find them and index them easily.

There are other tech and content issues here, but I’m going to jump into fixing the main one, which is getting more pages indexed for you. And this is an incredibly easy one, so this is good news.

Filling The Missing SEO Elements

There are a few steps I would take if this were my site or project:

  1. Build a cities and/or regions and states folder structure.
  2. Get unique content for each.
  3. Add in breadcrumbs and internal links.
  4. Modify the robots.txt and sitemaps.
  5. Do local PR work, not spammy links and directories.

Build A Folder Structure

When you allow a site search to generate URLs for cities, states, and locations, you create a ton of competing URLs.

People can spell a city wrong or use an abbreviation: Philly, Philadelphia (and is it Missouri or PA), or neighborhoods like Fishtown or Center City if it is the Pennsylvania version of Philadelphia.

And if you live in the DMV (DC, MD, VA) like I do, you may type in Washington or Columbia Heights vs. the state or city name. Washington could also be either DC or the state.

Building a folder structure lets you guide people to the correct location and makes it easier for search engines to know where you offer services.

For example, EU companies can structure by country; individual countries like Mexico could have states like Jalisco and Oaxaca and the major cities in each.

Create Unique Content

Original location-based content is easy to use for your niche. Each region has different climates, and cities do, too. Bring these into consideration when writing the copy.

If the region has more humidity, talk about how the building protects against mold, mildew, corrosion, and other humidity issues that impact storage and warehousing.

For areas that lose power because of hurricanes, snow, or even heat, talk about backup systems and refrigeration for temperature-critical items being stored.

You could include city-based content and localize with original talking points, including directions to the location. Don’t forget to include physical addresses, hours of operation, and phone numbers.

I wrote this guide to localized title tags and descriptions a while back, it applies here, too.

Add Breadcrumbs And Internal Links

Now that you have unique content for the states and major cities in an easy-to-navigate directory structure, help users and search engines find them.

Add breadcrumbs with breadcrumb schema to the site – ideally, at the top of the page so users can click and use them.

In the unique copy for the state pages, link to the city pages. If it will benefit the user, link back to the main head state pages from the city pages.

An example for this would be if one city has people renting in other cities in that state, or if one location fills up, other locations that are close by and are easily accessible as an alternative.

The copy may mention looking for other warehousing options in that state or other city/region. Use the name of the state or city with the call to action and make them an internal link.

You’re helping the user find a solution, and search engines understand what each page is about.

This is how you build an effective and meaningful internal linking structure for SEO. The main goal is to help a customer; the benefit is you make it easier for search engines to understand your site structure.

Modify The Robots.txt And Sitemaps

At this point, you have a good site and user experience. Now is the time to make sure the states and cities are in the sitemap, and then modify your robots.txt to allow the folders.

You can request a crawl from Search Console for Google and Bing’s version of Webmaster Tools.

As the spiders access robots.txt, they will find your listed pages, then through internal links find other pages to crawl to find all your pages.

With proper meta robots and canonicals, they will see clean pathways and be able to crawl your site better.

Drive Demand Through Local PR

Anyone can list in directories and pay to play. Yes, some may help you, but the real benefit is local PR.

Getting your business featured in the media drives local demand. People in those regions begin searching for your company by name and the services as modifiers on the keywords.

This may or may not help with SEO, but it does one thing: It builds consumer trust and gets you localized, high-value backlinks. These can send trust signals and customers to you.

Local media includes local blogs that do not allow guest posting or sponsored posts, TV and print media like local newspapers, radio stations and podcasts with descriptions with links, and other platforms that people in the city or state use.

You may be able to add a PR bar with “As Seen In” featuring local media logos to build trust with that region.

Summary

The good news is that you don’t have an indexing problem. The pages that exist are indexed.

You have discovery problems because only nine pages actually exist on your website.

The solution here is to build the state and city-based pages, fill them with content and site structure, and then do the work to build trust with PR and trust-building activities.

I hope this helps. Thank you for asking your question – it’s a good one!

More Resources:


Featured Image: Sammby/Shutterstock

WordCamp Asia: No Plans For WordPress In 5 Years via @sejournal, @martinibuster

An awkward Q&A at WordCamp Asia 2025 saw Matt Mullenweg struggle to answer where WordPress will be in five years. Apparently caught off guard, he turned to the Lead Architect of Gutenberg for ideas, but he couldn’t answer either.

Project Gutenberg

Gutenberg is a reimagining of how WordPress users can build websites without knowing any code, with a visual interface of blocks for different parts of a web page, which is supposed to make it easy. Conceived as a four phase project, it’s been in development since 2017 and is currently in phase three.

The four phases are:

  • Phase 1: Easier Editing
  • Phase 2: Customization
  • Phase 3: Collaborative Editing
  • Phase 4: Multilingual Support

There’s a perception that Project Gutenberg has not been enthusiastically received by the WordPress developer community or by regular users, even though there are currently 85.9 million installations of the Gutenberg WordPress editor.

However, one developer at WordCamp Asia told Matt Mullenweg at the end of conference Q&A session that she was experiencing hesitations from people she speaks with about using WordPress and expressed frustration about how difficult it was to use it.

She said:

“Some of those hesitations were it’s easy to get overwhelmed. You know, when you look up how to learn WordPress, and I had to be really motivated… for myself to actually study it and kind of learn the basics of blocks… So do you have any advice on how I could convince my friends to start a WordPress site or how to address these challenges myself? You know like, getting overwhelmed and feeling like there’s just so much. I’m not a coder and things like that… any advice you can offer small business owners?”

The whole purpose of the Gutenberg block editor was to make it easier for non-coders to use WordPress. So a WordPress user asking for ideas on how to convince people to use WordPress presented an unflattering view of the success of the WordPress Gutenberg Project.

Where Will WordPress Be In Five Years?

Another awkward moment was when someone else asked Matt Mullenweg where he saw WordPress being in five years. The question seemingly caught him off guard as he was unable to articulate what the plan is for the world’s most popular content management system.

Mullenweg had been talking about the importance of AI and of some integrations being tested in the commercial version at WordPress.com. So the person asking the question asked if he had any other ideas beyond AI.

The person asked:

“If you have other ideas beyond AI or even how we consume WordPress five years from now that might be different from today.”

Matt Mullenweg answered:

“Yeah, it’s hard to think about anything except AI right now. And as I said a few years ago, before ChatGPT came out, learn AI deeply. Everyone in the room should be playing with it. Try out different models. Check out Grok, check out DeepSeek, two of the coolest ones that just launched.

And for WordPress, at that point will be past all the phases of Gutenberg. I think… I don’t know…”

It was at this point that Mullenweg calls on Matías Ventura, Lead Architect of Gutenberg, to ask him if he has any ideas of where WordPress is headed in five years.

He continued:

“Matías, what do you think? What’s post-Gutenberg? We’ve been working for so long, it’s…”

Matías Ventura, Lead Architect of Gutenberg, came up to a microphone to help Mullenweg answer the question he was struggling with.

Matías answered:

“I mean, hopefully we’ll be done by then so…”

Mullenweg commented:

“Sometimes that last 10% takes, you know, 90% of the time.”

Matías quipped that it can take a hundred years then continued his answer, which essentially admitted that there were no plans without actually admitting that there were no plans for five years out.

He continued his answer:

“I don’t know, I think, well in the talk I gave I… also reflected a bit that part of the thing is just discovering as we go, like figuring out how like, right now it’s AI that’s shaping reality but who knows, in a few decades what it would be. And to me, the only conviction is that yeah, we’ll need to adapt, we’ll need to change. And that’s part of the fun of it, I think. So I’m looking forward to whatever comes.”

Mullenweg jumped in at this point with his thoughts:

“That’s a good point of the, you know, how many releases we have of WordPress right now, 60 or whatever… 70 probably…. Outside of Gutenberg, we haven’t had a roadmap that goes six months or a year, or a couple versions, because the world changes in ways you can’t predict.

But being responsive is, I think, really is how organisms survive.

You know, Darwin, said it’s not the fittest of the species that survives. It’s the one that’s most adaptable to change. I think that’s true for software as well.”

Mullenweg Challenged To Adapt To Change

His statement about being adaptable to change set up another awkward moment at the 6:55:47 minute mark where Taco Verdonschot, co-owner of Progress Planner, stood up to the microphone and asked Mullenweg if he really was committed to being adaptable.

Taco Verdonschot is formerly of Yoast SEO and currently sponsored to work on WordPress by Emilia Capital (owned by Joost de Valk and Marieke van de Rakt).

Taco asked:

“I’m Taco, co-owner of Progress Planner. I was wondering, you were talking about adaptability before and survival of the fittest. That means being open to change. What we’ve seen in the last couple of months is that people who were talking about change got banned from the project. How open are you to discussing change in the project?”

Mullenweg responded:

“Sure. I don’t want to go too far into this but I will say that talking about change will not get you banned. There’s other behaviors… but just talking about change is something that we do pretty much every day. And we’ve changed a lot over the years. We’ve changed a lot in the past year. So yeah. But I don’t want to speak to anyone personally, you know. So keep it positive.”

Biggest Challenges WordPress Will Face In Next Five Years

Watch the question and answer at the 6:19:24 mark

Data Shows Google AI Overviews Changing Faster Than Organic Search via @sejournal, @martinibuster

New research on AI Overviews and organic search results presents a fresh view on how AIO is evolving and suggests how to consider it for purposes of SEO.

Among the findings:

  • Their research showed that the AIO they were tracking showed more volatility than the organic search results, that they were changing at a faster rate.
  • AIO volatility doesn’t correlate with organic search volatility
  • They conclude that AIO is replacing featured snippets or “enhancing search results.”
  • It was also concluded that, for the purpose of SEO, AIO should be considered as something separate from the organic search.
  • Generative text changed for every query they looked at.

That last finding was really interesting and here is what they said about that:

“As far as I can tell, the generative text changed for every single query. However, our measure was looking for meaningful changes in the generative text which might reflect that Google had shifted the intent of the original query slightly to return different generative ranking pages.”

Another interesting insight was a caveat about search volatility is that it shouldn’t be taken as a sign of a Google update because it could be the influence of current events temporarily changing the meaning of a search query, which is related to Google’s freshness algorithm. I don’t know who the Authoritas people are but hats off to them, that’s a reasonable take on search volatility.

You can read the AIO research report here, it’s very long, so set aside at least 20 minutes to read it:

AIO Independence From Organic SERPs

That research published by Authoritas got me thinking about AIO, particularly the part about the AIO independence from the search results.

My thoughts on that finding is that there may be two reasons why AIO and organic SERPs are somewhat decoupled:

  1. AIO is tuned to summarize answers to complex queries with data from multiple websites, stitching them together from disparate sources to create a precise long-form answer.
  2. Organic search results offer answers that are topically relevant but not precise, not in the same way that AIO is precise.

Those are important distinctions. They explain why organic and AIO search results change independently. They are on independent parallel paths.

Those insights are helpful for making sense of how AIO fits into overall marketing and SEO strategies. Wrap your head around the insight that AIO and Organic Search do different and complementary things and AIO will seem less scary and become easier to focus on.

A complex query is something AIO can do better than the regular organic search results. An example of a complex question is asking “how” a general concept like men’s fashion is influenced by an unrelated factor like military clothing. Organic search falls short because Google’s organic ranking algorithm generally identifies a topically relevant answer and this kind of question demands a precise answer which may not necessarily exist on a single website.

What Is A Complex Query?

If complex queries trigger AI Overviews, where is the line? It’s hard to say because the line is moving. Google’s AIO are constantly changing. A short TL/DR answer could arguably be that adding a word like what or how can make a query trigger an AIO.

Example Of A Complex Query

Here’s the query:

“How is men’s fashion influenced by military style?”

Here’s the AIO answer that’s a summary based on information combined from from multiple websites:

“Men’s fashion is significantly influenced by military style through the adoption of practical and functional design elements like sharp lines, structured silhouettes, specific garments like trench coats, bomber jackets, cargo pants, and camouflage patterns, which originated in military uniforms and were later incorporated into civilian clothing, often with a more stylish aesthetic; this trend is largely attributed to returning veterans wearing their military attire in civilian life after wars, contributing to a more casual clothing culture.”

Here are the completely different websites and topics that AIO pulled that answer from:

Screenshot Of AIO Citations

The organic search results contain search results that are relevant to the topic (topically relevant), but don’t necessarily answer the question.

Information Gain Example

An interesting feature of AI Overviews is delivered through a feature that’s explained in a Google Patent on Information Gain. The patent is explicitly in the context of AI Assistants and AI Search. It’s about anticipating the need for additional information beyond the answer to a question. So in the example of “how is men’s fashion influenced by military style” there is a feature to show more information.

Screenshot Showing Information Gain Feature

The information gain section contains follow-up topics about:

  • Clean lines and structured fit
  • Functional design
  • Iconic examples of military clothing
  • Camouflage patterns
  • Post-war impact (how wars influenced what men after they returned home)

How To SEO For AIO?

I think it’s somewhat pointless to try to rank for information gain because what’s a main keyword and what’s a follow up question? They’re going to switch back and forth. Like, someone may query Google about the influence of camouflage patterns and one of the information gain follow-up questions may be about the influence of military clothing on camouflage.

The better way to think about AIO, which was suggested by the Authoritas study, is to just think about AIO as a search feature (which is what they literally are) and optimize for that in the same way one optimized for featured snippets, which in a nutshell is to create content that is concise and precise.

Featured Image by Shutterstock/Sozina Kseniia

How to Optimize for GenAI Answers

ChatGPT is taking the world by storm. It reported in November 2024 100 million weekly users, despite being only one of the popular generative AI platforms.

No business should ignore those channels, as consumers increasingly turn to genAI for product and brand recommendations.

Yet showing up in AI answers is tricky, and the tactics vary among platforms.

In August 2024, Seer Interactive, a marketing agency, compared the leading “answer engines.”

Answer Engines: AI vs SEO(Seer Interactive) Generative Engine Optimization (GEO) vs SEO AI Models: • Claude, Llama (Training Data) • Perplexity, Google AIO (Search Data) • ChatGPT, Gemini (Hybrid: Training + Search Data) • SEO: Google, Bing, Yahoo How They Generate Results: • Claude, Llama: LLM interprets query and serves information from training data. • Perplexity, Google AIO: LLM interprets query and serves information primarily from web index. • ChatGPT, Gemini: LLM routes response via training data or web index based on query. • SEO: Index & Retrieval: Crawl & Index. How They Serve Results: • Claude, Llama: Primarily text • Perplexity, Google AIO: Text & citation links • ChatGPT, Gemini: Text & citation links • SEO: Search engine serves most relevant indexed webpages (10 blue links, SERP features, Ads). Ability to Influence: • Claude, Llama: Low • Perplexity, Google AIO: Medium • ChatGPT, Gemini: Medium • SEO: High Speed to Influence: • Claude, Llama: Slow • Perplexity, Google AIO: Fast • ChatGPT, Gemini: Medium • SEO: Fast Mechanisms of Influence: • Claude, Llama: Brand marketing, Earned media • Perplexity, Google AIO: Website content, Earned media, Organic social • ChatGPT, Gemini: Content, Brand, Earned media, Social • SEO: Content, Brand, Earned media

“Answer Engines: AI vs. SEO” from Seer Interactive. Click image to enlarge.

Here’s my version.

Structure Organic Search ChatGPT, Gemini, Copilot Perplexity, Google AI Overviews Claude, DeepSeek, Llama
Knowledge sources Search index, knowledge graph Training data + search data + memory Search index Training data
Output Citations, ads, search features Answers + citations Answers + citations Answers (few links)
Optimization tactics Content, backlinks, branding Content, backlinks, branding Google & Bing indexes + top ranking Branding

Foundational search optimization tactics apply to genAI. Sites should be indexed and visible in Google and Bing to appear in answers on the leading platforms — ChatGPT, Gemini, Google’s AI Overviews, Microsoft Copilot, and Perplexity.

However, some AI-optimizaiton tactics are more important than others in my experience.

Fast, Light, Simple

AI crawlers that access a site will learn about the business and its purpose but may not link to it. Generative AI platforms often repurpose content without referencing the source. Antropic’s Claude, Meta’s Llama, and now DeepSeek rarely include links.

Thus allowing those AI crawlers on a site is debatable. My advice to clients is this: Google has monetized our content for years, but we’ve all benefitted from the visibility. So I usually suggest optimizing for AI platforms rather than blocking them.

The best AI-optimized sites are fast, light, and usable with JavaScript disabled. AI crawlers are immature, more or less. Most cannot render JavaScript and abort crawling slow-loading sites.

No Fluff

For years, Google’s machine learning favored featured snippets from pages with clear, concise, factual answers — even when the page itself wasn’t ranking organically near the top.

Recent case studies prove the point. One comes from search optimizer Matt Diggity, who shared examples on X of the ranking benefits in Google from brevity and clarity.

Search optimizer Matt Diggity posted on X the results from his “natural language processing” text. Click image to enlarge.

Matt’s findings apply to all writing, including generative AI platforms.

In short, AI optimization aligns with commonsense organic search tactics. Optimizing for one will likely help the other.

Q&A: Passport CEO Talks Tariffs

President Trump’s tariffs have merchants scrambling to gauge the impact on imports, exports, and overall cost. There’s no better authority to assess that impact than Alex Yancher. He’s the CEO and co-founder of Passport, a global provider of cross-border logistics, localization, and support for ecommerce sellers.

He and I recently spoke on the state of tariffs, the likely impact, and how merchants should react. The entire audio of that conversation is embedded below. The transcript is edited for clarity and length.

Practical Ecommerce: What’s the status of the Trump tariffs?

Alex Yancher: Let’s break it down by country, starting with China, which seems to be the focus of the Trump administration. Plus, it’s the only new tariff in effect. The president implemented a 10% tariff on all imports from China starting February 4. That’s 10% incremental, on top of the existing 39% tariff from the Biden administration.

For goods from Canada and Mexico, the president announced a 25% tariff but reversed course within a day or two. We’re waiting for more information from the administration, but it doesn’t look like those new tariffs will occur.

President Trump’s salacious post in mid-February about reciprocal tariffs, meaning like-for-like, adds more uncertainty. If one of our industries is subject to a 50% tariff from a country, he suggested reciprocating with an equivalent 50% tariff.

An adjacent development relevant to ecommerce is changes to the U.S. de minimis rules. “De minimis” refers to excluding tariffs for shipments valued below a certain amount, currently at $800. A tariff could be 1,000%, but it’s waived if the item is under $800. Any revision to that rule would be huge for ecommerce sellers.

However, the administration removed the de minimis and then reversed the decision. So it’s still intact. We’re hearing rumblings that Trump will remove it again, at least for China-made goods. A change would be a massive regulatory hurdle to monitor and enforce — likely costing more money to oversee than it generates. So stay tuned.

PEC: Practical Ecommerce has long encouraged free trade and cross-border collaboration. Nonetheless, what’s President Trump’s rationale for tariffs?

Yancher: It seems to fit into three buckets. One is national security and border integrity, including fentanyl-related issues. The second bucket is allegations of unfair, unbalanced trade. We can see that in our trade deficit numbers. We have a trade deficit with virtually every country. Trump doesn’t like that imbalance.

The third bucket is the MAGA, America-first position, putting U.S. workers and companies first — ahead of free trade principles, inflation, and so forth.

Those are the three rationales, more or less.

PEC: Your company, Passport, facilitates trade in 180-plus countries. Are the first two reasons — national security and unfair trade — legit?

Yancher: There’s something to be said about a porous border regarding people and packages. We’ve had bipartisan legislation on oversight of packages coming in, such as drugs and illicit paraphernalia. Most countries are ahead of us in collecting rigorous data and information on incoming goods.

Passport is an internationalization company, as you mentioned. We’re smack in the middle of data flow. The information we must pass to foreign governments is typically much more strict than what the U.S. requires. The U.S. and Australia are the only countries with a high de minimis.

Another aspect of national security is ensuring the supply of critical medical products, such as personal protective equipment during Covid. We don’t want to rely on another country for those items.

In terms of unfair trade, it’s hard to say. U.S. consumers benefit from having access to low-cost goods. U.S. prices are lower for the most part than any other country. That’s partly because we have low tariffs.

PEC: Let’s move on to the impact on ecommerce merchants. What’s your advice?

Yancher: You’re safe if you manufacture in the U.S. unless you import components. You’re likely cheering for the administration to compel countries to lower their tariffs and thus expand your market.

If you manufacture goods in China, there’s presumably a reason you do it since there are already tariffs involved. And now your goods have just become 10% more expensive. So what do you do? Is 10% that meaningful? Are there other suppliers? The answer is case-dependent. Certainly companies are re-evaluating their bill of materials and their supply chains.

Merchants that ship directly from China to consumers in the U.S. are in a tough spot. The de minimis is almost certainly going away, likely very quickly. I advise those sellers to keep going until the bitter end while also devising a plan B.

Otherwise, those direct-from-China sellers may have to pay the duty on the retail sales price. There are ways of structuring the setup to pay the duty on the cost of goods sold, the manufacturer’s cost. But it’s unclear and risky. We’re talking about a lot of money and a big structural expense. Sellers in that position must devise a plan now.

PEC: Tell us about Passport.

Yancher: We help ecommerce merchants go global irrespective of their size. We work with small and large brands. We help them with front-end internationalization — collecting the correct amount of duties and taxes, displaying local currencies, and regulatory and fiscal compliance.

We recently acquired Brand Access, a company that helps enterprises set up local operations in-country. We’ll handle the logistics, warehousing, and importer of record and their seller of record. We’ll equip their front-end consumer experience for a high conversion rate.

We’re at PassportGlobal.com. We’ve launched a new site, TrumpTradeTracker.com, to help the industry stay current on all the trade changes and cut through the noise.

TikTok Ads Achieve Highest Short-Term ROI, Says Dentsu Study via @sejournal, @MattGSouthern

A recent Dentsu study, in partnership with TikTok, shows that advertisers on the platform achieve strong returns on investment (ROI) for both short-term and long-term sales.

The analysis (PDF link) focused on 15 brands in Norway, Denmark, Sweden, and Finland and compared TikTok’s performance with other media channels.

Dentsu found:

“With a short-term ROI of 11.8, TikTok ranks among the most effective media channels for driving immediate sales, according to dentsu benchmark data. In practical terms, this means that advertisers generate nearly 12 times their initial investment in sales revenue within just six weeks—establishing TikTok as one of the strongest performance marketing channels available.”

In other words, advertisers gained nearly 12 dollars in sales for every dollar spent in six weeks or less.

For comparison, Dentsu measured the average short-term ROI from all media at 8.7.

Other top findings from the Dentsu study include:

  • 75% of advertisers found that TikTok provided the highest ROI compared to other channels.
  • All advertisers saw a substantial boost in short-term sales from TikTok.
  • To achieve the best short-term results, combine lower-funnel platforms with TikTok.
  • TikTok’s sales impact lasts 3 to 4 weeks after a campaign ends.
  • Advertisers with the best ROI stayed active on TikTok.
  • User-generated content featuring creators had the strongest sales impact.

What About Long-Term Impact?

The data indicates that, in addition to the 11.8 ROI achieved in the first six weeks, a 4.5 ROI is observed within 10 months after exposure.

This suggests that TikTok can function as a performance channel for immediate results and a tool for building brand equity over the long term.

However, as you can see in the chart below, several ad platforms achieve better results than TikTok in the longer term.

Screenshot from: From Storytelling to Sales: Short and Long-Term ROI of TikTok. A Marketing Mix Modeling Study By Dentsu. February 2025.

Storytelling Ads Perform Better

The study examined two main types of user-generated content (UGC):

  • Storytelling UGC: Content with a narrative focus and no direct promotional offers.
  • Tactical UGC: Content centered on pricing, sales promotions, or direct product calls to action.

Dentsu found that storytelling-based UGC drove stronger sales results than tactical promotions.

The narrative approach generated better audience engagement and recall, translating to 70% higher ROI than promotional content.

What This Means

Dentsu’s data shows TikTok can help meet both short-term sales goals and long-term brand goals.

Advertisers looking for quick sales and lasting brand impact could consider including TikTok in their media strategy.

Methodology

Dentsu used a three-step approach to measure TikTok’s full impact:

  • Short-Term Sales: Tracked direct revenue within 6 weeks of ads running while filtering out other market factors.
  • Brand Metrics: Measured how TikTok advertising shifted consumer perceptions.
  • Long-Term Sales: Connected those brand perception changes to additional sales 1-10 months later.

The study analyzed actual spending and sales data from 15 Nordic brands over 2-3 years. Statistical models isolated TikTok’s contribution from external factors like seasonality, competition, and other marketing channels.

This methodology captures both immediate performance and longer-term brand effects in a single framework.


Featured Image: Poetra.RH/Shutterstock

Beyond Tools: A Google Ads Guide To Detecting And Preventing Click Fraud In Lead Generation

Click fraud in lead generation can drain your marketing budget and corrupt your data, leading to misguided strategic decisions.

While automated detection tools serve as a first line of defense, relying solely on them is not enough.

This guide presents practical, hands-on approaches to identify and combat click fraud in your lead generation campaigns in Google Ads.

Understanding Modern Click Fraud Patterns

Click fraud isn’t just about basic bots anymore. The people running these scams have gotten much smarter, and they’re using tricks that your regular fraud tools might miss.

It’s a big business, and if you think you are not affected, you are wrong.

Here’s what’s really happening to your ad budget: Real people in click farms are getting paid to click on ads all day long.

They use VPNs to hide where they’re really coming from, making them look just like normal customers. And they’re good at it.

The bots have gotten better, too. They now copy exactly how real people use websites: They move the mouse naturally, fill out forms like humans, and even make typing mistakes on purpose.

When these smart bots team up with real people, they become really hard to spot.

The scammers are also messing with your tracking in clever ways. They can trick your website into thinking they’re new visitors every time.

They can make their phones seem like they’re in your target city when they’re actually on the other side of the world.

If you’re counting on basic click fraud protection to catch all this, you’re in trouble. These aren’t the obvious fake clicks from years ago – they’re smart attacks that need smart solutions.

That being said, the good old competitor trying to click 50 times on your ad is also still existent and not going away anytime soon.

Luckily, it is safe to say that Google can spot and detect those obvious fraud clicks in many cases.

Google’s Click Fraud Dilemma: Walking The Revenue Tightrope

Google faces a tricky problem with click fraud.

Every fake click puts money in Google’s pocket right now, but too many fake clicks will drive advertisers away. This creates a conflict of interest.

Google needs to show that it’s fighting click fraud to keep advertisers happy and the ad platform and all of its networks healthy, but it can’t afford to catch every single fake click.

If it did, its ad revenue would drop sharply in the short term because it also runs the risk of blocking valid clicks if it goes in too aggressively.

But if it doesn’t catch enough fraud, advertisers will lose trust and move their budgets elsewhere.

Some advertisers say this explains why Google’s fraud detection isn’t as strict as it could be.

They argue Google has found a sweet spot where it catches just enough fraud to keep advertisers from leaving, but not so much that it seriously hurts its revenue.

This balance gets even harder as fraudsters get better at making fake clicks look real.

This is also why many advertisers don’t fully trust Google’s own click fraud detection and prefer to use third-party tools.

These tools tend to flag more clicks as fraudulent than Google does, suggesting Google might be more conservative in what it considers fraud.

The Over-Blocking Problem Of Third-Party Tools

Third-party click fraud tools have their own business problem: They need to prove they’re worth paying for every month.

This creates pressure to show lots of “blocked fraud” to justify their subscription costs. The result? Many of these tools are too aggressive and often block real customers by mistake.

Other tactics are to show lots of suspicious traffic or activities.

Think about it. If a click fraud tool shows zero fraud for a few weeks, clients might think they don’t need it anymore and cancel.

So, these tools tend to set their detection rules very strict, marking anything slightly suspicious as fraud. This means they might block a real person who:

  • Uses a VPN for privacy.
  • Shares an IP address with others (like in an office).
  • Browses with privacy tools.
  • Has unusual but legitimate clicking patterns.

This over-blocking can actually hurt businesses more than the fraud these tools claim to stop.

It’s like a store security guard who’s so worried about shoplifters that they start turning away honest customers, too.

Why Click Fraud Tools Are Still Valuable

Despite these issues, click fraud tools are still really useful as a first line of defense.

They’re like security cameras for your ad traffic. They might not catch everything perfectly, but they give you a good picture of what’s happening.

Here’s what makes them worth using:

  • They quickly show you patterns in your traffic that humans would take weeks to spot.
  • Even if they’re sometimes wrong about individual clicks, they’re good at finding unusual patterns, like lots of clicks from the same place or at odd hours.
  • They give you data you can use to make your own decisions – you don’t have to block everything they flag as suspicious.

The key is to use these tools as a starting point, not a final answer. Look at their reports, but think about them carefully.

Are the “suspicious” clicks actually hurting your business? Do blocked users fit your customer profile?

Use the tool’s data along with your own knowledge about your customers to make smarter decisions about what’s really fraud and what’s not.

In terms of functionality, most third-party click fraud detection tools are somewhat similar to each other.

A simple Google search on “click fraud tool” shows the market leaders; the only bigger difference is usually pricing and contract duration.

Tackling Click Fraud With Custom Solutions

After getting a first impression with third-party click fraud tools, it’s best to build a collection of custom solutions to tackle your individual scenario.

Every business has a different situation with different software environments, website systems, and monitoring.

For custom solutions, it’s recommended to work closely with your IT department or developer, as many solutions require some modification on your website.

The Basics: Selecting An Identifier

There are a handful of solutions to cover 80% of the basics.

The first way to do something against click fraud is to find a unique identifier to work with.

In most cases, this will be the IP address since you can exclude certain IP addresses from Google Ads, thus making it a good identifier to work with.

Other identifiers like Fingerprints are also possible options. Once an identifier is found, you need to make sure your server logs or internal tracking can monitor users and their identifiers for further analysis.

The Basics: CAPTCHAs

Another basic tool, which is often forgotten, is CAPTCHAs.

CAPTCHAs can detect bots or fraudulent traffic. Google offers a free and simple-to-implement solution with reCAPTCHA.

CAPTCHAs might seem like an easy answer to bot traffic, but they come with serious downsides.

Every time you add a CAPTCHA, you’re basically telling your real users, “Prove you’re human before I trust you.” This creates friction, and friction kills conversions.

Most websites see a drop in form completions after adding CAPTCHAs if they are set too aggressively.

Smart CAPTCHAs can limit the frequency, but not all CAPTCHA providers allow that option, so choose your provider or solution wisely.

The Basics: Honeypot Fields

Honeypot fields are hidden form fields that act as traps for bots.

The trick is simple but effective: Add extra fields to your form that real people can’t see, but bots will try to fill out.

Only bots reading the raw HTML will find these fields; regular users won’t even know they’re there. The key is to make these fields look real to bots.

Use names that bots love to fill in, like “url,” “website,” or “email2.” If any of these hidden fields get filled out, you know it’s probably a bot. Real people won’t see them, so they can’t fill them out.

Pro tip: Don’t just add “honeypot” or “trap” to your field names. Bots are getting smarter and often check for obvious trap names. Instead, use names that look like regular-form fields.

Advanced Validation Methods

Smart Form Validation: Email

Most businesses only check if an email address has an “@” symbol and looks roughly correct.

This basic approach leaves the door wide open for fake leads and spam submissions.

Modern email validation needs to go much deeper. Start by examining the email’s basic structure, but don’t stop there.

Look at the domain itself: Is it real? How long has it existed? Does it have proper mail server records?

These checks can happen in real time while your user fills out the form. It should be noted, however, that smart form validation usually requires some sort of third-party provider to check the details, which means you need to rely on external services.

A common mistake is blocking all free email providers like Gmail or Yahoo. This might seem logical, but it’s a costly error.

Many legitimate business users rely on Gmail for their day-to-day operations, especially small business owners.

Instead of blanket blocks, look for unusual patterns within these email addresses. A Gmail address with a normal name pattern is probably fine; one with a random string of characters should raise red flags.

For enterprise B2B sales, you expect bigger companies to sign up with their company domain email address, so blocking free mail providers might work.

Smart Form Validation: Phone

Phone validation goes far beyond just counting digits. Think about the logic of location first.

When someone enters a phone number with a New York area code but lists their address in California, that’s worth investigating.

But be careful with this approach – people move, they travel, and they keep their old numbers. The key is to use these mismatches as flags for further verification, not as automatic rejections.

The Art Of Smart Data Formatting

Data formatting isn’t just about making your database look neat. It’s about catching mistakes and fraud while making the form easy to complete for legitimate users.

Name fields are a perfect example.

While you want to catch obviously fake names like “asdfgh” or repeated characters, remember that real names come in an incredible variety of formats and styles.

Some cultures use single names, others have very long names, and some include characters that might look unusual to your system.

Modify Your Google Ads Campaign Settings To Tackle Click Fraud

Google offers multiple campaign options to increase reach, on the downside most of those options come along with an increase of click fraud activities.

App Placements

Performance Max campaigns can place your ads across Google’s entire network, including in apps. While this broad reach can be powerful, it also opens the door to potential fraud.

The challenge is that you have limited control over where your ads appear, and some of these automatic placements can lead to wasted ad spend.

Kids’ games are often a major source of accidental and fraudulent clicks. These apps frequently have buttons placed near ad spaces, and children playing games can accidentally tap ads while trying to play.

What looks like engagement in your analytics is actually just frustrated kids trying to hit the “play” button.

Another issue comes from apps that use deceptive design to generate clicks. They might place clickable elements right where ads appear, or design their interface so users naturally tap where ads are located.

This isn’t always intentional fraud. Sometimes, it’s just poor app design, but it costs you money either way.

Unlike traditional campaigns, where you can easily exclude specific placements, Performance Max’s automation makes this more challenging.

The system optimizes for conversions, but it might not recognize that clicks from certain apps never lead to quality leads. By the time you spot the pattern, you’ve already spent money on these low-quality clicks.

Excluding app placements is for almost all advertisers a must have. Very few advertisers benefit from app placements at all.

Partner And Display Network

Lead generation businesses face a unique challenge with Performance Max campaigns that ecommerce stores can largely avoid.

While ecommerce businesses can simply run Shopping-only campaigns and tap into high-intent product searches, lead gen businesses are stuck dealing with the full Performance Max package, including the often problematic Display Network.

The Display Network opens up your ads to a mass of websites, many of which might not be the quality placements you’d want for your business.

While Google tries to filter out bad actors, the display network still includes sites that exist primarily to generate ad clicks.

These sites might look legitimate at first glance, but they’re designed to encourage accidental clicks or attract bot traffic.

Some are specifically designed for server bot farms, as they run on expired domains and have no content besides ads.

Lead generation businesses don’t have this luxury. Their Performance Max campaigns typically run on all networks except shopping. This creates several problems:

  • The quality of clicks varies wildly. Someone might click your medical practice ad while trying to close a pop-up on a gaming site. They’ll never become a patient, but you still pay for that click.
  • Display placements can appear on sites that don’t match your brand’s professional image. Imagine a law firm’s ad showing up on a site full of questionable content – not ideal for building trust with potential clients.
  • Bot traffic and click farms often target display ads because they’re easier to interact with than shopping ads. You might see high click-through rates that look great until you realize none of these clicks are turning into leads.

All those are reasons to question PMax campaigns for lead gen, but that’s a decision every marketer has to make.

Advanced Google Ads Settings To Tackle Click Fraud

If the basics are implemented but there is still a higher amount of suspected click fraud, advanced solutions need to be implemented.

Besides excluding suspicious IP addresses, you can also build negative audiences.

The idea is to have a second success page for your lead generation form and only forward potential bots or fake sign-ups to this page.

To achieve that, your website needs to evaluate potential bots live during the sign-up process.

You can then setup a dedicated “bot pixel” on the second success page in order to send data of this audience to Google.

Once enough data is retrieved, you can exclude this audience from your campaigns. This approach is a little trickier to implement but is worth the effort as those audience signals are of high quality if enough data is supplied.

Make sure to only fire the “bot pixel” on the special success page and only there, otherwise you run the risk of mixing your audiences which would render the system useless.

Filtering Fake Leads With Conditional Triggers

Another tracking-based strategy is to set up condition-based conversion tracking. Combined with hidden form fields, you can modify the conversion trigger not to send data if the hidden field was filled.

In that scenario, you would filter out bots from conversion tracking, sending back only real conversion to your campaign, and therefore, also training the Google algorithm and bidding strategy only on real data.

You eliminate a majority of fake leads and traffic with this setup.

Making Sign-Ups More Challenging To Improve Lead Quality

Another advanced strategy is to make the sign-up process a lot harder.

Tests have shown that much longer forms are not finished by bots because they are usually trained on simpler and shorter forms, which require only mail, name, phone, and address.

Asking specific questions and working with dropdowns can dramatically increase the lead quality. It should be noted, however, that longer forms can also hurt the valid signup rate, which is a risk you want to take if you have to deal with bot and fraud traffic.

A fitting case was a car dealer I worked with. They had a form where people could offer their cars for sale and retrieve a price estimate.

A short form had almost three times the signup rate than before, but it turned out later that a lot of them were spam signups or even very low-qualified leads.

A shorter form leads to more spam because it’s easy to sign up. After switching to a longer form, the signups dropped, but quality increased drastically.

Almost 20 fields long, and potential clients had to upload pictures of their car.

It took a few minutes to finish the signup, but those who did were committed to doing business and open to discussing the sale, which also made it easier for the salespeople to follow up properly.

A Hard Truth About Lead Fraud

Let’s be honest: You can’t completely stop lead fraud. It’s like shoplifting in retail – you can reduce it, you can catch it faster, but you can’t eliminate it entirely.

The fraudsters are always getting smarter, and for every security measure we create, they’ll eventually find a way around it.

But here’s the good news: You don’t need perfect protection. What you need is a balanced approach that catches most of the bad leads while letting good ones through easily.

Think of it like running a store: You want security, but not so much that it scares away real customers.

The key is to layer your defenses. Use click fraud tools as your first line of defense, add smart form validation as your second, and keep a human eye on patterns as your final check.

Will some fake leads still get through? Yes. But if you can stop 90% of the fraud, you’re winning the battle.

Remember: Perfect is the enemy of good. Focus on making fraud expensive and difficult for the bad actors, while keeping your lead generation process smooth and simple for real prospects. That’s how you win in the long run.

More Resources:


Featured Image: BestForBest/Shutterstock

Local SEO Schema: A Complete Guide To Local Structured Data & Rich Results via @sejournal, @rio_seo

Structured data markup can work diligently behind the scenes to help your local business shine online.

It can add eye-catching rich results to your search results, like review stars, FAQs, and breadcrumbs, that grab attention and encourage more clicks.

Structured data uses the standardized vocabulary of Schema to tell search engines – and even AI tools – exactly what your website is about, making it easier for customers to find you.

While it’s not a direct ranking boost, structured data plays a big role in making your business more visible, whether in traditional search results or AI-powered tools like Gemini or ChatGPT.

From a local ice cream shop to a hardware store, adding structured data can make a huge difference in boosting your local SEO and staying ahead in today’s ultra-saturated digital world. It often remains an untapped resource, despite its potential to significantly enhance your local SEO strategy.

This guide will equip you with actionable knowledge to use structured data markup to boost your local SEO and strengthen your visibility across search engines and AI platforms.

Why Does Schema Matter?

Structured data with Schema convey additional information to search engines so they can interpret and display your content more effectively, giving your business a competitive edge in search engine results pages (SERPs).

Google has consistently highlighted the importance of schema and structured data in delivering relevant, detailed information to users.

Implementing schema correctly can improve your visibility, attract more clicks, and even increase conversions.

Let’s clarify key terms related to schema and structured data markup.

Understanding Schema, Structured Data, Rich Results, And SERP Features

Local search marketers often use the terms “schema”, “schema markup” and “structured data” interchangeably, but there are differences to understand.

Structured Data

Structured data is the format for organizing and describing information on a webpage. By implementing structured data markup to a page, you convey additional information and make it easier for search engines to accurately interpret your pages and display relevant snippets in SERPs.

Schema

Schema is an open source standardized vocabulary used to markup structured data. There are other vocabularies, but the search industry uses Schema.org which is a collaborative initiative founded by Google, Bing, Yahoo, and Yandex in 2011.

This vocabulary enables webmasters to tag elements like business names, addresses, phone numbers, customer reviews, and services.

Pages using structured data with schema are eligible for rich results, which can significantly improve how your business appears in search results.

Rich Results

Rich results (also known as rich snippets) are enhanced search elements that provide more detailed and visually engaging information.

Rich results can also be referred to as “SERP features.” Examples include:

Rich results not only improve click-through rates but also help your business stand out in competitive local search results.

SERP Features

A SERP feature is a specialized element on a search results page that provides extra information or functionality beyond standard results.

Examples include featured snippets, local packs, and knowledge panels. It is a broader category covering various elements, while a rich result enhances individual listings using structured data.

Why Structured Data Matters

Structured data is an integral part of any business’s local SEO strategy for myriad reasons. Let’s explore each reason more in-depth.

Improved Search Engine Understanding

Structured data acts as a translator, turning your website’s content into a format that search engines can easily understand and classify.

This allows Google, large language models (LLMs), and other engines to identify key information such as your business hours, location, services, and customer ratings.

The better search engines understand your site, the more likely they are to display relevant information to users.

Enhanced SERP Visibility

Rich results generated from structured data are more visually appealing than standard search results.

For example, a local bakery using schema markup might appear with review stars, a photo, operating hours, and a “Place an Order” button directly in the SERPs.

This enhanced visibility can drive more traffic to your site and attract higher-quality leads.

Increased Click-Through Rates

Pages featuring rich results typically enjoy higher click-through rates (CTRs) compared to those with standard results.

By giving users detailed information upfront – such as pricing, availability, or reviews – you make it easier for them to decide to engage with your business.

Competitive Advantage

In saturated local markets, structured data markup can differentiate your business from competitors.

If your competitor’s listing only shows basic details while yours features rich elements like sitelinks or a star rating, potential customers are more likely to click on your result.

Voice Search Optimization

As voice search grows in popularity, structured data becomes even more important.

Devices like Google Assistant rely heavily on schema to deliver concise, accurate answers to voice queries.

For example, adding a “FAQ” schema to your site can make your business the top result when users ask questions like, “Where’s the best coffee shop near me?”

The Role Of Structured Data In Local SEO

As evidenced above, structured data serves as a vital tool for local businesses, helping search engines understand and present your information more effectively.

For businesses aiming to improve visibility in local search results, structured data provides an opportunity to display essential details in a highly appealing format.

With schema, businesses can highlight critical information such as:

  • Business hours, including holiday schedules.
  • Customer reviews and ratings.
  • Location details with maps and directions.
  • Product pricing and availability.
  • Events and promotions.

For example, a local bakery could use structured data to feature customer reviews, a “Place an Order” button, and seasonal promotions.

An event venue might showcase upcoming events with dates, times, and ticket links, making it easier for potential customers to engage directly from search results.

Practical Benefits For Local Businesses

Here’s how structured data benefits local businesses in practice:

  • Restaurant Example: A family-owned diner uses schema to display operational hours, reviews, and menu links, reducing barriers for diners looking for quick information.
  • Retail Example: A local bookstore features event details, such as upcoming author signings, directly in search results to attract customers.
  • Service Example: A home improvement company highlights service areas and customer testimonials, building credibility and attracting clicks from targeted local users.

These enhancements create a competitive edge by presenting detailed and relevant information before the customer even clicks on your website.

Data And Google Business Profile

Structured data on a location page doesn’t directly affect Google Business Profile (GBP) features like the Map Pack or reviews, but it enhances organic search features, such as rich results, by improving how search engines interpret your website.

While schema doesn’t directly impact GBP rankings, it complements them by ensuring consistent, accurate data across platforms, boosting credibility and visibility.

Including details like address, hours, and services in structured data helps Google associate your site with your GBP listing and can even fill gaps in unclaimed profiles.

Structured Data And Local Ranking Signals

Structured data is not a direct ranking signal in search engine algorithms, as confirmed by Google representatives like John Mueller.

However, it is essential for boosting a website’s visibility and engagement, both of which can impact search rankings.

By organizing information for easy interpretation, structured data improves how content appears in search results, encouraging clicks and interaction.

How Structured Data Impacts AI Results For Local Brands

The rise of AI in search engines and virtual assistants has redefined how structured data impacts digital visibility.

Once primarily a tool for helping search crawlers understand webpage content, structured data now plays a vital role in ensuring local landing pages perform well in AI-driven platforms like Gemini, Bing Chat, ChatGPT, and voice assistants such as Amazon Alexa and Google Assistant.

Structured Data: The Foundation For AI Optimization

Structured data is essential for AI systems like ChatGPT, helping them deliver accurate and relevant information.

Local landing pages using structured data, such as LocalBusiness or GeoCoordinates schema, provide a framework that AI can easily process for precise results.

For instance, structured data defining a business’s address, hours, and reviews allows AI platforms to seamlessly integrate this information into conversations.

Key Benefits Of Structured Data For AI

  1. Improved Contextual Understanding: Structured data helps AI systems understand relationships between key entities on a page. For example, linking a business’s name, address, and service area allows AI to provide more accurate answers for local queries like “electrician near me” or “top-rated gyms in San Diego.”
  2. Enhanced Rich Results: AI tools prioritize structured data to create detailed rich results. A local landing page with Review and AggregateRating schema can lead to AI displaying customer ratings and reviews directly in search results, fostering trust and engagement.
  3. Voice Search Optimization: Structured data enables voice assistants to deliver precise answers. For example, a local restaurant with schema data about its menu and hours will yield accurate responses to queries like “What time does Joe’s Diner open?”
  4. AI-Powered Features Integration: AI models like Google’s Search Generative Experience (SGE) synthesize content into conversational summaries. Local pages with detailed markup are more likely to be included in these overviews, giving businesses better visibility in AI-driven search environments.
Brand Retailer Local Page AI Overview ExampleScreenshot from search, Google, January 2025 – Brand Retailer Local Page AI Overview Example (webpages are utilizing advanced schema)

Localized Search Benefits Of Structured Data

AI search systems increasingly focus on localization, making structured data essential for businesses targeting specific geographic areas.

Key schema types that enhance localization include:

  • GeoCoordinates Schema: Ensures precise location information, allowing AI to integrate it into map-based results.
  • LocalBusiness Schema: Supplies essential business details like name, hours, and services offered.
  • Event Schema: Highlights local events and activities directly tied to the user’s location and query.

Practical Steps To Implement Structured Data For Local Pages

Structured data is essential for local business websites aiming to improve visibility in search engine results.

While many local sites have basic structured data enabled, implementing detailed and well-validated markup can significantly enhance search engine performance and qualify pages for rich results.

Below is a comprehensive guide to applying schema markup effectively.

Step 1: Select The Best Schema.org Category

Choosing the appropriate Schema.org category is critical for ensuring an accurate representation of your business in search results.

Schema.org provides various categories specifically tailored for local businesses. For example:

  • Ice Cream Shops: Use schema.org/IceCreamShop
  • Hardware Stores: Use schema.org/HardwareStore

If no specific category exists for your business, use the general schema.org/LocalBusiness.

Additionally, if you’re technically inclined, you can propose new categories via the Schema.org GitHub forum.

Recommended Local Business Schema for a Hardware StoreScreenshot from schema.org, January 2025 – Recommended Local Business Schema for a Hardware Store

Step 2: Implement Required Schema Properties

After selecting the correct category, include the following required schema properties to ensure validation and avoid disqualification from rich results:

  • url: The URL of the landing page.
  • name: Name of the business.
  • openingHours: Business operating hours.
  • telephone: Business contact number.
  • image: A relevant image (e.g., storefront).
  • logo: A link to your business logo.
  • address: Business address visible on the landing page.
  • geo: Geographical coordinates of your business.
  • areaServed: The service area, preferably specified as a ZIP code.
  • mainContentOfPage: The primary content of your landing page.

Step 3: Add Highly Recommended Schema Properties

These properties are not required but are highly recommended for enhancing visibility:

  • review: A review of your business (only if the local landing page has visible reviews).
  • aggregateRating: The overall rating based on multiple reviews. Ensure compliance with Google’s Review Rich Results guidelines.
  • FAQPage: Mark up FAQ sections with this schema to appear as FAQ rich results.
  • alternateName: Alternative names for your business, e.g., “Acme Inc.” vs. “Acme Stores.”
  • sameAs: Links to third-party profiles like Facebook, YouTube, or Wikipedia.
  • hasMap: A link to your business’s location on Google Maps.
  • breadcrumb: Structured navigation schema to improve rich results in SERPs.
  • department: Internal departments or services within your business.
  • priceRange: A general indicator of your pricing, such as “$$$.”

Step 4: Explore Advanced Schema Types

For businesses seeking even more advanced features, consider these schema types:

  • SearchAction: Formerly known as the sitelinks search box, lets users perform site-specific searches directly from the search engine results page (SERP), enhancing engagement and accessibility.
  • additionalType: Defines additional topical relevance, often using Wikipedia categories. For example, a sporting goods store might use the Wikipedia page for Sports Equipment.
  • headline: Helps local businesses optimize key pages—such as service offerings, promotions, and blog posts—by providing a clear, structured title that improves visibility in search results.
  • alternativeHeadline: Allows local businesses to add a secondary title variation, making content more discoverable for different search terms and customer queries related to local services.
  • significantLink: Highlights key pages that matter most for a local business, such as appointment booking, contact pages, or location-specific services, improving navigation and SEO.
  • contentLocation: Specifies the geographic area a business serves, helping search engines associate its services with a specific city or region, boosting local search rankings.

Step 5: Validate Your Schema Markup

Proper validation is critical for ensuring your structured data qualifies for rich results. Google provides several tools for this purpose:

  • Schema.org Structured Data Validator: Tests structured data directly by pasting your code into the tool. It flags both errors and warnings. While errors must be fixed, warnings are less critical and may not affect rich results.
Validated Schema Example (VALIDATED WITH NO ERRORS/WARNINGS) Screenshot from Schema Markup Validator, January 2025 – Validated Schema Example (validated with no errors/warnings)
  • Rich Results Test: Google’s official tool to preview which rich results can be generated by your structured data.
    Rich Results Test
  • Google Search Console Enhancement Reports: Monitors structured data across your site and provides enhancement reports, highlighting pages with errors or warnings. Notifications from Search Console should be addressed promptly to maintain performance.
    Google Search Console Enhancements Reporting ExampleScreenshot from Google Search Console, January 2025 – Google Search Console Enhancements Reporting Example

Step 6: Measure Rich Results Performance

Tracking your rich results’ performance helps you understand the impact of your schema implementation.

Third-party tools like Semrush offer “SERP feature” reports that show the aggregate rich results your site is earning. This data can be used to identify further optimization opportunities.

Semrush SERP Features Trend ExampleSemrush SERP Features Trend Example

You Can’t Go Wrong With Implementing Good Structured Data

Adding structured data to your location pages is a powerful way to enhance local SEO and improve how search engines and AI systems display your business.

Structured data is especially important for AI, as it helps models like ChatGPT and search assistants better understand and showcase your business details.

It also ensures your website’s information aligns with your Google Business Profile, even if your listing is incomplete or unclaimed.

By making key information easy to find, structured data benefits both AI systems and customers.

With better visibility, higher click-through rates, and a stronger online presence, schema markup is a must for local businesses. Add it to your location pages today to stand out and connect with more customers.

Key Takeaways

  • Selecting the right Schema.org category is crucial for accurate business representation.
  • Implement required and recommended schema properties to qualify for rich results.
  • Validate your structured data using tools like Google’s Rich Results Test and Schema.ord Structured Data Markup Validator.
  • Monitor performance through Google Search Console and third-party tools.

By following these steps, local businesses can maximize the visibility and effectiveness of their structured data, ultimately driving more traffic and engagement through enhanced search results.

Special thanks to Chad Klingensmith, Sr. SEO Strategist at Rio SEO, for his extensive contributions to this article. His in-depth knowledge of structured data ensures the accuracy and relevance of the insights shared here.

More Resources:


Featured Image: pixadot.studio/Shutterstock