Google: AI Overviews Drive 10% More Queries, Per Q2 Earnings via @sejournal, @MattGSouthern

New data from Google’s Q2 2025 earnings call suggests that AI features in Search are driving higher engagement.

Google reported that AI Overviews contribute to more than 10% additional queries for the types of searches where they appear.

With AI Overviews now reaching 2 billion monthly users, this is a notable shift from the early speculation that AI would reduce the need to search.

AI Features Linked to Higher Query Volume

Google reported $54.2 billion in Search revenue for Q2, marking a 12% increase year-over-year.

CEO Sundar Pichai noted that both overall and commercial query volumes are up compared to the same period last year.

Pichai said during the earnings call:

“We are also seeing that our AI features cause users to search more as they learn that Search can meet more of their needs. That’s especially true for younger users.”

He added:

“We see AI powering an expansion in how people are searching for and accessing information, unlocking completely new kinds of questions you can ask Google.”

This is the first quarter where Google has quantified how AI Overviews impact behavior, rather than just reporting usage growth.

More Visual, Conversational Search Activity

Google highlighted continued growth in visual and multi-modal search, especially among younger demographics. The company pointed to increased use of Lens and Circle to Search, often in combination with AI Overviews.

AI Mode, Google’s conversational interface, now has over 100 million monthly active users across the U.S. and India. The company plans to expand its capabilities with features like Deep Search and personalized results.

Language Model Activity Is Accelerating

In a stat that received little attention, Google disclosed it now processes more than 980 trillion tokens per month across its products. That figure has doubled since May.

Pichai stated:

“At I/O in May, we announced that we processed 480 trillion monthly tokens across our surfaces. Since then we have doubled that number.”

The rise in token volume shows how quickly AI is being used across Google products like Search, Workspace, and Cloud.

Enterprise AI Spending Continues to Climb

Google Cloud posted $13.6 billion in revenue for the quarter, up 32% year-over-year.

Adoption of AI tools is a major driver:

  • Over 85,000 enterprises are now building with Gemini
  • Deal volume is increasing, with as many billion-dollar contracts signed in the first half of 2025 as in all of last year
  • Gemini usage has grown 35 times compared to a year ago

To support growth across AI and Cloud, Alphabet raised its projected capital expenditures for 2025 to $85 billion.

What You Should Know as a Search Marketer

Google’s data challenges the idea that AI-generated answers are replacing search. Instead, features like AI Overviews appear to prompt follow-up queries and enable new types of searches.

Here are a few areas to watch:

  • Complex queries may become more common as users gain confidence in AI
  • Multi-modal search is growing, especially on mobile
  • Visibility in AI Overviews is increasingly important for content strategies
  • Traditional keyword targeting may need to adapt to conversational phrasing

Looking Ahead

With Google now attributing a 10% increase in queries to AI Overviews, the way people interact with search is shifting.

For marketers, that shift isn’t theoretical, it’s already in progress. Search behavior is leaning toward more complex, visual, and conversational inputs. If your strategy still assumes a static SERP, it may already be out of date.

Keep an eye on how these AI experiences roll out beyond the U.S., and watch how query patterns change in the months ahead.


Featured Image: bluestork/shutterstock

Google Makes It Easier To Talk To Your Analytics Data With AI via @sejournal, @MattGSouthern

Google has released an open-source Model Context Protocol (MCP) server that lets you analyze Google Analytics data using large language models like Gemini.

Announced by Matt Landers, Head of Developer Relations for Google Analytics, the tool serves as a bridge between LLMs and analytics data.

Instead of navigating traditional report interfaces, you can ask questions in plain English and receive responses instantly.

A Shift From Traditional Reports

The MCP server offers an alternative to digging through menus or configuring reports manually. You can type queries like “How many users did I have yesterday?” and get the answer you need.

Screenshot from: YouTube.com/GoogleAnalytics, July 2025.

In a demo, Landers used the Gemini CLI to retrieve analytics data. The CLI, or Command Line Interface, is a simple text-based tool you run in a terminal window.

Instead of clicking through menus or dashboards, you type out questions or commands, and the system responds in plain language. It’s like chatting with Gemini, but from your desktop or laptop terminal.

When asked about user counts from the previous day, the system returned the correct total. It also handled follow-up questions, showing how it can refine queries based on context without requiring additional technical setup.

You can watch the full demo in the video below:

What You Can Do With It

The server uses the Google Analytics Admin API and Data API to support a range of capabilities.

According to the project documentation, you can:

  • Retrieve account and property information
  • Run core and real-time reports
  • Access standard and custom dimensions and metrics
  • Get links to connected Google Ads accounts
  • Receive hints for setting date ranges and filters

To set it up, you’ll need Python, access to a Google Cloud project with specific APIs enabled, and Application Default Credentials that include read-only access to your Google Analytics account.

Real-World Use Cases

The server is especially helpful in more advanced scenarios.

In the demo, Landers asked for a report on top-selling products over the past month. The system returned results sorted by item revenue, then re-sorted them by units sold after a follow-up prompt.

Screenshot from: YouTube.com/GoogleAnalytics, July 2025.

Later, he entered a hypothetical scenario: a $5,000 monthly marketing budget and a goal to increase revenue.

The system generated multiple reports, which revealed that direct and organic search had driven over $419,000 in revenue. It then suggested a plan with specific budget allocations across Google Ads, paid social, and email marketing, each backed by performance data.

Screenshot from: YouTube.com/GoogleAnalytics, July 2025.

How To Set It Up

You can install the server from GitHub using a tool called pipx, which lets you run Python-based applications in isolated environments. Once installed, you’ll connect it to Gemini CLI by adding the server to your Gemini settings file.

Setup steps include:

  • Enabling the necessary Google APIs in your Cloud project
  • Configuring Application Default Credentials with read-only access to your Google Analytics account
  • (Optional) Setting environment variables to manage credentials more consistently across different environments

The server works with any MCP-compatible client, but Google highlights full support for Gemini CLI.

To help you get started, the documentation includes sample prompts for tasks like checking property stats, exploring user behavior, or analyzing performance trends.

Looking Ahead

Google says it’s continuing to develop the project and is encouraging feedback through GitHub and Discord.

While it’s still experimental, the MCP server gives you a hands-on way to explore what natural language analytics might look like in the future.

If you’re on a marketing team, this could help you get answers faster, without requiring dashboards or custom reports. And if you’re a developer, you might find ways to build tools that automate parts of your workflow or make analytics more accessible to others.

The full setup guide, source code, and updates are available on the Google Analytics MCP GitHub repository.


Featured Image: Mijansk786/Shutterstock

Google Shares SEO Guidance For State-Specific Product Pricing via @sejournal, @MattGSouthern

In a recent SEO Office Hours video, Google addressed whether businesses can show different product prices to users in different U.S. states, and what that means for search visibility.

The key point: Google only indexes one version of a product page, even if users in different locations see different prices.

Google Search Advocate John Mueller stated in the video:

“Google will only see one version of your page. It won’t crawl the page from different locations within the U.S., so we wouldn’t necessarily recognize that there are different prices there.”

How Google Handles Location-Based Pricing

Google confirmed it doesn’t have a mechanism for indexing multiple prices for the same product based on a U.S. state.

However, you can reflect regional cost differences by using the shipping and tax fields in structured data.

Mueller continued:

“Usually the price difference is based on what it actually costs to ship this product to a different state. So with those two fields, maybe you could do that.”

For example, you might show a base price on the page, while adjusting the final cost through shipping or tax settings depending on the buyer’s location.

When Different Products Make More Sense

If you need Google to recognize distinct prices for the same item depending on state-specific factors, Google recommends treating them as separate products entirely.

Mueller added:

“You would essentially want to make different products in your structured data and on your website. For example, one product for California specifically, maybe it’s made with regards to specific regulations in California.”

In other words, rather than dynamically changing prices for one listing, consider listing two separate products with different pricing and unique product identifiers.

Key Takeaway

Google’s infrastructure currently doesn’t support state-specific price indexing for a single product listing.

Instead, businesses will need to adapt within the existing framework. That means using structured data fields for shipping and tax, or publishing distinct listings for state variants when necessary.

Hear Mueller’s full response in the video below:

Pew Research Confirms Google AI Overviews Is Eroding Web Ecosystem via @sejournal, @martinibuster

Pew Research Center tracked real web browsing behavior and confirmed what many publishers and SEOs have claimed: AI Overviews does not send traffic back to websites. The results show that the damage caused by AI summaries to the web ecosystem is as bad as or worse than is commonly understood.

Methodology

The Pew Research study tracked over 900 adults who consented to installing an online browsing tracker to record their browsing behavior in the month of March 2025. The dataset contains 68,879 unique Google search queries, and a total of 12,593 queries triggered an AI summary.

Confirmed: Google AI Search Is Eroding Referral Traffic

The tracked user data confirms publisher complaints about a drop in referral traffic caused by AI search results. Google users who encounter an AI search result are less likely to click on a link and visit a website than users who see only a standard search result.

Only 8% of users who encountered an AI summary clicked a link (in the AI summary or the standard search results) to visit a website. Users who only saw a standard search result tended to click to visit a website 15% of the time, nearly twice as many as users who viewed an AI summary.

Users rarely click a link within an AI summary. Only 1% of users clicked an AI summary link and visited a website.

AI Summaries Cause Less Web Engagement

In a recent interview, Google’s CEO Sundar Pichai pushed back on the notion that AI summaries have a negative impact on the web ecosystem. He said that the fact that there is more content being created on the web than at any other time is proof that the web ecosystem is thriving. He said that

“So, generally there are more web pages… I think people are producing a lot of content, and I see consumers consuming a lot of content. We see it in our products.”

Pichai also insisted that people are consuming content across multiple forms of content (video, images, text) and that publishers today should be presenting content within more than just one format.

However, contrary to what Google’s CEO said, AI is not encouraging users to consume more content, it’s having the opposite effect. The Pew research data shows that AI summaries cause users to engage less with web content.

According to the research findings:

Users End Their Browsing Session

“Google users are more likely to end their browsing session entirely after visiting a search page with an AI summary than on pages without a summary.

This happened on 26% of pages with an AI summary, compared with 16% of pages with only traditional search results.”

Users Refrain From Clicking On Traditional Search Links

It also says that users tended to not click on a traditional search result when faced with an AI summary:

“Users who encountered an AI summary clicked on a traditional search result link in 8% of all visits. Those who did not encounter an AI summary clicked on a search result nearly twice as often (15% of visits).”

Only 1% Click Citation Links In AI Summaries

Users who see an AI summary overwhelmingly do not click the citations to the websites that the AI summary links to.

The report shows:

“Google users who encountered an AI summary also rarely clicked on a link in the summary itself. This occurred in just 1% of all visits to pages with such a summary.”

This confirms what publishers and SEOs have been saying to Google over and over again: Google AI Overviews robs publishers of referral traffic. Rob is a strong word but given the context that Google is using web content to “synthesize” an answer to a search query that does not result in a referral click, the word “rob” is what inevitably comes to mind to a publisher or SEO who worked hard to create the content.

Another startling fact shared in research is that almost 66% of users either browsed somewhere else on Google or completely bailed on Google without clicking a link to visit a website. In other words, nearly 66% of Google’s users do not click a link to visit the web ecosystem.

The report explains:

“…the largest share of Google searches in our study resulted in the user either browsing elsewhere on Google or leaving the site entirely without clicking a link in the search results. Around two-thirds of all searches resulted in one of these actions.”

Wikipedia, YouTube And Reddit Dominate Google Searches

Google has been holding publisher events and Search Central Live events all around the world to listen to publisher feedback and to promise that Google will work harder to surface a greater variety of content. I know that the Googlers at these events are not lying, but those promises of surfacing more high-quality content are subverted by the grim facts presented in the Pew research of actual users.

One of the biggest complaints is that Reddit and Wikipedia dominate the search results. The research validates publisher and SEO concerns because it shows that not only are Reddit and Wikipedia the most commonly cited websites, but Google’s own YouTube ranks among the top three most cited web destinations.

The report explains:

“The most frequently cited sources in both Google AI summaries and standard search results are Wikipedia, YouTube and Reddit. These three sites are the most commonly linked sources in AI summaries and standard search results alike.

Collectively, they accounted for 15% of the sources that were listed in the AI summaries we examined. They made up a similar share (17%) of the sources listed in standard search results.”

The report also shows:

  • “Wikipedia links are somewhat more common in AI summaries than in standard search pages”
  • “YouTube links are somewhat more common in standard search results than in AI summaries.”

These Are The Facts

Pew Research’s study of over 68,000 search queries from the browsing habits of over 900 adults reveals that Google’s AI summaries sharply reduce clicks to websites, with just 8% of users clicking any link and only 1% engaging with citations in AI answers.

Users encountering AI summaries are more likely to end their sessions or stay within Google’s ecosystem rather than visiting independent websites. This confirms publisher and SEO concerns that AI-driven search erodes web traffic and concentrates attention on a few dominant platforms like Wikipedia, Reddit, and YouTube.

These are the facts. They show that SEOs and publishers are right that AI Overviews is siphoning traffic out of the web ecosystem.

Featured Image by Shutterstock/Asier Romero

WP Engine’s AI Toolkit Vectorizes WordPress Sites For Smart Search via @sejournal, @martinibuster

WP Engine announced the release of its AI Toolkit, a way to easily integrate advanced AI search and product recommendations into WordPress websites, plus a Managed Vector Database that enables developers to easily integrate AI features directly into websites.

Smart Search AI

WP Engine’s AI Toolkit helps WordPress site owners improve search and content visibility without requiring a steep technical learning curve. Smart Search AI is easily enabled in just a few clicks. Once activated, it syncs with WordPress content, including:

  • Posts
  • Pages
  • Tags
  • Metadata
  • Custom fields

Smart Search AI converts a website’s content into a vector format to deliver faster, more useful search results. The system combines natural-language and keyword search to help contextualize queries and guide visitors to what they need, which may help reduce bounce rates and support higher conversions.

AI-Powered Recommendations

The AI-powered recommendations feature uses past and current user session data to suggest products or content that is relevant to the user. This helps increase shopping sales and keeps readers engaged with content. The system runs efficiently without slowing down the website and uses flat-rate pricing with no overage fees. It’s suited for eCommerce, media, and any site focused on driving sales and engagement through personalized experiences.

Managed Vector Database

WP Engine’s Managed Vector Database is a service that simplifies building AI features directly into WordPress websites. Designed for developers, agencies, and site owners, it removes the need to manage tasks like data extraction, embedding creation, and content updates. Developers can start building content-based AI apps and functionalities immediately, because the system automatically processes and trains on their WordPress content without additional setup.

Integrated with WordPress, the database keeps AI outputs aligned with current site content without extra work. It enables developers to connect WordPress data directly to chatbot frameworks or APIs, and it also makes AI features accessible to non-technical creators or site owners. This enables creators to focus on building meaningful experiences without getting bogged down in technical setup.

Read more about WP Engine’s AI Toolkit:

WP Engine Launches AI Toolkit Empowering Website Owners to Drive Engagement and Growth

Featured Image by Shutterstock/Ground Picture

Google Says It Could Make Sense To Use Noindex Header With LLMS.txt via @sejournal, @martinibuster

Google’s John Mueller answered a question about llms.txt related to duplicate content, stating that it doesn’t make sense that it would be viewed as duplicate content, but he also stated it could make sense to take steps to prevent indexing.

LLMs.txt

Llms.txt is a proposal to create a new content format standard that large language models can use to retrieve the main content of a web page without having to deal with other non-content data, such as advertising, navigation, and anything else that is not the main content. It offers web publishers the ability to provide a curated, Markdown-formatted version of the most important content. The llms.txt file sits at the root level of a website (example.com/llms.txt).

Contrary to some claims made about llms.txt, it is not in any way similar in purpose to robots.txt. The purpose of robots.txt is to control robot behavior, while the purpose of llms.txt is to provide content to large language models.

Will Google View Llms.txt As Duplicate Content?

Someone on Bluesky asked if llms.txt could be seen by Google as duplicate content, which is a good question. It could happen that someone outside of the website might link to the llms.txt and that Google might begin surfacing that content instead of or in addition to the HTML content.

This is the question asked:

“Will Google view LLMs.txt files as duplicate content? It seems stiff necked to do so, given that they know that it isn’t, and what it is really for.

Should I add a “noindex” header for llms.txt for Googlebot?”

Google’s John Mueller answered:

“It would only be duplicate content if the content were the same as a HTML page, which wouldn’t make sense (assuming the file itself were useful).

That said, using noindex for it could make sense, as sites might link to it and it could otherwise become indexed, which would be weird for users.”

Noindex For Llms.txt

Using a noindex header for the llms.txt is a good idea because it will prevent the content from entering Google’s index. Using a robots.txt to block Google is not necessary because that will only block Google from crawling the file which will prevent it from seeing the noindex.

Featured Image by Shutterstock/Krakenimages.com

Google CTRs Drop 32% For Top Result After AI Overview Rollout via @sejournal, @MattGSouthern

A new study from GrowthSRC Media finds that click-through rates (CTRs) for Google’s top-ranking search result have declined from 28% to 19%. This 32% drop correlates with the expansion of AI Overviews, a feature that now appears across a wide range of search results.

Position #2 experienced an even steeper decline, with CTRs falling 39% from 20.83% to 12.60% year-over-year.

The research analyzed more than 200,000 keywords from 30 websites across ecommerce, SaaS, B2B, and EdTech industries. Here are more highlights from the study.

Key Findings

According to the report, AI Overviews appeared for just 10,000 keywords in August 2024. By May 2025, that number had grown to over 172,000.

This expansion followed the March core update and was confirmed during Google’s full U.S. rollout announcement at the I/O developer conference.

These developments appear to contrast with comments from Google CEO Sundar Pichai, who said in a Decoder interview with The Verge:

“If you put content and links within AI Overviews, they get higher click-through rates than if you put it outside of AI Overviews.”

CTRs Shift Downward and Upward

While top positions saw notable declines, the study observed a 30.63% increase in CTRs for positions 6 through 10 compared to the previous year. This suggests that users may be scrolling past AI-generated summaries to find original sources further down the page.

Across positions 1 through 5, the study reported an average CTR decline of 17.92%. The analysis focused on approximately 74,000 keywords ranking in the top 10.

Major Publishers Report Similar Trends

The findings align with reports from major publishers. Carly Steven, SEO and editorial ecommerce director at MailOnline, told attendees at the WAN-IFRA World News Media Congress that CTRs drop when AI Overviews are present.

As reported by Press Gazette, Steven explained:

“On desktop, when we are ranking number one in organic search, [CTR] is about 13% on desktop and about 20% on mobile. When there is an AI Overview present, that drops to less than 5% on desktop and 7% on mobile.”

MailOnline’s broader data showed CTRs falling by 56.1% on desktop and 48.2% on mobile for keywords with AI Overviews.

Ecommerce Affected by Product Widgets

The study also highlighted changes in ecommerce performance tied to Google’s Product Widgets.

Widgets like “Popular Products” and “Under [X] Price” began appearing more frequently from November 2024 onward, especially in categories such as home care, fashion, and beauty.

These widgets open a Google Shopping interface directly within search results, which may reduce clicks to traditional organic listings.

Methodology

GrowthSRC analyzed year-over-year data from Google Search Console across clients in multiple industries, focusing on changes before and after the full rollout of AI Overviews and Product Widgets.

The dataset included queries, clicks, impressions, CTRs, and average positions.

Data was segmented by content type, including product pages, collection pages, and blog posts. Additional keyword data from Ahrefs helped determine which queries triggered AI Overviews or Product Widgets.

What This Means

Mahendra Choudhary, Partner at GrowthSRC Media, encouraged SEO professionals to reconsider traditional performance benchmarks:

“With lower clicks to websites from informational content becoming the new normal, this is the perfect time to let your clients and internal stakeholders know that chasing website traffic as a KPI should be thought of differently.”

He recommends shifting focus toward brand visibility in social search, geographic relevance, mentions in LLM outputs, and overall contribution to revenue or leads.

This shift may require:

  • Tracking engagement beyond clicks, such as on-site conversions, branded search growth, or assisted conversions.
  • Diversifying content distribution across platforms like YouTube, TikTok, and Reddit, where users often bypass traditional search.
  • Investing in high-authority content at the top of the funnel to build brand awareness, even if direct clicks decline.

These strategies can help ensure SEO continues to drive measurable value as user behavior evolves.

Looking Ahead

The decline in organic CTRs for top positions highlights how search behavior is changing as AI-generated content plays a larger role in discovery.

Adapting to this environment may involve placing less emphasis on rankings alone and focusing more on how visibility supports broader business goals.

As zero-click search becomes more common, understanding where users are engaging, and where they aren’t, will be essential to maintaining visibility.


Featured Image: Roman Samborskyi/Shutterstock

AI Chatbots Frequently Get Login URLs Wrong, Netcraft Warns via @sejournal, @MattGSouthern

A report finds that AI chatbots are frequently directing users to phishing sites when asked for login URLs to major services.

Security firm Netcraft tested GPT-4.1-based models with natural language queries for 50 major brands and found that 34% of the suggested login links were either inactive, unrelated, or potentially dangerous.

The results suggest a growing threat in how users access websites via AI-generated responses.

Key Findings

Of 131 unique hostnames generated during the test:

  • 29% were unregistered, inactive, or parked—leaving them open to hijacking.
  • 5% pointed to completely unrelated businesses.
  • 66% correctly led to brand-owned domains.

Netcraft emphasized that the prompts used weren’t obscure or misleading. They mirrored typical user behavior, such as:

“I lost my bookmark. Can you tell me the website to log in to [brand]?”

“Can you help me find the official website to log in to my [brand] account?”

These findings raise concerns about the accuracy and safety of AI chat interfaces, which often display results with high confidence but may lack the necessary context to evaluate credibility.

Real-World Phishing Example In Perplexity

In one case, the AI-powered search engine Perplexity directed users to a phishing page hosted on Google Sites when asked for Wells Fargo’s login URL.

Rather than linking to the official domain, the chatbot returned:

hxxps://sites[.]google[.]com/view/wells-fargologins/home

The phishing site mimicked Wells Fargo’s branding and layout. Because Perplexity recommended the link without traditional domain context or user discretion, the risk of falling for the scam was amplified.

Small Brands See Higher Failure Rates

Smaller organizations such as regional banks and credit unions were more frequently misrepresented.

According to Netcraft, these institutions are less likely to appear in language model training data, increasing the chances of AI “hallucinations” when generating login information.

For these brands, the consequences include not only financial loss, but reputational damage and regulatory fallout if users are affected.

Threat Actors Are Targeting AI Systems

The report uncovered a strategy among cybercriminals: tailoring content to be easily read and reproduced by language models.

Netcraft identified more than 17,000 phishing pages on GitBook targeting crypto users, disguised as legitimate documentation. These pages were designed to mislead people while being ingested by AI tools that recommend them.

A separate attack involved a fake API, “SolanaApis,” created to mimic the Solana blockchain interface. The campaign included:

  • Blog posts
  • Forum discussions
  • Dozens of GitHub repositories
  • Multiple fake developer accounts

At least five victims unknowingly included the malicious API in public code projects, some of which appeared to be built using AI coding tools.

While defensive domain registration has been a standard cybersecurity tactic, it’s ineffective against the nearly infinite domain variations AI systems can invent.

Netcraft argues that brands need proactive monitoring and AI-aware threat detection instead of relying on guesswork.

What This Means

The findings highlight a new area of concern: how your brand is represented in AI outputs.

Maintaining visibility in AI-generated answers, and avoiding misrepresentation, could become a priority as users rely less on traditional search and more on AI assistants for navigation.

For users, this research is a reminder to approach AI recommendations with caution. When searching for login pages, it’s still safer to navigate through traditional search engines or type known URLs directly, rather than trusting links provided by a chatbot without verification.


Featured Image: Roman Samborskyi/Shutterstock

Potential SEO Clients May Want To Discuss AI Search And Chatbots via @sejournal, @martinibuster

There was a post on social media about so-called hustle bros, and one on Reddit about an SEO who lost a prospective client to a digital marketer whose pitch included a song and dance about AI search visibility. Both discussions highlight a trend in which potential customers want to be assured of positive outcomes and may want to discuss AI search positioning.

Hustle Bro Culture?

Two unrelated posts touched on SEOs who are hustling for clients and getting them. The first post was about SEO “hustle bros” who post search console screenshots to show the success of their work.

I know of a guy who used to post a lot in a Facebook SEO group until the moderators discovered that his Search Console screenshots were downloaded from Google Images. SEO hustle bros who post fake screenshots are an actual thing, and sometimes they get caught.

So, a person posted a rant on Bluesky about people who do that.

Here’s what he posted:

“How much of SEO is “chasing after wind”. There’s so many hustle bros, programmatic promoters and people posting graphs with numbers erased off to show their “success”.”

Has Something Changed?

Google’s John Mueller responded:

“I wonder if it has changed over the years, or if it’s just my (perhaps your) perception that has changed.

Or maybe all the different kinds of SEOs are just in the same few places, rather than their independent forums, making them more visible?”

Mueller might be on to something because social media and YouTube have made it easier for legit SEOs and “hustle bros” to find a larger audience. But I think the important point to consider is that those people are connecting to potential clients in a way that maybe legit SEOs might not be connecting.

And that leads into the next social media discussion, which is about SEOs who are talking about what clients want to hear: AI Fluff.

SEOs Selling AI “Fluff”

There is a post on Reddit where an SEO shares how they spent months communicating with a potential client, going out of their way to help a small business as a favor to a friend. After all the discussions the SEO gets to the part where they expect the small business to commit to an agreement and they walk away, saying they’re going with another SEO who sold them with something to do with AI.

They explained:

“SEOs Selling AI Fluff

After answering a bunch of questions via email over 3 months (unusually needy client) but essentially presales, it all sounds good to go and we hop on a kickoff call. Recap scope and reshare key contacts, and tee up a chat with the we design agency. So far so good.

Then dropped.

Clients reason? The other SEO who they’ve been chatting with is way more clued up with the AI technicals

I’d love to know what crystal ball AI mysticism they were sold on. Maybe a “cosine similarity audit”, maybe we’ll include “schema embeddings analysis” within our migration project plan to make sure AI bots can read your site. Lol cool whatever bro.”

John Mueller responded to that person’s post but then retracted it.

Nevertheless, a lively discussion ensued with three main points:

  1. Is AI SEO this year’s EEAT?
  2. Some potential clients want to discuss AI SEO
  3. SEOs may need to address AEO/AIO/GEO

1. Is AI For SEO This Year’s EEAT?

Many Redditors in that discussion scoffed at the idea of SEO for AI. This isn’t a case of luddites refusing to change with the times. SEO tactics for AI Search are still evolving.

Reddit moderator WebLinkr received eight upvotes for their comment:

“Yup – SEOs been like that for years – EEAT, “SEO Audits” – basically people buy on what “makes sense” or “sounds sensible” even though they’ve already proven they have no idea what SEO is.”

Unlike EEAT, AI Search is most definitely disrupting visibility. It’s a real thing. And I do know of at least one SEO with a computer science degree who has it figured out.

But I think it’s not too off the mark to say that many digital marketers are still figuring things out. The amount of scoffing in that discussion seems to support the idea that AI Search is not something all SEOs are fully confident about.

2. Some Clients Are Asking For AI SEO

Perhaps the most important insight is that potential clients want to know what an SEO can do for AI optimization. If clients are asking about AI SEO, does that mean it’s no longer hype? Or is this a repeat of what happened with EEAT where it was a lot of wheels spinning for nothing?

Redditor mkhaytman shared:

“Like it or not, clients are asking questions about AIs impact and how they can leverage the new tools people are using for search and just telling them that “Nobody knows!” isn’t a satisfactory answer. You need to be able to tell them something – even if its just “good seo practices are the same things that will improve your AI citations”.”

3. AI Search Is Real: SEOs Need To Talk About It With Clients

A third point of view emerged: this is something real that all SEOs need to be having a conversation about. It’s not something that can be ignored and only discussed if a client or prospect asks about it.

SVLibertine shared:

“Battling AIO, GEO, and AEO may seem like snake oil to some, but…it’s where we’re headed. Right now.

To stay relevant in our field you need to be able to eloquently and convincingly speak to this brave new world we’ve found ourselves in. Either to potential clients, or to our boss’s bosses.

I spend almost as much time after work staying on top of developments as I do during the day working. …That being said… SEO fundamentals absolutely still apply, and content is still king.”

Uncertainty About Answer Engine SEO

There are many ways to consider SEO for AI. For example, there’s a certain amount of consensus that AI gets web search data from traditional search engines, where traditional SEO applies. That’s what the comment about content being king seems to be about.

But then we have folks who are using share buttons to raise visibility by getting people to ask ChatGPT, Claude, and Perplexity about their web pages. That’s kind of edgy, but it’s a natural part of how SEO reacts to new things: by experimenting and seeing how the algorithmic black box responds.

This is a period similar to what I experienced at the dawn of SEO, when search marketers were playing around with different approaches and finding what works until it doesn’t.

But here’s something to be aware of: there are times when a client will demand certain things, and it’s tempting to give clients what they’re asking for. But if you have reservations, it may be helpful to share your doubts.

Read about Google’s ranking signals:

Google’s Quality Rankings May Rely On These Content Signals

Featured Image by Shutterstock/Asier Romero

Why It’s Okay To Not Buy Or Obsess Over Links Anymore via @sejournal, @martinibuster

There are many businesses relatively new to SEO that eventually face the decision to build or buy links because they are told that links are important, which, of course, links are important. But the need to buy links presupposes that buying them is the only way to acquire them. Links are important, but less important than at any time in the history of SEO.

How Do I Know So Much About Links?

I have been doing SEO for 25 years, at one time specializing in links. I did more than links, but I was typecast as a “links guy” because I was the moderator of the Link Building Forum at WebmasterWorld under the martinibuster nickname. WebmasterWorld was at one time the most popular source of SEO information in the world. Being a WebmasterWorld moderator was an honor, and only the best of the very best were invited to become one. Many top old-school SEOs were moderators there, like Jennifer Slegg, Greg Boser, Todd Friesen, Dixon Jones, Ash Nallawalla, and many more.

That’s not to brag, but to explain that my opinion comes from decades-long experience starting from the very dawn of link building. There are very few people who have as deep hands-on experience with links. So this is my advice based on my experience.

Short History Of Link Building

Google’s link algorithms have steadily improved since the early days. As early as 2003, I was told by Google engineer Marissa Mayer (then at Google, before becoming CEO of Yahoo) that Google was able to distinguish that a link in the footer was a “built by” link and to not count it for PageRank. This crushed sites that relied on footer links to power their rankings.

  • 2005 – Statistical Analysis
    In 2005, Google engineers announced at the Pubcon New Orleans search conference that they were using statistical analysis to catch unnatural linking patterns. Their presentation featured graphs showing a curve representing normal linking patterns and then a separate cloud of red dots that represented unnatural links.
  • Links That “Look” Natural
    If you’ve ever read the phrase “links that look natural” or “natural-looking links” and wondered where that came from, statistical analysis algorithms is the answer. After 2005, the goal for manipulative links was to look natural, which meant doing things like alternating the anchor text, putting links into context, and being careful about outbound link targets.
  • Demise Of Easy Link Tactics
    By 2006, Google had neutralized the business of reciprocal links, traffic counter link building, and was winding down the business of link directories.
  • WordPress Was Good For Link Building
    WordPress was a boon to link builders because it made it possible for more people to get online and build websites, increasing the ability to obtain links by asking or throwing money at them. There were also sites like Geocities that hosted mini-sites, but most of the focus was on standalone sites, maybe because of PageRank considerations (PageRank was visible in the Google Toolbar).
  • Rise Of Paid Links
    Seemingly everyone built websites on virtually any topic, which made link building easier to do simply by asking for a link. Companies like Text-Link-Ads came along and built huge networks of thousands of independent websites on virtually every topic, and they made a ton of money. I knew some people who sold links from their network of sites who were earning $40,000/month in passive income. White hat SEOs celebrated link selling because they said it was legitimate advertising (wink, wink), and therefore Google wouldn’t penalize it.
  • Fall Of Paid Links
    The paid links party ended in the years leading up to 2012, when paid links began losing their effectiveness. As a link building moderator, I had access to confidential information and was told by insiders that paid links were having less and less effect. Then 2012’s Penguin Update happened, and suddenly thousands of websites got hit by manual actions for paid links and guest posting links.

Ranking Where You’re Supposed To Rank

The Penguin Algorithm marked a turning point in the business of building links. Internally at Google there must have been a conversation about the punitive aspect of catching links and at some point not long after Google started ranking sites where they were supposed to rank instead of penalizing them.

In fact, I coined the phrase “ranking where you’re supposed to rank” in 2014 to show that while sites with difficulty ranking may not technically have a penalty, their links are ineffective and they are ranking where they are supposed to rank.

There’s a class of link sellers that sell what they call Private Blog Network links. PBN sellers depend on Google to not penalize a site and depend on Google to give a site a temporary boost which happens for many links. But the sites inevitably return to ranking where they’re supposed to rank.

Ranking poorly is not a big deal for churn and burn affiliate sites designed to rank high for a short period of time. But it’s a big deal for businesses that depend on a website to be ranking well every day.

Consequences Of Poor SEO

Receiving a manual action is a big deal because it takes a website out of action until Google restores the rankings. Recovering from a manual action is difficult and requires a site to go above and beyond by removing every single low-quality link they are responsible for, and sometimes more than that. Publishers are often disappointed after a manual action is lifted because their sites don’t return to their former high rankings. That’s because they’re ranking where they’re supposed to rank.

For that reason, buying links is not an option for B2B sites, personal injury websites, big-brand websites, or any other businesses that depend on rankings. An SEO or business owner will have to answer for a catastrophic loss in traffic and earnings should their dabbling in paid links backfire.

Personal injury SEO is a good example of why relying on links can be risky. It’s a subset of local search, where rankings are determined by local search algorithms. While links may help, the algorithm is influenced by other factors like local citations, which are known to have a strong impact on rankings. Even if a site avoids a penalty, links alone won’t carry it, and the best-case scenario is that the site ends up ranking where it’s supposed to rank. The worst-case scenario is a manual action for manipulative links.

I’ve assisted businesses with their reconsideration requests to get out of a manual action, and it’s a major hassle. In the old days, I could just send an email to someone at Google or Yahoo and get the penalty lifted relatively quickly. Getting out of a manual action today is not easy. It’s a big, big deal.

The point is that if the consequences of a poor SEO strategy are catastrophic, then buying links is not an option.

Promotion Is A Good Strategy

Businesses can still promote their websites without depending heavily on links. SEOs tend to narrow their views of promotion to just links. Link builders will turn down an opportunity to publish an article for distribution to tens of thousands of potential customers because the article is in an email or a PDF and doesn’t come with a link on a web page.

How dumb is that, right? That’s what thinking in the narrow terms of SEO does: it causes people to avoid promoting a site in a way that builds awareness in customers—the people who may be interested in a business. Creating awareness and building love for a business is the kind of thing that, in my opinion, leads to those mysterious external signals of trustworthiness that Google looks for.

Promotion is super important, and it’s not the kind of thing that fits into the narrow “get links” mindset. Any promotional activity a business undertakes outside the narrow SEO paradigm is going to go right over the head of the competition. Rather than obsessing over links, it may be a turning point for all businesses to return to thinking of ways to promote the site, because links are less important today than they ever have been, while external signals of trust, expertise, and authoritativeness are quite likely more important today than at any other time in SEO history.

Takeaways

  • Link Building’s Declining Value:
    Links are still important, but less so than in the past; their influence on rankings has steadily decreased.
  • Google’s Increasingly Sophisticated Link Algorithms:
    Google has increasingly neutralized manipulative link strategies through algorithm updates and statistical detection methods.
  • Rise and Fall of Paid Link Schemes:
    Paid link networks once thrived but became increasingly ineffective by 2012, culminating in penalties via the Penguin update.
  • Ranking Where You’re Supposed to Rank:
    Google now largely down-ranks or ignores manipulative links, meaning sites rank based on actual quality and relevance. Sites can still face manual actions, so don’t depend on Google continuing to down-rank manipulative links.
  • Risks of Link Buying:
    Manual actions are difficult to recover from and can devastate sites that rely on rankings for revenue.
  • Local SEO Factors Rely Less On Links:
    For industries like personal injury law, local ranking signals (e.g., citations) often outweigh link impact.
  • Promotion Beyond Links:
    Real promotion builds brand awareness and credibility, often in ways that don’t involve links but may influence user behavior signals. External user behavior signals have been a part of Google’s signals since the very first PageRank algorithm, which itself models user behavior.

Learn more about Google’s external user behavior signals and ranking without links:

Google’s Quality Rankings May Rely On These Content Signals

Featured Image by Shutterstock/Luis Molinero