As generative search becomes the default for tools like ChatGPT, Gemini, and Claude, fewer people are clicking through to traditional search results. If your content isn’t part of their training data or grounding sources, it’s effectively invisible.
And that means one thing: you’re no longer just optimizing for humans or search engines. You’re optimizing for machines that summarize the internet.
Introducing Generative Engine Optimization (GEO)
In this tactical webinar, we’ll break down what it takes to get your brand cited, linked, and quoted in AI-generated content, intentionally.
Ways to increase your AIO (AI Overview) brand presence.
Proven SEO & GEO workflows you can copy today.
Learn How To Influence LLMs
This isn’t theory. We’ll walk through the specific strategies SEOs and marketers are using right now to shape what language models say, and don’t say, about their brands.
Expect insights on:
How foundational training data is gathered (and how you might influence it).
Which formats and language structures improve your chances of being cited.
This is for SEOs, content strategists, and marketing leads who want to stay relevant as AI redefines the playing field.
Why This Webinar Is A Must-Attend
Whether you’re refining your search strategy or trying to future-proof your brand visibility, this session offers high-ROI insights you can apply immediately.
✅ Actionable examples
✅ Real-world GEO workflows
✅ Early looks at emerging standards like MCP, A2A, and llms.txt
📍 Designed for experienced marketers ready to lead change.
Reserve Your Spot Or Get The Recording
🛑 Can’t make it live? No problem. Register anyway, and we’ll send you the full recording so you don’t miss a thing.
Over the years, Google has limited how websites can control their appearance in search results.
Here’s what sites cannot control in Google search.
Sitelinks
For some searches, especially involving brand names, Google shows links below the listing title. These are called sitelinks. Unfortunately, Google’s algorithm often displays sitelinks that are irrelevant or unimportant to the site’s business.
Owners have no control over these URLs. The only methods to remove a sitelink are to delete the page or add the noindex meta tag, but both would also remove the page from all Google searches.
Here are sitelinks for a “Practical Ecommerce” query:
Websites have little control over sitelinks, such as this example for Practical Ecommerce.
Listing title
The listing title is the most prominent section of a search snippet and largely influences the number of clicks. Google used to display only a page’s title tag for the listing.
A few years ago, however, Google began displaying titles based on search queries, for relevance. The result is often fewer clicks.
There’s no way to stop Google from rewriting a page title. Using an HTML title as an H1 heading increases the likelihood that Google will use it, in my experience, as it aligns the listing title with what searchers would see on the subsequent page.
Google now decides SERP listing titles based on the query, such as “how to build a website.”
Listing description
A page’s HTML meta description summarizes its content. Google has long considered meta descriptions as hints rather than directives. It displays meta descriptions only if relevant to the query.
Websites can influence listing descriptions, which appear below the title, by including on a page summary paragraphs, conclusions, and short answers. Depending on the query, Google could display part of those sections in a description.
Otherwise, sites have no control over the SERP snippet’s description.
A listing may or may not use the page’s HTML meta description.
AI Overviews
Google’s AI Overviews are artificial intelligence-generated answers on top of search results.
AI Overviews typically satisfy searchers’ needs, thereby eliminating the need to click. Hence many site owners prefer Google not to use their content in AI Overviews. I know no way to block Google from using a site’s content in AI Overviews while still indexing it for conventional SERPs.
Google’s Extended directive in a site’s robots.txt file blocks Gemini but not AI Overviews. A nosnippet meta tag will likely block AI Overviews, as well as all SERPs snippet descriptions.
AI Overviews typically satisfy searchers’ needs, thereby eliminating the need to click. This example is for the query “how to build a website.”
Featured snippets
Featured snippets used to appear at the top Google SERPs to provide quick answers to a query. They now appear in the middle of SERP pages, if at all, given the rise of AI Overviews.
Featured snippets typically decrease the number of clicks to a linked URL. Websites have no control over appearing in a featured snippet or its content.
A nosnippet meta tag instructs search engines not to display a page in featured snippet, but it also removes descriptions from the page’s non-featured listing.
A well-structured page — short FAQs, HTML headings, concise summaries — can influence the contents of a featured snippet, but there’s no guarantee.
In short, Google is reducing websites’ control over SERPs as it prioritizes what searchers seek. Sites can influence their SERP appearance by focusing on concise content, well-structured pages, and appropriate headings.
Last week, I walked through the shift from keyword-first to topic-first SEO – and why that mindset change matters more than ever for long-term visibility in both search and large language models (LLMs).
This week, we’re getting tactical. Because understanding the shift is one thing, operationalizing it across your team is another.
In this issue, Amanda and I are breaking down:
How to build and use a topic map and matrix (with a map template for premium readers).
Why a deep understanding of your audience is crucial to true topical depth.
Guidance for internal + external linking by topic (with tool recommendations).
For premium readers: Practical advice on measuring SEO performance by topic.
If you’re trying to build durable organic visibility and authority for your brand – and not just chase hacks for AI overviews – this is your blueprint.
Image Credit: Kevin Indig
Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!
How To Operationalize A Topic-First SEO Strategy
Last week, we covered how you need to shift from keywords to topics (if you haven’t already).
But what if you’re not quite sure how to operationalize this approach across your team?
Let’s talk about how to do that.
To earn lasting visibility – and not short-term visibility bought by hacky LLM visibility tricks – your brand needs to signal to search engines and LLMs that it’s an authority in topics related to your offerings for the intended audience you serve.
You’ll do this by:
Building a map of your parent topics.
Using audience research and personas as lenses to create content through.
Expanding with subtopics and “zero-volume” content creation, because fringe content adds depth.
Optimizing both your internal and external links with a topic-first approach.
Build A Map Of Your Parent Topics
First up, you need to build your topic map.
(You know, if you don’t already have an old doc or spreadsheet out there collecting dust, buried in your Google Drive, with your core topic pillars and subtopics already stored.)
This is the first step in building a thorough persona-based SEO topic matrix.
A topic matrix is a strategic framework that compiles your brand’s key topics, subtopics, and content formats needed to comprehensively cover a subject area for search visibility.
It helps align content with user intent, target personas, and search visibility opportunities, creating a roadmap for developing topical authority and minimizing keyword cannibalization.
If you haven’t built one before, this is going to look different from keyword lists of the past, and it might be organized like this:
Image Credit: Kevin Indig
Amanda interjecting here: Even if you have built one before, stick with us. We’ve got a visual for you below that will help communicate to stakeholders how/why a topic-first approach matters to earning visibility and authority for your brand’s core offerings. Plus, premium subscribers get the ready-to-go template.
Later, once your topic matrix is complete, you’ll use your keyword universe to select priority keywords to pair with your overall topic and individual pages.
Instead of living in keyword lists, you’ll live in a topic map, prioritizing meeting the needs of separate personas or ideal customer profiles (ICPs) in your target audience, and later pairing search queries that best help the people you serve find you.
To start building a list of your parent topics, you need to:
Outline the exact topics your brand needs to own. This is where you start. (And many of you reading this already have this locked in.).
Inventory your existing content: What topics do you cover already? What topics do we actually need to cover? Where are the gaps? Which ones convert the best?
Make sure you log all your core offerings (i.e., features, services, core products) as topics or subtopics themselves.
These are the “buckets” under which all other content should logically live (regardless of the persona, funnel stage, or search intent you’re optimizing for).
Think of them as your brand’s semantic backbone, so to speak … these are the foundational topics that every page ultimately ladders up to.
Here’s how to determine them:
1. Start with your offerings.
What services do you provide?
What features or products do you sell?
What problems do you solve?
2. Group offerings into themes.
Which of those offerings can be grouped under a broader topic?
What high-level conversations do your users consistently return to?
3. Refine for relevance.
You’re aiming for topics broad enough to support many subtopics, but specific enough to reflect your unique authority in your area of expertise.
Let’s look at an example of a fictional DTC brand that also offers some B2B services: Kind Habitat. (Needs a better name, but let’s move on. 😆)
Kind Habitat offers eco-friendly home furnishings and sustainable materials via a small ecommerce store as well as residential and commercial interior design services.
Let’s say its target audience includes homeowners, renters, residential and commercial property managers, as well as both residential builders and designers that focus on sustainability and eco-friendly values.
With that in mind, its ecommerce products and design services could all be mapped to five simplified but distinct core topics:
Sustainable interior design.
Eco-friendly building materials.
Zero-waste living.
Sustainable furniture shopping.
Green home upgrades.
Every piece of content they create should tie back to one or more of these core topics, and that ensures the site builds deep, durable authority in its niche.
(And keep in mind, this is a simplified example here. You might have up to 10 parent topics … or more, depending on the breadth of your offerings or expertise areas.)
Next up, you’re going to work to expand your topic map, starting with audience research.
Use Audience Research And Personas
Here’s where those personas your brand invested so heavily in come into play. You’ll need to map out (1) who you’re solving problems for and (2) how their queries change based on unique persona, intent, audience type, or industry sector.
But how do you know if you’ve identified the right people (personas) and their queries?
You can spend tens of thousands investing in deep buyer persona market research.
But if your resources are limited, talk to your sales team. Talk to your customer care team. And (gasp) talk to your customers and/or leads who didn’t buy from you.
And if you’re just starting out and don’t have sales or customer teams in place, have your founder dig into their email inbox, LinkedIn DMs, etc., and mine for information.
As Spartoro’s Amanda Natividad states in “How to Turn Audience Research Into Content Ideas” (a great read, btw):
Questions are content gold. Each question represents an information gap you can fill with valuable content. [1]
Then, your job is to take the collected information gaps and fold them into your overall topic matrix.
Keep in mind, though, when optimizing for your core topics, you’ll also need to target different intents across the topic and the funnel via different perspectives, painpoints, and viewpoints (a.k.a. “ranch style SEO”).
Here’s an exciting bonus to investing in this approach: Persona-aligned content that offers deep topic coverage and unique perspectives can bring natural information gain to the overall topical conversation.
Screenshot from LinkedIn, July 2025 (Image Credit: Kevin Indig)
Your topics can be expanded exponentially in many directions, based on the people you’re creating content for and the problems they have:
People:
Core audiences.
Crafted personas.
Multiple sectors (if applicable to your product or service).
Problems:
Core problem/needs your brand solves for each audience.
Unique problems experienced by each persona that your brand solves.
Core problems unique to multiple sectors (and in the language of those sectors).
Let’s circle back to our fictional example with Kind Habitat, that sustainable interior design firm with a quickly-made-up name and a mini ecommerce store.
Here’s what their “people and problems” that they’d optimize their core topics for would look like:
Homeowner: Stan, 45, high-income earner, second-time homeowner in suburban area, looking to renovate sustainably.
Renter: Nicole, 31, mid-income earner, long-term rent-controlled apartment in a big city with values of sustainability, who is researching sustainable home decor and design.
Property Manager: Quinn, 25, mid-income earner, entry-level property manager for small local firm that values zero-waste construction and sustainable renovations.
Builder: JP, 57, high-income earner, owns sustainable building firm, seeking zero-waste, low-toxin approach to new builds and prioritizing energy-efficient design in luxury homes.
Designer: Sydney, 29, mid-income earner, junior to mid-level associate at a commercial interior design firm seeking both products and plans for sustainable furnishings and design.
Multiple sectors (if applicable to your product or service): Residential real estate, property managers for multi-family housing, real estate portfolios, or commercial real estate, sustainable building firms, individual homeowners, and renters interested in sustainable design.
Keep in mind, you could fan out your audience even further with three to five individual audience personas under each audience type.
And once your audience data is finally ready to go, you’d then expand into the problems faced by each audience, persona, and sector across each targeted topic.
Once you have your core topics covered (and have addressed your core features, offerings, services, audience pain points, and organic audience questions, etc.), you’d expand even further into content that offers unique perspectives, hot takes, and even digs into current events related to your industry or product/services.
That’s … a lot of content.
Using Amanda’s topic map visual, here’s what it could look like … for just one parent topic.
You could just keep going. For-ev-er.
(But your content doesn’t have to. If you establish your brand as an authority by publishing content with depth of coverage and information gain baked in, you can accomplish a lot with a tight, well-developed library of pages.)
Here’s what I’d recommend if you have the team members or freelancers on hand:
Assign specific team members or freelancers to cover core topics. Essentially, you’d have trained writer-SMEs for each major topic you’d like to target across your strategy. That way, content can be produced more accurately … and faster.
Divvy up work based on personas. If you have multiple audience types, like the Kind Habitat example, assign production to your team based on different personas/audiences, so your content producers can hone in on the needs of – and the way they speak to – each persona.
Use AI to scale topic coverage while tailoring to persona type. A tool like AirOps can help you build out workflows based on specific topics and specific personas; that way, you’re creating iterations of core pieces of work geared toward the specific needs, pain points, and problems of each industry sector, persona, etc.
When refreshing older content to combat content decay, refresh by topics. Don’t just refresh one page that has experienced a decline. Work on keeping content decay in check by refreshing subtopics/clusters as a whole whenever possible. Assign one producer/individual contributor to work on the cluster of related pages.
Expand With Subtopics, Because Fringe Content Adds Depth
Once you’ve mapped your audience and their problems across your core topics, you need to expand your coverage with subtopics, especially the ones that live on the edges and directly speak to your target ICPs. This is the kind of content that rarely shows up in a traditional keyword list, although you can definitely map specific keywords and intents to these pages in order to adjacently optimize for organic visibility.
However, you won’t always have a clear “search volume” number for this type of content.
Sometimes this content is going to be messy. Sometimes it’s going to be weird.
You need to thoroughly know your core audience and understand their most pressing needs and questions that you can solve for. (Even the fringe ones.)
But this “fringe content” is what makes your site actually helpful, authoritative, and hard to replicate.
Think of it this way: The best organic search strategies don’t just optimize for the top 10 questions on a topic – they anticipate the next 100.
They dig into the side doors, caveats, gotchas, exceptions, industry language quirks, and debates.
You must go beyond building clusters and instead build context for your brand within your targeted topic.
Here’s where to look when expanding with meaningful subtopics:
Sales calls with leads, customer care questions, and actual customer interviews: There’s a gold mine here, and every brand has it. (Yes, even yours.) Use it to your advantage. I recommend tools like Gong/Chorus + Humata AI to help.
Reddit + Quora discussions: Look for questions that no one has great concrete answers to or resources/solutions for. Use a tool like Gummy Search to streamline this research.
Context that will build out your topic environment: You’re not just building a tidy cluster with “best X tools,” “top tools for Y,” and “X vs Y.” Ask: What misconceptions need to be cleared up? What advanced tips only experts talk about when they talk shop? Lean on your internal SMEs, or invest in paying SMEs hourly, getting connected to them via platforms like JustAnswer.
Wikipedia table of contents and footnotes: While this might initially sound like strange guidance, if you truly feel you’ve covered your core topics for all your ICPs from multiple perspectives and for all their common pain points, this approach can help you branch out into connected subtopics. Caveat: Of course, don’t invest in covering subtopics that don’t matter to your ICPs … or angles they already understand thoroughly. (This research is very manual. If you have a workaround you’d suggest, send it my way.)
People Also Ask questions in the SERP: Keep these in mind: They still exist for a reason. Use your standard SEO tools like Semrush, Ahrefs, etc., to explore these within your topic.
So, with topic-first optimization at the center, should you be organizing your internal links by topic instead of just navigation structure or blog recency?
Um, yes – definitely. And if you weren’t doing that already, the time to start is now.
Topic-based internal linking is one of the most powerful (and underutilized) ways to reinforce topical authority.
Most content teams default to one of two internal linking strategies:
Navigation-based linking: whatever shows up in your menu or footer.
Date-based linking: linking to “recent posts” regardless of topic relevance.
The problem? These methods serve the convenience of the content management system (CMS), not the reader or search engine.
A topic-first internal linking strategy intentionally:
Connects all relevant pages under a single topic or persona target.
Links related subtopics together to increase crawl depth and surface additional value.
Boosts orphaned or underperforming assets with contextually relevant links.
You can simplify this task with an SEO tool like Clearscope, Surfer, Ahrefs, etc. (For convenience, the pages explaining how these features work per tool are linked here.)
For example, tools like these surface internal linking opportunities within the pages you’re monitoring within the tool. The feature then gives you clear related anchor text on where to add the URLs specifically.
The manual part? Having your content producers or SEO analysts determine if the tool’s suggested page is in the right topic cluster to warrant an anchor link. (But you can also set up topic clusters/content segments within tools like Clerascope that can help guide your producers.)
Used with permission from 4aGoodCause, a top monthly giving platform for nonprofits. (Link)
But you should be employing a topic-based backlink strategy, too.
You don’t just want backlinks. You want links that have authority in your target topics and/or with your audience.
For instance, our example from earlier, Kind Habitat, doesn’t need low-quality backlinks from around the globe to build topical authority in the sustainable interior design niche.
This brand needs to invest in backlinks that include:
High-authority sites in similar topics, like ThisOldHouse.com, MarthaStewart.com, Houzz.com, and HomeAdvisor.com.
Local and regional publications for this brand’s service areas.
Manufacturers of sustainable, low-toxin home building products and materials.
Professional associations for interior designers, builders, and property managers who value sustainable and green design.
Here’s the payoff of taking a topic-first approach: Once you shift your strategy to cover core topics deeply – across the right audience segments and intent layers – you unlock a Topical Authority Flywheel.
Here’s how it works:
Better coverage → Better engagement and organic links → Better visibility across more queries.
Image Credit: Kevin Indig
When your site deeply addresses a topic, you not only become more useful to your audience, but you also are more visible to search engines and LLMs.
You build the kind of brand context that LLMs surface and that Google’s evolving AI-driven results reward.
And yes, it’s measurable.
Track your performance by topic, not just by page or keyword.
If you’ve mapped and organized your content well, you can group related URLs and monitor how the topic as a whole performs:
Watch how refreshed or expanded topic clusters improve in average rank, CTR, and conversions over time.
Look for early signals of lift within the first 10-30 days after refreshing or publishing a comprehensive set of content on a given topic.
Monitor link velocity. Strong topic clusters reap rewards.
Operationalizing a topic-first approach isn’t just about traffic.
It’s about building a defensible edge in search/LLM visibility by doing the thing many brands still are missing out on: going deep, not wide.
Featured Image: Paulo Bobita/Search Engine Journal
Google’s John Mueller answered a question about llms.txt related to duplicate content, stating that it doesn’t make sense that it would be viewed as duplicate content, but he also stated it could make sense to take steps to prevent indexing.
LLMs.txt
Llms.txt is a proposal to create a new content format standard that large language models can use to retrieve the main content of a web page without having to deal with other non-content data, such as advertising, navigation, and anything else that is not the main content. It offers web publishers the ability to provide a curated, Markdown-formatted version of the most important content. The llms.txt file sits at the root level of a website (example.com/llms.txt).
Contrary to some claims made about llms.txt, it is not in any way similar in purpose to robots.txt. The purpose of robots.txt is to control robot behavior, while the purpose of llms.txt is to provide content to large language models.
Will Google View Llms.txt As Duplicate Content?
Someone on Bluesky asked if llms.txt could be seen by Google as duplicate content, which is a good question. It could happen that someone outside of the website might link to the llms.txt and that Google might begin surfacing that content instead of or in addition to the HTML content.
“Will Google view LLMs.txt files as duplicate content? It seems stiff necked to do so, given that they know that it isn’t, and what it is really for.
Should I add a “noindex” header for llms.txt for Googlebot?”
Google’s John Mueller answered:
“It would only be duplicate content if the content were the same as a HTML page, which wouldn’t make sense (assuming the file itself were useful).
That said, using noindex for it could make sense, as sites might link to it and it could otherwise become indexed, which would be weird for users.”
Noindex For Llms.txt
Using a noindex header for the llms.txt is a good idea because it will prevent the content from entering Google’s index. Using a robots.txt to block Google is not necessary because that will only block Google from crawling the file which will prevent it from seeing the noindex.
A new study from GrowthSRC Media finds that click-through rates (CTRs) for Google’s top-ranking search result have declined from 28% to 19%. This 32% drop correlates with the expansion of AI Overviews, a feature that now appears across a wide range of search results.
Position #2 experienced an even steeper decline, with CTRs falling 39% from 20.83% to 12.60% year-over-year.
The research analyzed more than 200,000 keywords from 30 websites across ecommerce, SaaS, B2B, and EdTech industries. Here are more highlights from the study.
Key Findings
According to the report, AI Overviews appeared for just 10,000 keywords in August 2024. By May 2025, that number had grown to over 172,000.
This expansion followed the March core update and was confirmed during Google’s full U.S. rollout announcement at the I/O developer conference.
These developments appear to contrast with comments from Google CEO Sundar Pichai, who said in a Decoderinterview with The Verge:
“If you put content and links within AI Overviews, they get higher click-through rates than if you put it outside of AI Overviews.”
CTRs Shift Downward and Upward
While top positions saw notable declines, the study observed a 30.63% increase in CTRs for positions 6 through 10 compared to the previous year. This suggests that users may be scrolling past AI-generated summaries to find original sources further down the page.
Across positions 1 through 5, the study reported an average CTR decline of 17.92%. The analysis focused on approximately 74,000 keywords ranking in the top 10.
Major Publishers Report Similar Trends
The findings align with reports from major publishers. Carly Steven, SEO and editorial ecommerce director at MailOnline, told attendees at the WAN-IFRA World News Media Congress that CTRs drop when AI Overviews are present.
“On desktop, when we are ranking number one in organic search, [CTR] is about 13% on desktop and about 20% on mobile. When there is an AI Overview present, that drops to less than 5% on desktop and 7% on mobile.”
MailOnline’s broader data showed CTRs falling by 56.1% on desktop and 48.2% on mobile for keywords with AI Overviews.
Ecommerce Affected by Product Widgets
The study also highlighted changes in ecommerce performance tied to Google’s Product Widgets.
Widgets like “Popular Products” and “Under [X] Price” began appearing more frequently from November 2024 onward, especially in categories such as home care, fashion, and beauty.
These widgets open a Google Shopping interface directly within search results, which may reduce clicks to traditional organic listings.
Methodology
GrowthSRC analyzed year-over-year data from Google Search Console across clients in multiple industries, focusing on changes before and after the full rollout of AI Overviews and Product Widgets.
The dataset included queries, clicks, impressions, CTRs, and average positions.
Data was segmented by content type, including product pages, collection pages, and blog posts. Additional keyword data from Ahrefs helped determine which queries triggered AI Overviews or Product Widgets.
What This Means
Mahendra Choudhary, Partner at GrowthSRC Media, encouraged SEO professionals to reconsider traditional performance benchmarks:
“With lower clicks to websites from informational content becoming the new normal, this is the perfect time to let your clients and internal stakeholders know that chasing website traffic as a KPI should be thought of differently.”
He recommends shifting focus toward brand visibility in social search, geographic relevance, mentions in LLM outputs, and overall contribution to revenue or leads.
This shift may require:
Tracking engagement beyond clicks, such as on-site conversions, branded search growth, or assisted conversions.
Diversifying content distribution across platforms like YouTube, TikTok, and Reddit, where users often bypass traditional search.
Investing in high-authority content at the top of the funnel to build brand awareness, even if direct clicks decline.
These strategies can help ensure SEO continues to drive measurable value as user behavior evolves.
Looking Ahead
The decline in organic CTRs for top positions highlights how search behavior is changing as AI-generated content plays a larger role in discovery.
Adapting to this environment may involve placing less emphasis on rankings alone and focusing more on how visibility supports broader business goals.
As zero-click search becomes more common, understanding where users are engaging, and where they aren’t, will be essential to maintaining visibility.
There was a post on social media about so-called hustle bros, and one on Reddit about an SEO who lost a prospective client to a digital marketer whose pitch included a song and dance about AI search visibility. Both discussions highlight a trend in which potential customers want to be assured of positive outcomes and may want to discuss AI search positioning.
Hustle Bro Culture?
Two unrelated posts touched on SEOs who are hustling for clients and getting them. The first post was about SEO “hustle bros” who post search console screenshots to show the success of their work.
I know of a guy who used to post a lot in a Facebook SEO group until the moderators discovered that his Search Console screenshots were downloaded from Google Images. SEO hustle bros who post fake screenshots are an actual thing, and sometimes they get caught.
So, a person posted a rant on Bluesky about people who do that.
“How much of SEO is “chasing after wind”. There’s so many hustle bros, programmatic promoters and people posting graphs with numbers erased off to show their “success”.”
Has Something Changed?
Google’s John Mueller responded:
“I wonder if it has changed over the years, or if it’s just my (perhaps your) perception that has changed.
Or maybe all the different kinds of SEOs are just in the same few places, rather than their independent forums, making them more visible?”
Mueller might be on to something because social media and YouTube have made it easier for legit SEOs and “hustle bros” to find a larger audience. But I think the important point to consider is that those people are connecting to potential clients in a way that maybe legit SEOs might not be connecting.
And that leads into the next social media discussion, which is about SEOs who are talking about what clients want to hear: AI Fluff.
SEOs Selling AI “Fluff”
There is a post on Reddit where an SEO shares how they spent months communicating with a potential client, going out of their way to help a small business as a favor to a friend. After all the discussions the SEO gets to the part where they expect the small business to commit to an agreement and they walk away, saying they’re going with another SEO who sold them with something to do with AI.
After answering a bunch of questions via email over 3 months (unusually needy client) but essentially presales, it all sounds good to go and we hop on a kickoff call. Recap scope and reshare key contacts, and tee up a chat with the we design agency. So far so good.
Then dropped.
Clients reason? The other SEO who they’ve been chatting with is way more clued up with the AI technicals
I’d love to know what crystal ball AI mysticism they were sold on. Maybe a “cosine similarity audit”, maybe we’ll include “schema embeddings analysis” within our migration project plan to make sure AI bots can read your site. Lol cool whatever bro.”
John Mueller responded to that person’s post but then retracted it.
Nevertheless, a lively discussion ensued with three main points:
Is AI SEO this year’s EEAT?
Some potential clients want to discuss AI SEO
SEOs may need to address AEO/AIO/GEO
1. Is AI For SEO This Year’s EEAT?
Many Redditors in that discussion scoffed at the idea of SEO for AI. This isn’t a case of luddites refusing to change with the times. SEO tactics for AI Search are still evolving.
Reddit moderator WebLinkr received eight upvotes for their comment:
“Yup – SEOs been like that for years – EEAT, “SEO Audits” – basically people buy on what “makes sense” or “sounds sensible” even though they’ve already proven they have no idea what SEO is.”
Unlike EEAT, AI Search is most definitely disrupting visibility. It’s a real thing. And I do know of at least one SEO with a computer science degree who has it figured out.
But I think it’s not too off the mark to say that many digital marketers are still figuring things out. The amount of scoffing in that discussion seems to support the idea that AI Search is not something all SEOs are fully confident about.
2. Some Clients Are Asking For AI SEO
Perhaps the most important insight is that potential clients want to know what an SEO can do for AI optimization. If clients are asking about AI SEO, does that mean it’s no longer hype? Or is this a repeat of what happened with EEAT where it was a lot of wheels spinning for nothing?
Redditor mkhaytman shared:
“Like it or not, clients are asking questions about AIs impact and how they can leverage the new tools people are using for search and just telling them that “Nobody knows!” isn’t a satisfactory answer. You need to be able to tell them something – even if its just “good seo practices are the same things that will improve your AI citations”.”
3. AI Search Is Real: SEOs Need To Talk About It With Clients
A third point of view emerged: this is something real that all SEOs need to be having a conversation about. It’s not something that can be ignored and only discussed if a client or prospect asks about it.
SVLibertine shared:
“Battling AIO, GEO, and AEO may seem like snake oil to some, but…it’s where we’re headed. Right now.
To stay relevant in our field you need to be able to eloquently and convincingly speak to this brave new world we’ve found ourselves in. Either to potential clients, or to our boss’s bosses.
I spend almost as much time after work staying on top of developments as I do during the day working. …That being said… SEO fundamentals absolutely still apply, and content is still king.”
Uncertainty About Answer Engine SEO
There are many ways to consider SEO for AI. For example, there’s a certain amount of consensus that AI gets web search data from traditional search engines, where traditional SEO applies. That’s what the comment about content being king seems to be about.
But then we have folks who are using share buttons to raise visibility by getting people to ask ChatGPT, Claude, and Perplexity about their web pages. That’s kind of edgy, but it’s a natural part of how SEO reacts to new things: by experimenting and seeing how the algorithmic black box responds.
This is a period similar to what I experienced at the dawn of SEO, when search marketers were playing around with different approaches and finding what works until it doesn’t.
But here’s something to be aware of: there are times when a client will demand certain things, and it’s tempting to give clients what they’re asking for. But if you have reservations, it may be helpful to share your doubts.
There are many businesses relatively new to SEO that eventually face the decision to build or buy links because they are told that links are important, which, of course, links are important. But the need to buy links presupposes that buying them is the only way to acquire them. Links are important, but less important than at any time in the history of SEO.
How Do I Know So Much About Links?
I have been doing SEO for 25 years, at one time specializing in links. I did more than links, but I was typecast as a “links guy” because I was the moderator of the Link Building Forum at WebmasterWorld under the martinibuster nickname. WebmasterWorld was at one time the most popular source of SEO information in the world. Being a WebmasterWorld moderator was an honor, and only the best of the very best were invited to become one. Many top old-school SEOs were moderators there, like Jennifer Slegg, Greg Boser, Todd Friesen, Dixon Jones, Ash Nallawalla, and many more.
That’s not to brag, but to explain that my opinion comes from decades-long experience starting from the very dawn of link building. There are very few people who have as deep hands-on experience with links. So this is my advice based on my experience.
Short History Of Link Building
Google’s link algorithms have steadily improved since the early days. As early as 2003, I was told by Google engineer Marissa Mayer (then at Google, before becoming CEO of Yahoo) that Google was able to distinguish that a link in the footer was a “built by” link and to not count it for PageRank. This crushed sites that relied on footer links to power their rankings.
2005 – Statistical Analysis In 2005, Google engineers announced at the Pubcon New Orleans search conference that they were using statistical analysis to catch unnatural linking patterns. Their presentation featured graphs showing a curve representing normal linking patterns and then a separate cloud of red dots that represented unnatural links.
Links That “Look” Natural If you’ve ever read the phrase “links that look natural” or “natural-looking links” and wondered where that came from, statistical analysis algorithms is the answer. After 2005, the goal for manipulative links was to look natural, which meant doing things like alternating the anchor text, putting links into context, and being careful about outbound link targets.
Demise Of Easy Link Tactics By 2006, Google had neutralized the business of reciprocal links, traffic counter link building, and was winding down the business of link directories.
WordPress Was Good For Link Building WordPress was a boon to link builders because it made it possible for more people to get online and build websites, increasing the ability to obtain links by asking or throwing money at them. There were also sites like Geocities that hosted mini-sites, but most of the focus was on standalone sites, maybe because of PageRank considerations (PageRank was visible in the Google Toolbar).
Rise Of Paid Links Seemingly everyone built websites on virtually any topic, which made link building easier to do simply by asking for a link. Companies like Text-Link-Ads came along and built huge networks of thousands of independent websites on virtually every topic, and they made a ton of money. I knew some people who sold links from their network of sites who were earning $40,000/month in passive income. White hat SEOs celebrated link selling because they said it was legitimate advertising (wink, wink), and therefore Google wouldn’t penalize it.
Fall Of Paid Links The paid links party ended in the years leading up to 2012, when paid links began losing their effectiveness. As a link building moderator, I had access to confidential information and was told by insiders that paid links were having less and less effect. Then 2012’s Penguin Update happened, and suddenly thousands of websites got hit by manual actions for paid links and guest posting links.
Ranking Where You’re Supposed To Rank
The Penguin Algorithm marked a turning point in the business of building links. Internally at Google there must have been a conversation about the punitive aspect of catching links and at some point not long after Google started ranking sites where they were supposed to rank instead of penalizing them.
In fact, I coined the phrase “ranking where you’re supposed to rank” in 2014 to show that while sites with difficulty ranking may not technically have a penalty, their links are ineffective and they are ranking where they are supposed to rank.
There’s a class of link sellers that sell what they call Private Blog Network links. PBN sellers depend on Google to not penalize a site and depend on Google to give a site a temporary boost which happens for many links. But the sites inevitably return to ranking where they’re supposed to rank.
Ranking poorly is not a big deal for churn and burn affiliate sites designed to rank high for a short period of time. But it’s a big deal for businesses that depend on a website to be ranking well every day.
Consequences Of Poor SEO
Receiving a manual action is a big deal because it takes a website out of action until Google restores the rankings. Recovering from a manual action is difficult and requires a site to go above and beyond by removing every single low-quality link they are responsible for, and sometimes more than that. Publishers are often disappointed after a manual action is lifted because their sites don’t return to their former high rankings. That’s because they’re ranking where they’re supposed to rank.
For that reason, buying links is not an option for B2B sites, personal injury websites, big-brand websites, or any other businesses that depend on rankings. An SEO or business owner will have to answer for a catastrophic loss in traffic and earnings should their dabbling in paid links backfire.
Personal injury SEO is a good example of why relying on links can be risky. It’s a subset of local search, where rankings are determined by local search algorithms. While links may help, the algorithm is influenced by other factors like local citations, which are known to have a strong impact on rankings. Even if a site avoids a penalty, links alone won’t carry it, and the best-case scenario is that the site ends up ranking where it’s supposed to rank. The worst-case scenario is a manual action for manipulative links.
I’ve assisted businesses with their reconsideration requests to get out of a manual action, and it’s a major hassle. In the old days, I could just send an email to someone at Google or Yahoo and get the penalty lifted relatively quickly. Getting out of a manual action today is not easy. It’s a big, big deal.
The point is that if the consequences of a poor SEO strategy are catastrophic, then buying links is not an option.
Promotion Is A Good Strategy
Businesses can still promote their websites without depending heavily on links. SEOs tend to narrow their views of promotion to just links. Link builders will turn down an opportunity to publish an article for distribution to tens of thousands of potential customers because the article is in an email or a PDF and doesn’t come with a link on a web page.
How dumb is that, right? That’s what thinking in the narrow terms of SEO does: it causes people to avoid promoting a site in a way that builds awareness in customers—the people who may be interested in a business. Creating awareness and building love for a business is the kind of thing that, in my opinion, leads to those mysterious external signals of trustworthiness that Google looks for.
Promotion is super important, and it’s not the kind of thing that fits into the narrow “get links” mindset. Any promotional activity a business undertakes outside the narrow SEO paradigm is going to go right over the head of the competition. Rather than obsessing over links, it may be a turning point for all businesses to return to thinking of ways to promote the site, because links are less important today than they ever have been, while external signals of trust, expertise, and authoritativeness are quite likely more important today than at any other time in SEO history.
Takeaways
Link Building’s Declining Value: Links are still important, but less so than in the past; their influence on rankings has steadily decreased.
Google’s Increasingly Sophisticated Link Algorithms: Google has increasingly neutralized manipulative link strategies through algorithm updates and statistical detection methods.
Rise and Fall of Paid Link Schemes: Paid link networks once thrived but became increasingly ineffective by 2012, culminating in penalties via the Penguin update.
Ranking Where You’re Supposed to Rank: Google now largely down-ranks or ignores manipulative links, meaning sites rank based on actual quality and relevance. Sites can still face manual actions, so don’t depend on Google continuing to down-rank manipulative links.
Risks of Link Buying: Manual actions are difficult to recover from and can devastate sites that rely on rankings for revenue.
Local SEO Factors Rely Less On Links: For industries like personal injury law, local ranking signals (e.g., citations) often outweigh link impact.
Promotion Beyond Links: Real promotion builds brand awareness and credibility, often in ways that don’t involve links but may influence user behavior signals. External user behavior signals have been a part of Google’s signals since the very first PageRank algorithm, which itself models user behavior.
Learn more about Google’s external user behavior signals and ranking without links:
Google’s John Mueller and Martin Splitt discussed making changes to a web page, observing the SEO effect, and the importance of tracking those changes. There has been long-standing hesitation around making too many SEO changes because of a patent filed years ago about monitoring frequent SEO updates to catch attempts to manipulate search results, so Mueller’s answer to this question is meaningful in the context of what’s considered safe.
Does this mean it’s okay now to keep making changes until the site ranks well? Yes, no, and probably. The issue was discussed on a recent Search Off the Record podcast.
Is It Okay To Make Content Changes For SEO Testing?
The context of the discussion was a hypothetical small business owner who has a website and doesn’t really know much about SEO. The situation is that they want to try something out to see if it will bring more customers.
Martin Splitt set up the discussion as the business owner asking different people for their opinions on how to update a web page but receiving different answers. Splitt then asks whether going ahead and changing the page is safe to do.
Martin asked:
“And I want to try something out. Can I just do that or do I hurt my website when I just try things out?”
Mueller affirmed that it’s okay to get ahead and try things out, commenting that most content management systems (CMS) enable a user to easily make changes to the content.
He responded:
“…for the most part you can just try things out. One of the nice parts about websites is, often, if you’re using a CMS, you can just edit the page and it’s live, and it’s done. It’s not that you have to do some big, elaborate …work to put it live.”
In the old days, Google used to update its index once a month. So SEOs would make their web page changes and then wait for the monthly update to see if those changes had an impact. Nowadays, Google’s index is essentially on a rolling update, responding to new content as it gets indexed and processed, with SERPs being re-ranked in reaction to changes, including user trends where something becomes newsworthy or seasonal (that’s where the freshness algorithm kicks in).
Making changes to a small site that doesn’t have much traffic is an easy thing. Making changes to a website responsible for the livelihood of dozens, scores, or even hundreds of people is a scary thing. So when it comes to testing, you really need to balance the benefits against the possibility that a change might set off a catastrophic chain of events.
Monitoring The SEO Effect
Mueller and Splitt next talked about being prepared to monitor the changes.
Mueller continued his answer:
“It’s very easy to try things out, let it sit for a couple of weeks, see what happens and kind of monitor to see is it doing what you want it to be doing. I guess, at that point, when we talk about monitoring, you probably need to make sure that you have the various things installed so that you actually see what is happening.
Perhaps set up Search Console for your website so that you see the searches that people are doing. And, of course, some way to measure the goal that you want, which could be something perhaps in Analytics or perhaps there’s, I don’t know, some other way that you track in person if you have a physical store, like are people actually coming to my business after seeing my website, because it’s all well and good to do SEO, but if you have no way of understanding has it even changed anything, you don’t even know if you’re on the right track or recognize if something is going wrong.”
Something that Mueller didn’t mention is the impact on user behavior on a web page. Does the updated content make people scroll less? Does it make them click on the wrong thing? Do people bounce out at a specific part of the web page?
That’s the kind of data Google Analytics does not provide because that’s not what it’s for. But you can get that data with a free Microsoft Clarity account. Clarity is a user behavior analytics SaaS app. It shows you where (anonymized) users are on a page and what they do. It’s an incredible window on web page effectiveness.
Martin Splitt responded:
“Yeah, that’s true. Okay, so I need a way of measuring the impact of my changes. I don’t know, if I make a new website version and I have different texts and different images and everything is different, will I immediately see things change in Search Console or will that take some time?”
Mueller responded that the amount of time it takes for changes to show up in Search Console depends on how big the site is and the scale of the changes.
Mueller shared:
“…if you’re talking about something like a homepage, maybe one or two other pages, then probably within a week or two, you should see that reflected in Search. You can search for yourself initially.
That’s not forbidden to search for yourself. It’s not that something will go wrong or anything. Searching for your site and seeing, whatever change that you made, has that been reflected. Things like, if you change the title to include some more information, you can see fairly quickly if that got picked up or not.”
When Website Changes Go Wrong
Martin next talks about what I mentioned earlier: when a change goes wrong. He makes the distinction between a technical change and changes for users. A technical change can be tested on a staging site, which is a sandboxed version of the website that search engines or users don’t see. This is actually a pretty good thing to do before updating WordPress plugins or doing something big like swapping out the template. A staging site enables you to test technical changes to make sure there’s nothing wrong. Giving the staged site a crawl with Screaming Frog to check for broken links or other misconfigurations is a good idea.
Mueller said that changes for SEO can’t be tested on a staged site, which means that whatever changes are made, you have to be prepared for the consequences.
Listen to The Search Off The Record from about the 24 minute mark:
The CEO of Conductor started a LinkedIn discussion about the future of AI SEO platforms, suggesting that the established companies will dominate and that 95 percent of the startups will disappear. Others argued that smaller companies will find their niche and that startups may be better positioned to serve user needs.
Besmertnik published his thoughts on why top platforms like Conductor, Semrush, and Ahrefs are better positioned to provide the tools users will need for AI chatbot and search visibility. He argued that the established companies have over a decade of experience crawling the web and scaling data pipelines, with which smaller organizations cannot compete.
Conductor’s CEO wrote:
“Over 30 new companies offering AI tracking solutions have popped up in the last few months. A few have raised some capital to get going. Here’s my take: The incumbents will win. 95% of these startups will flatline into the SaaS abyss.
…We work with 700+ enterprise brands and have 100+ engineers, PMs, and designers. They are all 100% focused on an AI search only future. …Collectively, our companies have hundreds of millions of ARR and maybe 1000x more engineering horsepower than all these companies combined.
Sure we have some tech debt and legacy. But our strengths crush these disadvantages…
…Most of the AEO/GEO startups will be either out of business or 1-3mm ARR lifestyle businesses in ~18 months. One or two will break through and become contenders. One or two of the largest SEO ‘incumbents’ will likely fall off the map…”
Is There Room For The “Lifestyle” Businesses?
Besmertnik’s remarks suggested that smaller tool companies earning one to three million dollars in annual recurring revenue, what he termed “lifestyle” businesses, would continue as viable companies but stood no chance of moving upward to become larger and more established enterprise-level platforms.
Rand Fishkin, cofounder of SparkToro, defended the smaller “lifestyle” businesses, saying that it feels like cheating at business, happiness, and life.
He wrote:
“Nothing better than a $1-3M ARR “lifestyle” business.
…Let me tell you what I’m never going to do: serve Fortune 500s (nevermind 100s). The bureaucracy, hoops, and friction of those orgs is the least enjoyable, least rewarding, most avoid-at-all-costs thing in my life.”
Not to put words into Rand’s mouth but it seems that what he’s saying is that it’s absolutely worthwhile to scale a business to a point where there’s a work-life balance that makes sense for a business owner and their “lifestyle.”
Case For Startups
Not everyone agreed that established brands would successfully transition from SEO tools to AI search, arguing that startups are not burdened by legacy SEO ideas and infrastructure, and are better positioned to create AI-native solutions that more accurately follow how users interact with AI chatbots and search.
Daniel Rodriguez, cofounder of Beewhisper, suggested that the next generation of winners may not be “better Conductors,” but rather companies that start from a completely different paradigm based on how AI users interact with information. His point of view suggests that legacy advantages may not be foundations for building strong AI search tools, but rather are more like anchors, creating a drag on forward advancement.
He commented:
“You’re 100% right that the incumbents’ advantages in crawling, data processing, and enterprise relationships are immense.
The one question this raises for me is: Are those advantages optimized for the right problem? All those strengths are about analyzing the static web – pages, links, and keywords.
But the new user journey is happening in a dynamic, conversational layer on top of the web. It’s a fundamentally different type of data that requires a new kind of engine.
My bet is that the 1-2 startups that break through won’t be the ones trying to build a better Conductor. They’ll be the ones who were unburdened by legacy and built a native solution for understanding these new conversational journeys from day one.”
Venture Capital’s Role In The AI SEO Boom
Mike Mallazzo, Ads + Agentic Commerce @ PayPal, questioned whether there’s a market to support multiple breakout startups and suggested that venture capital interest in AEO and GEO startups may not be rational. He believes that the market is there for modest, capital-efficient companies rather than fund-returning unicorns.
Mallazzo commented:
“I admire the hell out of you and SEMRush, Ahrefs, Moz, etc– but y’all are all a different breed imo– this is a space that is built for reasonably capital efficient, profitable, renegade pirate SaaS startups that don’t fit the Sand Hill hyper venture scale mold. Feels like some serious Silicon Valley naivete fueling this funding run….
Even if AI fully eats search, is the analytics layer going to be bigger than the one that formed in conventional SEO? Can more than 1-2 of these companies win big?”
New Kinds Of Search Behavior And Data?
Right now it feels like the industry is still figuring out what is necessary to track, what is important for AI visibility. For example, brand mentions is emerging as an important metric, but is it really? Will brand mentions put customers in the ecommerce checkout cart?
And then there’s the reality of zero click searches, the idea that AI Search significantly wipes out the consideration stage of the customer’s purchasing journey, the data is not there, it’s swallowed up in zero click searches. So if you’re going to talk about tracking user’s journey and optimizing for it, this is a piece of the data puzzle that needs to be solved.
Michael Bonfils, a 30-year search marketing veteran, raised these questions in a discussion about zero click searches and what to do to better survive it, saying:
“This is, you know, we have a funnel, we all know which is the awareness consideration phase and the whole center and then finally the purchase stage. The consideration stage is the critical side of our funnel. We’re not getting the data. How are we going to get the data?
So who who is going to provide that? Is Google going to eventually provide that? Do they? Would they provide that? How would they provide that?
But that’s very important information that I need because I need to know what that conversation is about. I need to know what two people are talking about that I’m talking about …because my entire content strategy in the center of my funnel depends on that greatly.”
There’s a real question about what type of data these companies are providing to fill the gaps. The established platforms were built for the static web, keyword data, and backlink graphs. But the emerging reality of AI search is personalized and queryless. So, as Michael Bonfils suggested, the buyer journeys may occur entirely within AI interfaces, bypassing traditional SERPs altogether, which is the bread and butter of the established SEO tool companies.
AI SEO Tool Companies: Where Your Data Will Come From Next
If the future of search is not about search results and the attendant search query volumes but a dynamic dialogue, the kinds of data that matter and the systems that can interpret them will change. Will startups that specialize in tracking and interpreting conversational interactions become the dominant SEO tools? Companies like Conductor have a track record of expertly pivoting in response to industry needs, so how it will all shake out remains to be seen.
I’ve spent years working with Google’s SEO tools, and while there are countless paid options out there, Google’s free toolkit remains the foundation of my optimization workflow.
These tools show you exactly what Google considers important, and that offers invaluable insights you can’t get anywhere else.
Let me walk you through the five Google tools I use daily and why they’ve become indispensable for serious SEO work.
1. Lighthouse
Screenshot from Chrome DevTool, July 2025
When I first discovered Lighthouse tucked away in Chrome’s developer tools, it felt like finding a secret playbook from Google.
This tool has become my go-to for quick site audits, especially when clients come to me wondering why their perfectly designed website isn’t ranking.
Getting Started With Lighthouse
Accessing Lighthouse is surprisingly simple.
On any webpage, press F12 (Windows) or Command+Option+C (Mac) to open developer tools. You’ll find Lighthouse as one of the tabs. Alternatively, right-click any page, select “Inspect,” and navigate to the Lighthouse tab.
What makes Lighthouse special is its comprehensive approach. It evaluates five key areas: performance, progressive web app standards, best practices, accessibility, and SEO.
While accessibility might not seem directly SEO-related, I’ve learned that Google increasingly values sites that work well for all users.
Real-World Insights From The Community
The developer community has mixed feelings about Lighthouse, and I understand why.
As _listless noted, “Lighthouse is great because it helps you identify easy wins for performance and accessibility.”
However, CreativeTechGuyGames warned about the trap of chasing perfect scores: “There’s an important trade-off between performance and perceived performance.”
I’ve experienced this firsthand. One client insisted on achieving a perfect 100 score across all categories.
We spent weeks optimizing, only to find that some changes actually hurt user experience. The lesson? Use Lighthouse as a guide, not gospel.
Why Lighthouse Matters For SEO
The SEO section might seem basic as it checks things like meta tags, mobile usability, and crawling issues, but these fundamentals matter.
I’ve seen sites jump in rankings just by fixing the simple issues Lighthouse identifies. It validates crucial elements like:
Proper viewport configuration for mobile devices.
Title and meta description presence.
HTTP status codes.
Descriptive anchor text.
Hreflang implementation.
Canonical tags.
Mobile tap target sizing.
One frustrating aspect many developers mention is score inconsistency.
As one Redditor shared, “I ended up just re-running the analytics WITHOUT changing a thing and I got a performance score ranging from 33% to 90%.”
I’ve seen this too, which is why I always run multiple tests and focus on trends rather than individual scores.
Making The Most Of Lighthouse
My best advice? Use the “Opportunities” section for quick wins. Export your results as JSON to track improvements over time.
And remember what one developer wisely stated: “You can score 100 on accessibility and still ship an unusable [website].” The scores are indicators, not guarantees of quality.
2. PageSpeed Insights
Screenshot from pagespeed.web.dev, July 2025
PageSpeed Insights transformed from a nice-to-have tool to an essential one when Core Web Vitals became ranking considerations.
What sets PageSpeed Insights apart is its combination of lab data (controlled test results) and field data (real user experiences from the Chrome User Experience Report).
This dual approach has saved me from optimization rabbit holes more times than I can count.
The field data is gold as it shows how real users experience your site over the past 28 days. I’ve had situations where lab scores looked terrible, but field data showed users were having a great experience.
This usually means the lab test conditions don’t match your actual user base.
Community Perspectives On PSI
The Reddit community has strong opinions about PageSpeed Insights.
NHRADeuce perfectly captured a common frustration: “The score you get from PageSpeed Insights has nothing to do with how fast your site loads.”
While it might sound harsh, there’s truth to it since the score is a simplified representation of complex metrics.
Practical Optimization Strategies
Through trial and error, I’ve developed a systematic approach to PSI optimization.
Arzishere’s strategy mirrors mine: “Added a caching plugin along with minifying HTML, CSS & JS (WP Rocket).” These foundational improvements often yield the biggest gains.
DOM size is another critical factor. As Fildernoot discovered, “I added some code that increased the DOM size by about 2000 elements and PageSpeed Insights wasn’t happy about that.” I now audit DOM complexity as part of my standard process.
Mobile optimization deserves special attention. A Redditor asked the right question: “How is your mobile score? Desktop is pretty easy with a decent theme and Litespeed hosting and LScaching plugin.”
In my experience, mobile scores are typically 20-30 points lower than desktop, and that’s where most of your users are.
The Diminishing Returns Reality
Here’s the hard truth about chasing perfect PSI scores: “You’re going to see diminishing returns as you invest more and more resources into this,” as E0nblue noted.
I tell clients to aim for “good” Core Web Vitals status rather than perfect scores. The jump from 50 to 80 is much easier and more impactful than 90 to 100.
3. Safe Browsing Test
Screenshot from transparencyreport.google.com/safe-browsing/search, July 2025
The Safe Browsing Test might seem like an odd inclusion in an SEO toolkit, but I learned its importance the hard way.
A client’s site got hacked, flagged by Safe Browsing, and disappeared from search results overnight. Their organic traffic dropped to zero in hours.
Understanding Safe Browsing’s Role
Google’s Safe Browsing protects users from dangerous websites by checking for malware, phishing attempts, and deceptive content.
As Lollygaggindovakiin explained, “It automatically scans files using both signatures of diverse types and uses machine learning.”
The tool lives in Google’s Transparency Report, and I check it monthly for all client sites. It shows when Google last scanned your site and any current security issues.
The integration with Search Console means you’ll get alerts if problems arise, but I prefer being proactive.
Community Concerns And Experiences
The Reddit community has highlighted some important considerations.
One concerning trend expressed by Nextdns is false positives: “Google is falsely flagging apple.com.akadns.net as malicious.” While rare, false flags can happen, which is why regular monitoring matters.
Privacy-conscious users raise valid concerns about data collection.
As Mera-beta noted, “Enhanced Safe Browsing will send content of pages directly to Google.” For SEO purposes, standard Safe Browsing protection is sufficient.
Why SEO Pros Should Care
When Safe Browsing flags your site, Google may:
Remove your pages from search results.
Display warning messages to users trying to visit.
Drastically reduce your click-through rates.
Impact your site’s trust signals.
I’ve helped several sites recover from security flags. The process typically takes one to two weeks after cleaning the infection and requesting a review.
That’s potentially two weeks of lost traffic and revenue, so prevention is infinitely better than cure.
Best Practices For Safe Browsing
My security checklist includes:
Weekly automated scans using the Safe Browsing API for multiple sites.
Immediate investigation of any Search Console security warnings.
Regular audits of third-party scripts and widgets.
Monitoring of user-generated content areas.
4. Google Trends
Screenshot from Google Trends, July 2025
Google Trends has evolved from a curiosity tool to a strategic weapon in my SEO arsenal.
With updates now happening every 10 minutes and AI-powered trend detection, it’s become indispensable for content strategy.
Beyond Basic Trend Watching
What many SEO pros miss is that Trends isn’t just about seeing what’s popular. I use it to:
Validate content ideas before investing resources.
The Reddit community offers balanced perspectives on Google Trends.
Maltelandwehr highlighted its unique value: “Some of the data in Google Trends is really unique. Even SEOs with monthly 7-figure budgets will use Google Trends for certain questions.”
However, limitations exist. As Dangerroo_2 clarified, “Trends does not track popularity, but search demand.”
This distinction matters since a declining trend doesn’t always mean fewer total searches, just decreasing relative interest.
For niche topics, frustrations mount. iBullyDummies complained, “Google has absolutely ruined Google Trends and no longer evaluates niche topics.” I’ve found this particularly true for B2B or technical terms with lower search volumes.
Advanced Trends Strategies
My favorite Trends hacks include:
The Comparison Method: I always compare terms against each other rather than viewing them in isolation. This reveals relative opportunity better than absolute numbers.
Category Filtering: This prevents confusion between similar terms. The classic example is “jaguar” where without filtering, you’re mixing car searches with animal searches.
Rising Trends Mining: The “Rising” section often reveals opportunities before they become competitive. I’ve launched successful content campaigns by spotting trends here early.
Geographic Arbitrage: Finding topics trending in one region before they spread helps you prepare content in advance.
Addressing The Accuracy Debate
Some prefer paid tools, as Contentwritenow stated: “I prefer using a paid tool like BuzzSumo or Semrush for trends and content ideas simply because I don’t trust Google Trends.”
While I use these tools too, they pull from different data sources. Google Trends shows actual Google search behavior, which is invaluable for SEO.
“A line trending downward means that a search term’s relative popularity is decreasing. But that doesn’t necessarily mean the total number of searches for that term is decreasing.”
I always combine Trends data with absolute volume estimates from other tools.
No list of Google SEO tools would be complete without Search Console.
If the other tools are your scouts, Search Console is your command center, showing exactly how Google sees and ranks your site.
Why Search Console Is Irreplaceable
Search Console provides data you literally cannot get anywhere else. As Peepeepoopoobutler emphasized, “GSC is the accurate real thing. But it doesn’t really give suggestions like ads does.”
That’s exactly right. While it won’t hold your hand with optimization suggestions, the raw data it provides is gold.
The tool offers:
Actual search queries driving traffic (not just keywords you think matter).
True click-through rates by position.
Index coverage issues before they tank your traffic.
Core Web Vitals data for all pages.
Manual actions and security issues that could devastate rankings.
I check Search Console daily, and I’m not alone.
Successful site owner ImportantDoubt6434 shared, “Yes monitoring GSC is part of how I got my website to the front page.”
The Performance report alone has helped me identify countless optimization opportunities.
Setting Up For Success
Getting started with Search Console is refreshingly straightforward.
As Anotherbozo noted, “You don’t need to verify each individual page but maintain the original verification method.”
I recommend domain-level verification for comprehensive access since you can “verify ownership by site or by domain (second level domain),” but domain gives you data across all subdomains and protocols.
The verification process takes minutes, but the insights last forever. I’ve seen clients discover they were ranking for valuable keywords they never knew about, simply because they finally looked at their Search Console data.
Hidden Powers Of Search Console
What many SEO pros miss are the advanced capabilities lurking in Search Console.
Seosavvy revealed a powerful strategy: “Google search console for keyword research is super powerful.” I couldn’t agree more.
By filtering for queries with high impressions but low click-through rates, you can find content gaps and optimization opportunities your competitors miss.
The structured data reports have saved me countless hours. CasperWink mentioned working with schemas, “I have already created the schema with a review and aggregateRating along with confirming in Google’s Rich Results Test.”
Search Console will tell you if Google can actually read and understand your structured data in the wild, something testing tools can’t guarantee.
Sitemap management is another underutilized feature. Yetisteve correctly stated, “Sitemaps are essential, they are used to give Google good signals about the structure of the site.”
I’ve diagnosed indexing issues just by comparing submitted versus indexed pages in the sitemap report.
The Reality Check: Limitations To Understand
Here’s where the community feedback gets really valuable.
An experienced SimonaRed warned, “GSC only shows around 50% of the reality.” This is crucial to understand since Google samples and anonymizes data for privacy. You’re seeing a representative sample, not every single query.
Some find the interface challenging. As UncleFeather6000 admitted, “I feel like I don’t really understand how to use Google’s Search Console.”
I get it because the tool has evolved significantly, and the learning curve can be steep. My advice? Start with the Performance report and gradually explore other sections.
Recent changes have frustrated users, too. “Google has officially removed Google Analytics data from the Search Console Insights tool,” Shakti-basan noted.
This integration loss means more manual work correlating data between tools, but the core Search Console data remains invaluable.
Making Search Console Work Harder
Through years of daily use, I’ve developed strategies to maximize Search Console’s value:
The Position 11-20 Gold Mine: Filter for keywords ranking on page two. These are your easiest wins since Google already thinks you’re relevant. You just need a push to page one.
Click-Through Rate Optimization: Sort by impressions, then look for low CTR. These queries show demand but suggest your titles and descriptions need work.
Query Matching: Compare what you think you rank for versus what Search Console shows. The gaps often reveal content opportunities or user intent mismatches.
Page-Level Analysis: Don’t just look at site-wide metrics. Individual page performance often reveals technical issues or content problems.
Integrating Search Console With Other Tools
The magic happens when you combine Search Console data with the other tools:
Use Trends to validate whether declining traffic is due to ranking drops or decreased search interest.
Cross-reference PageSpeed Insights recommendations with pages showing Core Web Vitals issues in Search Console.
Verify Lighthouse mobile-friendliness findings against Mobile Usability reports.
Monitor Safe Browsing status directly in the Security Issues section.
Mr_boogieman asked rhetorically, “How are you tracking results without looking at GSC?” It’s a fair question.
Without Search Console, you’re flying blind, relying on third-party estimations instead of data straight from Google.
Bringing It All Together
These five tools form the foundation of effective SEO work. They’re free, they’re official, and they show you exactly what Google values.
While specialized SEO platforms offer additional features, mastering these Google tools ensures your optimization efforts align with what actually matters for rankings.
My workflow typically starts with Search Console to identify opportunities, using Trends to validate content ideas, employing Lighthouse and PageSpeed Insights to optimize technical performance, and includes Safe Browsing checks to protect hard-won rankings.
Remember, these tools reflect Google’s current priorities. As search algorithms evolve, so do these tools. Staying current with their features and understanding their insights keeps your SEO strategy aligned with Google’s direction.
The key is using them together, understanding their limitations, and remembering that tools are only as good as the strategist wielding them. Start with these five, master their insights, and you’ll have a solid foundation for SEO success.