Google Defends Parasite SEO Crackdown As EU Opens Investigation via @sejournal, @MattGSouthern

Google has defended its enforcement of site reputation abuse policies after the European Commission announced an investigation into whether the company unfairly demotes news publishers in search results.

The company published a blog post stating the investigation “is misguided and risks harming millions of European users” and that it “risks rewarding bad actors and degrading the quality of search results.”

Google’s Chief Scientist for Search, Pandu Nayak, wrote the response.

Background

The European Commission announced an investigation under the Digital Markets Act examining whether Google’s anti-spam policies unfairly penalize legitimate publisher revenue models.

Publishers complained that Google demotes news sites running sponsored content and third-party promotional material. EU antitrust chief Teresa Ribera said:

“We are concerned that Google’s policies do not allow news publishers to be treated in a fair, reasonable and non-discriminatory manner in its search results.”

Google updated its site reputation abuse policy last year to combat parasite SEO. The practice involves spammers paying publishers to host content on established domains to manipulate search rankings.

The policy targets content like payday loan reviews on educational sites, casino content on medical sites, or third-party coupon pages on news publishers. Google provided specific examples in its announcement including weight-loss pill spam and payday loan promotions.

Manual enforcement began shortly after. Google issued penalties to major publishers including Forbes, The Wall Street Journal, Time and CNN in November 2024.

Google later updated the policy to clarify that first-party oversight doesn’t exempt content primarily designed to exploit ranking signals.

Google’s Defense

Google’s response emphasized three points.

First, Google stated that a German court dismissed a similar claim, ruling the anti-spam policy was “valid, reasonable, and applied consistently.”

Second, Google says its policy protects users from scams and low-quality content. Allowing pay-to-play ranking manipulation would “enable bad actors to displace sites that don’t use those spammy tactics.”

Third, Google says smaller creators support the crackdown. The company claims its policy “helps level the playing field” so legitimate sites competing on content quality aren’t outranked by sites using deceptive tactics.

Nayak argues the Digital Markets Act is already making Search ‘less helpful for European businesses and users,’ and says the new probe risks rewarding bad actors.

The company has relied exclusively on manual enforcement so far. Google confirmed in May 2024 that it hadn’t launched algorithmic actions for site reputation abuse, only manual reviews by human evaluators.

Google added site reputation abuse to its Search Quality Rater Guidelines in January 2025, defining it as content published on host sites “mainly because of that host site’s already-established ranking signals.”

Why This Matters

The investigation creates a conflict between spam enforcement and publisher business models.

Google maintains parasite SEO degrades search results regardless of who profits. Publishers argue sponsored content with editorial oversight provides legitimate value and revenue during challenging times for media.

The distinction matters. If Google’s policy captures legitimate publisher-advertiser partnerships, it restricts how news organizations monetize content. If the policy only targets manipulative tactics, it protects search quality.

The EU’s position suggests regulators view Google’s enforcement as potentially discriminatory. The Digital Markets Act prohibits gatekeepers from unfairly penalizing others, with fines up to 10% of global revenue for violations.

Google addressed concerns about the policy in December 2024, confirming that affiliate content properly marked isn’t affected and that publishers must submit reconsideration requests through Search Console to remove penalties.

The updated policy documentation clarified that simply having third-party content isn’t a violation unless explicitly published to exploit a site’s rankings.

The policy has sparked debate in the SEO community about whether Google should penalize sites based on business arrangements rather than content quality.

Looking Ahead

The European Commission has opened the investigation under the Digital Markets Act and will now gather evidence and define the specific DMA provisions under examination.

Google will receive formal statements of objections outlining alleged violations. The company can respond with arguments defending its policies.

DMA investigations move faster than traditional antitrust cases. Publishers may submit formal complaints providing evidence of traffic losses and revenue impacts.

The outcome could force changes to how Google enforces spam policies in Europe or validate its current approach to protecting search quality.


Featured Image: daily_creativity/Shutterstock

llms.txt: The Web’s Next Great Idea, Or Its Next Spam Magnet via @sejournal, @DuaneForrester

At a recent conference, I was asked if llms.txt mattered. I’m personally not a fan, and we’ll get into why below. I listened to a friend who told me I needed to learn more about it as she believed I didn’t fully understand the proposal, and I have to admit that she was right. After doing a deep dive on it, I now understand it much better. Unfortunately, that only served to crystallize my initial misgivings. And while this may sound like a single person disliking an idea, I’m actually trying to view this from the perspective of the search engine or the AI platform. Why would they, or why wouldn’t they, adopt this protocol? And that POV led me to some, I think, interesting insights.

We all know that search is not the only discovery layer anymore. Large-language-model (LLM)-driven tools are rewriting how web content is found, consumed, and represented. The proposed protocol, called llms.txt, attempts to help websites guide those tools. But the idea carries the same trust challenges that killed earlier “help the machine understand me” signals. This article explores what llms.txt is meant to do (as I understand it), why platforms would be reluctant, how it can be abused, and what must change before it becomes meaningful.

Image Credit: Duane Forrester

What llms.txt Hoped To Fix

Modern websites are built for human browsers: heavy JavaScript, complex navigation, interstitials, ads, dynamic templates. But most LLMs, especially at inference time, operate in constrained environments: limited context windows, single-pass document reads, and simpler retrieval than traditional search indexers. The original proposal from Answer.AI suggests adding an llms.txt markdown file at the root of a site, which lists the most important pages, optionally with flattened content so AI systems don’t have to scramble through noise.

Supporters describe the file as “a hand-crafted sitemap for AI tools” rather than a crawl-block file. In short, the theory: Give your site’s most valuable content in a cleaner, more accessible format so tools don’t skip it or misinterpret it.

The Trust Problem That Never Dies

If you step back, you discover this is a familiar pattern. Early in the web’s history, something like the meta keywords tag let a site declare what it was about; it was widely abused and ultimately ignored. Similarly, authorship markup (rel=author, etc) tried to help machines understand authority, and again, manipulation followed. Structured data (schema.org) succeeded only after years of governance and shared adoption across search engines. llms.txt sits squarely inside this lineage: a self-declared signal that promises clarity but trusts the publisher to tell the truth. Without verification, every little root-file standard becomes a vector for manipulation.

The Abuse Playbook (What Spam Teams See Immediately)

What concerns platform policy teams is plain: If a website publishes a file called llms.txt and claims whatever it likes, how does the platform know that what’s listed matches the live content users see, or can be trusted in any way? Several exploit paths open up:

  1. Cloaking through the manifest. A site lists pages in the file that are hidden from regular visitors or behind paywalls, then the AI tool ingests content nobody else sees.
  2. Keyword stuffing or link dumping. The file becomes a directory stuffed with affiliate links, low-value pages, or keyword-heavy anchors aimed at gaming retrieval.
  3. Poisoning or biasing content. If agents trust manifest entries more than the crawl of messy HTML, a malicious actor can place manipulative instructions or biased lists that affect downstream results.
  4. Third-party link chains. The file could point to off-domain URLs, redirect farms, or content islands, making your site a conduit or amplifier for low-quality content.
  5. Trust laundering. The presence of a manifest might lead an LLM to assign higher weight to listed URLs, so a thin or spammy page gets a boost purely by appearance of structure.

The broader commentary flags this risk. For instance, some industry observers argue that llms.txt “creates opportunities for abuse, such as cloaking.” And community feedback apparently confirms minimal actual uptake: “No LLM reads them.” That absence of usage ironically means fewer real-world case studies of abuse, but it also means fewer safety mechanisms have been tested.

Why Platforms Hesitate

From a platform’s viewpoint, the calculus is pragmatic: New signals add cost, risk, and enforcement burden. Here’s how the logic works.

First, signal quality. If llms.txt entries are noisy, spammy, or inconsistent with the live site, then trusting them can reduce rather than raise content quality. Platforms must ask: Will this file improve our model’s answer accuracy or create risk of misinformation or manipulation?

Second, verification cost. To trust a manifest, you need to cross-check it against the live HTML, canonical tags, structured data, site logs, etc. That takes resources. Without verification, a manifest is just another list that might lie.

Third, abuse handling. If a bad actor publishes an llms.txt manifest that lists misleading URLs which an LLM ingests, who handles the fallout? The site owner? The AI platform? The model provider? That liability issue is real.

Fourth, user-harm risk. An LLM citing content from a manifest might produce inaccurate or biased answers. This just adds to the current problem we already face with inaccurate answers and people following incorrect, wrong, or dangerous answers.

Google has already stated that it will not rely on llms.txt for its “AI Overviews” feature and continues to follow “normal SEO.” And John Mueller wrote: “FWIW no AI system currently uses llms.txt.” So the tools that could use the manifest are largely staying on the sidelines. This reflects the idea that a root-file standard without established trust is a liability.

Why Adoption Without Governance Fails

Every successful web standard has shared DNA: a governing body, a clear vocabulary, and an enforcement pathway. The standards that survive all answer one question early … “Who owns the rules?”

Schema.org worked because that answer was clear. It began as a coalition between Bing, Google, Yahoo, and Yandex. The collaboration defined a bounded vocabulary, agreed syntax, and a feedback loop with publishers. When abuse emerged (fake reviews, fake product data), those engines coordinated enforcement and refined documentation. The signal endured because it wasn’t owned by a single company or left to self-police.

Robots.txt, in contrast, survived by being minimal. It didn’t try to describe content quality or semantics. It only told crawlers what not to touch. That simplicity reduced its surface area for abuse. It required almost no trust between webmasters and platforms. The worst that could happen was over-blocking your own content; there was no incentive to lie inside the file.

llms.txt lives in the opposite world. It invites publishers to self-declare what matters most and, in its full-text variant, what the “truth” of that content is. There’s no consortium overseeing the format, no standardized schema to validate against, and no enforcement group to vet misuse. Anyone can publish one. Nobody has to respect it. And no major LLM provider today is known to consume it in production. Maybe they are, privately, but publicly, no announcements about adoption.

What Would Need To Change For Trust To Build

To shift from optional neat-idea to actual trusted signal, several conditions must be met, and each of these entails a cost in either dollars or human time, so again, dollars.

  • First, manifest verification. A signature or DNS-based verification could tie an llms.txt file to site ownership, reducing spoof risk. (cost to website)
  • Second, cross-checking. Platforms should validate that URLs listed correspond to live, public pages, and identify mismatch or cloaking via automated checks. (cost to engine/platform)
  • Third, transparency and logging. Public registries of manifests and logs of updates would make dramatic changes visible and allow community auditing. (cost to someone)
  • Fourth, measurement of benefit. Platforms need empirical evidence that ingesting llms.txt leads to meaningful improvements in answer correctness, citation accuracy, or brand representation. Until then, this is speculative. (cost to engine/platform)
  • Finally, abuse deterrence. Mechanisms must be built to detect and penalize spammy or manipulative manifest usage. Without that, spam teams simply assume negative benefit. (cost to engine/platform)

Until those elements are in place, platforms will treat llms.txt as optional at best or irrelevant at worst. So maybe you get a small benefit? Or maybe not…

The Real Value Today

For site owners, llms.txt still may have some value, but not as a guaranteed path to traffic or “AI ranking.” It can function as a content alignment tool, guiding internal teams to identify priority URLs you want AI systems to see. For documentation-heavy sites, internal agent systems, or partner tools that you control, it may make sense to publish a manifest and experiment.

However, if your goal is to influence large public LLM-powered results (such as those by Google, OpenAI, or Perplexity), you should tread cautiously. There is no public evidence those systems honor llms.txt yet. In other words: Treat llms.txt as a “mirror” of your content strategy, not a “magnet” pulling traffic. Of course, this means building the file(s) and maintaining them, so factor in the added work v. whatever return you believe you will receive.

Closing Thoughts

The web keeps trying to teach machines about itself. Each generation invents a new format, a new way to declare “here’s what matters.” And each time the same question decides its fate: “Can this signal be trusted?” With llms.txt, the idea is sound, but the trust mechanisms aren’t yet baked in. Until verification, governance, and empirical proof arrive, llms.txt will reside in the grey zone between promise and problem.

More Resources:


This post was originally published on Duane Forrester Decodes.


Featured Image: Roman Samborskyi/Shutterstock

Secrets Of A Wildly Successful Website via @sejournal, @martinibuster

Back in 2005 I intuited that there are wildly successful Internet enterprises that owed nothing to SEO. These successes intrigued me because they happened according to undocumented rules outside of the SEO bubble. These sites have stories and lessons about building success.

Turning Your Enthusiasm Into Success

In 2005 I interviewed the founder of the Church Of The Flying Spaghetti Monster, which at the time had a massive Page Rank score of 7. The founder explains how promotion was never part of a plan- in fact he denied having any success plan at all. He simply put the visual material out there and let people hotlink the heck out of it at the rate 40GB/day back in 2005.

The site is controversial because it was created in response to an idea called Intelligent Design, which is an ideology that believes that aspects of the universe and life are the products of an unseen intelligent hand and not products of undirected processes like evolution and natural selection. This article is not about religion, it’s about how someone leveraged their passion to create a wildly successful website.

The point is, there was no direct benefit to hotlinking, only the indirect benefits of putting his name out there and having it seen, known and remembered. It’s the essence of what we talk about when we talk about brand and mindshare building. Which is why I say that this interview is wildly relevant in 2013. Many of my most innovative methods for obtaining links are located within the mindset of identifying latent opportunities related to indirect benefits. There is a lot of opportunity there because most of the industry is focused on the direct-benefits/ROI mindset. Without further ado, here is the interview. Enjoy!

Secrets Of A Wildly Popular Website

The other day I stumbled across a successful website called, Church of the Flying Spaghetti Monster that does about 40 GB of traffic (including hotlinks) every single day. The site was created as a response to a social, cultural, political, and religiou issue of the day.

Many of you are interested in developing strategies to creating massively popular sites, so the following story of this hyper-successful website (PR 7, in case you were wondering) may be of interest.

Creating a website to react to controversy or a current event is an old but maybe forgotten methods for receiving links. Blogs fit into this plan very nicely. The following is the anatomy of a website created purely for the passion of it. It was not created for links or monetary benefit. Nevertheless it has accomplished what thousands of link hungry money grubbing webmasters aspire to every day. Ha!

So let’s take a peek behind the scenes of a wildly successful site that also makes decent change. The following is an interview with Bobby Henderson, the man behind the site.

Can you give me a little history of the Church of the Flying Spaghetti Monster website?

“The site was never planned. “the letter” had been written and sent off – with no reply – for months before it occurred to me to post it online.”

Have you ever built a website before, what is your web background?

“I made a website for the Roseburg, Oregon school district when I was in high school.

With the Fly Spaghetti Monster (FSM) site, I want things to be as plain and non-shiny as possible. Screw aesthetics. I don’t want it to look slick and well-designed at all. I prefer it to be just slapped together, with new content added frequently. I love it when people give me tips to make the site better. It’s received well over 100 million hits at this point, so maybe there’s something to this content-instead-of-shiny-ness thing.”

What made you decide to build your website?

“The idea of a Flying Spaghetti Monster was completely random. I wrote the letter at about 3am one night, for no particular reason other than I couldn’t sleep. And there must have been something in news about ID that day.

After posting the letter online, it was “discovered” almost immediately. It got boingboing’ed within a couple weeks, and blew up from there. I’ve done zero “promotion”. Promotion is fake. None of the site was planned, it has evolved over the months. Same with the whoring-out, the t-shirts,etc. None of that stuff was my idea. People asked for it, so I put it up. I can remember telling a friend that I would be shocked if one person bought a t-shirt. Now there have been around 20k sold.”

To what do you attribute the support of your site from so many people?

“I believe the support for the FSM project comes from spite…

I get 100-200 emails a day. Depends on the news, though. I got maybe 300 emails about that “pirate” attack on the cruise-ship. Incidentally, the reason we saw no change in global weather was because they were not real pirates. Real pirates don’t have machine guns and speedboats. (editors note: The FSM dogma asserts a connection between pirates and global warming)”

Were you surprised at how the site took off?

“Yes of course I’m surprised the site took off. And it blows my mind that it’s still alive. Yesterday was the highest-traffic day yet, with 3.5 million hits (most of those hits were hotlinked images).

What advice do you have to others who have a site they want to promote?

“Advice. . . ok .. here’s something. A lot of people go out of their way to stop hotlinking. I go out of my to allow it – going so far as paying for the extra bandwidth to let people steal my stuff. Why? It’s all part of the propaganda machine. It would be easy enough to prevent people from hotlinking FSM images. But I WANT people to see my propaganda, so why not allow it?

It’s like advertising, requiring zero effort by me. I am paying for about 40GB in bandwidth every day in just hijacked images – and it’s totally worth it, because now the Flying Spaghetti Monster is everywhere.”

Seeing how your deity is a flying spaghetti monster, I am curious… do you like eating spaghetti?

“No comment.”

Featured Image by Shutterstock/Elnur

Data Shows How AI Overviews Is Ranking Shopping Keywords via @sejournal, @martinibuster

BrightEdge’s latest research shows that Google’s AI Overviews are now appearing in ways that reflect what BrightEdge describes as “deliberate, aggressive choices” about where AI shows up and where it does not. These trends show marketers where AI search is showing up within the buyer’s journey and what businesses should expect.

The data indicates that Google is concentrating AI in parts of the shopping process where it gives clear informational value, particularly during research and evaluation. This aligns AI Overviews with the points in the shopping journey where users need help comparing options or understanding product details.

BrightEdge reports that Google retained only about 30 percent of the AI Overview keywords that appeared at the peak of its September 1 through October 15, 2025 research window. The retained queries also tended to have higher search volume than the removed ones, which BrightEdge notes is the opposite pattern observed in 2024. This fits with the higher retention in categories where shoppers look for explanations, comparisons, and instructional information.

BrightEdge explains:

“The numbers paint an interesting story: Google retained only 30% of its peak AI Overview keywords. But here’s what makes 2025 fundamentally different: those retained keywords have HIGHER search volume than removed ones—the complete opposite of 2024. Google isn’t just pulling back; it’s being strategic about which searches deserve AI guidance.”

The shifting behavior of AI Overviews shows how actively Google is tuning its system. BrightEdge observed a spike from 9 percent to 26 percent coverage on September 18 before returning to 9 percent soon after. This change signals ongoing testing. The year-over-year overlap of AI Overview keywords is only 18 percent, which BrightEdge calls a “massive reshuffling” that shows “active experimentation” and requires marketers to plan for change rather than stability. The volatility shows Google may be experimenting or responding to user trends and that the queries shown in AI Overviews can change over time.
My opinion is that Google is likely responding to user trends, testing how they respond to AI Overviews, then using the data to show more if user reactions are positive.

AI Is A Comparison And Evaluation Layer

BrightEdge’s research indicates that AI Overviews aligns with shopper intent. Google places AI in research queries such as “best TV for gaming,” continues support for evaluation queries like “Samsung vs LG,” and then withdraws when users show purchase intent with searches like “Samsung S95C price.”

These examples show that AI serves as an educational and comparison layer, not a transactional one. When a shopper reaches a buying decision, Google steps back and lets traditional results handle the final step. This apparent alignment with comparison and evaluation means Google is confident in using AI Overviews as a part of the shopping journey.

Usefulness Varies Across Categories

The data shows that AI’s usefulness varies across categories, and Google adjusts AIO keywords retention based on these needs. Categories that retained AI Overviews such as Grocery, TV and Home Theater, and Small Appliances share a pattern.

Users rely on comparison, explanation, and instruction during their decisions. In contrast, categories with low retention, like Furniture and Home, rely on visual browsing rather than text-based evaluation. This limits the value of AI. Google’s category patterns show that AI appears more often in categories where text-based information (such as comparison, explanation, and instruction) guides decisions.

Google’s keyword filtering clarifies how AI fits into the shopping journey. Among retained queries, a little more than a quarter are evaluation or comparison searches, including “best [product]” and “X vs Y” terms. These are queries where users need background and guidance. In contrast, Google removes bottom-funnel keywords. Price, buy, deals, and specific product names are removed. This shows Google’s focus is on how useful AI serves for each intent. AI educates and guides but does not handle the final purchase step.

Shopping Trends Influence AI Appearance

The shopping calendar shapes how AI appears in search results. BrightEdge describes the typical shopping journey as consisting of research in November, evaluation and comparison in early December, and buying in late December. AI helps shoppers understand options in November, assists with comparisons in early December, and by late December, AI tends to be less influential and traditional search results tend to complete the sale.

This makes November the key moment for making evaluation and comparison content easier for AI to cite. Once December arrives, the chance for AI-driven discovery shrinks because consumers have moved on to the final leg of their shopping journey, purchase.

These findings mean that brands should align their content strategies with the points in the journey where AI Overviews are active. BrightEdge advises identifying evaluation and transactional pages, ensuring that comparison content is indexed early, and watching category-specific retention patterns. The data indicates two areas where brands can focus their efforts. One is supporting AI during research and review stages. The other is improving organic search visibility for purchasing queries. The 18 percent year-over-year consistency figure also shows that flexibility is needed because the queries shown in AI Overviews change frequently.

Although the behavior of AI Overviews may seem volatile, BrightEdge’s research suggests that the changes follow a consistent pattern. AI surfaces when people are learning and evaluating and withdraws when users shift into buying. Categories that require explanations or comparisons see the highest retention in AI Overviews, and November remains the key period when AI can use that content. The overall pattern gives brands a clearer view of how AI fits into the shopping journey and how user intent shapes where AI shows up.

Read BrightEdge’s report:
Google AI Overview Holiday Shopping Test: The 57% Pullback That Changes Everything

Featured Image by Shutterstock/Misselss

Why Web Hosting Is A Critical Factor To Maximize SEO Results via @sejournal, @MattGSouthern

Most SEO professionals obsess over content, links, and technical implementations. We track algorithm updates and audit on-page elements with precision. But there’s one factor that determines whether all that work can deliver results.

Your web hosting controls every user’s first interaction with your site. It determines load speeds, uptime consistency, and Core Web Vitals scores before anyone reads a word you’ve written.

Here’s the reality. Your hosting provider isn’t a commodity service. It’s the infrastructure that either supports or sabotages your SEO efforts. When technical SEO fails, the problem can trace back to hosting limitations you don’t know exist.

Your Host Controls The Metrics Google Measures

Core Web Vitals are a key part of how hosting can impact SEO through slow page speeds. These metrics measure what your server infrastructure determines.

Your Largest Contentful Paint (LCP) score starts with server response time. When Google’s crawler requests your page, your host must respond, process the request, and start delivering content.

Fast servers respond in under 200 milliseconds. Slower infrastructure takes 500+ milliseconds, degrading your LCP before optimization work matters.

Research analyzing 7,718 businesses across 676 sectors found top 10 ranking positions consistently showed faster server response than competitors. Google’s algorithm recognizes and rewards infrastructure quality.

Your hosting provider controls these metrics through several factors:

  • SSD storage processes read/write operations exponentially faster than traditional hard drives.
  • HTTP/3 protocol support reduces latency by 3-7% compared to HTTP/2. [1, 2]
  • Content Delivery Networks distribute content to servers closer to users, eliminating distance delays.

Sites on infrastructure optimized for Core Web Vitals consistently achieve LCP under 2.5 seconds and INP under 200 milliseconds. These are Google’s “good” thresholds. Sites on legacy infrastructure struggle to meet these benchmarks regardless of front-end optimization.

Distance Still Matters In A Connected World

Server location introduces physical limitations that no optimization can overcome. Data travels at light speed through fiber optic cables, but distance matters. A California server serving New York users introduces approximately 70 milliseconds of latency from physical distance alone.

This affects SEO through Core Web Vitals performance. Geographic distance introduces latency that affects page load times. Sites struggle to meet Core Web Vitals thresholds when server infrastructure sits far from their primary audience, as distance contributes to performance problems that optimization alone can’t fully resolve.

The solution depends on your architecture. Shared, VPS, and dedicated hosting place your site on physical servers in specific data centers. Choose data centers close to your primary audience to reduce latency.

Cloud hosting distributes content differently. It serves content from multiple geographic points, mitigating distance penalties. But it requires careful configuration to ensure search engines can efficiently crawl your distributed content.

Uptime Affects How Often Google Crawls Your Site

Google allocates crawl budget partly based on your site’s reliability. When crawlers consistently encounter server timeouts, Google reduces crawl frequency to avoid wasting resources on unreliable infrastructure.

This creates a compounding problem.

Lower crawl frequency means new content takes longer to appear in search results. Updated pages don’t get re-indexed promptly. For sites publishing time-sensitive content or competing in fast-moving markets, hosting-related crawl delays can mean missing ranking opportunities.

Industry standard uptime guarantees of 99.9% translate to roughly 8.8 hours of downtime per year, or about 1.44 minutes daily. This sounds negligible, but timing matters. If those minutes occur when Google’s crawler attempts to access your site, you’ve lost that crawl opportunity. If they occur during peak traffic, you’ve lost conversions and sent negative signals to algorithms.

The business impact varies by industry:

  • Ecommerce sites lose immediate sales and long-term ranking potential.
  • News properties miss brief windows when content is most valuable.
  • Local businesses miss moments when potential customers search for their services.

Any host claiming 100% uptime should raise skepticism. Server maintenance, network routing issues, and data center problems ensure some downtime will occur. Select providers whose infrastructure design minimizes both frequency and duration of outages.

Modern Protocols Create Measurable Performance Advantages

Google’s Page Experience signals extend beyond Core Web Vitals to security and modern web standards. HTTPS has been a confirmed ranking factor since 2014, and its importance continues growing.

Modern hosts include free SSL certificates through services like Let’s Encrypt as standard features. Legacy providers may charge for SSL or create barriers that discourage upgrading to secure connections.

Beyond basic HTTPS, hosting infrastructure determines whether you can leverage protocols that improve performance. HTTP/2 introduced multiplexing capabilities that reduce latency. HTTP/3 further reduces latency through improved connection handling and better performance on unreliable networks.

These improvements translate to measurable Core Web Vitals gains. HTTP/3 can reduce page load times by 3-7% compared to HTTP/2, particularly for mobile users. Since mobile performance increasingly drives rankings, hosting infrastructure supporting the latest protocols provides competitive advantages.

Security extends beyond encryption to broader concerns. Hosts with modern security practices protect against DDoS attacks that cause downtime, implement rate limiting that prevents bot traffic from overwhelming your server, and maintain updated server software preventing exploitation of vulnerabilities.

Scalability Prevents Success From Becoming A Problem

One of hosting’s most overlooked SEO implications emerges when you succeed. Content goes viral. A campaign drives unexpected traffic. Your site appears on a major news outlet. Suddenly, the hosting plan adequate for normal traffic becomes a bottleneck.

Server resource limits (CPU, RAM, bandwidth) determine how many simultaneous users your site can serve before performance degrades. When your infrastructure can’t handle success, SEO consequences arrive quickly:

The worst-case scenario sees viral success damaging your organic performance. Content driving traffic performs poorly for new visitors, creating negative signals. Meanwhile, Google reduces crawl frequency across your site, delaying indexation of new content designed to capitalize on visibility.

Hosting providers offering easy scaling paths prevent this. Cloud platforms can automatically scale resources to match traffic demands. Traditional providers with multiple plan tiers allow upgrades without changing providers or migrating your site, reducing technical risk and preserving existing configuration.

Evaluating Hosts as Strategic Infrastructure

The hosting decision requires evaluating providers as infrastructure partners whose capabilities enable or constrain your SEO strategy, not as feature checklists to compare.

Before selecting hosting, audit your requirements. Geographic distribution of your target audience determines whether server location matters or CDN coverage is essential. Content publication frequency affects how much crawl consistency matters. Traffic patterns indicate whether you need spike-handling resources or steady-state capacity.

Consider these strategic factors when evaluating hosts:

  • Review network infrastructure and data center locations relative to your primary markets.
  • Verify track record on actual uptime rather than advertised guarantees.
  • Examine scaling options to ensure you can grow without migration disruption.
  • Evaluate technical support quality. 24/7 availability and demonstrated expertise matter during problems affecting organic performance.

Third-party monitoring services track real-world performance across major hosts, providing verification beyond marketing claims.

Why Infrastructure Determines Your SEO Ceiling

Web hosting functions as a multiplier on SEO efforts. Excellent hosting won’t compensate for poor content, but poor hosting can completely undermine excellent optimization work.

Think of hosting as a building’s foundation. A weak foundation limits how high you can build and how much weight the structure can support. You can create architectural marvels on that foundation, but they remain vulnerable. Similarly, you can implement sophisticated SEO strategies on inadequate infrastructure, but those strategies will consistently underperform their potential.

The most successful SEO programs recognize infrastructure as a strategic investment rather than a commodity expense. They select hosting providers whose capabilities align with performance requirements, whose geographic distribution matches their audience, and whose technical sophistication supports modern web standards and protocols.

As search algorithms increasingly emphasize user experience through metrics like Core Web Vitals, the hosting decision becomes more consequential. The gap between sites on modern infrastructure and those on legacy systems will widen. The organic visibility advantages of fast, reliable, geographically distributed hosting will compound over time as Google’s algorithm continues refining how it measures and rewards site performance.

Your hosting provider should be a strategic partner in your SEO program, not just a vendor in your technology stack. The infrastructure decisions you make today determine the ceiling on your organic performance potential for months or years to come.

Good hosting runs in the background without you thinking about it. That’s what an SEO-friendly web host should do: Enable your optimization work to deliver results rather than limiting what’s possible.

More Resources:


Featured Image: N Universe/Shutterstock

Is AI Search SEO Leaving Bigger Opportunities Behind? via @sejournal, @martinibuster

A recent podcast by Ahrefs raised two issues about optimizing for AI search that can cause organizations to underperform and miss out on opportunities to improve sales. The conversation illustrates a gap between realistic expectations for AI-based trends and what can be achieved through overlooked opportunities elsewhere.

YouTube Is Second Largest Search Engine

The first thing noted in the podcast is that YouTube is the second-largest search engine by queries entered in the search bar. More people type search queries into YouTube’s search bar than any other search engine except Google itself. So it absolutely makes sense for companies to seriously consider how a video strategy can work to increase traffic and brand awareness.

It should be a no-brainer that businesses figure out YouTube, and yet many businesses are rushing to spend time and money optimizing for answer engines like Perplexity and ChatGPT, which have a fraction of the traffic of YouTube.

Patrick Stox explained:

“YouTube is the second largest search engine. There’s a lot of focus on all these AI assistants. They’re in total driving less than 1% of your traffic. YouTube might be a lot more. I don’t know how much it’s going to drive traffic to the website, but there’s a lot of eyes on it. I know for us, like we see it in our signups, …they sign up for Ahrefs.

It’s an incredible channel that I think as people need to diversify, to kind of hedge their bets on where their traffic is coming from, this would be my first choice. Like go and do more video. There’s your action item. If you’re not doing it, go do more video right now.”

Tim Soulo, Ahrefs CMO, expressed curiosity that so many people are looking two or three years ahead for opportunities that may or may not materialize on AI assistants, while overlooking the real benefits available today on YouTube.

He commented:

“I feel that a lot of people get fixated on AI assistants like ChatGPT and Perplexity and optimizing for AI search because they are kind of looking three, five years ahead and they are kind of projecting that in three, five years, that might be the dominant thing, how people search.

…But again, if we focus on today, YouTube is much more popular than ChatGPT and YouTube has a lot more business potential than ChatGPT. So yeah, definitely you have to invest in AI search. You have to do the groundwork that would help you rank in Google, rank in ChatGPT and everything. …I don’t see YouTube losing its relevance five years from now. I can only see it getting bigger and bigger because the new generation of people that is growing up right now, they are very video oriented. Short form video, long form video. So yeah, definitely. If you’re putting all your eggs in the basket of ChatGPT, but not putting anything in YouTube, that’s a big mistake.”

Patrick Stox agreed with Tim, noting that Instagram and TikTok are big for short-form videos that are wildly popular today, and encouraged viewers and listeners to see how video can fit into their marketing.

Some of the disconnect regarding SEO and YouTube is that SEOs may feel that SEO is about Google, and YouTube is therefore not their domain of responsibility. I would counter that YouTube should be a part of SEOs’ concern because people use it for reviews, how-to information, and product research, and the searches on YouTube are second only to Google.

SEO/AEO/GEO Can’t Solve All AI Search Issues

The second topic they touched on was the expectations placed on SEO to solve all of a business’s traffic and visibility problems. Patrick Stox and Tim Soulo suggested that high rankings and a satisfactory marketing outcome begin and end with a high-quality product, service, and content. Problems at the product or service end cause friction and result in negative sentiment on social media. This isn’t something that you can SEO yourself out of.

Patrick Stox explained:

“We only have a certain amount of control, though. We can go and create a bunch of pages, a bunch of content. But if you have real issues, like if everyone suddenly is like Nvidia’s graphics cards suck and they’re saying that on social media and Reddit and everything, YouTube, there’s only so much you can do to combat that.

…And there might be tens of thousands of them and there’s one of me. So what am I gonna do? I’m gonna be a drop in the bucket. It’s gonna be noise in the void. The internet is still the one controlling the narrative. So there’s only so much that SEOs are gonna be able to do in a situation like that.

…So this is going to get contentious in a lot of organizations where you’re going to have to do something that the execs are going to be yelling, can’t you just change that, make it go away?”

Tim and Patrick went on to use the example of their experience with a pricing change they made a few years ago, where customers balked at the changes. Ahrefs made the change because they thought it would make their service more affordable, but despite their best efforts to answer user questions and get control of the conversation, the controversy wouldn’t go away, so they ultimately decided to give users what they wanted.

The point is that positive word of mouth isn’t necessarily an SEO issue, even though SEO/GEO/AEO is now expected to get out there and build positive brand associations so that they’re recommended by AI Mode, ChatGPT, and Perplexity.

Takeaways

  • Find balance between AI search and immediate business opportunities:
    Some organizations may focus too heavily on optimizing for AI assistants at the expense of video and multimodal search opportunities.
  • YouTube’s marketing power:
    YouTube is the second-largest search engine and a major opportunity for traffic and brand visibility.
  • Realistic expectations for SEO:
    SEO/GEO/AEO cannot fix problems rooted in poor products, services, or customer sentiment. Long-term visibility in AI search depends not just on optimization, but on maintaining positive brand sentiment.

Watch the video at about the 36 minute mark:

Featured Image by Shutterstock/Collagery

Budget SEO For Capacity, Not Output via @sejournal, @Kevin_Indig

Marketing leaders are still budgeting to grow clicks in 2026, even though AI Overviews cut organic traffic in half and AI Mode kills it almost entirely.

Image Credit: Kevin Indig

Meanwhile, close to 60% of those who responded to my recent poll report their stakeholders don’t understand the value of brand mentions in LLMs.

The SEO budget conversation has to move from “Why isn’t SEO driving more clicks?/What can we do to drive more traffic?” to “What capabilities do we need to build authority in new discovery channels?”

In 2026, the best marketing teams will stop measuring SEO success by clicks and start treating it as what it really is: a capacity and influence system.

1. Traffic-Based ROI Is A Decayed Model

Marketing budgets, on average, rose modestly in the last 12 months. Overall, marketing budgets are up 3.31%. And digital marketing spending specifically is up 7.25%.

SEO gets less than 10% of the marketing budget despite being one of the most efficient channels.

Image Credit: Kevin Indig

And for years, marketers invested this sliver of SEO budget like paid media – spend more, get more clicks. It’s time to let this go. There’s discomfort here, of course: We’re losing a significant leading indicator with traffic stagnation. In theory, SEO now appears to take “longer” to show results.

As Google dials AI in the search results up, organic clicks are destined to shrink. AI surfaces decouple visibility from clicks. Your brand can appear in every AI output response and get zero measurable traffic. In Semrush’s AI Mode study, 92-94% of AI Mode sessions produced no external clicks. (But that doesn’t mean people buy less. The opposite could be true.) Slowed growth in clicks is not a performance issue of an SEO team – it’s a system feature, and it’s the future of search. Platforms want users to stay within their ecosystems.

The implication: Traffic no longer equals demand. Brand visibility happens upstream inside AI responses, UGC threads, and recommendation loops that don’t often show in your analytics.

Image Credit: Kevin Indig

2. SEO Budgets Are Capacity Allocation, Not Spend-To-Output Trading

With paid ads, you’re buying impressions. Double your spend, you roughly double your impressions (with diminishing returns). There’s a direct, measurable relationship.

But most SEO costs are fixed: salaries, tool subscriptions, infrastructure. You pay for capacity regardless of whether your team delivers a 10% or 50% lift.

65% of those surveyed by Search Engine Journal don’t expect a reduction in SEO budget for 2026.

When deciding on next year’s budget, the question “What ROI do we expect from this spend?” is an outdated one. Instead, you need to answer this query: “What capabilities do we need to earn visibility?”

The variable isn’t spend; it’s prioritization and execution quality:

  • Paid media is transactional: Spend → user impression → user click.
  • SEO is compounding: Optimization → brand visibility → user impressions → brand influence.

Your SEO dollars don’t buy results. They buy the ability to earn trust and surface in the right systems.

3. Design Your SEO Budget Around Influence, Not Output In 2026.

Your budget planning must be scenario-based, not traffic-forecasted.

Because your SEO costs are mostly fixed, you can model it out: “If we allocate 40% of capacity to digital PR, 30% to technical SEO, 20% to content operations, and 10% to foundational research, what visibility outcomes can we reasonably expect?”

Allocate resources by priority, not by historical traffic performance. Strategize your resources for the zero-click world ahead:

  1. Digital PR: Third-party signals drive 85% of brand visibility in LLMs. Digital PR and high-quality, topically related backlink investment are crucial. The biggest gains come when you hit the upper boundaries of link quality/authority over volume.
  2. Technical SEO + UX: Get the foundation right. Agents need to review your site and make recommendations or decisions quickly.
  3. Audience + first-party data research: Users are making decisions about brands within the AI Mode outputs – know your audience and which search surfaces they use. Data from one study showed 71% of companies that exceeded revenue goals had documented personas.
  4. Content operations + re-optimizations: Content recency is non-negotiable, and LLMs prefer it. Some evidence shows refreshing every ~90 days could be a competitive edge.
  5. Additive content rich with information gain: Evergreen content is less valuable. Additive content that provides net-new takes, insights, and conversations is rewarded.
  6. Engineering + design support for interactive tools:Once the validation click is earned, you must provide value that’s worth on-page engagement.
  7. Video and custom graphics: Organic low-fi video content and custom graphics are earning highly visible mid-output placement in AIOs. Don’t let restricted resources stop you from investing in this visibility lever.

Your brand’s prioritization could vary based on audience, goals, and – of course – capacity.

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!


Featured Image: Paulo Bobita/Search Engine Journal

Google Is Not Diminishing The Use Of Structured Data In 2026 via @sejournal, @martinibuster

A recent announcement on the Google Search Central blog gave a Redditor the impression that Google was significantly reducing the use of structured data, causing them to ask if it’s worthwhile to use it anymore.

The person on Reddit posted:

“Google just posted a new update — they’re removing support for some structured data types starting in January 2026. Dataset already works only in Dataset Search, and rich results are getting more selective.

So… is schema still worth it? Or are we moving past it entirely?”

Matt Southern covered the blog post (Google Deprecates Practice Problem Structured Data In Search), focusing on the specific structured data that Google was deprecating. Google’s blog post, authored by John Mueller, could, if read quickly, be accidentally interpreted to be more alarming than it was intended to be.

Google’s announcement explained:

“We’re constantly working to simplify the search results page, so that it’s quick and easy to find the information and websites you’re looking for. As part of this effort, we regularly evaluate all of our existing features to make sure they’re still useful, both for people searching on Google and for website owners.

Through this process, we’ve identified some features that aren’t being used very often and aren’t adding significant value to users. In these cases, we’ve found that other advancements on the search results page are able to get people what they’re looking for more seamlessly. So we’re beginning to phase these lesser-used features out.

For most searches, you likely won’t notice a major difference — most of these features didn’t trigger often and weren’t interacted with much by users. But overall, this update will simplify the page and improve the speed of search results.”

Ending with the following sentence:

“Starting in January 2026, we’ll remove support for the structured data types in Search Console and its API.”

Google’s Search Features Are Always Changing

Someone responded to the initial post to reassure them that Google’s search features and the structured data that triggers them are always changing. That’s true. Google Search has consistently been in a state of change and never more visibly on the front end as it is today with AI search.

Google’s John Mueller responded to the Redditor who noted that Google is constantly changing by affirming that markup types (which includes Schema.org structured data) are always changing.

He responded:

“Exactly. Understand that markup types come and go, but a precious few you should hold on to (like title, and meta robots).”

Structured Data Curation Is Automatic

Keeping up with Schema.org structured data is easy with any modern content management system through plugins or as part of a native functionality because they are responsive to Google’s structured data guidance. So in general, it’s not something that a publisher or SEO needs to think about. Publishers on WordPress just need to keep their plugins updated.

Featured Image by Shutterstock/pathdoc

How To Cultivate Brand Mentions For Higher AI Search Rankings via @sejournal, @martinibuster

Building brand awareness has long been an important but widely overlooked part of SEO. AI Search has brought this activity to the forefront. The following ideas should assist in forming a strategy for achieving brand name mentions at a ubiquitous scale, with the goal of achieving similar ubiquity in AI search results.

Tell People About The Site

SEOs and businesses can become overly concerned with getting links and forget that the more important thing to do is to get the word out about a website. A website must have unique qualities that will positively impress people and make them enthusiastic about the brand. If the site you’re trying to build traffic to lacks those unique qualities then building links or brand awareness can become a futile activity.

User behavior signals have been a part of Google’s algorithms since the 2004 Navboost signals were kicking in and the recent Google antitrust lawsuit shows that user behavior signals have continued to play a role. What has changed is that SEOs have noticed that AI search results tend to recommend sites that are recommended by other sites, brand mentions.

The key to all of this has been to tell other sites about your site and make it clear to potential consumers or website visitors what makes your site special.

  • So the first task is always to make a site special in every possible way.
  • The second task is to tell others about the site in order to build word of mouth and top-of-mind brand presence.

Optimizing a website for users and cultivating awareness of that site are the building blocks of the external signals of authoritativeness, expertise, and popularity that Google is always talks about.

Downside of Backlink Searches

Everyone knows how to do a backlink search with third-party tools but a lot of the data consists of garbage-y sites; that’s not the tool’s fault, it’s just the state of the Internet. In any case, a backlink search is limited, it doesn’t surface the conversations real people are having about a website.

In my experience, a better way to do it is to identify all instances of where a site is linked from another site or discussed by another site.

Brand And Link Mentions

Some websites have bookmark and resource pages. These are low hanging fruit.

Search for a competitor’s links:

example.com site:.com “bookmarks” -site:example.com

example.com site:.com “resources” -site:example.com

The “-site:example.com” removes the competitor site from the search results, showing you just the sites that might mention the full URL of the site which may or may not be linked.

The TLD segmented variants are:

example.com site:.net "resources" 
example.com site:.org "resources" 
example.com site:.edu "resources" 
example.com site:.ai "resources" 
example.com site:.net "links" 
example.com site:.org "links" 
example.com site:.edu "links" 
example.com site:.ai "links" 
Etc.

The goal is not necessarily to get links. It’s to build awareness of the site and build popularity.

Brand Mentions By Company Name

One way to identify brand mentions is to search by company name using the TLD segmentation technique. Making a broad search for a company’s name will only get you some of the brand mentions. Segmenting the search by TLD will reveal a wider range of sites.

Segmented Brand Mention Search

The following assumes that the competitor’s site is on the .com domain and you’re limiting the search to .com websites.

Competitor's Brand Name site:.com -site:example.com

Segmented Variants:

Competitor's Brand Name site:.org
Competitor's Brand Name site:.edu
Competitor's Brand Name site:.Reddit.com
Competitor's Brand Name site:.io
etc.

Sponsored Articles

Sponsored articles are indexed by search engines and ranked in AI search surfaces like AI Mode and ChatGPT. These can present opportunities to purchase a sponsored post that enables you to present your message with links that are nofollow and a prominent “sponsored post” disclaimer at the top of the web page – all in compliance with Google and FTC guidelines.

Brand Mentions: Authoritativeness Is Key

The thing that some SEOs never learned is that authoritativeness is important and quite likely millions of dollars have been wasted on paying for links from low-quality blogs and higher quality sites.

ChatGPT and AI Mode are found to recommend sites that are mentioned in high quality authoritative sites. Do not waste time or money paying for mentions on low quality sites.

Some Ways To Search

Product/Service/Solution Search

Name Of Product Or Service Or Problem Needing Solving site:.com “sponsored article”
Name Of Product Or Service Or Problem Needing Solving site:.net “sponsored article”
Name Of Product Or Service Or Problem Needing Solving site:.org “sponsored article”
Name Of Product Or Service Or Problem Needing Solving site:.edu “sponsored article”
Name Of Product Or Service Or Problem Needing Solving site:.io “sponsored article”
etc.

Sponsored Post Variant

Name Of Product Or Service Or Problem Needing Solving site:.com “sponsored post”
Name Of Product Or Service Or Problem Needing Solving site:.net “sponsored post”
Name Of Product Or Service Or Problem Needing Solving site:.org “sponsored post”
Name Of Product Or Service Or Problem Needing Solving site:.edu “sponsored post”
Name Of Product Or Service Or Problem Needing Solving site:.io “sponsored post”
etc.

Key insight: Test whether “sponsored post” or “sponsored article” provides better results or just more results. Using quotation marks, or if necessary the verbatim search tool, will stop Google from stemming the search results and prevents it from showing a mix of both “post” and “article” results. By forcing Google to be specific, you’re forcing Google to show more search results.

Competitor Search

Competitor’s Brand Name site:.com “sponsored post”
Competitor’s Brand Name site:.net “sponsored post”
Competitor’s Brand Name site:.org “sponsored post”
Competitor’s Brand Name site:.edu “sponsored post”
Competitor’s Brand Name site:.io “sponsored post”
etc.

Pure Awareness Building With Zero Internet Presence

This method of getting the word out is pure gold, especially for B2B but also for professional businesses such as in the legal niches. There are organizations and associations that print magazines or send out newsletters to thousands, sometimes tens of thousands, of people who are an exact match for the people you want to build top of mind brand name recognition with.

Emails and magazines do not have links and that’s okay. The goal is to build name brand recognition with positive associations. What better way than getting interviewed in a newsletter or magazine? What better way than submitting an article to a newsletter or magazine?

Don’t Forget PDF Magazines

Not all magazines are print, many magazines are in the form of a PDF. For example, I subscribe to a surf fishing magazine that is entirely in a proprietary web format that can only be viewed by subscribers. If I were a fishing company, I would make an effort to meet some of article authors, in addition to the publishers, at fishing industry conferences where they appear as presenters and in product booths.

This kind of outreach is in-person, it’s called relationship building. 

Getting back to the industry organizations and associations, this is an entire topic in itself and I’ll follow up with another article, but many of the techniques covered in this guide will work with this kind of brand building.

Using the filetype search operator in combination with the TLD segmentation will yield some of these kinds of brand building opportunities.

[product/service/keyword/niche] filetype:pdf site:.com newsletter
[product/service/keyword/niche] filetype:pdf site:.org newsletter

1. Segment the search for opportunities search by TLD .net/.com/.org/.us/.edu, etc.
Segmenting by TLD will help you discover different kinds of brand building opportunities. Websites on a Dot Org domain often link to a site for different reasons than a Dot Com website. Dot org domains represent article writing projects, free links on a links page, newsletter article opportunity, and charity link opportunities, just to name a few.

2. Consider Segmenting Dot Com Searches
The Dot Com TLD will yields an overabundance of search results, not all of them useful. This makes it imperative to segment the results to find all available opportunities. Even if you’re

Ways to segment the Dot Com are by:

  • A. Kinds of sites (blog/shopping related keywords/product or service keywords/forum/etc.)
    This is pretty straightforward. If you’re looking for brand mentions be sure to add keywords to the searches that are directly relevant to what your business is about. If your site is about car injuries then sites about cars as well as specific makes, models, and kinds of automobiles are how you would segment a .com search
  • B. Context – Audience Relevance Not Keyword Match
    Context of a sponsored article is important. This is not about whether the website content matches what your site, business, product, or service are about.  What’s important is to identify if the audience reach is an exact match to the audience that will be interested in your product, business, or service.
  • C. Quality And Authoritativeness
    This is not about third-party metrics related to links. This is just about making a common sense judgment about whether a site where you want a mention is well-regarded by those who are likely to be interested in your brand. That’s it.

Takeaway

The thing I want you to walk away with is that it’s useful to just tell people about a site and to get as many people as possible aware of it. Identify opportunities for ways to get them to tell a friend. There is no better recommendation than the one you can get from a friend or from a trusted organization.  This is the true source of authoritativeness and popularity.

Featured Image by Shutterstock/Bird stocker TH

Why Strategic Review Is The Missing Layer In Many SEO Campaigns via @sejournal, @coreydmorris

Whether you call your SEO efforts a strategy, campaign, or channel, many SEO programs start strong but slowly drift. That could be in the form of reports getting routine, dashboards taking over for thinking, and moving into a mode of “doing SEO” versus challenging and building it.

In many cases, there’s an initial audit, roadmap, and then turn to implementation. Those are all good things, and I strongly advocate for the right level of strategy, research, and planning before moving into any level of ongoing work. However, monthly reports or dashboards, and little reflection can lead to stale tactics.

When activity, tactics, and implementation are the biggest part of what is reported on and/or measured, I question if enough strategic thinking and approach exist.

A strategic review and approach included a structured, periodic checkpoint within the process to assess performance. That includes a mixture of team (and resource/partner/vendor) alignment, execution, and continued connection to overall business goals that SEO is mapped out to impact.

Similar to a retrospective or ending a sprint in agile methodology, it is time for a look backwards at what worked, what didn’t, and where we need to go next in the overall SEO investment. This is different than just a set of reports and metrics; it is time for true reflection and recalibration beyond just measurement.

Why Strategy Is Often Missing

There are some common reasons SEO teams and resources skip strategic review and don’t have the layer fully in place. At times, SEO can seem like an ongoing checklist of things to audit, crawl, fix, and optimize. It can also feel like something that is always on or never-ending.

While all of those things are true to some degree, I think with SEO being a longer-term discipline before seeing return on investment (ROI), there’s pressure to show activity as progress before seeing tangible results, and this can be hard to change after habits and patterns form are embedded in the process.

Agency and client relationships can become rooted in deliverables and lose strategic direction over time. Or, a lack of ownership can exist where no one person or entity truly feels accountable for stepping back and considering if the strategy is still right and delivering.

Risks Of Skipping Strategy

When teams lack or drift from strategy, they run the risk of optimizing for the wrong things. Whether that is the topics, content, context, or even chasing the wrong key performance indicators (KPIs). Going for traffic and things that show activity and progress alone, and are disconnected from the bottom line, lead to danger when they can’t convert at some point.

Additionally, silos can exist, and insights can stay within the silos. When SEO is reduced to activities, tactics, and just actions, learnings from content, dev, brand, product development, customer service, leadership, and other functions aren’t shared with SEO, and vice versa.

Plus, in a world where new information, strategies, and opportunities seemingly emerge daily with how SEO works, AI search, and other areas of change, it is easy to get outdated quickly with assumptions about intent, audience behavior, and connections to the bottom line.

Strategy Integrated Ongoing SEO

Establish A Cadence

The ideal timing for how often to revisit strategy or how it integrates into the ongoing SEO effort is different for everyone. Whether it is quarterly, monthly, or on some frequency that matches the speed at which SEO can and will be implemented, along with the speed of the rest of the moving parts in digital marketing, it is important to lock it in. And, adjust where necessary, but do not keep pushing it down the road.

Since SEO is often an indefinitely ongoing investment, I like the use of sprints and agile thinking, and in this case, building into the agile process. Ultimately, the goal is to not drift or move into a void far enough where strategy problems start happening, yet are missed or ignored.

Dig Deep Enough

However and whenever you build in the strategic review part of the process, there are some key questions to ask however formally you format the process.

This starts with strategy alignment. Are our current goals still the right ones to anchor to? Do they map out to business outcomes versus indicators or vanity metrics? Can we get deep enough in measurement of impact and attribution?

From there, execution and focus are important to review. This includes looking at the tactics that had an impact versus those that didn’t. And, to fully understand why.

Now, we can set our sights on the next sprint or period, looking forward. Consider the opportunities ahead, including trends, SERP features, audience behaviors, AI, and anything else that has emerged that needs to be factored into the effort.

Bring People Together

A tale as old as time in SEO having the best plans stalled out by a lack of resources or a strong resource plan. This means we need to make sure we have the right people, whether they are on the team, in another department, freelance, or at a vendor company, booked and lined up to help us implement.

Better yet, if you can have them in the room with you at any part of the strategic review to learn from the insights you’re seeing and help shape the plan, sharing out of their subject matter expertise and perspective, even better. This is your chance to break down silos and get more integration of SEO with other functions.

Be Structured

I have to confess that I love to iterate and try new things with processes. That’s part of what drew me into SEO over 20 years ago. However, I think that there has to be consistency in the approach and process. You don’t want to spend too much time overdoing it in ongoing strategic reviews. At the same time, you don’t want to be too shallow and gloss over it.

I recommend borrowing some agile retrospective agenda formats and structures to look at what to start, stop, continue, and plan what’s next. Borrow from that if you are struggling to come up with a simple enough, yet powerful review criteria and process.

Revise The Plan

It might feel like a given that you’ll take the work you did and integrate it into your plan and efforts. I simply want to wrap up here by stating the obvious that you need to feed insights into the next period’s plan. That could also include adjusting goals, KPIs, and tactical priorities.

The key is to take things from talk and spreadsheets to action. Especially if your efforts have multiple layers, integrations of teams, or client/agency relationships.

Wrapping Up

SEO is a long game, but progress happens in shorter cycles. It can become a routine, a checklist, or a thing to “do” over time. Often, outdated strategies and tactics come from a lack of frequent enough critical strategic review and adjustment.

My goal for you is to not encounter these issues or find out later than you wished that your SEO has been drifting or gotten stale and isn’t delivering (and hasn’t for some time). The most strategic SEO efforts aren’t always the busiest or most activity-filled with quantity, but are focused on quality and have the mechanisms in place and often enough to adapt intentionally.

The best SEO teams and efforts aren’t just executing; they’re evolving.

More Resources:


Featured Image: Master1305/Shutterstock