What link building should be trying to accomplish, in my opinion, is proving that a site is trustworthy and making sure the machine understands what topic your web pages fit into. The way to communicate trustworthiness is to be careful about what sites you obtain links from and to be super careful about what sites your site links out to.
Context Of Links Matter
Maybe it doesn’t have to be said but I’ll say it: It’s important now more than ever that the page your link is on has relevant content on it and that the context for your link is an exact match for the page that’s being linked to.
Outgoing Links Can Signal A Site Is Poisoned
Also make sure that the outgoing links are to legitimate sites, not to sites that are low quality or in problematic neighborhoods. If those kinds of links are anywhere on the site it’s best to consider the entire site poisoned and ignore it.
The reason I say to consider the site poisoned is the link distance ranking algorithm concept where inbound links tell a story about how trustworthy a site is. Low quality outbound links are a signal that something’s wrong with the site. It’s possible that a site like that will have its ability to pass PageRank removed.
Reduced Link Graph
This is how the Reduced Link Graph works, where the spammy sites are kicked out of the link graph and only the legit sites are kept for ranking purposes and link propagation. The link graph can be thought of as a map of the internet with websites connected to each other by links. When you kick out the spammy sites that’s called the reduced link graph.
Search engines are at a point where they can rank websites based on the content alone. Links still matter but the content itself is now the highest level ranking factor. I suspect that in general the link signal isn’t very healthy right now. Less people are blogging across all topics. Some topics have a healthy blogging ecosystem but in general there aren’t professors blogging about technology in the classroom and there aren’t HR executives sharing workplace insights and so on like there used to be ten or fifteen years ago.
Links for Inclusion
I’m of the opinion that links increasingly are useful for determining if a site is legit, high quality, and trustworthy, deeming it worthy for consideration in the search results. In order to stay in the SERPs it’s important to think about the outbound links on your site and the sites you obtain links from. Think in terms of reduced link graphs, with spammy sites stuck on the outside within their own spammy cliques and the non-spam on the inside within the trusted Reduced Link Graph.
In my opinion, you must be in the trusted Reduced Link Graph in order to stay in play.
Is Link Building Over?
Link building is definitely not over. There’s still important. What needs to change is how links are acquired. The age of blasting out emails at scale are over. There aren’t enough legitimate websites to make that worthwhile. It’s better to be selective and targeted about which sites you get a (free) link from.
Something else that’s becoming increasingly important is citations, other sites talking about your site. An interesting thing right now is that sponsored articles, sometimes known as native advertising, will get cited in AI search engines, including Google AI Overviews and AI Mode. This is a great way to get a citation in a way that will not hurt your rankings as long as the sponsored article is clearly labeled as sponsored and the outbound links are nofollowed.
Takeaways
Links As Trust And Context Signals, Not Drivers Of Ranking Links increasingly function to confirm that a site is legitimate and topically aligned, rather than to directly push rankings through volume or anchor text manipulation as in the old days.
The Reduced Link Graph Matters Search engines filter out spammy or low-quality sites, leaving a smaller trusted network where links and associations still count. Being outside this trusted graph puts sites at risk of exclusion.
Content Matters, Links Qualify Search engines can rank many pages based on content alone, but links can still act as a gatekeeper for credibility and inclusion, especially for competitive topics.
Outbound Links Are A Risk Signal Linking out to low-quality or problematic sites can damage a site’s perceived trustworthiness and its ability to pass value.
Traditional Link Building Is Obsolete Scaled outreach, anchor text strategies, and chasing volume are ineffective in an AI-driven search environment.
Citations Are Rising In Importance Mentions and discussions of a website can cause a site to rank better in AI search engines
Sponsored Articles Sponsored articles that are properly labeled as sponsored content and containing nofollowed links are increasingly surfaced in AI search features and contribute to visibility.
Link building is still relevant, but not in the way it used to be. Its function now is likely more about establishing whether a site is legitimate and clearly associated with a real topic area, not to push rankings through volume, anchors, or scale. Focusing on clean outbound links, selective relationships with trusted sites, and credible citations keeps a site inside the trusted reduced link graph, which is the condition that allows strong content to compete and appear in both traditional search results and AI-driven search surfaces.
Google’s Danny Sullivan offered advice to SEOs who have clients asking for updates on what they’re going to do for AI SEO. He acknowledged it’s easier to give the advice than it is to have to actually tell clients, but he also said that advancements in content management systems drive technical SEO into the background, enabling SEOs and publishers to focus on the content.
What To Tell Clients
Danny Sullivan acknowledged that SEOs are in a tough spot with clients. He didn’t suggest specifics for how to rank better in AI search (although later in the podcast he did offer suggestions for what to do to rank better in AI search).
But he did offer suggestions for what to tell clients.
Danny explained:
“And the other thing is, and I’ve seen a number of people remark on this, is this concern that, well, I’ve been doing SEO, but now I’m getting clients or people saying to me, but I need the new stuff. I need the new stuff. And I can’t just tell them it’s the same old stuff.
So I don’t know if you feel like you need to dress it up a bit more, but I think the way you dress it up is to say, These are continuing to be the things that are going to make you successful in the long-term. I get you want the fancy new type of thing, but the history is that the fancy new type of thing doesn’t always stick around if we go off and do these particular types of things…
I’m keeping an eye on it, but right now, the best advice I can tell you when it comes to how we’re going to be successful with our AEO is that we continue on doing the stuff that we’ve been doing because that is what it’s built on.
Which is easy for me to say ’cause I don’t got someone banging on the door to say, Well, actually we do. And so we are doing that.
So that’s why, as part of the podcast, it’s just to kind of reassure that, look, just because the formats are changing didn’t mean you have to change everything that you had to do and that everything you had to shift around.”
Downside Of Prioritizing AEO/GEO For AI Search Visibility
There are many in the SEO community who are suggesting fairly spammy things to do to rank better in AI chatbots like ChatGPT, like creating listicles that recommend themselves as best whatever. Others are doing things like tweaking on keyword phrases, the kind of thing SEOs stopped doing by 2005 or 2006.
The problem with making dramatic changes to content in order to rank better in chatbots is that ChatGPT, Perplexity, and Anthropic Claude’s search traffic share is a fraction of a percent for each of them, with Claude close zero and ChatGPT estimated to be 0.2% – 0.5%.
So it absolutely makes zero sense to prioritize AEO/GEO over Google and Bing search at this point because the return on the investment is close to zero. It’s a different story when it comes to Google AI Overviews and AI Mode, but the underlying ranking systems for both AI interfaces remain Google’s classic search.
Danny shared that focusing on things that are specific to AI risks complicating what should be simple.
Google’s Danny Sullivan shared:
“And in fact, that the more that you dramatically shift things around, and start doing something completely different, or the more that you start thinking I need to do two different things, the more that you may be making things far more complicated, not necessarily successful in the long term as you think they are.”
Technical SEO Is Needed Less?
John Mueller followed up by mentioning that the advanced state of content management systems today means that SEOs and publishers no longer have to spend as much time on technical SEO issues because most CMS’s have the basics of SEO handled virtually out of the box. Danny Sullivan said that this frees up SEOs and creators to focus on their content, which he insisted will be helpful for ranking in AI search surfaces.
John Mueller commented:
“I think that makes a lot of sense. I think one of the things that perhaps throws SEOs off a little bit is that in the early days, there was a lot of almost like a technical transition where people initially had to do a lot of technical specific things to make their site even kind of accessible in search. And at some point nowadays, I think if you’re using a popular CMS like WordPress or Wix or any of them, basically you don’t have to worry about any of those technical details.
So it’s almost like that technical side of things is a lot less in the foreground now, and you can really focus on the content, and that’s really what users are looking for. So it’s like that, almost like a transition from technical to content side with regards to SEO.”
This echoed a previous statement from earlier in the podcast where Danny remarked on how some people have begun worrying less about SEO and focusing on content.
Danny said:
“But we really just want you to focus on your content and not really worry about this. If your content is on the web and generally accessible as most people’s content is, that’s it.
I’ve actually been heartened that I’ve seen a number of people saying things like: I don’t even want to think about this SEO stuff anymore. I’m just getting back into the joy of writing blogs.
I’m like, yes, great. That’s what we want you to do. That’s where we think you’re going to find your most success.”
Listen to Danny Sullivan’s remarks at about the 8 minute mark:
Google’s John Mueller and Danny Sullivan discussed their thoughts on AI search and what SEOs and creators should be doing to make sure their content is surfaced. Danny showed some concern for folks who were relying on commodity content that is widely available.
What Creators Should Focus On For AI
John Mueller asked Danny Sullivan what publishers should be focusing on right now that’s specific to AI. Danny answered by explaining what kind of content you should not focus on and what kind of content creators should be focusing on.
He explained that the kind of content that creators should not focus on is commodity content. Commodity content is web content that consists of information that’s widely available and offers no unique value, no perspective, and requires no expertise. It is the kind of content that’s virtually interchangeable with any other site’s content because they are all essentially generic.
While Danny Sullivan did not mention recipe sites, his discussion about commodity content immediately brought recipe sites to mind because those kinds of sites seemingly go out of their way to present themselves as generically as possible, from the way the sites look, the “I’m just a mom of two kids” bio, and the recipes they provide. In my opinion, what Danny Sullivan said should make creators consider what they bring to the web that makes them notable.
To explain what he meant by commodity content, Danny used the example of publishers who used to optimize a web page for the time that the Super Bowl game began. His description of the long preamble they wrote before giving the generic answer of what time the Super Bowl starts reminded me again of recipe sites.
At about the twelve minute mark John Mueller asked Danny:
“So what would you say web creators should focus on nowadays with all of the AI?”
Danny answered:
“A key thing is to really focus on is the original aspect. Not a new thing.
These are not new things beyond search, but if you’re really trying to reframe your mind about what’s important, I think that on one hand, there’s a lot of content that is just kind of commodity content, factual information, and I think that the… LLM, AI systems are doing a good job of presenting that sort of stuff.
And it’s not originating from any type of thing.
So the classic example, as you know, will make people laugh, …but every year we have this little American football thing called the Super Bowl, which is our big event.
…But no one ever can seem to remember what time it’s on.
…Multiple places would then all write their “what time does the Super Bowl start in 2011?” post. And then they would write these giant long things.
…So, you know, and then at some point, we could see enough information and we have data feeds and everything else that we just kind of said, you do a search and …the Super Bowl is going to be at 3:30.
…I think the vast majority of people say, that’s a good thing. Thank you for just telling me the time of the Super Bowl.
It wasn’t super original information.”
Commodity Content Is Not Your Strength
Next Danny considered some of the content people are publishing today, encouraging them to think about the generic nature of their content and to give some thought to how they can share something more original and unique.
Danny continued his answer:
“I think that is a thing people need to understand, is that more of this sort of commodity stuff, it isn’t going to necessarily be your strength.
And I do worry that some people, even with traditional SEO, focus on it too much.
There are a number of sites I know from the research and things that I’ve done that get a huge amount of traffic for the answer to various popular online word-solving games.
It’s just every day I’m going to give you the answer to it. …and that is great. Until the system shifts or whatever, and it’s common enough, or we’re pulling it from a feed or whatever, and now it’s like, here’s the answer.”
Bring Your Expertise To AI
Danny next suggested that people who are concerned about showing up in AI should start exploring how to express their authentic experience or expertise. He said this advice is not just for text content but also to video and podcast content as well.
He continued:
“Your original voice is that thing that only you can provide. It’s your particular take.
And so that’s what we think was our number one thing when we’re telling people is like, this is what we think your strength is going to be.
As we go into this new world, is already what you should be doing, but this is what your strength that you should be doing is focus on that original content.
I think related to that is this idea that people are also seeking original content that’s, …authentic to them, which typically means it’s a video, it’s a podcast…
…And you’ve seen that in the search we’ve already done, where we brought in more social, more experiential content. Not to take away from the expert takes, it’s just that people want that.
Sometimes you’re just wanting to know someone’s firsthand experience alongside some expert take on it as well.
But if you are providing those expert takes, you’re doing reviews or whatever, and you’ve done that in the written form, you still have the opportunity to be doing those in videos and podcasts and so on. Those are other opportunities.
So those are things that, again, it’s not unique to the AI formats, but they just may be, as you’re thinking about, how do I reevaluate what I’m doing overall in this era, that these are things you may want to be considering with it from there.”
John Mueller agreed that it makes sense to bring your unique voice to content in order to make it stand out. Danny’s point treats visibility in AI driven search as a matter of differentiation rather than optimization. The emphasis is not on adapting content to a new format, but on creating a recognizable voice and perspective with which to stand out. Given that AI Search is still classic search under the hood, it makes sense to stand out from competitors with unique content that people will recognize and recommend.
Listen to the passage at around the twelve minute mark:
Trust is commonly understood to be a standalone quality that is passed between sites regardless of link neighborhood or topical vertical. What I’m going to demonstrate is that “trust” is not a thing that trickles down from a trusted site to another site. The implication for link building is that many may have been focusing on the wrong thing.
Six years ago I was the first person to write about link distance ranking algorithms that are a way to create a map of the Internet that begins with a group of sites that are judged to be trustworthy. These sites are called the seed set. The seed set links to other sites, which in turn link to ever increasing groups of other sites. The sites closer to the original seed set tend to be trustworthy websites. The sites that are furthest away from the seed set tend to be not trustworthy.
Google still counts links as part of the ranking process so it’s likely that there continues to be a seed set that is considered trustworthy from which the further away you a site is linked from the seeds the likelier it is considered to be spam.
Circling back to the idea of trust as a ranking related factor, trust is not a thing that is passed from one site to another. Trust, in this context, is not even a part of the conversation. Sites are said to be trustworthy by the link distance between the site in question and the original seed set. So you see, there is no trust that is conveyed from one site or another.
The word Trustworthiness is even a part of the E-E-A-T standard of what constitutes a quality website. So trust should never be considered as a thing that is passed from one site to another because it does not exist.
The takeaway is that link building decisions based on the idea of trust propagated through links are built on an outdated premise. What matters is whether a site sits close to trusted seed sites within the same topical neighborhood, not whether it receives a link from a widely recognized or authoritative domain. This insight transforms link evaluation into a relevance problem rather than a reputation problem. This insight should encourage site owners to focus on earning links that reinforce topical alignment instead of chasing links that appear impressive but have little, if any, ranking value.
Why Third Party Authority Metrics Are Inaccurate
The second thing about the link distance ranking algorithms that I think is quite cool and elegant is that websites naturally coalesce around each other according to their topics. Some topics are highly linked and some, like various business association verticals, are not well linked at all. The consequence is that those poorly linked sites that are nevertheless close to the original seed set do not acquire much “link equity” because their link neighborhoods are so small.
What that means is that a low-linked vertical can be a part of the original seed set and display low third-party authority metrics scores. The implication is that the third-party link metrics that measure how many inbound links a site has fail. They fail because third-party authority metrics follow the old and outdated PageRank scoring method that counts the amount of inbound links a site has. PageRank was created around 1998 and is so old that the patent on it has expired.
The seed set paradigm does not measure inbound links. It measures the distance from sites that are judged to be trustworthy. That has nothing to do with how many links those seed set sites have and everything to do with them being trustworthy, which is a subjective judgment.
That’s why I say that third-party link authority metrics are outdated. They don’t follow the seed set paradigm, they follow the old and outdated PageRank paradigm. The insight to take away from this is that many highly trustworthy sites are being overlooked for link building purposes because link builders are judging the quality of a site by outdated metrics that incorrectly devalue sites in verticals that aren’t well linked but are actually very close to the trustworthy seed set.
The Important Of Link Neighborhoods
Let’s circle back to the observation that websites tend to naturally link to other sites that are on the same topic. What’s interesting about this is that the seed sets can be chosen according to topic verticals. Some verticals have a lot of inbound links and some verticals are in their own little corner of the Internet and aren’t link to from outside of their clique.
A link distance ranking algorithm can thus be used to calculate the relevance according to whatever neighbhorhood a site is located in. Majestic does something like that with their Trust Flow and Topical Trust Flow metrics that actually start with trusted seed sites. Topical Trust Flow breaks that score down into specific topic categories. The Topical Trust Flow metric shows how relevant a website is for a given metric.
My point isn’t that you should use that metric, although I think it’s the best one available today. The point is that there is no context for thinking about trustworthiness as something that spreads from link to link.
Once you can think of links in the paradigm of distance within a topic category it becomes easier to understand why a link from a university website or some other so-called “high trust” site isn’t necessarily that good or useful. I know for certain because there was a time before distance ranking where the topic of the site didn’t matter but now it does matter very much and it has mattered for a long time now.
The takeaways here are:
It is counterproductive to go after so-called “high trust” links from verticals that are well outside of the topic of the website you’re trying to get a link to.
This means that it’s more important to get links from sites that are in the right topic or from a context that exactly matches the topic, from a website that’s in an adjacent topical category.
For example, a site like The Washington Post is not a part of the Credit Repair niche. Any “trust” that may be calculated from a New York Times link to a Credit Repair site will likely be dampened to zero. Of course it will. Remember, seed set trust distance is calculated within groups within a niche. There is no trust passed from one link to another link. It is only the distance that is counted.
Logically, it makes sense to assume that there will be no validating effect between irrelevant sites. relevant website for the purposes of the seed set trust calculations.
Takeaways
Trust is not something that’s passed by links Link distance ranking algorithms do not deal with “trust.” They only measure how close a site is to a trusted seed set within a topic.
Link distance matters more than link volume Ranking systems based on link distance assess proximity to trusted seed sites, not how many inbound links a site has.
Topic-based link neighborhoods shape relevance Websites naturally cluster by topic, and link value is likely evaluated within those topical clusters rather than across the entire web. A non-relevant link can still have some small value but irrelevant links stopped working almost twenty years ago.
Third-party authority metrics are misaligned with modern link ranking systems Some third-party metrics rely on outdated Page Rank-style link counting and fail to account for seed set distance and topical context.
Low-link verticals are undervalued by SEOs Entire niches that are lightly linked can still sit close to trusted seed sets, yet appear weak in third-party metrics, causing them to be overlooked in link builders.
Relevance outweighs perceived link strength Links from well-known but topically irrelevant sites likely contribute little or nothing compared to links from closely related or adjacent topic sites.
Modern link evaluation is about topical proximity, not “trust” or raw link counts. Search systems measure how close a site is to trusted seed sites within its own topic neighborhood, which means relevant links from smaller, niche sites can matter more than links from famous but unrelated domains.
This knowledge should enable smarter link building by focusing efforts on contextually relevant websites that may actually strengthen relevance and rankings, instead of chasing outdated link authority scores that no longer reflect how search works.
Link building outreach is not just blasting out emails. There’s also a conversation that happens when someone emails you back with a skeptical question. The following are tactics to use for overcoming skeptical responses.
In my opinion it’s always a positive sign when someone responds to an email, even if they’re skeptical. I consider nearly all email responses to be indicators that a link is waiting to happen. This is why a good strategy that anticipates common questions will help you convert skeptical responses into links.
Many responses tend to be questions. What they are asking, between the lines, is for you to help them overcome their suspicions. Anytime you receive a skeptical response, try to view it as them asking you, “Help me understand that you are legitimate and represent a legitimate website that we should be linking to.”
The question is asked between the lines. The answer should similarly be addressed between the lines. Ninety nine percent of the time, a between-the-lines question should not be answered directly. The perfect way to answer those questions, the perfect way to address an underlying concern, is to answer it in the same way you received it, between the lines.
Common and weird questions that I used to get were like:
Who are you?
Who do you work for?
How did get my email address?
Before I discuss how I address those questions, I want to mention something important that I do not do. I do not try to actively convert the respondent in the first response. In my response to their response to my outreach, I never ask them to link to the site.
The question of linking is already hanging in the air and is the object of their email to you- there is no need to bring that up. If in your response you ask them again to link to your site it will tilt them back to being suspicious of you, raising the odds of losing the link.
In trout fishing, the successful angler crouches so that the trout does not see you. The successful angler may even wear clothing that helps them blend into the background. The best anglers imitate the crane, a fish-eating bird that stands perfectly still, imperceptibly inching closer to its prey. This is done to avoid being noticed. Your response should imitate the crane or the camouflaged angler. You should put yourself into the mindset of anything but a marketer asking for a link.
Your response must not be to immediately ask for a link because that in my opinion will just lose the link. So don’t do it just yet.
Tribal Affinity
One approach that I used to use successful is what I called the Tribal Affinity approach. For a construction/home/real estate related campaign, I used to approach it with the mindset of a homeowner. I wouldn’t say that I’m a homeowner (even though I was), I would just think in terms of what would I say as a homeowner contacting a company to suggest a real estate or home repair type a link. In the broken link or suggest a link strategy, I would say that the three links I am suggesting for their links page have been useful to me.
Be A Mirror
A tribal affinity response that was useful to me is to mirror the person I’m outreaching to, to assume the mindset of the person I am responding to. So for example, if they are a toy collector then your mindset can also be a toy collector. If the outreach target is a club member then your outreach mindset can be an enthusiast of whatever the club is about. I never claim membership in any particular organization, club or association. I limit my affinity to mirroring the same shared mindset as the person I’m outreaching to.
Assume The Mindset
Another approach is to assume the mindset of someone who happened upon the links page with a broken link or missing a good quality link. When you get into the mindset the text of your email will be more natural.
Thus, when someone responds by challenging me by asking how I found their site or who am I working for my response is to just stick to my mindset of a homeowner and respond accordingly.
And really, what’s going on is that they’re not really asking how you found their site. What they’re really asking, between the lines, is if you’re a marketer of some kind. You can go ahead and say yes, you are. Or you can respond between the lines and say that you’re just a homeowner. Up to you.
There are many variations to this approach. The important points are:
Responses that challenge you are not necessarily hostile but are often link conversions waiting to happen.
Never respond to a response by asking for a link.
Put yourself into the right mindset. Thinking like a marketer will usually lead to a conversion dampening response.
Put yourself into the mindset that mirrors the person you outreach to.
Get into the mindset that gives you a plausible reason for finding their site and the best words for asking for a link will write themselves.
Few CEOs ever ask hard questions about their company website. They’ll sign off on multimillion-dollar redesigns, approve ad budgets, and endorse “digital transformation” plans, but rarely ask how much enterprise value their digital infrastructure is actually creating.
That’s a problem, because the website is no longer a marketing artifact. It’s the factory floor of digital value creation. Every lead, sale, customer interaction, and data signal runs through it. When the site performs well, it compounds growth. When it underperforms, it silently leaks shareholder value.
Executives don’t need to understand HTML or crawl budgets. But they do need to ask sharper questions. They need to ask the kind that expose hidden risk, surface inefficiencies, and align digital investments with measurable business outcomes. In the age of AI-driven search, where visibility and trust are determined algorithmically, these questions aren’t optional. They’re fiduciary.
Why CEOs Must Ask – Even If SEO’s Believe It Is “Beneath” Them
There’s a persistent misconception in digital circles: that CEOs shouldn’t concern themselves with SEO, site performance, or technical issues. “That’s marketing’s job,” people say. But the truth is, these issues directly affect the metrics that boards and investors care about most – operating margin, revenue growth, capital efficiency, and risk mitigation.
When a website is treated as an expense line rather than a capital asset, accountability disappears. Teams chase traffic over value, marketing spend rises to offset organic losses, and executives are left with fragmented data that hides the real cost of inefficiency.
A CEO’s job isn’t to approve color palettes or keyword lists. It’s to ensure the digital infrastructure is producing measurable returns on invested capital just as they would for a factory, logistics system, or data center.
The Cost Of Not Asking
Every company has a “digital balance sheet,” even if it’s never been documented. Behind every campaign and click lies a network of dependencies, from page speed and content accuracy to structured data, discoverability, and cross-market alignment. When those systems falter, the losses are invisible but compounding:
Organic visibility declines, forcing paid media spend to rise.
AI search engines misattribute content or cite competitors instead.
Global teams duplicate content, fragmenting authority and wasting budget.
In one multinational I audited, over $5 million per month in paid search spend was compensating for lost organic traffic caused by broken hreflang tags and indexation gaps.
A similar disconnect played out publicly when the CMO of a major retail brand was asked during an earnings call about their online holiday strategy. He confidently declared, “As the largest reseller in our category, we’ll dominate the season online.” Within seconds, a reporter searched the category term, and the brand didn’t appear on page one. The CMO was stunned. He had assumed offline dominance guaranteed online visibility. It didn’t.
That thirty-second fact-check illustrated a billion-dollar truth: market leadership offline doesn’t ensure findability online. Without the right questions and governance, digital equity erodes silently until someone outside the company exposes it.
No CEO would tolerate that level of inefficiency in their supply chain. Yet it happens online every day, unnoticed, because few know which questions to ask.
The 10 Questions Every CEO Should Be Asking
These questions aren’t tactical; they’re financial. They surface whether the digital system that represents your brand to the world is operating efficiently, effectively, and in alignment with corporate goals.
Question
Why It Matters
Executive Red Flag
1. Are we treating the website as a capital asset or a cost center?
Capital assets require lifecycle planning, maintenance, and reinvestment.
Budgets are reset annually with no cumulative accountability.
2. What’s our digital yield – the value per visit or per impression?
Links traffic and investment to tangible business outcomes.
Traffic grows, revenue stays flat.
3. Where are we leaking value?
Surfaces inefficiencies across SEO, paid, content, and conversion funnels.
Paid media dependency rises while organic visibility declines.
4. How fast can we diagnose and fix a problem?
Measures organizational agility and governance maturity.
Issues discovered only after quarterly reports.
5. Do we have digital “command and control”?
Reveals whether teams, agencies, and regions share accountability.
Multiple CMSs, duplicated content, and conflicting data.
6. How does our web performance translate to shareholder metrics?
Connects digital KPIs to ROIC and margin.
Dashboards report sessions, not value.
7. Who owns web effectiveness?
Ownership drives accountability and resourcing.
Everyone claims a piece; no one owns the outcome.
8. Are we findable, understandable, and trusted by both humans and machines?
Future-proofs the brand in AI-driven search.
Generative engines cite competitors, not us.
9. How resilient is our digital ecosystem?
Tests readiness for migrations, rebrands, and AI shifts.
Every platform change causes a traffic cliff.
10. What are we learning from our data that informs decisions?
Turns analytics into strategy, not hindsight.
Insights exist but never reach decision-makers.
Each question reframes a “marketing” issue as a governance issue. When CEOs ask these questions, they encourage teams to think systemically, connecting content, code, and conversion as interdependent components of a single digital value chain.
From Questions To Action: Building A Culture Of Digital Accountability
Asking the right questions isn’t micromanagement – it’s leadership through intent.
When a CEO defines the Commander’s Intent for digital, it brings clarity of purpose, alignment of teams, and shared metrics, and it changes how the organization approaches the web. Instead of chasing redesigns or vanity KPIs, teams operate with a shared understanding:
“Our website’s job is to create enterprise value – measurable, sustainable, and scalable.”
That intent cascades into structure:
Visibility: Reporting evolves from traffic to contribution value.
Speed: Teams track time-to-detect and time-to-resolve issues.
Alignment: Marketing, IT, and product teams operate under a unified governance framework.
This is where the Web Effectiveness Score or Digital Value Creation Framework bridges web metrics (load time, index coverage) to enterprise KPIs (ROIC, margin, growth). Once that link is visible, executives start managing digital performance as a financial asset because it is.
The CEO’s Digital Playbook
CEOs who ask these questions consistently outperform those who don’t – not because they know more about SEO, but because they lead with system awareness. When they do:
Wasted Spend Decreases. Duplicative content, overlapping agencies, and redundant tools are identified and rationalized.
Visibility and Trust Increase. Content becomes findable, structured, and cited by both search engines and generative AI.
Risk Declines. Technical debt, migration shocks, and compliance failures are detected early.
Innovation Accelerates. Modular systems and shared data layers enable faster experimentation.
Enterprise Value Compounds. Web performance improvements flow into revenue growth and cost efficiency.
This is the same logic CFOs apply to physical assets. The only difference is that digital assets rarely appear on the balance sheet, so their underperformance remains invisible until a crisis.
Why Now: The AI Search Inflection Point
The rise of generative search makes these questions urgent. Search is no longer a static list of links; it’s a recommendation system. AI engines evaluate authority, trust, and structured data across the web to synthesize answers.
If your website isn’t structured, trusted, and machine-readable, your company risks digital disintermediation and being invisible in the ecosystems that shape decisions. For CEOs, that’s not a marketing problem; it’s an enterprise risk.
As AI systems determine which brands get cited and recommended, your digital infrastructure becomes the new supply chain for relevance and reputation.
Final Thought
The CEOs who win the next decade won’t outspend their competitors – they’ll out-align them. They’ll treat digital infrastructure with the same financial discipline as physical assets, measure contribution instead of activity, and lead teams to think in systems rather than silos.
Every boardroom already measures financial capital. It’s time to start measuring digital capital, and your website is where it compounds.
In the AI era, your website isn’t just how people find you. It’s how machines define you.
Informational sites can easily decline into a crisis of search visibility. No site is immune. Here are five ways to manage content to maintain steady traffic, increase the ability to adapt to changing audiences, and make confident choices that help the site maintain growth momentum over time.
1. Create A Mix Of Content Types
Publishers are in a constant race to publish what’s latest because being first to publish can be a source of massive traffic. The main problem with these kinds of sites is that publishing content about current events can run into problems, putting into question the sustainability of the publication.
Current events quickly become stale and no longer relevant to an audience.
Unforeseen events like an industry strike, accidents, world events, and pandemics can disrupt interest in a topic.
The focus then is to identify content topics that are reliably relevant to the website’s current audience. This kind of content is called evergreen content, and it can form a safety net of reliable traffic that can sustain the business during slow cycles.
An example of the mixed approach to content that comes to mind is how the New York Times has a standalone recipes section on a subdomain of the main website. It also has a category-based section dedicated to gadget reviews called The Wirecutter.
Another example is the entertainment niche which in addition to industry news also publish interviews with stars and essays about popular movies. Music websites publish the latest news but also content based on snippets from interviews with famous musicians where the musicians make interesting statements about songs, inspirations, and cultural observations.
Rolling Stone magazine publishes content about music but also about current events like politics that align with their reader interests.
All three of those examples expand their topics to adjacent topics in order to bulk up their ability to attract steady and consistent traffic that is reliable.
2. Evergreen Content Also Needs Current Event Topics
Conversely, evergreen topics can generate new audience reach and growth by expanding to cover current events. Content sites about recipes, gardening, home repairs, DIY, crafts, parenting, personal finance, and fitness are all examples of topics that feature evergreen content and can also expand to cover current events. The flow of traffic derived from trending topics is an excellent source of devoted readers who return to read evergreen content and end up recommending the site to friends for both current events and evergreen topics.
Current events can be related to products and even to statements by famous people. If you enjoy creating content or making discoveries, then you’ll enjoy the challenge of discovering new sources of trending topics.
If you don’t already have a mix of evergreen and ephemeral content, then I would encourage you to seek opportunities to focus on those kinds of articles. They can help sustain traffic levels while feeding growth and life into the website.
3. Beware Of Old Content
Google evaluates the total content of a website in order to generate a quality score. Google is vague about these whole-site evaluations. We only know that they do it and that a good evaluation can have a positive effect on traffic.
However, what happens when the site becomes top-heavy with old, stale content that’s no longer relevant to site visitors? This can become a drag on a website. There are multiple ways of handling this situation.
Content that is absolutely out of date and of no interest to anyone and is therefore no longer useful should be removed. The criteria to judge content with is usefulness, not the age of the content. The reason to prune this content is because it’s possible that a whole-site evaluation may conclude that most of the website is comprised of unhelpful, outdated web pages. This could be a negative drag on site performance.
There’s nothing inherently wrong with old content as long as it’s useful. For example, the New York Times keeps old movie reviews in archives that are organized by year, month, day, category, and article title.
The URL slug for the movie review of E.T. looks like this: /1982/06/11/movies/et-fantasy-from-spielberg.html
Screenshot Of Archived Article
Take Decisive Steps
Useful historical content can be archived.
Older content that is out of date can be rehabilitated.
Content that’s out of date and has been superseded by new content can be redirected with a 301 response code to the new content.
Content that is out of date and objectively useless should be removed from the website and allowed to show a 404 response code.
4. Topic Interest
Something that can cause traffic to decline on an informational site is waning interest. Technological innovation can cause the popularity of another product to decline, dragging website traffic along with it. For example, I consulted for a website that reported its traffic was declining. The site still ranked for its keywords, but a quick look at Google Trends showed that interest in the website topic was declining. This was several months after the introduction of the iPhone, which negatively impacted a broad category of products that the website was centered on.
Always keep track of how interested your audience is in your topic. Follow influencers in your niche topic on social media to gauge what they are talking about and whether there are any shifts in the conversation that indicate waning interest or growing interest in a related topic.
Always try out new subcategories of your topic that cross over with your readership to see if there is an audience there that can be cultivated.
Another nuance to consider is the difference between temporary dips in interest and long-term structural decline. Some topics experience predictable cycles driven by seasons, economic conditions, or news coverage, while others face permanent erosion as user needs change or alternatives emerge. Misreading a cyclical slowdown as a permanent decline can lead to unnecessary pivots, while ignoring structural shifts can leave a site over-invested in content that no longer aligns with how people search, buy, or learn.
Monitoring topic interest is less about reacting to short-term fluctuations and more about keeping aware of topical interest and trends. By monitoring audience behavior, tracking broader trends, and experimenting at the edges of the core topic, an informational site can adjust gradually rather than being forced into abrupt changes after traffic has already declined. This ongoing attention helps ensure that content decisions remain grounded in how interest evolves over time.
5. Differentiate
Something that happens to a lot of informational websites is that competitors in a topic tend to cover the exact same stories and even have similar styles of photos, about pages, and bios.
B2B software sites have images of people around a laptop, images of a serious professional, and people gesturing at a computer or a whiteboard.
Recipe sites feature the Flat Lay (food photographed from above), the Ingredient Still Life portrait, and action shots of ingredients grated, sprinkled, or in mid air.
Websites tend to converge into homogeneity in the images they use and the kind of content that’s shared, based on the idea that if it’s working for competitors, then it may be a good approach. But sometimes it’s best to step out of the pack and do things differently.
Evolve your images so that they stand out or catch the eye, try a different way of communicating your content, identify the common concept that everyone uses, and see if there’s an alternate approach that makes your site more authentic.
For example, a recipe site can show photographic bloopers or discuss what can go wrong and how to fix or avoid it. Being real is authentic. So why not show what underbaked looks like? Instagram and Pinterest are traffic drivers, but does that mean all images must be impossibly perfect? Maybe people might respond to the opposite of homogeneity and fake perfection.
The thing that’s almost always missing from product reviews is photos of the testers actually using the products. Is it because the reviews are fake? Hm… Show images of the products with proof that they’ve been used.
Takeaways
Sustainable traffic can be cultivated with a mix of evergreen and timely content. Find the balance that works for your website.
Evergreen content performs best when it is periodically refreshed with up-to-date details.
Outdated content that lacks utility or meaning in people’s lives can quietly grow to suppress site-wide performance. Old pages should be reviewed for usefulness and then archived, updated, redirected, or removed.
Audience interest in a topic can decline even if rankings remain stable. Monitoring search demand and cultural shifts helps publishers know when it’s time to expand into adjacent topics before traffic erosion becomes severe.
Differentiation matters as much as coverage. Sites that mirror competitors in visuals, formats, and voice risk blending into sameness, while original presentation and proof of authentic experience build trust and attention.
Search visibility declines are not caused by a single technical flaw or isolated content mistake but by gradual misalignment between what a site publishes and what audiences continue to value. Sites that rely too heavily on fleeting interest, allow outdated material to accumulate, or follow competitors into visual and editorial homogeneity risk signaling mediocrity rather than relevance and inspiring enthusiasm. Sustained performance depends on actively managing content, balancing evergreen coverage with current events, pruning what’s no longer useful, and making deliberate choices that distinguish the site as useful, authentic, and credible.
Safari 26.2 adds support for measuring Largest Contentful Paint (LCP) and the Event Timing API, which is used to calculate Interaction to Next Paint (INP). This enables site owners to collect Largest Contentful Paint (LCP) and Interaction to Next Paint (INP) data from Safari users through the browser Performance API using their own analytics and real user monitoring tools.
LCP And INP In Apple Safari Browser
LCP is a Core Web Vital and a ranking signal. Interaction To Next Paint (INP), also a Core Web Vitals metric, measures how quickly your website responds to user interactions. Native Safari browser support enables accurate measurement, which closes a long-standing blind spot for performance diagnostics of site visitors using Apple devices.
INP is a particularly critical measurement because it reports on the total time between a user’s action (click, tap, or key press) and the visual update on the screen. It tracks the slowest interaction observed during a user’s visit. INP is important because it enables site owners to know if the page feels “frozen” or laggy for site visitors. Fast INP scores translate to a positive user experience for site visitors who are interacting with the website.
This change will have no effect on public tools like PageSpeed Insights and CrUX data because they are Chrome-based.
However, Safari site visitors can now be included in field performance data where site owners have configured measurement, such as in Google Analytics or other performance monitoring platforms.
The following analytics packages can now be configured to surface these metrics from Safari browser site visitors:
Google Analytics (GA4, via Web Vitals or custom event collection)
Adobe Analytics
Matomo
Amplitude (with performance instrumentation)
Mixpanel (with custom event pipelines)
Custom / In-House Monitoring
Apple Safari’s update also enables Real User Monitoring (RUM) platforms to surface this data for site owners:
“Safari 26.2 adds support for two tools that measure the performance of web applications, Event Timing API and Largest Contentful Paint.
The Event Timing API lets you measure how long it takes for your site to respond to user interactions. When someone clicks a button, types in a field, or taps on a link, the API tracks the full timeline — from the initial input through your event handlers and any DOM updates, all the way to when the browser paints the result on screen. This gives you insight into whether your site feels responsive or sluggish to users. The API reports performance entries for interactions that take longer than a certain threshold, so you can identify which specific events are causing delays. It makes measuring “Interaction to Next Paint” (INP) possible.
Largest Contentful Paint (LCP) measures how long it takes for the largest visible element to appear in the viewport during page load. This is typically your main image, a hero section, or a large block of text — whatever dominates the initial view. LCP gives you a clear signal about when your page feels loaded to users, even if other resources are still downloading in the background.”
Safari 26.2 provides new data that is critical for SEO and for monitoring the user experience, information that site owners rely on. Safari traffic represents a significant share of site visits. These improvements make it possible for site owners to have a more complete view of the real user experience across more devices and browsers.
Honestly, I don’t think Google can handle this at all. The scale is unprecedented. They went after publishers manually with the site reputation abuse update. More expired domain abuse is reaching the top of the SERPs than at any time I can remember in recent history. They’re fighting a losing battle, and they’ve taken their eye off the ball.
In a microcosm, this is what’s happening (Image Credit: Harry Clarkson-Bennett)
A few years ago, search was getting on top of the various spam issues “creative” SEOs were trialling. The prospect of being nerfed by a spam update and Google’s willingness to invest and care in the quality of search seemed to be winning the war. Trying to recover from these penalties is nothing short of disastrous. Just ask anybody hit by the Helpful Content update.
But things have shifted. AI is haphazardly rewriting the rules, and big tech has bigger, more poisonous fish to fry. This is not a great time to be a white hat SEO.
TL;DR
Google is currently losing the war against spam, with unprecedented scale driven by AI-generated slop, and expired domain and PBN abuse.
Google’s spam detection monitors four key groups of signals – content, links, reputational, and behavioral.
Data from the Google Leak suggests its most capable detection focuses on link velocity and anchor text.
AI “search” is dozens of times more expensive than traditional search. This enormous cost and focus on new AI products is leading to underinvestment in core spam-fighting.
How Does Google’s Spam Detection System Work?
Via SpamBrain. Previously, the search giant rolled out Penguin, Panda, and RankBrain to make better decisions based on links and keywords.
And right now, badly.
SpamBrain is designed to identify content and websites engaging in spammy activities with apparently “shocking” accuracy. I don’t know whether shocking in this sense is meant in a positive or negative way right now, but I can only parrot what is said.
Over time, the algorithm learns what is and isn’t spam. Once it has clearly established signals associated with spammy sites, it’s able to create a neural network.
Much like the concept of seed sites, if you have the spammiest websites mapped out, you can accurately score everyone else against them. Then you can analyse signals at scale – content, links, behavioral, and reputational signals – to group sites together.
Inputs (content, linking reputational and behavioral signals).
Hidden layer (clustering and comparing each site to known spam ones).
Outputs (spam or not spam).
If your site is bucketed in the same group as obviously spammy sites when it comes to any of the above, that is not a good sign. The algorithm works on thresholds. I imagine you need to sail pretty close to the wind for long enough to get hit by a spam update.
But if your content is relatively thin and low value add, you’re probably halfway there. Add some dangerous links into the mix, some poor business decisions (parasite SEO being the most obvious example), and scaled content abuse, and you’re doomed.
Lots of these are grossly intertwined. Expired domain abuse and PBNs. Keyword stuffing is a little old hat, but link spam is still very much alive and well. Scaled content abuse is at an all-time high across the internet.
The more content you have spread across multiple, semantically similar websites, the more effective you can be. Using exact and partial match anchors to leverage your authority towards “money” pages, the richer you will become.
Millions of page views have been sent to expired and drop domain abusers (Image Credit: Harry Clarkson-Bennett)
From changing the state pension age to free bus passes and TV licenses, the spammers know the market. They know how to incite emotions. Hell hath no fury like a pensioner scorned, and while you can forgive the odd slip-up, nobody can be this generous.
The people who have been working by the book are being sidelined. But the opportunities in the black hat world are booming. Which is, in fairness, quite fun.
Not hard to see what the problem is… (Image Credit: Harry Clarkson-Bennett)
According to award-winning journalist Jean-Marc Manach’s research, he has found over 8,300 AI-generated news websites in French and over 300 in English (the tip of the iceberg, trust me).
He estimates two of these site owners have become millionaires.
By leveraging authoritative, expired domains and PBNs (more on that next), SEOs – the people still ruining the internet – know how to game the system. By faking clicks, manipulating engagement signals, and utilizing past link equity effectively.
Expired Domain Abuse
The big daddy. Black hat ground zero.
If you engage even a little bit with a black hat community, you’ll know how easy it is right now to leverage expired domains. In the example below, someone had bought the London Road Safety website (a once highly authoritative domain) and turned it into a single-page “best betting sites not on GamStop” site.
Betting and crypto are ground zero for all things black hat, just because there’s so much money involved.
I’m not an expert here, but I believe the process is as follows:
Purchase an expired, valuable domain with a strong, clean backlink history (no manual penalties). Ideally, a few of them.
Then you can begin to create your own PBN with unique hosting providers, nameservers, and IP addresses, with a variety of authoritative, aged, and newer domains.
This domain(s) then becomes your equity/authority stronghold.
Spin up multiple TLD variations of the domain, i.e., instead of .com it becomes .org.uk.
Add a mix of exact and partial match anchors from a PBN to the money site to signal its new focus.
Either add a 301 redirect for a short period of time to the money variation of the domain or canonicalize to the variation.
These scams are always short-term plays. But they can be worth tens of hundreds of thousands of pounds when done well. And they are back, and I believe more valuable than ever.
Right now, I think it’s as simple as buying an old charity domain, adding a quick reskin and voila. A 301 or equity passing tactic and your single page site about ‘best casinos not on gamstop’ is printing money. Even in the English speaking market.
A PBN (or Private Blog Network) is a network of websites that someone controls that link back to the money site. The variation of the site designed to generate typically advertising or affiliate revenue.
A private blog network has to be completely unique from each other. They cannot share breadcrumbs that Google can trace. Each site needs a standalone:
Hosting provider.
IP address.
Nameserver.
The reason PBNs are so valuable is you can build up an enormous amount of link equity and falsified topical authority to mitigate risk. Expired domains are risky because they’re expensive, and once they get a penalty, they’re doomed. PBNs spread the risk. Like the head of a Hydra, one dies; another rises up.
Protecting the tier 1 asset (the purchased aged or expired domain) is paramount. Instead of pointing links directly to the money site, you can link to the sites that link to the money site.
This indirectly boosts the value of the money site, protecting it from Google’s prying eyes.
What Does The Google Leak Show About Spam?
As always, this is an inexact science. Barely even pseudo-science really. I’ve got the tinfoil hat on and a lot of string connecting wild snippets of information around the room to make this work. You should follow Shaun Anderson here.
If I take every mention of the word “spam” in the module names and descriptions, there are around 115, once I’ve removed any nonsense. Then we can categorize those into content, links, reputational, and behavioral signals.
Taking it one step further, these modules can be classified as relating to things like link building, anchor text, content quality, et al. This gives us a rough sense of what matters in terms of scale.
Anchor text makes up the lion’s share of spammy modules based on data from the Google Leak (and my own flawed categorization)(Image Credit: Harry Clarkson-Bennett)
A few examples:
spambrainTotalDocSpamScore calculates a document’s overall spam score.
IndexingDocjoinerAnchorPhraseSpamInfo and IndexingDocjoinerAnchorSpamInfo modules identify spammy anchor phrases by looking at the number, velocity, the days the links were discovered, and the time the spike ended.
GeostoreSourceTrustProto helps evaluate the trustworthiness of a source.
Really, the takeaway is how important links are from a spam sense. Particularly, anchor text. The velocity at which you gain links matters. As does the text and surrounding content. Linking seems to be where Google’s algorithm is most capable of identifying red and amber flags.
If your link velocity graph spiked with exact match anchors to highly commercial pages, that’s a flag. Once a site is pinged for this type of content or link-related abuse, the behavioral and reputational signals are analysed as part of SpamBrain.
If these corroborate and your site exceeds certain thresholds, you’re doomed. It’s why this has (until recently) been a relatively fine art.
Ultimately, They’re Just Investing Less In Traditional Search
As Martin McGarry pointed out, they just care a bit less … They have bigger, more hallucinogenic fish to fry.
Image Credit: Harry Clarkson-Bennett
In 2025, we have had four updates, with a duration of c. 70 days. In 2024, we had seven that lasted almost 130 days. Productivity levels we can all aspire to.
It’s Not Hard To Guess Why…
The bleeding-edge search experience is changing. Google is rolling out preferred publisher sources globally and inline linking more effectively in its AI products. Much-needed changes.
I think we’re seeing the real-time moulding of the new search experience in the form of The Google Web Guide. A personalized mix of trusted sources, AI Mode, a more classic search interface, and something inspirational. I suspect this might be a little like a Discover-lite feed. A place in the traditional search interface where content you will almost certainly like is fed to you to keep you engaged.
Once you understand enough about a user to bucket them into specific groups, you can saturate a market over the course of a few days Discover. Less even. But the problem is the economics of it all. Ten blue links are cheap. AI is not. At any level.
According to Google, when someone chooses a preferred source, they click through to that site twice as often on average. So I suspect it’s worth taking seriously.
While Google hasn’t released public information on this, it’s no secret that AI searches are significantly more expensive than the classic 10 blue links. Traditional search is largely static and retrieval-based. It relies on pre-indexed pages to serve a list of links and is very cheap to run.
An AI Overview is generative. Google has to run a large language model to summarize and generate a natural language answer. AI Mode is significantly worse. The multi-turn, conversational interface processes the entire dialogue in addition to the new query.
Given the query fan-out technique – where dozens of searches are run in parallel – this process demands significantly more computational power.
Custom chips, efficiencies, and caching can reduce the cost of this. But this is one of Google’s biggest challenges. I suspect exactly why Barry believes AI Mode won’t be the default search experience. I’d be surprised if it isn’t just applied at a search/personalization level, too. There are plenty of branded and navigational searches where this would be an enormous waste of money.
And these guys really love money.
According to The IET, if the population of London (>9.7 million) asked ChatGPT to write a 100-word email this would require 4,874,000 litres of water to cool the servers – equivalent to filling over seven 25m swimming pools
When you add a line in your footer describing something you or your business did, it’s taken as read. Spammy, low-quality tactics work more effectively than heavy lifting.
Ideally, we wouldn’t live in a world where low-lift shit outperforms proper marketing efforts. But here we are.
Like in 2012, “best” lists are on the tip of everyone’s tongue. Basic SEO is making a comeback because that’s what is currently working in LLMs. Paid placements, reciprocal link exchanges. You name it.
Image Credit: Harry Clarkson-Bennett
If it’s half-arsed, it’s making a comeback.
As these models rely on Google’s index for searches that the model cannot confidently answer (RAG), Google’s spam engine matters more than ever. In the same way that I think publishers need to take a stand against big tech and AI, Google needs to step up and take this seriously.
I’m Not Sure Anyone Is Going To…
I’m not even sure they want to right now. OpenAI has signed some pretty extraordinary contracts, and its revenue is light-years away from where it needs to be. And Google’s CAPEX expenditure is through the roof.
So, things like quality and accuracy are not at the top of the list. Consumer and investor confidence is not that high. They need to make some money. And private companies can be a bit laissez-faire when it comes to reporting on revenue and profits.
New funding has to be thrown at data centres (Image Credit: Harry Clarkson-Bennett)
Let’s see them post-hoc rationalize their way out of this one. That’s it. Thank you for reading and subscribing to my last update of the year. Certainly been a year.
There are multiple reasons why a site can drop in rankings due to a core algorithm update. The reasons may reflect specific changes to the way Google interprets content, a search query, or both. The change could also be subtle, like an infrastructure update that enables finer relevance and quality judgments. Here are eight commonly overlooked reasons for why a site may have lost rankings after a Google core update.
Ranking Where It’s Supposed To Rank?
If the site was previously ranking well and now it doesn’t, it could be what I call “it’s ranking where it’s supposed to rank.” That means that some part of Google’s algorithm has caught up to a loophole that the page was intentionally or accidentally taking advantage of and is currently ranking it where it should have been ranking in the first place.
This is difficult to diagnose because a publisher might believe that the web pages or links were perfect the way they previously were, but in fact there was an issue.
Topic Theming Defines Relevance
A part of the ranking process is determining what the topic of a web page is. Google admitted a year ago that a core topicality system is a part of the ranking process. The concept of topicality as part of the ranking algorithm is real.
The so-called Medic Update of 2018 brought this part of Google’s algorithm into sharp focus. Suddenly, sites that were previously relevant for medical keywords were nowhere to be found because they dealt in folk remedies, not medical ones. What happened was that Google’s understanding of what keyword phrases were about became more topically focused.
Bill Slawski wrote about a Google patent (Website representation vector) that describes a way to classify websites by knowledge domains and expertise levels that sounds like a direct match to what the Medic Update was about.
The patent describes part of what it’s doing:
“The search system can use information for a search query to determine a particular website classification that is most responsive to the search query and select only search results with that particular website classification for a search results page. For example, in response to receipt of a query about a medical condition, the search system may select only websites in the first category, e.g., authored by experts, for a search results page.”
Google’s interpretation of what it means to be relevant became increasingly about topicality in 2018 and continued to be refined in successive updates over the years. Instead of relying on links and keyword similarity, Google introduced a way to identify and classify sites by knowledge domain (the topic) in order to better understand how search queries and content are relevant to each other.
Returning to the medical queries, the reason many sites lost rankings during the Medic Update was that their topics were outside the knowledge domain of medical remedies and science. Sites about folk and alternative healing were permanently locked out of ranking for medical phrases, and no amount of links could ever restore their rankings. The same thing happened across many other topics and continues to affect rankings as Google’s ability to understand the nuances of topical relevance is updated.
Example Of Topical Theming
A way to think of topical theming is to consider that keyword phrases can be themed by topic. For example, the keyword phrase “bomber jacket” is related to both military clothing, flight clothing, and men’s jackets. At the time of writing, Alpha Industries, a manufacturer of military clothing, is ranked number one in Google. Alpha Industries is closely related to military clothing because the company not only focuses on selling military style clothing, it started out as a military contractor producing clothing for America’s military, so it’s closely identified by consumers with military clothing.
Screenshot Showing Topical Theming
So it’s not surprising that Alpha Industries ranks #1 for bomber jacket because it ticks both boxes for the topicality of the phrase Bomber Jacket:
Shopping > Military clothing
Shopping > Men’s clothing
If your page was previously ranking and now it isn’t, then it’s possible that the topical theme was redefined more sharply. The only way to check this is to review the top ranked sites, focusing, for example, on the differences between ranges such as position one and two, or sometimes positions one through three or positions one through five. The range depends on how the topic is themed. In the example of the Bomber Jacket rankings, positions one through three are themed by “military clothing” and “Men’s clothing.” Position three in my example is held by the Thursday Boot Company, which is themed more closely with “men’s clothing” than it is with military clothing. Perhaps not coincidentally, the Thursday Boot Company is closely identified with men’s fashion.
This is a way to analyze the SERPs to understand why sites are ranking and why others are not.
Topic Personalization
Sometimes the topical themes are not locked into place because user intents can change. In that case, opening a new browser or searching a second time in a different tab might cause Google to change the topical theme to a different topical intent.
In the case of the “bomber jacket” search results, the hierarchy of topical themes can change to:
Informational > Article About Bomber Jackets
Shopping > Military clothing
Shopping > Men’s clothing
The reason for that is directly related to the user’s information need which informs the intent and the correct topic. In the above case it looks like the military clothing theme may be the dominant user intent for this topic but the informational/discovery intent may be a close tie that’s triggered by personalization. This can vary by previous searches but also by geographic location, a user’s device, and even by the time of day.
The takeaway is that there may not be anything wrong with a site. It’s just ranking for a more specific topical intent. So if the topic is getting personalized so that your page no longer ranks, a solution may be to create another page to focus on the additional topic theme that Google is ranking.
Authoritativeness
In one sense, authoritativeness can be seen as an external validation of expertise of a website as a go-to source for a product, service, or content topic. While the expertise of the author contributes to authoritativeness and authoritativeness in a topic can be inherent to a website, ultimately it’s third-party recognition from readers, customers, and other websites (in the form of citations and links) that communicate a website’s authoritativeness back to Google as a validating signal.
The above can be reduced to these four points:
Expertise and topical focus originate within the website.
Authoritativeness is the recognition of that expertise.
Google does not assess that recognition directly.
Third-party signals can validate a site’s authoritativeness.
To that we can add the previously discussed Website Representation Vector patent that shows how Google can identify expertise and authoritativeness.
What’s going on then is that Google selects relevant content and then winnows that down by prioritizing expert content.
“Google’s automated systems are designed to use many different factors to rank great content. After identifying relevant content, our systems aim to prioritize those that seem most helpful. To do this, they identify a mix of factors that can help determine which content demonstrates aspects of experience, expertise, authoritativeness, and trustworthiness, or what we call E-E-A-T.”
Authoritativeness is not about how often a site publishes about a topic; any spammer can do that. It has to be about more than that. E-E-A-T is a standard to hold your site up to.
Stuck On Page Two Of Search Results? Try Some E-E-A-T
Speaking of E-E-A-T, many SEOs have the mistaken idea that it’s something they can add to websites. That’s not how it works. At the 2025 New York City Search Central Live event, Google’s John Mueller confirmed that E-E-A-T is not something you add to web pages.
He said:
“Sometimes SEOs come to us or like mention that they’ve added EEAT to their web pages. That’s not how it works. Sorry, you can’t sprinkle some experiences on your web pages. It’s like, that doesn’t make any sense.”
Clearly, content reflects qualities of authoritativeness, trustworthiness, expertise, and experience, but it’s not something that you add to content. So what is it?
E-E-A-T is just a standard to hold your site up to. It’s also a subjective judgment made by site visitors. A subjective judgment is like how a sandwich can taste great, with the “great” part being the subjective judgment. It is a matter of opinion.
One thing that is difficult for SEOs to diagnose is when their content is missing that extra something to push their site onto the first page of the SERPs. It can feel unfair to see competitors ranking on the first page of the SERPs even though your content is just as good as theirs.
Those differences indicate that their top-ranked web pages are optimized for people. Another reason is that more people know about them because they have a multimodal approach to content, whereas the site on page two of the SERPs mainly communicates via textual content.
In SERPs where Google prefers to rank government and educational sites for a particular keyword phrase, except for one commercial site, I almost always find evidence that their content and their outreach are resonating with site visitors in ways that the competitor websites do not. Websites that focus on multimodal, people-optimized content and experiences are usually what I find in those weird outlier rankings.
So if your site is stuck on page two, revisit the top-ranked web pages and identify ways that those sites are optimized for people and multimodal content. You may be surprised to see what makes those sites resonate with users.
Temporary Rankings
Some rankings are not made to last. This is the case with a new site or new page ranking boost. Google has a thing where it tastes a new site to see how it fits with the rest of the Internet. A lot of SEOs crow about their client’s new website conquering the SERPs right out of the gate. What you almost never hear about is when those same sites drop out of the SERPs.
This isn’t a bad thing. It’s normal. It simply means that Google has tried the site and now it’s time for the site to earn its place in the SERPs.
There’s Nothing Wrong With The Site?
Many site publishers find it frustrating to be told that there’s nothing wrong with their site even though it lost rankings. What’s going on may be that the site and web page are fine, but that the competitors’ pages are finer. These kinds of issues are typically where the content is fine and the competitors’ content is about the same but is better in small ways.
This is the one form of ranking drop that many SEOs and publishers easily overlook because SEOs generally try to identify what’s “wrong” with a site, and when nothing obvious jumps out at them, they try to find something wrong with the backlinks or something else.
This inability to find something wrong leads to recommendations like filing link disavows to get rid of spam links or removing content to fix perceived but not actual problems (like duplicate content). They’re basically grasping at straws to find something to fix.
But sometimes it’s not that something is wrong with the site. Sometimes it’s just that there’s something right with the competitors.
What can be right with competitors?
Links
User experience
Image content (for example, site visitors are reflected in image content).
Multimodal approach
Strong outreach to potential customers
In-person marketing
Cultivate word-of-mouth promotion
Better advertising
Optimized for people
SEO Secret Sauce: Optimized For People
Optimizing for people is a common blind spot. Optimizing for people is a subset of conversion optimization. Conversion optimization is about subtle signals that indicate a web page contains what the site visitor needs.
Sometimes that need is to be recognized and acknowledged. It can be reassurance that you’re available right now or that the business is trustworthy.
For example, a client’s site featured a badge at the top of the page that said something like “Trusted by over 200 of the Fortune 500.” That badge whispered, “We’re legitimate and trustworthy.”
Another example is how a business identified that most of their site visitors were mothers of boys, so their optimization was to prioritize images of mothers with boys. This subtly recognized the site visitor and confirmed that what’s being offered is for them.
Nobody loves a site because it’s heavily SEO’d, but people do love sites that acknowledge the site visitor in some way. This is the secret sauce that’s invisible to SEO tools but helps sites outrank their competitors.
It may be helpful to avoid mimicking what competitors are doing and increase ways that differentiate the site and outreach in ways that make people like your site more. When I say outreach, I mean actively seeking out places where your typical customer might be hanging out and figuring out how you can make your pitch there. Third-party signals have long been strong ranking factors at Google, and now, with AI Search, what people and other sites say about your site are increasingly playing a role in rankings.
Takeaways
Core updates sometimes correct over-ranking, not punish sites Ranking drops sometimes reflect Google closing loopholes and placing pages where they should have ranked all along rather than identifying new problems.
Topical theming has become more precise Core updates sometimes make existing algorithms more precise. Google increasingly ranks content based on topical categories and intent, not just keywords or links.
Topical themes can change dynamically Search results may shift between informational and commercial themes depending on context such as prior searches, location, device, or time of day.
Authoritativeness is externally validated Recognition from users, citations, links, and broader awareness can be the difference why one site ranks and another does not.
SEO does not control E-E-A-T and can’t be reduced to an on-page checklist While concepts of expertise and authoritativeness are inherent in content, they’re still objective judgments that can be inferred from external signals, not something that can be directly added to content by SEOs.
Temporary ranking boosts are normal New pages and sites are tested briefly, then must earn long-term placement through sustained performance and reception.
Competitors may simply be better for users Ranking losses often occur because competitors outperform in subtle but meaningful ways, not because the losing site is broken.
People-first optimization is a competitive advantage Sites that resonate emotionally, visually, and practically with visitors often outperform purely SEO-optimized pages.
Ranking changes after a core update sometimes reflect clearer judgments about relevance, authority, and usefulness rather than newly discovered web page flaws. As Google sharpens how it understands topics, pages increasingly compete on how well they align with what users are actually trying to accomplish and which sources people already recognize and trust. The lasting advantage comes from building a site that resonates with actual visitors, earns attention beyond search, and gives Google consistent evidence that users prefer it over alternatives. Marketing, the old-fashioned tell-people-about-a-business approach to promoting it, should not be overlooked.
Featured Image by Shutterstock/Silapavet Konthikamee