Google’s Updated Raters Guidelines Refines Concept Of Low Quality via @sejournal, @martinibuster

Google’s Search Quality Rater Guidelines were updated a few months ago, and several of the changes closely track the talking points shared by Googlers at the 2025 Search Central Live events. Among the most consequential updates are those to the sections defining the lowest quality pages, which more clearly reflect the kinds of sites Google wants to exclude from the search results.

Section 4.0 Lowest Quality Pages

Google added a new definition of the Lowest Rating in the Lowest Quality Pages section. While Google has always been concerned about removing low quality sites from the search results, this change to their raters guideline likely reflects an emphasis on weeding out a specific kind of low quality website.

The new guideline focuses on identifying the publisher’s motives for publishing the content.

The previous definition said:

“The Lowest rating is required if the page has a harmful purpose, or if it is designed to deceive people about its true purpose or who is responsible for the content on the page.”

The new version keeps that sentence but adds a new sentence that encourages the quality rater to consider the underlying motives of the publisher responsible for the web page. The focus of this guidance is to encourage the quality raters to consider how the page benefits a site visitor and to judge whether the purpose of the page is entirely for benefiting the publisher.

The addition to this section reads:

“The Lowest rating is required if the page is created to benefit the owner of the website (e.g. to make money) with very little or no attempt to benefit website visitors or otherwise serve a beneficial purpose.”

There’s nothing wrong with being motivated to earn an income from a website. What Google is looking at is if the content only serves that purpose or if there is also a benefit for the user.

Focus On Effort

The next change is focused on identifying how much effort was put into creating the site. This doesn’t mean that publishers must now document how much time and effort was put into the creating the content. This section is simply about looking for evidence that the content is not distinguishable from content on other sites and offers no clear advantages over the content found elsewhere on the Internet.

This part about the main content (MC) was essentially rewritten:

“● The MC is copied, auto-generated, or otherwise created without adequate effort.”

The new version has more nuance about the main content (MC):

“● The MC is created with little to no effort, has little to no originality and the MC adds no value compared to similar pages on the web”

Three things to unpack there:

  1. Content created with little to no effort
  2. Contains little to no originality
  3. Main content adds no additional value

Publishers who focus on keeping up with competitors should be careful that they’re not simply creating the same thing as their competitors. Saying that it’s not the same thing because it’s the same topic only better doesn’t change the fact that it’s the same thing. Even if the content is “ten times better” the fact remains that it’s still basically the same thing as the competitor’s content, only ten times more of it.

A Word About Content Gap Analysis

Some people are going to lose their minds about what I’m going to say about this, but keep an open mind.

There is a popular SEO process called Content Gap Analysis. It’s about reviewing competitors to identify topics that the competitors are writing about that are missing on the client’s site then copying those topics to fill the content gap.

That is precisely the kind of thing that leads to unoriginality and content that is indistinguishable from everything else that’s on the Internet. It’s my number one reason I would never use a software program that scrapes top ranked sites and suggests topics based on what the competitor’s are publishing. It results in virtually indistinguishable content and pure unoriginality.

Who wants to jump from one site to another site and read the same exact  recipes, even if they have more images and graphs and videos. Copying a competitor’s content “but doing it better” is not original.

Scraping Google’s PAAs (People Also Asked) just like everyone else does not result in original content. It results in content that’s exactly the same as everyone else that’s scraping PAAs.

While the practice of content gap analysis is about writing about the same thing only better, it’s still unoriginal. Saying it’s better doesn’t change the fact that it’s the same thing.

Lack of originality is a huge issue with Internet content and it’s something that Google’s Danny Sullivan discussed extensively at the recent Google Search Central Live in New York City.

Instead of looking for information gaps, it’s better to review your competitor’s weaknesses. Then look at their strengths. Then compare that to your own weaknesses and strengths.

A competitor’s weakness can become your strength. This is especially valuable information when competing against a bigger and more powerful competitor.

Takeaways

1. Google’s Emphasis on Motive-Based Quality Judgments

  • Quality raters are now encouraged to judge not just content, but the intent behind it.
  • Pages created purely for monetization, with no benefit to users, should be rated lowest.
  • This may signal Google’s intent to refine their ability to week out low quality content based on the user experience.

2. Effort and Originality Are Now Central Quality Signals

  • Low-effort or unoriginal content is explicitly called out as justification for the lowest rating.
  • This may signal that Google’s algorithms may increasingly focus on surfacing content with higher levels of originality.
  • Content that doesn’t add distinctive value over competitors may struggle in the search results

3. Google’s Raters Guidelines Reflect Public Messaging

  • Changes to the Guidelines mirror talking points in recent Search Central Live events.
  • This suggests that Google’s algorithms may become more precise on things like originality, added value, and effort put into creating the content.
  • This means publishers should (in my opinion) consider ways to make their sites more original than other sites, to compete by differentiation.

Google updated its Quality Rater Guidelines to draw a sharper line between content that helps users and content that only helps publishers. Pages created with little effort, no originality, or no user benefit are now listed as examples of the lowest quality, even if they seem more complete than competing pages.

Google’s Danny Sullivan used the example of travel sites that all have the sidebar that introduces the smiling site author and other hallmarks of travel sites as an example of an area where sites become indistinguishable from each other.

The reason why publishers do that is that they see what Google is ranking and assume that’s what Google wants. In my experience, that’s not the case. In my opinion it may be useful to think about what you can do to make a site more original.

Download the latest version of Google’s Search Quality Raters Guidelines here (PDF).

Featured Image by Shutterstock/Kues

Google Answers Why Landing Page Ranks For An E-Commerce Query via @sejournal, @martinibuster

Google’s John Mueller answered a question on Bluesky about why an e-commerce page with minimal content is ranking, illustrating that sometimes optimized content isn’t enough.

E-Commerce Search Results

A person posted their concerns about an e-commerce site that was ranking in the search results with barely any content. In fact, the domain that was ranking redirects to another domain. On the face of it, it appears like something is not right. Why would Google rank a landing page about a domain name transfer, right?

Why would Google rank what is essentially a landing page with virtually zero content for a redirected domain?

Why A Landing Page Ranks

The company with the landing page had acquired another company and they subsequently joined the two domains. There was nothing wrong or spammy going on, one business bought another business, it happens every day.

The person asking the question dropped a URL and a screenshot of the landing page and asked:

“How does Google think this would be the best result and also, do you think this is a relevant result for users?”

Google’s John Mueller answered:

“It looks like a normal ecommerce site to me. They could have handled the site-migration a bit more gracefully (and are probably losing a lot of “SEO value” by doing this instead of a real migration), but it doesn’t seem terrible for users.”

Site Migration

Mueller’s comment about the site migration was expanded further.

He posted:

“Our guidance for site migrations is at https://developers.google.com/search/docs/crawling-indexing/site-move-with-url-changes . What they’re doing is a “soft or crypto redirect”, and they’re doing it “N:1″ (meaning all old pages go there). Both of these make transfering information about the old site hard / impossible.”

Sometimes Google ranks pages that seem like they don’t belong. But sometimes the site rankings make sense when looked at from a different perspective, particularly from the perspective of what’s good and makes sense for the user. Rankings change all the time and it could be that the rankings for that page could go away after a certain amount of time. But waiting for a competitor to drop away isn’t really a good SEO strategy. Google’s Danny Sullivan had some good advice about differentiating a site for better rankings.

Channel Reporting Is Coming To Performance Max Campaigns via @sejournal, @brookeosmundson

Google just launched substantial upgrades to its Performance Max campaigns today.

In their announcement, they introduced long-anticipated reporting features that will provide advertisers with much-needed visibility into how their campaigns perform across different Google surfaces.

These updates include new channel-level reporting, full search terms data, and expanded asset performance metrics.

The goal?

It’s aimed at helping marketers better understand, evaluate, and optimize their Performance Max campaigns.

The rollout is expected to begin with an open beta for channel performance reporting in the coming weeks.

For advertisers managing budget and strategy across a mix of formats and inventory, these reporting enhancements mark a meaningful step forward in understanding where results are coming from and how to take informed action.

Advertiser Feedback is Directly Shaping PMax’s Direction

According to Google, Performance Max is now used by over one million advertisers.

In 2024 alone, Google implemented more than 90 improvements to Performance Max, leading to measurable gains in both conversions and conversion value.

But alongside performance, advertisers have consistently asked for better transparency and reporting.

Google’s latest announcements make clear that advertiser feedback has played a central role in shaping these enhancements.

The goal is to deliver clearer insights, support decision-making, and increase control—without sacrificing the benefits of automation.

Channel Performance Reporting Is Coming To Performance Max

Channel-level reporting is the most significant update in this release.

For the first time, advertisers will be able to view results by channel: Search, YouTube, Display, Discover, Gmail, Maps, and Search partners.

The new “Channel performance” page will show:

  • Visual breakdowns of performance by surface
  • Campaign-level metrics for each channel, including clicks, conversions, and spend
  • A downloadable table with key performance data
  • Diagnostics to surface missed opportunities or setup issues.

You’ll be able to find the Channel Performance reporting in the “Insights & reports” tab on the left-hand side of Google. See the example below on how the report will function.

For example, if Maps isn’t generating traffic, diagnostics might suggest adding a location asset. Or if YouTube is outperforming, advertisers can shift their focus to high-impact video creatives.

The ability to view spend and conversion value by channel adds clarity that Performance Max has previously lacked.

Search Terms Reporting Reaches (Almost) Full Visibility

Another major enhancement is the addition of full search terms reporting.

Advertisers will now be able to see the actual queries driving performance – similar to what’s available in standard Search and Shopping campaigns.

With this rollout, marketers can:

  • Identify top-performing search terms
  • Create tailored assets round those queries
  • Apply negative keywords or brand exclusions when needed

For agencies managing multiple clients or accounts at scale, this change improves daily workflow efficiency.

Rather than relying solely on limited theme-level insights or making assumptions about what’s driving performance, teams can now analyze exact queries.

This supports better keyword refinement, more accurate exclusions, and tighter alignment between campaign objectives and user behavior, all within the familiar framework of Search best practices.

Privacy thresholds will still apply, but the reporting experience will be much more detailed than before.

At launch, this feature will be available in the Google Ads UI only, with API support expected later.

For marketers focused on search intent, this change makes Performance Max a more actionable channel.

More Granular Asset Metrics Across Campaign Types

Asset reporting is also expanding. In addition to conversion data, advertisers will now see:

  • Impressions
  • Clicks
  • Cost
  • Conversion Value
Example of expanded asset-level reporting in Performance Max.Image Credit: Google, April 2025

These new metrics will apply across Performance Max, Search, and Display. This allows advertisers to evaluate creative performance at a deeper level.

Want to know if your video is driving more conversions than your static image? Now you can. Want to see if your headline gets more clicks than your call-to-action? The data is there.

These insights support better creative testing and stronger Ad Strength scores, all based on performance—not assumptions.

Built-In Diagnostics Help Spot Gaps and Missed Opportunities

Google is also adding diagnostics that flag potential performance issues. These insights will live within the Channel performance page and highlight areas for improvement.

For example:

  • If you’re not showing on Maps, diagnostics might suggest adding a location feed or location asset
  • If Search delivery is limited, landing page relevance could be the cause
Image credit: Google, April 2025

This feature won’t give full control over where ads appear, but it does provide better visibility into what’s working and what’s not.

Channel exclusions are still not available in Performance Max, but Google confirmed it’s exploring future control options. For now, diagnostics serve as a step toward more informed decision-making.

Why These Updates Matter For Advertisers

This round of updates helps address a long-standing challenge with Performance Max: the lack of visibility.

Advertisers have embraced the campaign type for its scale and automation, but often struggled to understand the “how” behind performance.

With these new features, advertisers will gain:

  • Channel-level transparency
  • Deeper search intent insights
  • Clearer creative performance metrics
  • Actionable recommendations to fix delivery issues

These aren’t just incremental changes. They reshape how marketers can evaluate and optimize PMax.

The updates make it easier to align creative strategy, understand channel contribution, and refine search targeting.

It’s also clear that Google is listening. The inclusion of diagnostics, downloadable tables, and more detailed reporting shows a strong response to real-world feedback.

These updates also signal a broader industry shift toward hybrid automation models: where AI handles scale, but humans still guide strategy with the help of robust data.

As marketers continue to seek clarity on campaign performance, updates like these help reinforce trust in automated systems by making them easier to measure and manage.

More details are expected at Google Marketing Live. But this release signals a new phase for Performance Max: one that balances automation with greater accountability and insight.

SEO Rockstar Names 7 SEO Fundamentals To Win In AI Search via @sejournal, @martinibuster

Todd Friesen, one of the most experienced digital marketers in our industry, recently posted on LinkedIn that the core fundamentals that apply to traditional search engines work exactly the same for AI search optimization. His post quickly received dozens of comments and more than a hundred likes, indicating that he’s not the only one who believes there’s no need to give SEO another name.

Who Is Todd Friesen?

Todd has had a long career in SEO, formerly of Salesforce and other top agencies and businesses. Like me, he was a moderator at the old WebmasterWorld Forums, only he’s been doing SEO for even longer. Although he’s younger than I am, I totally consider him my elder in the SEO business. Todd Friesen, along with Greg Boser, was an SEO podcasting pioneer with their SEO Rockstars show.

AEO – Answer Engine Optimization

There’s been a race to give a name to optimizing web content for AI search engines and few details on why it merits a new name.

We find ourselves today with five names for the exact same thing:

  1. AEO (Answer Engine Optimization)
  2. AIO (AI Optimization)
  3. CEO (Chat Engine Optimization)
  4. GEO (Generative Engine Optimization)
  5. LMO (Language Model Optimization)

There are many people today that agree that optimizing for an AI search engine is fundamentally the same as optimizing for a traditional search engine.  There’s little case for a new name when even an AI search engine like Perplexity uses a version of Google’s PageRank algorithm for ranking authoritative websites.

Todd Friesen’s post on LinkedIn made the case that optimizing for AI search engines is essentially the same thing as SEO:

“It is basically fundamental SEO and fundamental brand building. Can we stop over complicating it?

– proper code (html, schema and all that)
– fast and responsive site
– good content
– keyword research (yes, we still do this)
– coordination with brand marketing
– build some links
– analytics and reporting (focus on converting traffic)
– rinse and repeat”

SEO For AI = The Same SEO Fundamentals

Todd Friesen is right. While there’s room for quibbling about the details the overall framework for SEO, regardless if it’s for an AI search engine or not, can be reduced to these seven fundamentals of optimization.

Digital Marketer Rosy Callejas (LinkedIn Profile) agreed that there were too many names for the same thing:

“Too many names! SEO vs AEO vs GEO”

Kevin Doory, (LinkedIn Profile) Director Of SEO at RazorFish commented:

“The ones that talk about what they do, can change the names to whatever they want. The rest of us will just do the darn things.”

SEO Consultant Don Rhoades (LinkedIn Profile) agreed:

“Still SEO after all these (failed) attempts to distance from it by “thought leaders” – eg: inbound marketing, growth hacking, and whatever other nomenclature du jour they decide to cook up next.”

Ryan Jones (LinkedIn Profile), Senior Vice President, SEO at Razorfish (and founder of SERPrecon.com) commented on the ridiculousness of the GEO name: 

“GEO is a terrible name”

Pushback On AEO Elsewhere

A discussion on Bluesky saw Google’s John Mueller commenting on the motivations for creating hype.

Preeti Gupta‬ posted her opinion on Bluesky:

“It is absolutely wild to me that in this debate of GEO/AEO and SEO, everyone is saying that building a brand is not a requisite for SEO, but it is important for GEO/AEO.

Like bro, chill. This AI stuff didn’t invent the need for building a brand. It existed way before it. smh.”

Google’s John Mueller responded:

“You don’t build an audience online by being reasonable, and you don’t sell new things / services by saying the current status is sufficient.”

What Do You Think?

What’s your opinion? Is SEO for AI fundamentally the same as for regular search engines?

Google & Apple Maps: 20% of Local Searches Now Start Here via @sejournal, @MattGSouthern

New research shows that map platforms have become key search engines for local businesses.

One in five consumers now searches directly in map apps instead of traditional search engines.

BrightLocal’s Consumer Search Behavior study found that Google, Apple, and Bing Maps make up 20% of all local searches.

This is a big part of search traffic that many marketers might be missing in their local SEO plans.

The Rise of Map-First Search Behavior

The research found that 15% of consumers use Google Maps as their first choice for local searches. This makes it the second most popular platform after Google Search (45%).

The study reads:

“Another significant finding is the prominence of Google Maps in local search. 15% of consumers said they would use Google Maps as their first port of call, meaning they are searching local terms—which could be brand or non-brand terms—directly in Google Maps.”

It continues:

“Google Maps, Apple Maps, and Bing Maps combined make up 20% of default local search platforms. This reinforces the importance of ensuring you’re optimizing for both map packs and organic search listings. You might have a strong presence in the SERPs, but if consumers are looking for businesses like yours on a map search, you need to ensure you’re going to be found there, too.”

This change shows that consumers favor visual, location-based searches for local businesses, especially when making spontaneous decisions.

Generational Differences in Map Usage

Different age groups use map platforms at different rates:

  • Eighteen percent of Gen Z consumers use Google Maps as their primary local search tool, which is three percentage points higher than the average.
  • 21% of Millennials use Google Maps as their default local search platform.
  • 5% of Millennials prefer Apple Maps as their primary local search option.
  • Younger consumers appear to be more comfortable using maps to discover local businesses. This might be because they’re used to doing everything on mobile devices.

What Consumers Look for in Map Results

The study found key information that drives consumer decisions when using maps:

  • 85% of consumers say contact information and opening hours are “important” or “very important”
  • 46% rate business contact information as “very important”
  • Nearly half (49%) of consumers “often” or “always” plan their route to a business after searching

Map-based searches have high potential to convert browsers into customers, the report notes:

“Almost half of consumers (49%) said that they ‘often’ or ‘always’ go on to plan their travel route to the chosen business. This suggests two things: one, how quickly consumers seem to be making their decisions, and two, that consumers are conducting local business research with the aim of visiting in the very near future.”

SEO Implications for Local Businesses

For SEO pros and local marketers, these findings highlight several actions to take:

  • Prioritize optimizing map listings beyond your Google Business Profile.
  • Ensure accuracy across all map platforms, not just Google.
  • Focus on complete business information, especially contact details and hours.
  • Monitor the “justifications” in map results, which can be sourced from your business information, reviews, and website.
  • Treat maps as a primary search channel rather than an afterthought.

BrightLocal highlights:

“So, don’t lose out to potential customers by not having a correct address, phone number, or email address listed on your platforms—and be sure to check your opening hours are up to date.”

Looking Ahead

Map platforms are evolving from simple navigation tools into search engines that drive sales and revenue.

If you treat map listings as an afterthought, you risk missing many motivated, ready-to-buy consumers.

As search continues to fragment across platforms, investing specific resources in optimizing your map presence, beyond standard local SEO, is increasingly essential for businesses that rely on local traffic.


Featured Image: miss.cabul/Shutterstock

GoDaddy Is Offering Leads To Freelancers And Agencies via @sejournal, @martinibuster

GoDaddy launched a new partner program called GoDaddy Agency that matches web developers with leads for small to mid-sized businesses (SMBs). It provides digital agencies with tools, services, and support to help them grow what they offer their customers.

The new program is available to U.S. based freelancers and web development agencies. GoDaddy offers the following benefits:

  • Client leads
    Partners are paired with SMBs based on expertise and business goals. GoDaddy delivers high-intent business referrals from GoDaddy’s own Web Design Services enquiries.
  • Commission revenue opportunities
    Partners can earn up to 20% commission for each new client purchases.
  • Access to premium WordPress tools
  • Co-branded marketing
    Top-performing partners benefit from more exposure from joint marketing campaigns.
  • Dedicated Support
    Every agency is assigned an Agency Success Manager who can help them navigate ways to benefit more from the program.

Joseph Palumbo, Go-to-Market and Agency Programs Director at GoDaddy explained:

“The GoDaddy Agency Program is all about helping agencies grow. We give partners the tools, support, and referrals they need to take on more clients and bigger projects—without adding more stress to their day. It’s like having a team behind your team.”

For WordPress Developers And More

I asked GoDaddy if this program exclusively for WordPress developers. They answered:

“GoDaddy has a wide variety of products to help make any business successful. So, this isn’t just about WordPress. We have plenty of website solutions, like Managed WordPress, Websites + Marketing or VPS for application development. Additionally, we have other services like email through Office 365, SSL certificates and more.”

Advantage Of Migrating Customers To GoDaddy

I asked GoDaddy what advantages can a developer at another host receive by bringing all of their clients over to GoDaddy?

They answered:

“First, our extensive product portfolio and diverse hosting selection allows agencies to house all and any projects at GoDaddy, allowing them to simplify their operations and giving them the opportunity to manage their business from a single dashboard and leverage a deep connection with a digital partner that understands their challenges and opportunities.

On top of that, there’s the growth potential. Every day, we get calls from customers who want websites that are too complex for us to design and build. So, we have created a system that instead of directing those customers elsewhere, we can connect with Web agencies that are better suited to handle their requests.

If a digital agency becomes a serious partner and the work they do meets our standards, and they have great customer service , etc. we can help make connections that are mutually beneficial to our customers and our partners.”

Regarding my question about WordPress tools offered to agency partners, a spokesperson answered:

“We have a wide variety of AI tools to help them get their jobs done faster. From website design via AI to product descriptions and social posts. Beyond our AI tools, agency partners that use WordPress can work directly with our WordPress Premium Support team. This is a team of WordPress experts and developers who can assist with anything WordPress-related whether hosted at GoDaddy or somewhere else.”

Takeaways

When was the last time your hosting provider gave you a business lead?  The Agency partner program is an innovative ecosystem that supports agencies and freelancers who partner with GoDaddy, a win-win for everyone involved.

It makes sense for a web host to share business leads from customers who are actively in the market for web development work with partner agencies and freelancers who could use those leads. It’s a win-win for the web host and the agency partners, an opportunity that’s worth looking into.

GoDaddy’s new Agency Program connects U.S.-based web developers, freelancers and agencies with high-intent leads from small-to-mid-sized businesses while offering commissions, tools, and support to help agencies grow their client base and streamline operations. The program is a unique ecosystem that enables developers to consolidate hosting, leverage WordPress and AI tools, and benefit from co-marketing and personalized support.

  • Client Acquisition via Referrals:
    GoDaddy matches agency partners with high-intent SMB leads generated from its own service inquiries.
  • Revenue Opportunities:
    Agencies can earn up to 20% commission on client purchases made through the program.
  • Consolidated Hosting and Tools:
    Agencies can manage multiple client types using GoDaddy’s product ecosystem, including WordPress, VPS, and Websites + Marketing.
  • Premium WordPress and AI Support:
    Partners gain access to a dedicated WordPress Premium Support team and AI-powered productivity tools (e.g., design, content generation).
  • Co-Branded Marketing Exposure:
    High-performing partners receive increased visibility through joint campaigns with GoDaddy.
  • Dedicated Success Management:
    Each partner is assigned an Agency Success Manager for personalized guidance and program optimization.
  • Incentive for Migration from Other Hosts:
    GoDaddy offers a centralized platform offering simplicity, scale, and client acquisition opportunities for agencies switching from other providers.

Read more about the GoDaddy Agency program:

GoDaddy Agency: A New Way to Help Digital Consultants Grow

Apply to join the Agency Program here.

Google On Diluting SEO Impact Through Anchor Text Overuse via @sejournal, @martinibuster

Google’s John Mueller answered a question about internal site navigation where an SEO was concerned about diluting the ability to rank for a keyword phrase by using the same anchor text in four sitewide sections across an entire website.

Link In Four Navigational Areas

The person asking the question had a client that had four navigational areas with links across the entire site. One of the links is repeated across each of the four navigational areas, using the exact same anchor text. The concern is that using that phrase multiple times across the entire site might cause it to appear overused.

Roots of Why SEOs Worry About Anchor Text Overuse

There’s a longtime concern in the SEO industry about overusing anchor text. The original reason for this concern, the root of it, is because overusing internal anchor text could be seen as communicating the intent to manipulate the search engines. This concern arose in 2005 because of Google’s announced use of statistical analysis which can identify unnatural linking patterns.

Over the years that concern has evolved to worrying about “diluting” the impact of anchor text, which has no foundation in anything Google said although Google is on record as saying that they’re dampening sitewide links.

Google has in the past made it known that it divides a page into its constituent parts such as the header section (where the logo is), the sitewide navigation, sidebars, main content, in-content navigation, advertising and footers.

We know that Google has been doing this since at least 2004 (a Googler confirmed it to me at a search event) and most definitely around 2006-ish when Google was dampening the effect of external sitewide links and internal sitewide links so that the links only counted as one link, and not with the full power of 2,000 or whatever number of links.

Back in the day people were selling sitewide links at a premium because they were said to harness the entire PageRank power of the site. So Google announced that those links would be dampened for internal links and Google began recognizing paid links and blocking the PageRank from transferring.

We could see the power of the sitewide links through Google’s browser toolbar that contained a PageRank meter so when the change happened we were able to confirm that effect in the toolbar and in rankings.

That’s why sitewide links are no longer an SEO thing anymore. It’s has nothing to do with dilution.

Sitewide Links And Dilution 2025

Today we find an SEO who’s worrying about a sitewide anchor text link being “diluted.”

So, if we already know that Google recognizes sidebars, menus and footers and separates them out from the main content and we know that Google doesn’t count a sitewide link as a multiple but rather counts it as if it only existed on one page, then we already know the answer to that person’s question, which is that no, it’s not going to be a big deal because it’s a navigational sitewide link, which is not meaningful other than to tell Google that it’s an important page for the site.

A sitewide navigational link is important but it’s not the same as a contextual link from within content. A contextual link has meaning, it’s meaningful, because it says something about the page being linked to. One is not better than the other, they’re just different kinds of links.

This is the question that the SEO asked:

Hey
@johnmu.com
a client has 4 navs. A Main Menu, Footer Links, Sidebar Quicklinks & a Related Pages Mini-Nav in posts. Not for SEO but they have quadrupled the internal link profile to a key page on a single anchor.

Any risk that we’re diluting the ability to rank that keyword with “overuse”?

Someone else answered the question with a link to a Search Engine Journal article that was about a site that contains links to every page of the entire site, which is a different situation entirely. That’s a type of site architecture from the old days called a flat site architecture. It was created by SEOs for the purpose of spreading PageRank across to all pages of the site to help all them rank.

Google’s John Mueller responded with a comment about that flat site structure and an answer to the query posed by the SEO:

“I think (it’s been years) that was more about a site that links from all pages to all pages, where you lose context of how pages sit within the site. I don’t think that’s your situation. Having 4 identical links on a page to another page seems fine & common to me, I wouldn’t worry about that.”

Lots Of Duplication

The SEO responded that the duplicated content along the sidebars were HTML and not “navigation” and that they were concerned that this introduced a lot of duplication.

He wrote:

“Its 4 duplicated navs on every page of the site, semantically the side bar and related pages are not navs, they’re html, list structured links so lots of duplication IMO”

I think that Mueller’s answer still applies. It doesn’t matter if they are semantically side bars and related pages. What’s important is that they are not the main content, which is what Google is focused on.

Google’s Martin Splitt went into detail about this four years ago where he talked about the Centerpiece Annotation.

Martin talks about how they identify related content links and other stuff that’s not the main content:

“And then there’s this other thing here, which seems to be like links to related products but it’s not really part of the centerpiece. It’s not really main content here. This seems to be additional stuff.

And then there’s like a bunch of boilerplate or, “Hey, we figured out that the menu looks pretty much the same on all these pages and lists.”

So the answer for the SEO is that it doesn’t matter if those links are in a sidebar or menu navigation or related links. Google identifies it as not the main content and for the purposes of analyzing the web page, sets that aside. Google doesn’t care if stuff is popping up all over the site, it’s not main content.

Read the original discussion on Bluesky.

Featured Image by Shutterstock/Photobank.kiev.ua

Google Discover Desktop Data Already Trackable In Search Console via @sejournal, @MattGSouthern

Google Discover desktop data is already trackable in Search Console. Here’s how to prepare ahead of the full rollout.

  • Data suggests Google Discover has been in testing on desktop for over 16 months.
  • Desktop Discover data reveals lower traffic than mobile (only 4% in the US).
  • Publishers can access their desktop Discover performance now in Search Console.
Google Clarifies Googlebot-News Crawler Documentation via @sejournal, @martinibuster

Google updated their Google News crawler documentation to correct an error that implied that publisher crawler preferences addressed to Googlebot-News influenced the News tab in Google search.

Google News Tab

The Google News tab is a category of search that is displayed near the top of the search results pages (SERPs). The news tab filter displays search results from news publishers. Content that’s shown in the news tab generally comes from sites that eligible to be displayed in Google News and must meet Google’s news content policies.

What Changed:

Google’s changelog noted that the user agent description was in error to say that publisher preferences influenced what’s shown in the Google News tab.

They explained:

“The description for how crawling preferences addressed to Googlebot-News mistakenly stated that they’d affect the News tab on Google, which is not the case.”

The entire section mentioning the News tab in Google Search was removed from this sentence:

“Crawling preferences addressed to the Googlebot-News user agent affect all surfaces of Google News (for example, the News tab in Google Search and the Google News app).”

The corrected version now reads:

“Crawling preferences addressed to the Googlebot-News user agent affect the Google News product, including news.google.com and the Google News app.”

Read the updated Googlebot-News user agent documentation here:

Googlebot News

Featured Image by Shutterstock/Asier Romero

Google’s John Mueller: Updating XML Sitemap Dates Doesn’t Help SEO via @sejournal, @MattGSouthern

Google’s John Mueller clarifies that automatically changing XML sitemap dates doesn’t boost SEO and could make it harder for Google to find actual content updates.

The “Freshness Signal” Myth Busted

On Reddit’s r/SEO forum, someone asked if competitors ranked better by setting their XML sitemap dates to the current date to send a “freshness signal” to Google.

Mueller’s answer was clear:

“It’s usually a sign they have a broken sitemap generator setup. It has no positive effect. It’s just a lazy setup.”

The discussion shows a common frustration among SEO pros. The original poster was upset after following Google’s rules for 15 years, only to see competitors using “spam tactics” outrank established websites.

When asked about sites using questionable tactics yet still ranking well, Mueller explained that while some “sneaky things” might work briefly, updating sitemap dates isn’t one of them.

Mueller said:

“Setting today’s date in a sitemap file isn’t going to help anyone. It’s just lazy. It makes it harder for search engines to spot truly updated pages. This definitely isn’t working in their favor.”

XML Sitemaps: What Works

XML sitemaps help search engines understand your website structure and when content was last updated. While good sitemaps are essential for SEO, many people misunderstand the impact they have on rankings.

According to Google, the lastmod tag in XML sitemaps should show when a page was truly last updated. When used correctly, this helps search engines know which pages have new content that needs to be recrawled.

Mueller confirms that faking these dates doesn’t help your rankings and may prevent Google from finding your real content updates.

What This Means for Your SEO

Mueller’s comments remind us that while some SEO tactics might seem to improve rankings, correlation isn’t causation.

Sites ranking well despite questionable methods are likely succeeding due to other factors, rather than manipulated sitemap dates.

For website owners and SEO professionals, the advice is:

  • Keep your XML sitemaps accurate
  • Only update lastmod dates when you change content
  • Focus on creating valuable content instead of technical shortcuts
  • Be patient with ethical SEO strategies – they provide lasting results

It can be frustrating to see competitors seemingly benefit from questionable tactics. However, Mueller suggests these advantages don’t last long and can backfire.

This exchange confirms that Google’s smart algorithms can recognize and eventually ignore artificial attempts to manipulate ranking signals.


Featured Image:  Keronn art/Shutterstock