Archive

News

Yoast SEO Functionality Is Now Available Within Google Docs via @sejournal, @martinibuster

Yoast SEO announced a new feature that enables SEO and readability analysis within Google Docs, allowing publishers and teams to integrate search marketing best practices at the moment content is created instead of as an editing activity that comes after the fact.Two Functionalities Carry Over To Google Docs
Yoast SEO is providing SEO optimization and readability feedback within the Google Docs editing environment.
SEO feedback consists of the familiar traffic light system that offers visual confirmation that the content is search optimized according to Yoast SEO’s content metrics on keywords, structure and optimization.
The readability analysis offers feedback on paragraph structure, sentence length, and headings to help the writer create engaging content, which is increasingly important in today’s content-first search engines that prioritize high quality content.
According to Yoast SEO:
“The Google Docs add-on tool is available to all Yoast SEO Premium subscribers, offering them a range of advanced optimization tools. For those not yet subscribed to Yoast Premium, the add-on is also available as a single purchase, making it accessible to a broader audience.
For those managing multiple team members, additional Google accounts can be linked for just $5 a month per account or annually for a 10% discount ($54). This flexibility ensures that anyone who writes content and in-house marketing teams managing multiple projects can benefit from high-quality SEO guidance.”
This new offering is an interesting step for Yoast SEO. Previously known as the developer of the Yoast SEO WordPress plugin, it’s expanded to Shopify and now it’s breaking out of the CMS paradigm to encompass the optimization process that happens before the content gets into the CMS.
Read more at the Yoast SEO:
Optimize your content directly in Google Docs with Yoast SEO

Read More »
digital marketing

Stop Paying the Google Ads Tax Without Realizing It [Webinar] via @sejournal, @hethr_campbell

Most brands don’t know they’re wasting money on branded ads. Are you one of them?What if your Google Ads strategy is quietly draining your budget? Many advertisers are paying high CPCs even when there’s no real competition. It’s often because they’re unknowingly bidding against themselves.
Join BrandPilot AI on July 17, 2025 for a live session with Jenn Paterson and John Beresford, as they explain The Uncontested Paid Search Problem and how to stop it before it eats into your performance.
In this data-backed session, you’ll learn:

Why CPCs rise even without competitor bidding
How to detect branded ad waste in your own account
What this hidden flaw is costing your brand
Tactical strategies to reclaim lost budget and improve your results

Why this matters:
Brands are overspending on Google Ads without knowing the real reason. If you’re running branded search campaigns, this session will show you how to identify and fix what’s costing you the most.
Register today to protect your spend and improve performance. If you can’t attend live, sign up anyway and we’ll send you the full recording after the event.

Read More »
News

Internet Marketing Ninjas Acquired By Previsible via @sejournal, @martinibuster

Internet Marketing Ninjas has been acquired by SEO consultancy Previsible, an industry leader co-founded by a former head of SEO at eBay. The acquisition brings link building and digital PR expertise to Previsible. While both companies are now under shared ownership, they will continue to operate as separate brands.Internet Marketing Ninjas
Founded in 1999 by Jim Boykin as We Build Pages, the Internet Marketing Ninjas consultancy story is one of steady innovation and pivoting in response to changes brought by Google. In my opinion, Jim’s talent was his ability to scale the latest tactics in order to offer the services to a large number of clients, and his ability to nimbly ramp up new strategies at scale in response to changes at Google. The names of the people he employed are a who’s who of legendary marketers.
In the early days of SEO, when reciprocal linking was the rage, it was Jim Boykin who became known as a bulk provider of that service, and when directories became a hot service, he was able to scale that tactic and make it easy for business owners to pick up links fast. Over time, the ability to provide links became increasingly harder, and yet Jim Boykin kept on innovating with strategies that made it easy for customers to attain links. I’ve long been an admirer of Boykin because he is the rare individual who can be both a brilliant SEO strategizer and a savvy business person.
Jordan Koene, CEO and co-founder at Previsible, commented:
“Previsible believes that the future of discovery and search lies at the intersection of trust and visibility. Our acquisition of Internet Marketing Ninjas brings one of the most experienced trusted-link and digital PR teams into our ecosystem. As search continues to evolve beyond keywords into authority, reputation, and real-world relevance, link strategies are essential for brands to stand out.”
Previsible and Internet Marketing Ninjas will continue to operate as separate brands, leveraging Boykin’s existing team for their expertise.
Jim Boykin explained:
“Combining forces with Previsible kicks off an incredibly exciting new chapter for Internet Marketing Ninjas. We’re not just an SEO company anymore, we’re at the forefront of the future of digital visibility. Together with Previsible, we’re leading the charge in both search and AI-driven discovery.
By merging decades of deep SEO expertise with bold, forward-thinking innovation, we’re meeting the future of online marketing head-on. From Google’s AI Overviews to ChatGPT and whatever comes next, our newly united team is perfectly positioned to help brands get found, build trust, and be talked about across the entire digital landscape. I’m absolutely stoked about what we’re building together and how we’re going to shape the next era of internet marketing.”
Previsible’s acquisition of Internet Marketing Ninjas merges long-standing experience in link building while retaining the distinct brands and teams that make each consultancy a search marketing leader. The partnership will enable clients to increase visibility by bringing the expertise of both companies together.

Read More »
News

YouTube Clarifies Monetization Update: Targeting Spam, Not Reaction Channels via @sejournal, @MattGSouthern

YouTube has responded to concerns surrounding its upcoming monetization policy update, clarifying that the July 15 changes are aimed at improving the detection of inauthentic content.The update isn’t a crackdown on popular formats like reaction videos or clip compilations.
The clarification comes from Renee Richie, a creator liaison at YouTube, after a wave of confusion and concern followed the initial announcement.
Richie said in a video update:
“If you’re seeing posts about a July 2025 update to the YouTube Partner Program monetization policies and you’re concerned it’ll affect your reaction or clips or other type of channel. This is a minor update to YouTube’s long-standing YPP policies to help better identify when content is mass-produced or repetitive.”
[embedded content]
Clarifying What’s Changing
Richie explained that the types of content targeted by the update, mass-produced and repetitious material, have already been ineligible for monetization under the YouTube Partner Program (YPP).
The update doesn’t change the rules but is intended to enhance how YouTube enforces them.
That distinction is important: while the policy itself isn’t new, enforcement may reach creators who were previously flying under the radar.
Why Creators Were Concerned
YouTube’s original announcement said the platform would “better identify mass-produced and repetitious content,” but didn’t clearly define those terms or how the update would be applied.
This vagueness led to speculation that reaction videos, clip compilations, or commentary content might be targeted, especially if those formats reuse footage or follow repetitive structures.
Richie’s clarification helps narrow the scope of the update, but it doesn’t explicitly exempt all reaction or clips channels. Channels relying on recycled content without significant added value may run into issues.
Understanding The Policy Context
YouTube’s Partner Program has always required creators to produce “original” and “authentic” content to qualify for monetization.
The July 15 update reiterates that standard, while providing more clarity around what the platform considers inauthentic today.
According to the July 2 announcement:
“On July 15, 2025, YouTube is updating our guidelines to better identify mass-produced and repetitious content. This update better reflects what ‘inauthentic’ content looks like today.”
YouTube emphasized two patterns in particular:

Mass-produced content
Repetitious content

While some reaction or commentary videos could fall under these categories, Richie’s statement suggests that the update is not meant to penalize formats that include meaningful creative input.
What This Means
Transformative content, such as reactions, commentary, and curated clips with original insights or editing, is still eligible for monetization.
But creators using these formats should ensure they’re offering something new or valuable in each upload.
The update appears aimed at:

Auto-generated or templated videos with minimal variation
Reposted or duplicated content with little editing or context
Channels that publish near-identical videos in large quantities

For creators who invest in original scripting, commentary, editing, or creative structure, this update likely won’t require changes. But those leaning on low-effort or highly repetitive content strategies may be at increased risk of losing monetization.
Looking Ahead
The updated policy will take effect on July 15. Channels that continue to publish content flagged as mass-produced or repetitive after this date may face removal from the Partner Program.
While Richie’s clarification aims to calm fears, it doesn’t override the enforcement language in the original announcement. Creators still have time to review their libraries and adjust strategies to ensure compliance.

Featured Image: Roman Samborskyi/Shutterstock

Read More »
SEO

Stop Retrofitting. Start Commissioning: The New Role Of SEO In The Age Of AI via @sejournal, @billhunt

For most of its history, SEO has been a reactive discipline, being asked to “make it rank” once a site is built, with little input into the process.Even crazier, most SEO professionals are assigned a set of key performance indicators (KPIs) for which they are accountable, metrics tied to visibility, engagement, and revenue.
Still, they have no real control over the underlying systems that affect them. These metrics often rely on the performance of disconnected teams, including content, engineering, brand, and product, which don’t always share the same objectives.
When my previous agency, Global Strategies, was acquired by Ogilvy, I recommended that our team be viewed as building inspectors, not just an SEO package upsell added at the end, but involved at key phases when architects, engineers, and tradespeople had laid out the structural components.
Ideally, we’d come in after the site framing (wireframes) was complete, reviewing the plumbing (information architecture), electrical (navigation and links), and foundation (technical performance), but before the drywall and paint obscured what lies beneath.
We’d validate that the right materials were used and that construction followed a standard fit for long-term performance.
However, in reality, we were rarely invited into the planning stages because that was creative, and we were just SEO. We were usually brought in only after launch, tasked with fixing what had already been buried behind a visually appealing design.
Despite fighting for it, I was never a complete fan of this model; it made sense in the early days of search, when websites were simple, and ranking factors were more forgiving.
SEO practitioners identified crawl issues, adjusted metadata, optimized titles, fixed broken links, and retrofitted pages with keywords and internal links.
That said, I have long advocated for eliminating the need for most SEO actions by integrating the fixes into the roles and workflows that initially broke them.
Through education, process change, and content management system (CMS) innovation, much of what SEO fixes could, and should, become standard practice.
However, this has been a challenging sell, as SEO has often been viewed as less important than design, development, or content creation.
It was easier to assign SEO the role of cleanup crew rather than bake best practices into upstream systems and roles. We worked around CMS limitations, cleaned up after redesigns, and tried to reverse-engineer what Google wanted from the outside in.
But that role of identifying and fixing defects is no longer enough. And in the AI-driven search environment, it’s becoming obsolete.
Search Has Changed. Our Role Must Too.
Search engines today do far more than index and rank webpages. They extract answers, synthesize responses, and generate real-time content previews.
What used to be a linear search journey (query > list of links > website) has become a multi-layered ecosystem of zero-click answers, AI summaries, featured snippets, and voice responses.
Traditional SEO tactics, indexability, content relevance, and backlinks still matter in this environment, but only as part of a larger system.
The new currency of visibility is semantic clarity, machine-readability, and multi-system integration. SEO is no longer about optimizing a page. It’s about orchestrating a system.
This complexity requires us to transition from being just an inspector to becoming the Commissioning Authority (CxA) to meet the demands of this shift.
What Is A Commissioning Authority?
In modern architecture and construction, a Commissioning Authority is a specialized professional who ensures that all building systems, including HVAC, electrical, plumbing, safety, and lighting, function as intended in combination.
They are brought in not just to inspect but also to validate, test, and orchestrate performance.
They work on behalf of the building owner, aligning the construction output with the original design intent and operational goals. They look at interoperability, performance efficiency, long-term sustainability, and documentation.
They are not passive checkers. They are active enablers of success.
Why SEO Needs Commissioning Authorities
The modern website is no longer a standalone asset. It is a network of interconnected systems:

Content strategy.
CMS structure.
Design and front-end frameworks.
Analytics and tagging layers
Schema and structured data.
Internationalization and localization.
Page speed and Core Web Vitals.
AI answer optimization.

Today’s SEO, or whatever the latest alphabet soup acronym du jour is, and especially tomorrow, must be a Commissioning Authority for these systems. That means:

Being involved at the blueprint stage, not just post-launch.
Advocating for search visibility as a performance outcome.
Ensuring that semantic signals, not just visual elements, are embedded in every page.
Testing and validating that the site performs in AI environments, not just traditional search engine results pages (SERPs).

The Rise Of The Relevance Engineer
A key function within this evolved CxA role is that of the Relevance Engineer, a concept and term introduced by Mike King of iPullRank.
Mike has been one of the most vocal and insightful leaders on the transformation of SEO in the AI era, and his view is clear: The discipline must fundamentally evolve, both in practice and in how it is positioned within organizations.
Mike King’s perspective underscores that treating AI-driven search as simply an extension of traditional SEO is dangerously misguided.
Instead, we must embrace a new function, Relevance Engineering, which focuses on optimizing for semantic alignment, passage-level competitiveness, and probabilistic rankings, rather than deterministic keyword-based tactics.
The Relevance Engineer ensures:

Each content element is structured and chunked for generative AI consumption.
Content addresses layered user intent, from informational to transactional.
Schema markup and internal linking reinforce topical authority and entity associations.
The site’s architecture supports passage-level understanding and AI summarization.

In many ways, the Relevance Engineer is the semantic strategist of the SEO team, working hand-in-hand with designers, developers, and content creators to ensure that relevance is not assumed but engineered.
In construction terms, this might resemble a systems integration specialist. This expert ensures that electrical, plumbing, HVAC, and automation systems function individually and operate cohesively within an innovative building environment.
Relevance Engineering is more than a title; it’s a mindset shift. It emphasizes that SEO must now live at the intersection of information science, user experience, and machine interpretability.
From Inspector To CxA: How The Role Shifts

SEO Pillar
Old Role: Building Inspector
New Role: Commissioning Authority

Indexability
Check crawl blocks after build
Design architecture for accessibility and rendering

Relevance
Patch in keywords post-launch
Map content to entity models and query intent upfront, guided by a Relevance Engineer

Authority
Chase links to weak content
Build a structured reputation and concept ownership

Clickability
Tweak titles and meta descriptions
Structure content for AI previews, snippets, and voice answers

User Experience
Flag issues in testing
Embed UX, speed, and clarity into the initial design

Looking Ahead: The Next Generation Of SEO
As AI continues to reshape search behavior, SEO pros must adapt again. We will need to:

Understand how content is deconstructed and repackaged by large language models (LLMs).
Ensure that our information is structured, chunked, and semantically aligned to be eligible for synthesis.
Advocate for knowledge modeling, not just keyword optimization.
Encourage cross-functional integration between content, engineering, design, and analytics.

The next generation of SEO leaders will not be optimization specialists.
They will be systems thinkers, semantic strategists, digital performance architects, storytellers, performance coaches, and importantly, master negotiators to advocate and steer the necessary organizational, infrastructural, and content changes to thrive.
They will also be force multipliers – individuals or teams who amplify the effectiveness of everyone else in the process.
By embedding structured, AI-ready practices into the workflow, they enable content teams, developers, and marketers to do their jobs better and more efficiently.
The Relevance Engineer and Commissioning Authority roles are not just tactical additions but strategic leverage points that unlock exponential impact across the digital organization.
Final Thought
Too much article space has been wasted arguing over what to call this new era – whether SEO is dead, what the acronym should be, or what might or might not be part of the future.
Meanwhile, far too little attention has been devoted to the structural and intellectual shifts organizations must make to remain competitive in a search environment reshaped by AI.
Suppose we, as an industry, do not start changing the rules, roles, and mindset now. In that case, we’ll again be scrambling when the CEO demands to know why the company missed profitability targets, only to realize we’re buying back traffic we should have earned.
We’ve spent 30 years trying to retrofit what others built into something functional for search engines – pushing massive boulders uphill to shift monoliths into integrated digital machines. That era is over.
The brands that will thrive in the AI search era are those that elevate SEO from a reactive function to a strategic discipline with a seat at the planning table.
The professionals who succeed will be those who speak the language of systems, semantics, and sustained performance – and who take an active role in shaping the digital infrastructure.
The future of SEO is not about tweaking; it’s about taking the reins. It’s about stepping into the role of Commissioning Authority, aligning stakeholders, systems, and semantics.
And at its core, it will be driven by the precision of relevance engineering, and amplified by the force multiplier effect of integrated, strategic influence.
More Resources:

Featured Image: Jack_the_sparow/Shutterstock

Read More »
SEO

Beyond Keywords: Leveraging Technical SEO To Boost Crawl Efficiency And Visibility via @sejournal, @cshel

For all the noise around keywords, content strategy, and AI-generated summaries, technical SEO still determines whether your content gets seen in the first place.You can have the most brilliant blog post or perfectly phrased product page, but if your site architecture looks like an episode of “Hoarders” or your crawl budget is wasted on junk pages, you’re invisible.
So, let’s talk about technical SEO – not as an audit checklist, but as a growth lever.
If you’re still treating it like a one-time setup or a background task for your dev team, you’re leaving visibility (and revenue) on the table.
This isn’t about obsessing over Lighthouse scores or chasing 100s in Core Web Vitals. It’s about making your site easier for search engines to crawl, parse, and prioritize, especially as AI transforms how discovery works.
Crawl Efficiency Is Your SEO Infrastructure
Before we talk tactics, let’s align on a key truth: Your site’s crawl efficiency determines how much of your content gets indexed, updated, and ranked.
Crawl efficiency is equal to how well search engines can access and process the pages that actually matter.
The longer your site’s been around, the more likely it’s accumulated detritus – outdated pages, redirect chains, orphaned content, bloated JavaScript, pagination issues, parameter duplicates, and entire subfolders that no longer serve a purpose. Every one of these gets in Googlebot’s way.
Improving crawl efficiency doesn’t mean “getting more crawled.” It means helping search engines waste less time on garbage so they can focus on what matters.
Technical SEO Areas That Actually Move The Needle
Let’s skip the obvious stuff and get into what’s actually working in 2025, shall we?
1. Optimize For Discovery, Not “Flatness”
There’s a long-standing myth that search engines prefer flat architecture. Let’s be clear: Search engines prefer accessible architecture, not shallow architecture.
A deep, well-organized structure doesn’t hurt your rankings. It helps everything else work better.
Logical nesting supports crawl efficiency, elegant redirects, and robots.txt rules, and makes life significantly easier when it comes to content maintenance, analytics, and reporting.
Fix it: Focus on internal discoverability.
If a critical page is five clicks away from your homepage, that’s the problem, not whether the URL lives at /products/widgets/ or /docs/api/v2/authentication.
Use curated hubs, cross-linking, and HTML sitemaps to elevate key pages. But resist flattening everything into the root – that’s not helping anyone.
Example: A product page like /products/waterproof-jackets/mens/blue-mountain-parkas provides clear topical context, simplifies redirects, and enables smarter segmentation in analytics.
By contrast, dumping everything into the root turns Google Analytics 4 analysis into a nightmare.
Want to measure how your documentation is performing? That’s easy if it all lives under /documentation/. Nearly impossible if it’s scattered across flat, ungrouped URLs.
Pro tip: For blogs, I prefer categories or topical tags in the URL (e.g., /blog/technical-seo/structured-data-guide) instead of timestamps.
Dated URLs make content look stale – even if it’s fresh – and provide no value in understanding performance by topic or theme.
In short: organized ≠ buried. Smart nesting supports clarity, crawlability, and conversion tracking. Flattening everything for the sake of myth-based SEO advice just creates chaos.
2. Eliminate Crawl Waste
Google has a crawl budget for every site. The bigger and more complex your site, the more likely you’re wasting that budget on low-value URLs.
Common offenders:

Calendar pages (hello, faceted navigation).
Internal search results.
Staging or dev environments accidentally left open.
Infinite scroll that generates URLs but not value.
Endless UTM-tagged duplicates.

Fix it: Audit your crawl logs.
Disallow junk in robots.txt. Use canonical tags correctly. Prune unnecessary indexable pages. And yes, finally remove that 20,000-page tag archive that no one – human or robot – has ever wanted to read.
3. Fix Your Redirect Chains
Redirects are often slapped together in emergencies and rarely revisited. But every extra hop adds latency, wastes crawl budget, and can fracture link equity.
Fix it: Run a redirect map quarterly.
Collapse chains into single-step redirects. Wherever possible, update internal links to point directly to the final destination URL instead of bouncing through a series of legacy URLs.
Clean redirect logic makes your site faster, clearer, and far easier to maintain, especially when doing platform migrations or content audits.
And yes, elegant redirect rules require structured URLs. Flat sites make this harder, not easier.
4. Don’t Hide Links Inside JavaScript
Google can render JavaScript, but large language models generally don’t. And even Google doesn’t render every page immediately or consistently.
If your key links are injected via JavaScript or hidden behind search boxes, modals, or interactive elements, you’re choking off both crawl access and AI visibility.
Fix it: Expose your navigation, support content, and product details via crawlable, static HTML wherever possible.
LLMs like those powering AI Overviews, ChatGPT, and Perplexity don’t click or type. If your knowledge base or documentation is only accessible after a user types into a search box, LLMs won’t see it – and won’t cite it.
Real talk: If your official support content isn’t visible to LLMs, they’ll pull answers from Reddit, old blog posts, or someone else’s guesswork. That’s how incorrect or outdated information becomes the default AI response for your product.
Solution: Maintain a static, browsable version of your support center. Use real anchor links, not JavaScript-triggered overlays. Make your help content easy to find and even easier to crawl.
Invisible content doesn’t just miss out on rankings. It gets overwritten by whatever is visible. If you don’t control the narrative, someone else will.
5. Handle Pagination And Parameters With Intention
Infinite scroll, poorly handled pagination, and uncontrolled URL parameters can clutter crawl paths and fragment authority.
It’s not just an indexing issue. It’s a maintenance nightmare and a signal dilution risk.
Fix it: Prioritize crawl clarity and minimize redundant URLs.
While rel=”next”/rel=”prev” still gets thrown around in technical SEO advice, Google retired support years ago, and most content management systems don’t implement it correctly anyway.
Instead, focus on:

Using crawlable, path-based pagination formats (e.g., /blog/page/2/) instead of query parameters like ?page=2. Google often crawls but doesn’t index parameter-based pagination, and LLMs will likely ignore it entirely.
Ensuring paginated pages contain unique or at least additive content, not clones of page one.
Avoiding canonical tags that point every paginated page back to page one that tells search engines to ignore the rest of your content.
Using robots.txt or meta noindex for thin or duplicate parameter combinations (especially in filtered or faceted listings).
Defining parameter behavior in Google Search Console only if you have a clear, deliberate strategy. Otherwise, you’re more likely to shoot yourself in the foot.

Pro tip: Don’t rely on client-side JavaScript to build paginated lists. If your content is only accessible via infinite scroll or rendered after user interaction, it’s likely invisible to both search crawlers and LLMs.
Good pagination quietly supports discovery. Bad pagination quietly destroys it.
Crawl Optimization And AI: Why This Matters More Than Ever
You might be wondering, “With AI Overviews and LLM-powered answers rewriting the SERP, does crawl optimization still matter?”
Yes. More than ever.
Pourquoi? AI-generated summaries still rely on indexed, trusted content. If your content doesn’t get crawled, it doesn’t get indexed. If it’s not indexed, it doesn’t get cited. And if it’s not cited, you don’t exist in the AI-generated answer layer.
AI search agents (Google, Perplexity, ChatGPT with browsing) don’t pull full pages; they extract chunks of information. Paragraphs, sentences, lists. That means your content architecture needs to be extractable. And that starts with crawlability.
If you want to understand how that content gets interpreted – and how to structure yours for maximum visibility – this guide on how LLMs interpret content breaks it down step by step.
Remember, you can’t show up in AI Overviews if Google can’t reliably crawl and understand your content.
Bonus: Crawl Efficiency For Site Health
Efficient crawling is more than an indexing benefit. It’s a canary in the coal mine for technical debt.
If your crawl logs show thousands of pages no longer relevant, or crawlers are spending 80% of their time on pages you don’t care about, it means your site is disorganized. It’s a signal.
Clean it up, and you’ll improve everything from performance to user experience to reporting accuracy.
What To Prioritize This Quarter
If you’re short on time and resources, focus here:

Crawl Budget Triage: Review crawl logs and identify where Googlebot is wasting time.
Internal Link Optimization: Ensure your most important pages are easily discoverable.
Remove Crawl Traps: Close off dead ends, duplicate URLs, and infinite spaces.
JavaScript Rendering Review: Use tools like Google’s URL Inspection Tool to verify what’s visible.
Eliminate Redirect Hops: Especially on money pages and high-traffic sections.

These are not theoretical improvements. They translate directly into better rankings, faster indexing, and more efficient content discovery.
TL;DR: Keywords Matter Less If You’re Not Crawlable
Technical SEO isn’t the sexy part of search, but it’s the part that enables everything else to work.
If you’re not prioritizing crawl efficiency, you’re asking Google to work harder to rank you. And in a world where AI-powered search demands clarity, speed, and trust – that’s a losing bet.
Fix your crawl infrastructure. Then, focus on content, keywords, and experience, expertise, authoritativeness, and trustworthiness (E-E-A-T). In that order.
More Resources:

Featured Image: Candy Shapes/Shutterstock

Read More »
News

Google Explains Why Link Disavow Files Aren’t Processed Right Away via @sejournal, @martinibuster

Filing link disavows is generally a futile way to deal with spammy links, but they are useful for dealing with unnatural links an SEO or a publisher is responsible for creating, which can require urgent action. But how long does Google take to process them? Someone asked John Mueller that exact question, and his answer provides insight into how link disavows are handled internally at Google.Google’s Link Disavow Tool
The link disavow tool is a way for publishers and SEOs to manage unwanted backlinks that they don’t want Google to count against them. It literally means that the publisher disavows the links.
The tool was created by Google in response to requests by SEOs for an easy way to disavow paid links they were responsible for obtaining and were unable to remove from the websites in which they were placed. The link disavow tool is accessible via the Google Search Console and enables users to upload a spreadsheet with a list of URLs or domains from which they want links to not count against them in Google’s index.
Google’s official guidance for the disavow tool has always been that it’s for use by SEOs and publishers who want to disavow paid or otherwise unnatural links that they are responsible for obtaining and are unable to have removed. Google expressly says that the vast majority of sites do not need to use the tool, especially for low quality links for which they have nothing to do with.
How Google Processes The Link Disavow Tool
A person asked Mueller on Blue Sky for details about how Google processed the newly added links.
He posted:
“When we add domains to the disavow, i.e top up the list. Can I assume the new domains are treated separately as new additions.
You don’t reprocess the whole thing?”
John Mueller answered that the order of the domains and URLs on the list didn’t matter.
His response:
“The order in the disavow file doesn’t matter. We don’t process the file per-se (it’s not an immediate filter of “the index”), we take it into account when we recrawl other sites naturally.”
The answer is interesting because he says that Google doesn’t process the link disavow file “per-se” and what he likely means is that it’s not acted on in that moment. The “filtering” of that disavowed link happens at the time when a subsequent crawling happens.
So another way to look at it is that the link disavow file doesn’t trigger anything, but the data contained in the file is acted upon during the normal course of crawling.
Featured Image by Shutterstock/Luis Molinero

Read More »