Google Launches June 2024 Spam Update via @sejournal, @MattGSouthern

Google has announced the rollout of the June 2024 spam update, which aims to further improve search results by targeting websites that violate Google’s spam policies.

According to a statement, the update, which began on June 20, is expected to take up to one week to roll out fully.

Background On Google’s Spam Updates & Policies

Google regularly updates its systems to reduce low-quality and spammy content from its search results.

Spam updates target websites that break Google’s rules, such as:

  • Automatically generating content solely to improve search rankings.
  • Buying or selling links to manipulate rankings.
  • Having thin, duplicated, or poor-quality content.
  • Tricking users with hidden redirects or other deceptive techniques.

Google’s last spam update was released in March.

Despite the March update impacting many spammy websites, some AI-generated content still managed to rank well in search results.

Analysis by Search Engine Journal’s Roger Montti notes that some AI spam sites ranked for over 217,000 queries, with more than 14,900 ranking in the top 10 search results.

The sites employed tactics such as rapid content churn, AI-generated images, and templated article structures, exploiting a loophole that allowed new content to receive an initial ranking boost.

Potential Impact On Search Results

The June spam update will likely refine Google’s spam detection capabilities further.

However, past experiences have shown that closing loopholes can inadvertently impact legitimate websites.

As with any significant update, the June spam update may result in fluctuations in search rankings for some websites.

Websites that engage in practices that violate Google’s spam policies or rely heavily on AI-generated content may see a decline in their search visibility.

On the other hand, some websites may benefit from the update, as they will face less competition from spammy websites in search results.

Looking Ahead

Google says the June 2024 spam update may take up to one week to roll out fully.

Once the rollout is complete, Google will post an update on its Search Status Dashboard, and you can assess the update’s impact on your search rankings.


Featured Image: Danishch/Shutterstock

New Bluehost Agency Partner Program For WordPress Agencies via @sejournal, @martinibuster

Bluehost announced a partner program that’s expressly designed to support WordPress agencies and freelancers that service small-to-medium size businesses (SMBs). The program offers revenue generating opportunities in the form of commissions, exclusive discounts, priority customer service, and other benefits that will help agencies grow their client base and earn more revenue.

Focus On WordPress Websites

Bluehost is an active member of the WordPress community, which includes helping to develop the WordPress core itself by directly sponsoring six WordPress core contributors. Bluehost is well-positioned to offer agencies the products, community, service and revenue generating opportunities that align with the goals of WordPress-based development agencies and freelancers that service SMBs.

A key element of the Agency Partner Program is Bluehost Cloud, a managed WordPress hosting platform that provides a 100% uptime SLA. Bluehost managed WordPress Cloud is designed as a secure high performance solution, which makes it ideal for freelancers and agencies that depend on performant hosting.

Exclusive Benefits for Partner Agencies

Acceptance into the program grants agencies early access to Bluehost’s referral program (commissions), product discounts, learning webinars, access to priority customer support, and membership in an exclusive LinkedIn network.

According to the Bluehost announcement:

“By partnering with Bluehost, agencies can now provide their clients with the highest quality customer service, WordPress expertise and some of the most comprehensive hosting products, including Bluehost Cloud, Yoast SEO and eCommerce plug-ins.”

The Bluehost Agency Partner Program offers the resources for WordPress agencies and freelancers to level up their service offerings, generate new revenue streams, and the resources to deliver superior results for their clients. It’s a win-win partnership that may be worth looking into.

Visit the Bluehost Partner Program page:

Early Applications: Introducing the Bluehost Agency Partner Program.

Read the official announcement here:

Bluehost Unlocks New Opportunities For WordPress Agencies

Featured Image by Shutterstock/Shift Drive

Reddit Traffic Up 39%: Is Google Prioritizing Opinions Over Expertise? via @sejournal, @MattGSouthern

Reddit’s website traffic has grown 39% compared to the previous year, according to data from Similarweb.

This growth seems fueled by Reddit’s increased visibility in Google search results.

Why is Reddit growing so fast, and what does this mean for businesses and SEO professionals?

Here’s our take on it.

Why Is Reddit Growing?

Several factors, including Google prioritizing “helpful content” from discussion forums in a recent algorithm update, have likely contributed to Reddit’s improved search rankings and visibility.

A report from Business Insider indicates that more people are now finding Reddit through Google searches than by directly visiting the reddit.com website.

Mordy Oberstein, Wix’s Head of SEO, shared recent data showing a consistent increase in the share of Reddit sources appearing in Google’s Discussion and Forums SERP feature.

Lily Ray, Senior Director of SEO and Head of Organic Research at Amsive Digital, tweeted about Reddit’s increased visibility in Google search results.

She noted that Reddit appeared in “Discussions and Forums” for various medical queries in recent weeks but not anymore today.

Ray also observed that the number of Discussion and Forum features with multiple Reddit URLs has decreased slightly over the past months.

Google’s $60 Million Deal with Reddit

Google recently signed a $60 million deal to license Reddit data for AI products.

The timing of the deal and Reddit’s search growth raise questions.

Google has denied a direct connection between the deal and Reddit’s search visibility, but the coincidence is notable.

Implications For Marketers & SEO Professionals

Reddit’s newfound dominance in Google search results presents business challenges and opportunities.

Challenges

Roger Montti, a staff writer for Search Engine Journal, raises concerns about the expertise and trustworthiness of Reddit content:

In the article, “Let’s Be Real: Reddit In Google Search Lacks Credibility,” Montti states:

“Opinions shared on Reddit by people who lack expertise and are sharing opinions in anonymity qualify as dubious. Yet Google is not only favoring Reddit in the search results, it is also paying millions of dollars for access to content that is lacking in expertise, experience, authoritativeness and trustworthiness.”

This is challenging because it means your expert-written content could get outranked by the opinions of anonymous Reddit users.

Opportunities

Search Engine Journal founder Brent Csutoras offers a more optimistic view, believing marketers should lean into Reddit’s newfound prominence.

In the article, “Why Every Marketer Should Be On Reddit,” Csutoras states:

“If your brand has something meaningful to say and is interested in truly connecting with your audience, then yes, you should be on Reddit.”

However, Reddit’s community-driven nature requires a delicate approach, Csutoras adds:

“Reddit communities can be highly negative toward self-serving promotion. But if you put in the effort and solve people’s needs and problems, Reddit has the potential to be a high-performance channel.”

Why SEJ Cares

SEO professionals and marketers should be mindful that expert-written resources could be outranked by Reddit threads that reflect personal opinions rather than authoritative information.

However, by providing genuine value and respecting Reddit’s community guidelines, businesses may be able to leverage the platform’s prominence for increased visibility and audience engagement.


Featured Image: rafapress/Shutterstock

Is Google Crawling Your Site A Lot? That Could Be A Bad Sign via @sejournal, @MattGSouthern

According to a recent LinkedIn post by Gary Illyes, Analyst at Google, you should be cautious if Google starts aggressively crawling your website.

While an uptick in crawling can be a good sign, Illyes says it may indicate underlying issues.

Illyes cautions:

“Don’t get happy prematurely when search engines unexpectedly start to crawl like crazy from your site.”

He says there are two common problems to watch out for: infinite spaces and website hacks.

Infinite Spaces Could Cause Crawling Spike

An issue Illyes highlighted is sites with “infinite spaces”—areas like calendar modules or endlessly filterable product listings that can generate unlimited potential URLs.

If a site is crawled a lot already, crawlers may get extra excited about infinite spaces.

Illyes explains:

“If your site generally has pages that search users find helpful, crawlers will get excited about these infinite spaces for a time.”

He recommends using the robots.txt file to block crawlers from accessing infinite spaces.

Hacked Sites Can Trigger Crawling Frenzy

Another troubling cause of a crawling spike is a security breach where hackers inject spam onto a reputable site.

Crawlers may initially interpret this as new content to index before realizing it’s malicious.

Illyes states:

“If a no-good-doer somehow managed to get access…they might flood your site with, well, crap… crawlers will get excited about these new pages for a time and happily crawl them.”

Remain Skeptical Of Crawling Spikes

Rather than assuming a crawling spike is positive, Illyes suggests treating it as a potential issue until the root cause is identified.

He states:

“Treat unexpected sharp increases in crawling as a symptom…until you can prove otherwise. Or, you know, maybe I’m just a hardline pessimist.”

Fixing Hacked Sites: Help From Google

For hacked sites, Illyes pointed to a page that includes a video with further assistance:

Here are the key points.

Tips From Google’s Video

Google’s video outlines the steps in the recovery process.

1. Identify The Vulnerability

The first crucial step is finding how the hacker gained access. Tools like Google’s Webmaster Tools can assist in detecting issues.

2. Fix The Vulnerability

Once the security hole is identified, it must be closed to prevent any future unauthorized access. This could involve updating software, changing passwords, etc.

3. Clean The Hacked Content

Check the entire site’s content and code to remove any spam, malware, defaced pages, or other injections by the hacker. Security plugins like Wordfence can assist in this process.

4. Harden Security

Beyond fixing the specific vulnerability, take additional measures to harden the site’s security. This could include enabling firewalls, limiting user permissions, and more frequent software updates.

5. Request A Review

Once the vulnerability is patched and any hacked content is removed, you can then request Google to review the site and remove any security warnings or blacklists once it’s verified as clean.

The video notes that the review process is faster for malware issues (days) than spam issues (weeks) since Google has to inspect spam cleanup efforts further.

Additional Tips From Google’s John Mueller

Google’s John Mueller has previously offered specific advice on recovering from the SEO impact of hacked pages:

  1. Use the URL removal tool to deindex the hacked pages quickly.
  2. Focus on improving the overall site quality beyond removing hacked content.
  3. Lingering impacts may persist for months until the site recovers Google’s trust.

Why SEJ Cares

Website security is crucial for all businesses, as hacked content can impact trust and search engine rankings.

Google’s Gary Illyes pointed out that sudden spikes in crawling activity could indicate security breaches or technical issues that need immediate attention.


Featured Image: Stacey Newman/Shutterstock

Razorfish R-Index Turns Consumer Data Silos Into Strategic Insights via @sejournal, @martinibuster

Razorfish launched a new technology called R-Index that measures disparate online and offline customer interactions (including for paid and owned interactions) and generates prescriptive insights on consumer sentiment, brand performance and business impact. R-Index turns otherwise disconnected data into strategic insights on consumer journeys and brand sentiment.

R-Index is based on a custom algorithm that leverages Google Cloud, Big Query, and a suite of machine learning and Vertex AI, working together to analyze what customers are doing at every step and providing actionable insights about customer insights and learning how to engage with customers better.

What Is R-Index About?

I interviewed Razorfish to get a better idea of what R-Index is and why it’s an important tool for brands.

I asked Razorfish about what’s being measured:

“R-Index helps measure brand performance, consumer sentiment, and business impact. It includes a brand’s experience touchpoints across the consumer journey, including paid and owned interactions.”

The press release notes how there’s an abundance of data about “moments that matter” but that its inherent disparate quality makes it challenging to get a holistic picture of what it all means and extract meaning from it. So I asked them to elaborate on that.

“The holistic journey looks different for different consumers and consumer journeys. R-Index aims to capture how consumers start their journeys through purchase and loyalty, and distill how resonant each of these touchpoints are along the journey into a single, easy-to-use metric.

A moment that matters is a specific engagement that a consumer has with any of our experience touchpoints, whether that’s marketing, going to a website, etc. These are the moments where we see
more engagement based on our observations. They can be different across consumers and segments.

As we analyze what the consumer is doing across the full journey, we’re identifying touchpoints that are resonating more and helping brands refine and optimize those experiences. This could be increasing the frequency, delivering a more personalized message, or focusing on a specific touchpoint. But with R-Index, we’re capitalizing on this behavioral data and using it to serve consumers better.”

What are the concrete real-world “touchpoints” you are referring to?

“Real-world touchpoints include call data, CRM information, web traffic, mobile app clicks, ad traffic or offline interactions like TV. As the number of avenues for consumers to interact with a brand continues to increase, data from those sources is continuing to fragment and shift further into silos.

Similarly, despite recent delays, third-party cookies will continue to deprecate and newer regulations will further the challenges in data collection, making it vital for brands to be able to access and process any and all data options into one source.

When you think about how traditional measurement tools have looked at performance (ex: acquisition and how that works across specific channels, paid media, or television) they aren’t really connected to measuring the actual sentiment or perception of consumers and how these translate into specific business value for brands.

And sentiment descriptions really differ from brand to brand, as some labels that are considered “negative” for one brand might not be the same for another.

R-index is meant to aggregate all the different touchpoints that a consumer could theoretically interact with and get to a perspective of what’s actually driving either positive sentiment and resonance for consumers or what areas need to be optimized for better experiences.”

Tell me more about the insights and how R-Index provides more a “nuanced view”?

“R-Index is a simpler tool to get the insights that are needed to help a brand optimize overall performance, dive into the specific drivers of that performance for a brand, and make those experiences more resonant and relevant for their core consumers. R-Index provides more insights into what’s truly driving positive and helpful consumer experiences, and driving resonance for brands across the entire marketing mix and marketing investment.

Even if you have specific segments of consumers, they can behave very differently based on how they’re interacting with the touchpoints. While the aim is not to drill down to any one
specific customer, it can provide improved segment understanding to make each touchpoint more appropriate and personalized.

There are many measurement solutions in the market that can look at channel performance or sentiment performance in a silo, but R-Index is putting everything together in one place. R-Index has the components of being more dynamic, being able to scale and being able to plug into a number of different tools and AI capabilities to provide predictive optimized recommendations at scale.

The combination and connectivity of the data being pulled, the AI capabilities, and rigorous testing of the tool is helping drive the more nuanced views of insights that provide prescriptive strategic recommendations and analyses of data with greater detail. The definition and understanding of a brand’s audience segments will be deeper than ever before.

R-Index is prescriptive, providing automated insights and recommendations, and allows for drill-down insights at granular levels for components that make up the index score.

R-Index’s capabilities go beyond simply understanding what ads work to understanding how nuances across media investment, macroeconomic data, etc., impact overall consumer perceptions and interactions with brands, and how to best refine experiences to be resonant to consumers with those insights in mind.”

A Powerful Tool For Actionable Insights

R-Index is a powerful marketing insight tool that measures brand performance, consumer sentiment, and business impact and provides prescriptive recommendations to help make marketers and marketing teams improve consumer experiences and business outcomes.

Read more about R-Index:

Razorfish Unveils R-Index, A Proprietary Data Solution for Creating Unified Experiences in Collaboration with Google Cloud

Featured Image by Shutterstock/PCH.Vector

YouTube Tests Crowdsourced Annotations For Videos via @sejournal, @MattGSouthern

YouTube is piloting a new experimental feature allowing users to add contextual notes to videos to provide supplemental information.

The “Video Context Notes” feature, currently being tested on mobile in the United States for English language videos, allows invited contributors to write short annotations.

Screenshot from: blog.youtube.com, June 2024.

In its announcement, YouTube describes how it intends for people to use context notes:

“These notes could clarify when footage contains parody material, point out if a product review is outdated due to a newer version release, or confirm whether viral clips actually depict current events.”

Notes build on other YouTube efforts to present context alongside videos, such as information panels and disclosure labels for altered or synthetic media.

However, YouTube recognizes there’s potential for inaccurate or unsuitable notes during the experimental phase, stating:

“We anticipate there will be mistakes – notes that aren’t a great match for the video or potentially incorrect information. That’s part of how we’ll learn from the experiment.”

Availability

A limited number of YouTube channels in good standing will be invited to write and attach context notes to videos.

Viewers in the U.S. will be able to see and rate the helpfulness of these notes.

Third-party evaluators, the same contracted personnel who provide feedback on YouTube’s search and recommendation systems, will also assess the quality and accuracy of posted notes.

Their ratings and viewer input will be processed through a “bridging-based algorithm” to determine which notes get published broadly.

YouTube explains in the announcement:

“If many people who have rated notes differently in the past now rate the same note as helpful, then our system is more likely to show that note under a video.”

As the pilot progresses, YouTube plans to explore having contributors rate each other’s notes to further train the note-publishing system.

Why SEJ Cares

Letting users add context could add another layer of credibility to videos, such as confirming or debunking the presenter’s claims.

While there are bound to be some growing pains, if YouTube can get this new notes system right, it could raise the bar for transparency when it comes to video content across the web.


Featured Image: Queenmoonlite Studio/Shutterstock

Google’s Unconventional Advice On Fixing Broken Backlinks via @sejournal, @martinibuster

Google’s Gary Illyes recently answered the question of whether one should spend time fixing backlinks with wrong URLs that are pointing to a website, known as broken backlinks. The answer is interesting because it suggests a way of considering this issue in a completely unorthodox manner.

Google: Should Broken Backlinks Be Fixed?

During a recent Google SEO Office Hours podcast, a question was asked about fixing broken backlinks:

“Should I fix all broken backlinks to my site to improve overall SEO?”

Google’s Gary Ilyes answered:

“You should fix the broken backlinks that you think would be helpful for your users. You can’t possibly fix all the links, especially once your site grew to the size of a mammoth. Or brontosaurus.”

Unconventional Advice

Assessing broken backlinks for those that are the the most helpful for “users” is an unconventional way to decide whether to fix them or not. The conventional SEO practice is to fix a broken backlink to assure that a site is receiving the maximum available link equity. So his advice runs counter to standard SEO practice but it shouldn’t be dismissed out of hand because there may be something useful there.

Keep an open mind, be open to different ways of considering solutions. Something I like about his approach is that it’s a shortcut for determining whether or not a backlink is useful. For example, if the link is to a product that is no longer sold or supported in any way, a 404 response is the best thing to show to search crawlers and to users. So there is some validity to his way of looking at it.

Why Broken Backlinks Should Be Fixed

It’s not really a big deal to fix these kinds of backlinks, it’s one of the easier SEO chores to be done and it’s a quick win.

While any benefit is hard to measure, it’s nonetheless worth doing it for site visitors who might follow the wrong URL to the webpage that they’re looking for.

Check Backlinks After A Link Building Campaign

Checking backlinks is also important to do after a backlink campaign, even months after asking for a link, because site owners will sometimes add their links weeks or months later but it could be that they added the wrong URL. It happens, I know from experience.

Broken Backlinks That Do & Don’t Matter

The kinds of broken backlinks that usually (but not always) matter are the ones that show up as 404 errors on your server logs or in the Google Search Console.

There are two kinds of broken backlinks that matter:

  1. A backlink that’s broken because the linked page no longer exists or the URL changed.
  2. The URL of the backlink is misspelled.

Then there are backlinks that matter less and the reasons for that are:

  • Because the broken backlink is from a low quality website that doesn’t send any traffic
  • The link is to an outdated webpage that doesn’t matter and should return a 404 response
  • It’s just a random link created by an AI chatbot, spambot, or a spam web page.

How To Identify Broken Backlinks

Identifying any kind of broken backlink is (arguably) best done done by reviewing 404 errors generated from visits to pages that no longer exist or to URLs that are misspelled. If the link matters then there’s going to be web traffic from a broken backlink to a 404 page.

You might not be able to see where that link is coming from, although it may be possible to search for the broken URL and possibly find it.

The server log may show the IP address and user agent of the site visitor that created the broken link and from there a site owner can make the judgment call of whether it’s a spam or hacker bot, a search engine bot or an actual user. The Redirection WordPress plugin and the Wordfence plugin can be helpful for site owners that don’t have access to server logs.

A site owner may find that using a SaaS backlink tool might be useful for finding broken links but many sites, particularly sites that have been around awhile, have a lot of backlinks and using a tool might not be the right solution because it’s a lot of work for finding a link that doesn’t even send traffic. If the broken link sends traffic then you’ll know it because it’ll show up as a 404 error response.

Fixing Broken Backlinks

Fixing links that no longer exist can be done by recreating the resource or by redirecting requests for the missing web page to a web page that is substantially similar.

Fixing a link to a misspelled URL is easily done by redirecting the misspelled URL to the correct URL.

Another way to fix it is to contact the site that’s linking to the wrong URL but there are three things to consider before doing that.

1. The site owner may decide that they don’t want to link to the site and remove the link altogether.

2. The site owner may decide to add a no-follow link attribute to the corrected URL.

3. There are other sites that may have copied the web page and/or the link and are thus also linking to the wrong URL.

Simply adding a redirect from the misspelled URL to the correct URL fixes the problem without any risk that the backlink is going to be removed or nofollowed.

Fixing Broken Backlinks

Identifying broken backlinks is something that many site owners might stumble on when investigating 404 errors. Some call it link reclamation but any discussion of “link reclamation” is basically about fixing broken backlinks, it’s just another name for it.

Regardless, fixing these kinds of inbound links are one of the few SEO quick wins that could actually benefit a site owner and it could be a part of a site audit especially when it’s limited to finding opportunities in 404 error responses because these are links that are either getting crawled or are being used by potential site visitors.

Listen to the podcast at the 5:32 minute mark for the answer on fixing broken backlinks:

Featured Image by Shutterstock/Roman Samborskyi

Google Answers Question About Toxic Link Sabotage via @sejournal, @martinibuster

Google’s Gary Illyes answered a question about how to notify Google that someone is poisoning their backlink profile with “toxic links” which is a problem that many people have been talking about for at least fifteen years.

Question About Alerting Google To Toxic Links

Gary narrated the question:

“Someone’s asking, how to alert Google of sabotage via toxic links?”

And this is Gary’s answer:

I know what I would do: I’d ignore those links.

Generally Google is really, REALLY good at ignoring links that are irrelevant to the site they’re pointing at. If you feel like it, you can always disavow those “toxic” links, or file a spam report.

Disavow Links If You Feel Like It

Gary linked to Google’s explainer about disavowing links where it’s explained that the disavow tool is for a site owner to tell Google about links that they are responsible for in some way, like paid links or some other link scheme.

This is what it advises:

“If you have a manual action against your site for unnatural links to your site, or if you think you’re about to get such a manual action (because of paid links or other link schemes that violate our quality guidelines), you should try to remove the links from the other site to your site. If you can’t remove those links yourself, or get them removed, then you should disavow the URLs of the questionable pages or domains that link to your website.”

Google suggests that a link disavow is only necessary when two conditions are met:

  1. “You have a considerable number of spammy, artificial, or low-quality links pointing to your site,
    AND
  2. The links have caused a manual action, or likely will cause a manual action, on your site.”

Both of the above conditions must be met in order to file a valid link disavow tool.

Origin Of The Phrase Toxic Links

As Google became better at penalizing sites for low quality links and paid links, some in the highly competitive gambling industry started creating low quality links to sabotage their competitors. The practice was called negative SEO.

The phrase toxic link is something that was never heard of until after the Penguin link updates in 2012 which required penalized sites to remove all the paid and low quality links they created and then disavow the rest. An industry grew around disavowing links and it was that industry that invented the phrase Toxic Links for use in their marketing.

Confirmation That Google Is Able To Ignore Links

I have shared this anecdote before and I’ll share it here again. Someone I knew contacted me and said that their site lost rankings from negative SEO links. I took a look and their site had a ton of really nasty looking links. So out of curiosity (and because I knew that the site was this person’s main income), I emailed someone at Google Mountain View headquarters about it. That person checked it and replied that the site didn’t lose rankings because of the links. They lost rankings because of a Panda update related content issue.

That was around 2012 and it showed me how good Google was at ignoring links. Now, if Google was that good at ignoring really bad links back then, they’re probably better at it now, twelve years later now that they have the spam brain AI.

Listen to the question and answer at the 8:22 minute mark:

Featured Image by Shutterstock/New Africa

Google On Traffic Diversity As A Ranking Factor via @sejournal, @martinibuster

Google’s SearchLiaison tweeted encouragement to diversify traffic sources, being clear about the reason he was recommending it. Days later, someone followed up to ask if traffic diversity is a ranking factor, prompting SearchLiaison to reiterate that it is not.

What Was Said

The question of whether diversity of traffic was a ranking factor was elicited from a previous tweet in a discussion about whether a site owner should be focusing on off-site promotion.

Here’s the question from the original discussion that was tweeted:

“Can you please tell me if I’m doing right by focusing on my site and content – writing new articles to be found through search – or if I should be focusing on some off-site effort related to building a readership? It’s frustrating to see traffic go down the more effort I put in.”

SearchLiaison split the question into component parts and answered each one. When it came to the part about off-site promotion, SearchLiaison (who is Danny Sullivan), shared from his decades of experience as a journalist and publisher covering technology and search marketing.

I’m going to break down his answer so that it’s clearer what he meant

This is the part from the tweet that talks about off-site activities:

“As to the off-site effort question, I think from what I know from before I worked at Google Search, as well as my time being part of the search ranking team, is that one of the ways to be successful with Google Search is to think beyond it.”

What he is saying here is simple, don’t limit your thinking about what to do with your site to thinking about how to make it appeal to Google.

He next explains that sites that rank tend to be sites that are created to appeal to people.

SearchLiaison continued:

“Great sites with content that people like receive traffic in many ways. People go to them directly. They come via email referrals. They arrive via links from other sites. They get social media mentions.”

What he’s saying there is that you’ll know that you’re appealing to people if people are discussing your site in social media, if people are referring the site in social media and if other sites are citing it with links.

Other ways to know that a site is doing well is when when people engage in the comments section, send emails asking follow up questions, and send emails of thanks and share anecdotes of their success or satisfaction with a product or advice.

Consider this, fast fashion site Shein at one point didn’t rank for their chosen keyword phrases, I know because I checked out of curiosity. But they were at the time virally popular and making huge amounts of sales by gamifying site interaction and engagement, propelling them to become a global brand. A similar strategy propelled Zappos when they pioneered no-questions asked returns and cheerful customer service.

SearchLiaison continued:

“It just means you’re likely building a normal site in the sense that it’s not just intended for Google but instead for people. And that’s what our ranking systems are trying to reward, good content made for people.”

SearchLiaison explicitly said that building sites with diversified content is not a ranking factor.

He added this caveat to his tweet:

“This doesn’t mean you should get a bunch of social mentions, or a bunch of email mentions because these will somehow magically rank you better in Google (they don’t, from how I know things).”

Despite The Caveat…

A journalist tweeted this:

“Earlier this week, @searchliaison told people to diversify their traffic. Naturally, people started questioning whether that meant diversity of traffic was a ranking factor.

So, I asked @iPullRank what he thought.”

SearchLiaison of course answered that he explicitly said it’s not a ranking factor and linked to his original tweet that I quoted above.

He tweeted:

“I mean that’s not exactly what I myself said, but rather repeat all that I’ll just add the link to what I did say:”

The journalist responded:

“I would say this is calling for publishers to diversify their traffic since you’re saying the great sites do it. It’s the right advice to give.”

And SearchLiaison answered:

“It’s the part of “does it matter for rankings” that I was making clear wasn’t what I myself said. Yes, I think that’s a generally good thing, but it’s not the only thing or the magic thing.”

Not Everything Is About Ranking Factors

There is a longstanding practice by some SEOs to parse everything that Google publishes for clues to how Google’s algorithm works. This happened with the Search Quality Raters guidelines. Google is unintentionally complicit because it’s their policy to (in general) not confirm whether or not something is a ranking factor.

This habit of searching for “ranking factors” leads to misinformation. It takes more acuity to read research papers and patents to gain a general understanding of how information retrieval works but it’s more work to try to understand something than skimming a PDF for ranking papers.

The worst approach to understanding search is to invent hypotheses about how Google works and then pore through a document to confirm those guesses (and falling into the confirmation bias trap).

In the end, it may be more helpful to back off of exclusively optimizing for Google and focus at least equally as much in optimizing for people (which includes optimizing for traffic). I know it works because I’ve been doing it for years.

Featured Image by Shutterstock/Asier Romero