Google’s E-E-A-T & The Myth Of The Perfect Ranking Signal via @sejournal, @MattGSouthern

Few concepts have generated as much buzz and speculation in SEO as E-E-A-T.

Short for Experience, Expertise, Authoritativeness, and Trustworthiness, this framework has been a cornerstone of Google’s Search Quality Evaluator Guidelines for years.

But despite its prominence, more clarity about how E-E-A-T relates to Google‘s ranking algorithms is still needed.

In a recent episode of Google’s Search Off The Record podcast, Search Director & Product Manager Elizabeth Tucker addressed this complex topic.

Her comments offer insights into how Google evaluates and ranks content.

No Perfect Match

One key takeaway from Tucker’s discussion of E-E-A-T is that no single ranking signal perfectly aligns with all four elements.

Tucker explained

“There is no E-E-A-T ranking signal. But this really is for people to remember it’s a shorthand, something that should always be a consideration, although, you know, different types of results arguably need different levels of E-E-A-T.”

This means that while Google’s algorithms do consider factors like expertise, authoritativeness, and trustworthiness when ranking content, there isn’t a one-to-one correspondence between E-E-A-T and any specific signal.

The PageRank Connection

However, Tucker did offer an example of how one classic Google ranking signal – PageRank – aligns with at least one aspect of E-E-A-T.

Tucker said:

“PageRank, one of our classic Google ranking signals, probably is sort of along the lines of authoritativeness. I don’t know that it really matches up necessarily with some of those other letters in there.”

For those unfamiliar, PageRank is an algorithm that measures the importance and authority of a webpage based on the quantity and quality of links pointing to it.

In other words, a page with many high-quality inbound links is seen as more authoritative than one with fewer or lower-quality links.

Tucker’s comments suggest that while PageRank may be a good proxy for authoritativeness, it doesn’t necessarily capture the other elements of E-E-A-T, like expertise or trustworthiness.

Why SEJ Cares

While it’s clear that E-E-A-T matters, Tucker’s comments underscore that it’s not a silver bullet to ranking well.

Instead of chasing after a mythical “E-E-A-T score,” websites should create content that demonstrates their expertise and builds user trust.

This means investing in factors like:

  • Accurate, up-to-date information
  • Clear sourcing and attribution
  • Author expertise and credentials
  • User-friendly design and navigation
  • Secure, accessible web infrastructure

By prioritizing these elements, websites can send strong signals to users and search engines about the quality and reliability of their content.

The E-E-A-T Evolution

It’s worth noting that E-E-A-T isn’t a static concept.

Tucker explained in the podcast that Google’s understanding of search quality has evolved over the years, and the Search Quality Evaluator Guidelines have grown and changed along with it.

Today, E-E-A-T is just one of the factors that Google considers when evaluating and ranking content.

However, the underlying principles – expertise, authoritativeness, and trustworthiness – will likely remain key pillars of search quality for the foreseeable future.

Listen to the full podcast episode below:


Featured Image: salarko/Shutterstock

Google Warns Of Soft 404 Errors And Their Impact On SEO via @sejournal, @MattGSouthern

In a recent LinkedIn post, Google Analyst Gary Illyes raised awareness about two issues plaguing web crawlers: soft 404 and other “crypto” errors.

These seemingly innocuous mistakes can negatively affect SEO efforts.

Understanding Soft 404s

Soft 404 errors occur when a web server returns a standard “200 OK” HTTP status code for pages that don’t exist or contain error messages. This misleads web crawlers, causing them to waste resources on non-existent or unhelpful content.

Illyes likened the experience to visiting a coffee shop where every item is unavailable despite being listed on the menu. While this scenario might be frustrating for human customers, it poses a more serious problem for web crawlers.

As Illyes explains:

“Crawlers use the status codes to interpret whether a fetch was successful, even if the contents of the page is basically just an error message. They might happily go back to the same page again and again wasting your resources, and if there are many such pages, exponentially more resources.”

The Hidden Costs Of Soft Errors

The consequences of soft 404 errors extend beyond the inefficient use of crawler resources.

According to Illyes, these pages are unlikely to appear in search results because they are filtered out during indexing.

To combat this issue, Illyes advises serving the appropriate HTTP status code when the server or client encounters an error.

This allows crawlers to understand the situation and allocate their resources more effectively.

Illyes also cautioned against rate-limiting crawlers with messages like “TOO MANY REQUESTS SLOW DOWN,” as crawlers cannot interpret such text-based instructions.

Why SEJ Cares

Soft 404 errors can impact a website’s crawlability and indexing.

By addressing these issues, crawlers can focus on fetching and indexing pages with valuable content, potentially improving the site’s visibility in search results.

Eliminating soft 404 errors can also lead to more efficient use of server resources, as crawlers won’t waste bandwidth repeatedly visiting error pages.

How This Can Help You

To identify and resolve soft 404 errors on your website, consider the following steps:

  1. Regularly monitor your website’s crawl reports and logs to identify pages returning HTTP 200 status codes despite containing error messages.
  2. Implement proper error handling on your server to ensure that error pages are served with the appropriate HTTP status codes (e.g., 404 for not found, 410 for permanently removed).
  3. Use tools like Google Search Console to monitor your site’s coverage and identify any pages flagged as soft 404 errors.

Proactively addressing soft 404 errors can improve your website’s crawlability, indexing, and SEO.


Featured Image: Julia Tim/Shutterstock

WordPress Plugin Supply Chain Attacks Escalate via @sejournal, @martinibuster

WordPress plugins continue to be under attack by hackers using stolen credentials (from other data breaches) to gain direct access to plugin code.  What makes these attacks of particular concern is that these supply chain attacks can sneak in because the compromise appears to users as plugins with a normal update.

Supply Chain Attack

The most common vulnerability is when a software flaw allows an attacker to inject malicious code or to launch some other kind of attack, the flaw is in the code. But a supply chain attack is when the software itself or a component of that software (like a third party script used within the software) is directly altered with malicious code. This creates the situation where the software itself is delivering the malicious files.

The United States Cybersecurity and Infrastructure Security Agency (CISA) defines a supply chain attack (PDF):

“A software supply chain attack occurs when a cyber threat actor infiltrates a software vendor’s network and employs malicious code to compromise the software before the vendor sends it to their customers. The compromised software then compromises the customer’s data or system.

Newly acquired software may be compromised from the outset, or a compromise may occur through other means like a patch or hotfix. In these cases, the compromise still occurs prior to the patch or hotfix entering the customer’s network. These types of attacks affect all users of the compromised software and can have widespread consequences for government, critical infrastructure, and private sector software customers.”

For this specific attack on WordPress plugins, the attackers are using stolen password credentials to gain access to developer accounts that have direct access to plugin code to add malicious code to the plugins in order to create administrator level user accounts at every website that uses the compromised WordPress plugins.

Today, Wordfence announced that additional WordPress plugins have been identified as having been compromised. It may very well be the case that there will be more plugins that are or will be compromised. So it’s good to understand what is going on and to be proactive about protecting sites under your control.

More WordPress Plugins Attacked

Wordfence issued an advisory that more plugins were compromised, including a highly popular podcasting plugin called PowerPress Podcasting plugin by Blubrry.

These are the newly discovered compromised plugins announced by Wordfence:

  • WP Server Health Stats (wp-server-stats): 1.7.6
    Patched Version: 1.7.8
    10,000 active installations
  • Ad Invalid Click Protector (AICP) (ad-invalid-click-protector): 1.2.9
    Patched Version: 1.2.10
    30,000+ active installations
  • PowerPress Podcasting plugin by Blubrry (powerpress): 11.9.3 – 11.9.4
    Patched Version: 11.9.6
    40,000+ active installations
  • Latest Infection – Seo Optimized Images (seo-optimized-images): 2.1.2
    Patched Version: 2.1.4
    10,000+ active installations
  • Latest Infection – Pods – Custom Content Types and Fields (pods): 3.2.2
    Patched Version: No patched version needed currently.
    100,000+ active installations
  • Latest Infection – Twenty20 Image Before-After (twenty20): 1.6.2, 1.6.3, 1.5.4
    Patched Version: No patched version needed currently.
    20,000+ active installations

These are the first group of compromised plugins:

  • Social Warfare
  • Blaze Widget
  • Wrapper Link Element
  • Contact Form 7 Multi-Step Addon
  • Simply Show Hooks

More information about the WordPress Plugin Supply Chain Attack here.

What To Do If Using A Compromised Plugin

Some of the plugins have been updated to fix the problem, but not all of them. Regardless of whether the compromised plugin has been patched to remove the malicious code and the developer password updated, site owners should check their database to make sure there are no rogue admin accounts that have been added to the WordPress website.

The attack creates administrator accounts with the user names of “Options” or “PluginAuth” so those are the user names to watch for. However, it’s probably a good idea to look for any new admin level user accounts that are unrecognized in case the attack has evolved and the hackers are using different administrator accounts.

Site owners that use the Wordfence free or Pro version of the Wordfence WordPress security plugin are notified if there’s a discovery of a compromised plugin. Pro level users of the plugin receive malware signatures for immediately detecting infected plugins.

The official Wordfence warning announcement about these new infected plugins advises:

“If you have any of these plugins installed, you should consider your installation compromised and immediately go into incident response mode. We recommend checking your WordPress administrative user accounts and deleting any that are unauthorized, along with running a complete malware scan with the Wordfence plugin or Wordfence CLI and removing any malicious code.

Wordfence Premium, Care, and Response users, as well as paid Wordfence CLI users, have malware signatures to detect this malware. Wordfence free users will receive the same detection after a 30 day delay on July 25th, 2024. If you are running a malicious version of one of the plugins, you will be notified by the Wordfence Vulnerability Scanner that you have a vulnerability on your site and you should update the plugin where available or remove it as soon as possible.”

Read more:

WordPress Plugins Compromised At The Source – Supply Chain Attack

3 More Plugins Infected in WordPress.org Supply Chain Attack Due to Compromised Developer Passwords

Featured Image by Shutterstock/Moksha Labs

Google’s Search Dilemma: The Battle With ‘Not’ & Prepositions via @sejournal, @MattGSouthern

While Google has made strides in understanding user intent, Director & Product Manager Elizabeth Tucker says specific queries remain challenging.

In a recent episode of Google’s Search Off The Record podcast, Tucker discussed some lingering pain points in the company’s efforts to match users with the information they seek.

Among the top offenders were searches containing the word “not” and queries involving prepositions, Tucker reveals:

“Prepositions, in general, are another hard one. And one of the really big, exciting breakthroughs was the BERT paper and transformer-based machine learning models when we started to be able to get some of these complicated linguistic issues right in searches.”

BERT, or Bidirectional Encoder Representations from Transformers, is a neural network-based technique for natural language processing that Google began leveraging in search in 2019.

The technology is designed to understand the nuances and context of words in searches rather than treating queries as a bag of individual terms.

‘Not’ There Yet

Despite the promise of BERT and similar advancements, Tucker acknowledged that Google’s ability to parse complex queries is still a work in progress.

Searches with the word “not” remain a thorn in the search engine’s side, Tucker explains:

“It’s really hard to know when ‘not’ means that you don’t want the word there or when it has a different kind of semantic meaning.”

For example, Google’s algorithms could interpret a search like “shoes not made in China” in multiple ways.

Does the user want shoes made in countries other than China, or are they looking for information on why some shoe brands have moved their manufacturing out of China?

This ambiguity poses a challenge for websites trying to rank for such queries. If Google can’t match the searcher’s intent with the content on a page, it may struggle to surface the most relevant results.

The Preposition Problem

Another area where Google’s algorithms can stumble is prepositions, which show the relationship between words in a sentence.

Queries like “restaurants with outdoor seating” or “hotels near the beach” rely on prepositions to convey key information about the user’s needs.

For SEO professionals, this means that optimizing for queries with prepositions may require some extra finesse.

It’s not enough to include the right keywords on a page; the content needs to be structured to communicate the relationships between those keywords.

The Long Tail Challenge

The difficulties Google faces with complex queries are particularly relevant to long-tail searches—those highly specific, often multi-word phrases that make up a significant portion of all search traffic.

Long-tail keywords are often seen as a golden opportunity for SEO, as they tend to have lower competition and can signal a high level of user intent.

However, if Google can’t understand these complex queries, it may be harder for websites to rank for them, even with well-optimized content.

The Road Ahead

Tucker noted that Google is actively improving its handling of these linguistically challenging queries, but a complete solution may still be a way off.

Tucker said:

“I would not say this is a solved problem. We’re still working on it.”

In the meantime, users may need to rephrase their searches or try different query formulations to find the information they’re looking for – a frustrating reality in an age when many have come to expect Google to understand their needs intuitively.

Why SEJ Cares

While BERT and similar advancements have helped Google understand user intent, the search giant’s struggles with “not” queries and prepositions remind us that there’s still plenty of room for improvement.

As Google continues to invest in natural language processing and other AI-driven technologies, it remains to be seen how long these stumbling blocks will hold back the search experience.

What It Means For SEO

So, what can SEO professionals and website owners do in light of this information? Here are a few things to keep in mind:

  1. Focus on clarity and specificity in your content. The more you can communicate the relationships between key concepts and phrases, the easier it will be for Google to understand and rank your pages.
  2. Use structured data and other technical SEO best practices to help search engines parse your content more effectively.
  3. Monitor your search traffic and rankings for complex queries, and be prepared to adjust your strategy if you see drops or inconsistencies.
  4. Monitor Google’s efforts to improve its natural language understanding and be ready to adapt as new algorithms and technologies emerge.

Listen to the full podcast episode below:

Google Completes June 2024 Spam Update Rollout via @sejournal, @MattGSouthern

Google has officially confirmed the completion of its June 2024 spam update, a week-long process aimed at enhancing search result quality by targeting websites that violate the company’s spam policies.

The update began on June 20, 2024, and was announced via Google’s Search Central Twitter account.

Google’s Search Status Dashboard shows the update finished on June 27 at 9:10 PDT.

This spam update is part of Google’s ongoing efforts to combat web spam and improve user experience.

It’s important to note that this is not the algorithmic component of the site reputation abuse update, which Google has clarified is yet to be implemented.

Key Points Of The June 2024 Spam Update

  1. The update targets websites violating Google’s spam policies.
  2. It is separate from the anticipated site reputation abuse algorithmic update.
  3. The rollout process lasted approximately one week.

Google’s spam updates typically focus on eliminating various forms of web spam, including:

  • Automatically generated content aimed solely at improving search rankings
  • Purchased or sold links intended to manipulate rankings
  • Thin, duplicated, or poor-quality content
  • Hidden redirects or other deceptive techniques

This latest update follows Google’s previous spam update in March 2024.

Despite that update’s impact, some AI-generated content performed well in search results.

An analysis by Search Engine Journal’s Roger Montti revealed that certain AI spam sites ranked for over 217,000 queries, with more than 14,900 ranking in the top 10 search results.

The June update is expected to refine Google’s spam detection capabilities further. However, as with previous updates, it may cause fluctuations in website search rankings.

Those engaging in practices that violate Google’s spam policies or heavily relying on AI-generated content may see a decline in their search visibility.

Conversely, legitimate websites adhering to Google’s guidelines may benefit from reduced competition from spammy sites in search results.

SEO professionals and website owners are advised to review their sites for spammy practices and ensure compliance with Google’s Webmaster Guidelines.

For more information about the June 2024 spam update and its potential impact, refer to Google’s official communication channels, including the Google Search Central Twitter account and the Google Search Status Dashboard.


Featured Image: ninefotostudio/Shutterstock

Google Reveals Its Methods For Measuring Search Quality via @sejournal, @MattGSouthern

How does Google know if its search results are improving?

As Google rolls out algorithm updates and claims to reduce “unhelpful” content, many wonder about the true impact of these changes.

In an episode of Google’s Search Off The Record podcast, Google Search Directer, Product Management, Elizabeth Tucker discusses how Google measures search quality.

This article explores Tucker’s key revelations, the implications for marketers, and how you can adapt to stay ahead.

Multifaceted Approach To Measurement

Tucker, who transitioned to product management after 15 years as a data scientist at Google, says it’s difficult to determine whether search quality is improving.

“It’s really hard,” she admitted, describing a comprehensive strategy that includes user surveys, human evaluators, and behavioral analysis.

Tucker explained

“We use a lot of metrics where we sample queries and have human evaluators go through and evaluate the results for things like relevance.”

She also noted that Google analyzes user behavior patterns to infer whether people successfully find the information they seek.

The Moving Target Of User Behavior

Tucker revealed that users make more complex queries as search quality improves.

This creates a constantly shifting landscape for Google’s teams to navigate.

Tucker observed:

“The better we’re able to do this, the more interesting and difficult searches people will do.”

Counterintuitive Metrics

Tucker shared that in the short term, poor search performance might lead to increased search activity as users struggle to find information.

However, this trend reverses long-term, with sustained poor performance resulting in decreased usage.

Tucker cautioned:

“A measurement that can be good in the long term can be misleading in the short term.”

Quantifying Search Quality

To tackle the challenge of quantifying search quality, Google relies on an expansive (and expanding) set of metrics that gauge factors like relevance, accuracy, trustworthiness, and “freshness.”

But numbers don’t always tell the full story, Tucker cautioned:

“I think one important thing that we all have to acknowledge is that not everything important is measurable, and not everything that is measurable is important.”

For relatively straightforward queries, like a search for “Facebook,” delivering relevant results is a comparatively simple task for modern search engines.

However, more niche or complex searches demand rigorous analysis and attention, especially concerning critical health information.

The Human Element

Google aims to surface the most helpful information for searchers’ needs, which are as diverse as they are difficult to pin down at the scales Google operates at.

Tucker says:

“Understanding if we’re getting it right, where we’re getting it right, where needs focus out of those billions of queries – man, is that a hard problem.”

As developments in AI and machine learning push the boundaries of what’s possible in search, Tucker sees the “human element” as a key piece of the puzzle.

From the search quality raters who assess real-world results to the engineers and product managers, Google’s approach to quantifying search improvements blends big data with human insight.

Looking Ahead

As long as the web continues to evolve, Google’s work to refine its search quality measurements will be ongoing, Tucker says:

“Technology is constantly changing, websites are constantly changing. If we just stood still, search would get worse.”

What Does This Mean?

Google’s insights can help align your strategies with Google’s evolving standards.

Key takeaways include:

  1. Quality over quantity: Given Google’s focus on relevance and helpfulness, prioritize creating high-quality, user-centric content rather than aiming for sheer volume.
  2. Embrace complexity: Develop content that addresses more nuanced and specific user needs.
  3. Think long-term: Remember that short-term metrics can be misleading. Focus on sustained performance and user satisfaction rather than quick wins.
  4. Holistic approach: Like Google, adopt a multifaceted approach to measuring your content’s success, combining quantitative metrics with qualitative assessments.
  5. Stay adaptable: Given the constant changes in technology and user behavior, remain flexible and ready to adjust your strategies as needed.
  6. Human-centric: While leveraging AI and data analytics, don’t underestimate the importance of human insight in understanding and meeting user needs.

As Tucker’s insights show, this user-first approach is at the heart of Google’s efforts to improve search quality – and it should be at the center of every marketer’s strategy as well.

Listen to the discussion on measuring search quality in the video below, starting at the 17:39 mark:


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral, June 2024

WordPress Plugins Compromised At The Source via @sejournal, @martinibuster

WordPress.org and Wordfence have published warnings about hackers adding malicious code to plugins at the source, leading to widespread infections via updates.

Five Compromised Plugins… To Date

Typically what happens is that a plugin contains a weakness (a vulnerability) that allows an attacker to compromise individual sites that use that version of a plugin. But these compromises are different because the plugins themselves don’t contain a vulnerability. The attackers are directly injecting malicious code at directly at the source of the plugin, forcing an update which then spreads to all sites that use the plugin.

Wordfence first noticed one plugin that contained malicious code. When they uploaded the details to their database they then discovered four other plugins that were compromised with a similar kind of malicious code. Wordfence immediately notified WordPress about their findings.

Wordfence shared details of the affected plugins:

“Social Warfare 4.4.6.4 – 4.4.7.1
Patched Version: 4.4.7.3

Blaze Widget 2.2.5 – 2.5.2
Patched Version: None

Wrapper Link Element 1.0.2 – 1.0.3
Patched Version: It appears that someone removed the malicious code, however, the latest version is tagged as 1.0.0 which is lower than the infected versions. This means it may be difficult to update to the latest version, so we recommend removing the plugin until a properly tagged version is released.

Contact Form 7 Multi-Step Addon 1.0.4 – 1.0.5
Patched Version: None

Simply Show Hooks 1.2.1
Patched Version None”

WordPress shut down all five plugins directly at the official plugin repository and published a notification at each of the plugin pages that they are closed and unavailable.

Screenshot Of A Delisted WordPress Plugin

The infected plugins generate rogue admin accounts that phones home to a server. The attacked websites are altered with SEO spam links that are added to the footer. Sophisticated malware can be hard to catch because the hackers actively try to hide their code so that, for example, the code looks like a string of numbers, the malicious code is obfuscated. Wordfence noted that this specific malware was not sophisticated and was easy to identify and track.

Wordfence made an observation about this curious quality of the malware:

“The injected malicious code is not very sophisticated or heavily obfuscated and contains comments throughout making it easy to follow. The earliest injection appears to date back to June 21st, 2024, and the threat actor was still actively making updates to plugins as recently as 5 hours ago.”

WordPress Issues Advisory On Compromised Plugins

The WordPress advisory states that attackers are identifying plugin developers that have “committer access” (meaning that they can commit code to the plugin) and then in the next step they used credentials from other data breaches that match with those developers. The hackers use those credentials to directly access the plugin at the code level and inject their malicious code.

WordPress explained:

“On June 23 and 24, 2024, five WordPress.org user accounts were compromised by an attacker trying username and password combinations that had been previously compromised in data breaches on other websites. The attacker used access to these 5 accounts to issue malicious updates to 5 plugins those users had committer access to.

…The affected plugins have had security updates issued by the Plugins Team to protect user security.”

The fault of these compromises apparently lies with the plugin developer security practices. WordPress’ official announcement reminded plugin developers of best practices to use in order to prevent these kinds of compromises from happening.

How To Know If Your Site Is Compromised?

At this point in time there are only five plugins known to be compromised with this specific malicious code. Wordfence said that the hackers create admins with the user names of “Options” or “PluginAuth” so one way to double check if a site is compromised might be to look for any new admin accounts, especially ones with those user names.

Wordfence recommended that affected sites that use any of the five plugins to delete rogue administrator level user accounts and to run a malware scan with the Wordfence plugin and remove the malicious code.

Someone in the comments asked if they should be worried even if they don’t use any of the five plugins”

“Do you think we need to be worried about other plug-in updates? Or was this limited to these 5 plug-ins.”

Chloe Chamberland, the Threat Intelligence Lead at Wordfence responded:

“Hi Elizabeth, at this point it appears to be isolated to just those 5 plugins so I wouldn’t worry too much about other plugin updates. However, out of extra caution, I would recommend reviewing the change-sets of any plugin updates prior to updating them on any sites you run to make sure no malicious code is present.”

Two other commenters noted that they had at least one of the rogue admin accounts on sites that didn’t use any of the five known affected plugins. At this time it’s not known if any other plugins are affected.

Read Wordfence’s advisory and explanation of what is going on:

Supply Chain Attack on WordPress.org Plugins Leads to 5 Maliciously Compromised WordPress Plugins

Read the official WordPress.org announcement:

Keeping Your Plugin Committer Accounts Secure

Featured Image by Shutterstock/Algonga

Google: “Our Ranking Systems Aren’t Perfect” via @sejournal, @martinibuster

Google’s SearchLiaison responded to a plea on X (formerly Twitter) about ridiculously poor search results in which he acknowledged that Google’s reviews algorithm could be doing a better job and outlined what’s being done to stop rewarding sites that shouldn’t be ranking in the first place.

Questioning Google’s Search Results

The exchange with Google began with a post about a high ranking sites that was alleged to fall short of Google’s guidelines.

@dannyashton tweeted:

“This review has been ranking #1 on Google for “Molekule Air Mini+ review” for the past six months.

It is 50% anecdotal and 50% marketing messaging. It doesn’t share in-depth original research.

So, how did they make it to the top of Google?”

Followed by:

“Instead of a third-party review (which is likely what searchers are looking for), Google ranks an article backed by the brand:

Searchers land in an advertorial built off marketing materials:

So little care that they even left briefing notes in the published version 😞

And I think I found the reason why it ranks #1… Money.”

The general responses to the tweets were sympathetic, such as this one:

“WILD.

And this is on page 1…

Is this what writing for readers is? Is this what people need/want?

I think of folks like my mom here who wouldn’t know better and to dig more.

It looks and seems nice, must be trustworthy.

I mean, that’s their goals, right? Dupe and dip.”

Google’s Algorithms Aren’t Perfect

SearchLiaison responded to those tweets to explain that he personally goes through the feedback submitted to Google and discusses them with the search team. He also shared about the monumental scale of ranking websites, saying that Google is indexing trillions of web pages, and because of that the ranking process is itself scaled and automated.

SearchLiaison tweeted:

“Danny, I appreciate where you’re coming from — just as I appreciated the post that HouseFresh originally shared, as well as this type of feedback from others. I do. I also totally agree that the goal is for us to reward content that’s aligned with our guidance. From the HouseFresh post itself, there seemed to be some sense that we had actually improved over time:

“In our experience, each rollout of the Products Review Update has shaken things up, generally benefitting sites and writers who actually dedicated time, effort, and money to test products before they would recommend them to the world.”

That said, there’s clearly more we should be doing. I don’t think this is particularly new, as I’ve shared before that our ranking systems aren’t perfect and that I see content that we ought to do better by, as well as content we’re rewarding when we shouldn’t.

But it’s also not a system where any individual reviews content and says “OK, that’s great — rank it better” or “OK that’s not great, downrank it.” It simply wouldn’t work for a search engine that indexes trillions of pages of content from across the web to operate that way. You need scalable systems. And you need to keep working on improving those systems.

That’s what we’ll keep doing. We’re definitely aware of these concerns. We’ve seen the feedback, including the feedback from our recent form. I’ve personally been through every bit of that feedback and have been organizing it so our teams can look further at different aspects. This is in addition to the work they’re already doing, based on feedback we’ve already seen.”

Some of the takeaways from SearchLiaison’s statement is that:

1. Google agrees that their algorithms should reward content that is aligned with their guidance (presumably guidance about good reviews, helpfulness, and spam).

2. He acknowledged that the current ranking systems can still use improvement in rewarding the useful content and not rewarding inappropriate content.

3. Google’s systems are scaled.

4. Google is committed to listening to feedback and working toward improving their algorithms.

5. SearchLiaison confirmed that they are reviewing the feedback and organizing it for further analysis to identify what needs attention for improvement to rankings.

What Is Taking So Long To Fix Google?

Someone else questioned Google’s process for rolling out updates that subsequently shakes things up. It’s a good question because it makes sense to test an update to rankings to make sure that the changes improve the quality of sites being ranked and not do the opposite.

@mikefutia tweeted:

“Danny, aren’t all your ‘system improvements’ fully tested BEFORE rolling them out?

Surely your team was aware of the shakeup in the SERPs that these last few updates would cause.

Completely legitimate hobby sites written by passionate creators getting absolutely DECIMATED by these updates.

All in favor of Reddit, Pinterest, Quora, Forbes, Business Insider, and other nonsense gaining at their expense.

I guess what I’m saying is — surely this was not a surprise.

You guys knew this carnage was coming as a direct result of the updates.

And now — here we are, NINE months later — and there have been ZERO cases of these legitimate sites recovering. In fact, the March update just made it 100x worse.

And so Google is saying ‘yeah we f-d up, we’re working on it.’

But the question is—and I think I speak on behalf of thousands of creators when I ask—’What the hell is taking so long?’”

We know that Google’s third party quality raters review search results before an update is rolled out. But clearly there are many creators, site owners and search marketers who feel that Google’s search results are going the wrong way with every update.

SearchLiaison’s response is a good one because it acknowledges that Google is not perfect and that they are actively trying to improve the search results. But that does nothing to help the thousands of site owners who are disappointed in the direction that Google’s algorithm is headed.

Featured Image by Shutterstock/ivan_kislitsin

Google Announces New GA4 Features As Universal Analytics Sunset Nears via @sejournal, @MattGSouthern

As the July 1, 2024 shutdown date for Universal Analytics (UA) draws near, Google has announced new features and improvements for Google Analytics 4 (GA4).

These enhancements give marketers deeper insights and tools for cross-channel measurement and budget optimization.

Expanded Cross-Channel Reporting

GA4 is getting improved cross-channel reporting capabilities.

You will soon be able to integrate data from third-party advertising partners such as Pinterest, Reddit, and Snap directly into GA4 properties.

This will allow for a more complete view of campaign performance across platforms.

Additionally, GA4 will introduce aggregated impressions from linked Campaign Manager 360 accounts in the advertising workspace.

This feature will give advertisers a thorough overview of campaign performance across the entire marketing funnel.

AI-Powered Insights

Google is leveraging its AI capabilities to provide users with generated insights.

These AI-driven summaries will explain data trends and fluctuations using plain language, enabling businesses to make faster, more informed decisions based on their analytics data.

Advanced Planning & Budgeting Tools

Later this year, GA4 will introduce cross-channel budgeting features, including a projections report.

This tool will allow advertisers to track media pacing and projected performance against target objectives across multiple channels.

This addition should improve marketers’ ability to optimize media spend and allocate budgets more effectively.

Privacy-First Approach

GA4 continues to prioritize user privacy while delivering effective measurement solutions.

Upcoming features include support for Chrome Privacy Sandbox APIs and improvements to enhanced conversions.

Google says these updates will offer complete picture of cross-channel conversion attribution in a privacy-safe manner.

Preparing For The Future

Steve Ganem, Director of Product Management for Google Analytics, highlights the platform’s commitment to adaptability:

“Google Analytics 4 is truly built to be durable for the future. We’ll continue to invest in giving you a tool that helps answer fundamental questions about your business across your consumer’s entire path to purchase, despite ongoing changes in the measurement landscape.”

As the sunset date for Universal Analytics approaches, Google encourages users who haven’t yet made the switch to complete their migration to GA4.

The company also reminds UA users to download any historical data they wish to retain before the July 1 shutdown date.


Featured Image: Muhammad Alimaki/Shutterstock