Google Discusses Core Topicality Systems via @sejournal, @martinibuster

Google’s latest Search Off the Record shared a wealth of insights on how Google Search actually works. Google’s John Mueller and Lizzi Sassman spoke with Elizabeth Tucker, Director, Product Management at Google, who shared insights into the many systems that work together to rank web pages, including a mention of a topicality system.

Google And Topicality

The word “topicality” means how something is relevant in the present moment. But when used in search the word “topicality” is about matching the topic of a search query with the content on a web page. Machine learning models play a strong role in helping Google understand what users mean.

An example that Elizabeth Tucker mentions is BERT (Bidirectional Encoder Representations from Transformers) which is a language model that helps Google understand a word within the context of the words that come before and after it (it’s more, that’s a thumbnail explanation).

Elizabeth explains the importance of matching topically relevant content to a search query within the context of user satisfaction.

Googler Lizzi Sassman asked about user satisfaction and Tucker mentioned that there are many dimensions to search, with many systems, using as an example the importance of the concept of topical relevance.

Lizzi asked (at about the 4:20 minute mark):

“In terms of the satisfaction bit that you mentioned, are there more granular ways that we’re looking at? What does it mean to be satisfied when you come away from a search?”

Elizabeth answered:

“Absolutely, Lizzi. Inside Search Quality, we think about so many important dimensions of search. We have so many systems. Obviously we want to show content that’s topically relevant to your search. In the early days of Google Search, that was sometimes a challenge.

Our systems have gotten much better, but that is still sometimes, for especially really difficult searches, we can struggle with. People search in so many ways: Everything from, of course, typing in keywords, to speaking to Google and using normal everyday language. I’ve seen amazing searches. “Hey Google, who is that person who, years ago, did this thing, and I don’t remember what it was called.” You know, these long queries that are very vague. And it’s amazing now that we have systems that can even answer some of those.”

Takeaway:

An important takeaway from that exchange is that there are many systems working together, with topicality being just one of them. Many in the search marketing community tend to focus on the importance of one thing like Authority or Helpfulness but in reality there are many “dimensions” to search and it’s counterproductive to reduce the factors that go into search to one, two or three concepts.

Biases In Search

Google’s John Mueller asked Elizabeth about biases in search and if that’s something that Google thinks about and she answered that there are many kinds of biases that Google watches out for and tries to catch. Tucker explains the different kinds of search results that may be topically relevant (such as evergreen and fresh) and then explains how it’s a balance that Google focuses on getting correctly.

John asked (at the 05:24 minute mark):

“When you look at the data, I assume biases come up. Is that a topic that we think about as well?”

Elizabeth answered:

“Absolutely. There are all sorts of biases that we worry about when you’re looking for information. Are we disproportionately showing certain types of sites, are we showing more, I don’t know, encyclopedias and evergreen results or are we showing more fresh results with up-to-date information, are we showing results from large institutional sites, are we showing results from small blogs, are we showing results from social media platforms where we have everyday voices?

We want to make sure we have an appropriate mix that we can surface the best of the web in any shape or size, modest goals.”

Core Topicality Systems (And Many Others)

Elizabeth next reiterated that she works with many kinds of systems in search. This is something to keep in mind because the search community only knows about a few systems when in fact there are many, many more systems.

That means it’s important to not focus on just one, two or three systems when trying to debug a ranking problem but instead to keep an open mind that it might be something else entirely, not just helpfulness or EEAT or some other reasons.

John Mueller asked whether Google Search responds by demoting a site when users complain about certain search results.

She speaks about multiple things, including that most of the systems she works on don’t have anything to do with demoting sites. I want to underline how she mentions that she works with many systems and many signals (not just the handful of signals that the search marketing community tends to focus on).

One of those systems she mentions is the core topicality systems. What does that mean? She explains that it’s about matching the topic of the search query. She says “core topicality systems” so I that probably means multiple systems and algorithms.

John asked (at the 11:20 minute mark):

“When people speak up loudly, is the initial step to do some kind of a demotion where you say “Well, this was clearly a bad site that we showed, therefore we should show less of it”? Or how do you balance the positive side of things that maybe we should show more of versus the content we should show less of?”

Elizabeth answered:

“Yeah, that’s a great question. So I work on many different systems. It’s a fun part of my job in Search Quality. We have many signals, many systems, that all need to work together to produce a great search result page.

Some of the systems are by their nature demotative, and webspam would be a great example of this. If we have a problem with, say, malicious download sites, that’s something we would probably want to fix by trying to find out which sites are behaving badly and try to make sure users don’t encounter those sites.

Most of the systems I work with, though, actually are trying to find the good. An example of this: I’ve worked with some of our core topicality systems, so systems that try to match the topic of the query.

This is not so hard if you have a keyword query, but language is difficult overall. We’ve had wonderful breakthroughs in natural language understanding in recent years with ML
models, and so we want to leverage a lot of this technology to really make sure we understand people’s searches so that we can find content that matches that. This is a surprisingly hard problem.

And one of the interesting things we found in working on, what we might call, topicality, kind of a nerdy word, is that the better we’re able to do this, the more interesting and difficult searches people will do.”

How Google Is Focused On Topics In Search

Elizabeth returns to discussing Topicality, this time referring to it as the “topicality space” and how much effort Google has expended on getting this right. Of particular importance she highlights how Google  used to be very focused on keywords, with the clear implication that they’re not as focused on it any more, explaining the importance of topicality.

She discusses it at the 13:16 minute mark:

“So Google used to be very keyword focused. If you just put together some words with prepositions, we were likely to go wrong. Prepositions are very difficult or used to be for our systems. I mean, looking back at this, this is laughable, right?

But, in the old days, people would type in one, two, three keywords. When I started at Google, if a search had more than four words, we considered it long. I mean, nowadays I routinely see long searches that can be 10-20 words or more. When we have those longer searches, understanding what words are important becomes challenging.

For example, this was now years and years ago, maybe close to ten years ago, but we used to be challenged by searches that were questions. A classic example is “how tall is Barack Obama?” Because we wanted pages that would provide the answer, not just match the words how tall, right?

And, in fact, when our featured snippets first came about, it was motivated by this kind of problem. How can we match the answer, not just keyword match on the words in the question? Over the years, we’ve done a lot of work in, what we might call, the topicality space. This is a space that we continue to work in even now.”

The Importance Of Topics And Topicality

There are a lot to understand in Tucker’s answer, including that it may be helpful that, when thinking about Google’s search ranking algorithms, to also consider the core topicality systems which help Google understand search query topics and match those to web page content because it underlines the importance of thinking in terms of topics instead of focusing hard on ranking for keywords.

A common mistake I see is in what people who are struggling with ranking is they are strongly focused on keywords.  I’ve been encouraging an alternate approach for the past many years that stresses the importance of thinking in terms of Topics. That’s a multidimensional way to think of SEO. Optimizing for keywords is one dimensional. Optimizing for a topic is multidimensional and aligns with how Google Search is ranking web pages in that topicality is an important part of ranking.

Listen to the Search Off The Record podcast starting at about the 4:20 minute mark and then fast forward to the 11:20 minute mark:

Featured Image by Shutterstock/dekazigzag

Top 10 Digital Marketing Trends For 2024 via @sejournal, @gregjarboe

It’s been a year of considerable disruptions in digital marketing so far.

Right now, the industry is dealing with the integration of generative AI and the impact this is going to have on user behaviour and how people search. Alongside the relentless updates that Google keeps throwing at us.

SEO is changing and the industry is trying to adapt whilst accepting the uncertainty.

But, it’s not all catastrophic, there is a lot of opportunity ahead for those that can evolve to embrace the new.

To help marketers and brands thrive amidst uncertainty, I’ve outlined trends to focus on, guided by strategic insights and Yogi Berra’s timeless wisdom,

“Predictions are hard, especially about the future.” – Yogi Berra

Digital marketers can no doubt relate to Yogi’s sentiment, acknowledging the challenge of what lies ahead.

These, then, are the top 10 digital marketing trends for 2024:

1. Strategy: “If You Don’t Know Where You Are Going, You Might Wind Up Someplace Else.”

Why is “strategy” this year’s top trend instead of the latest technology?

Well, as Yogi once observed, “If you don’t know where you are going, you might wind up someplace else.”

According to Spencer Stuart’s 2024 CMO Tenure Study, the average tenure of chief marketing officers (CMOs) at Fortune 500 companies in 2023 was 4.2 years.

The study also found the average tenure of CMOs at B2B companies was 4.5 years. It was 4.0 years for CMOs at B2C companies. And it was just 3.1 years at the consumer-heavy top 100 advertisers.

So, developing a digital marketing strategy that will generate above-average results within a couple of years is the key to keeping your job as the CMO of a big brand.

And if you don’t master the art and science of creating a digital marketing strategy that generates business outcomes, then you won’t land one of the CMO jobs that turn over each year.

In other words, learning to use the latest technology is necessary, but it won’t get digital marketing leaders and executives where they want to go.

2. Generative AI: “Predictions Are Hard, Especially About The Future.”

Yogi also said, “Predictions are hard, especially about the future.” So, it’s tempting to ask generative AI tools to predict their own future.

OpenAI’s ChatGPT 3.5 says:

“AI and machine learning have been transforming digital marketing for years, but 2024 is poised to see these technologies become even more integral. From chatbots and predictive analytics to personalized content and ad targeting, AI will enhance customer experiences and drive efficiency. Brands leveraging AI can expect to see improved ROI and deeper customer insights.”

Google’s Gemini (formerly Bard) says:

“It ain’t science fiction anymore. Artificial intelligence (AI) is taking center stage, from crafting personalized chatbots that convert like crazy to optimizing campaigns with laser precision. Don’t fear the robot takeover, embrace it!”

And Anthropic’s Claude 3 says:

“Artificial intelligence writing assistants like Claude have been making waves, and in 2024 we’ll see these tools become ubiquitous in content marketing. They’ll help scale content creation while maintaining quality.”

But AI can’t see the big picture for your organization. It can’t empathize with people. And it can’t be creative like you. So, AI needs you in the driver’s seat to make it work effectively.

3. SEO: “It Ain’t Over Till It’s Over.”

Some pundits think SEO is dead. But as Yogi declared, “It ain’t over till it’s over.”

That’s because SEO pros have the remarkable ability to adapt to constant change or new information. Often, this means adjusting to the latest Google algorithm updates. But this also includes rethinking strategies based on the recent Google API “leak.”

Now, Rand Fishkin and Mike King were the first to report on the leaked documents. Although Google has officially acknowledged that these internal documents are authentic, it has also cautioned against jumping to conclusions based on the leaked files alone.

What should savvy SEO pros do?

Well, I’ve known Fishkin for more than 20 years. And he has the experience, expertise, authoritativeness, and trustworthiness (E-E-A-T) you’ve heard about.

So, I’m going to follow Fishkin’s recommendations, including:

  • Hire writers with established reputational authority that Google already associates with quality content.
  • Supplement link-building with public relations to increase branded search demand. (I’ll say more on this below.)
  • “Think about SEO as being more geographically specific than you think it is even for web search results.”
  • Move beyond parsing Google’s public statements and embrace experimentation and testing to uncover what produces results.

4. Link Building: “Always Go To Other People’s Funerals; Otherwise, They Won’t Go To Yours.”

I spotted this trend a long time ago, and I spoke about it at SES London 2009 in a session titled, “Beyond Linkbait: Getting Authoritative Mentions Online.”

Back then, I said link bait tactics can be effective “if you focus on the underlying quality as well as ingenuity needed to get other websites to link to you.”

I also provided a couple of case studies that showed British SEO professionals how to “approach journalists, bloggers, and other authoritative sources to enhance your company’s online reputation, whether or not you get links.”

But getting authoritative mentions without links didn’t translate. People on the other side of the pond thought I was saying something unintentionally funny like, “Always go to other people’s funerals; otherwise, they won’t go to yours.”

Hopefully, Fishkin’s recommendation will enable a lot more SEO pros to finally understand the underlying wisdom of supplementing link building with public relations.

As he clearly explained at MozCon, “If you get a whole bunch of links in one day and nothing else, guess what? You manipulated the link graph. If you’re really a big brand, people should be talking about you.”

5. Paid Media: “It’s Déjà Vu All Over Again.”

Everyone knows that Google, Meta, and other paid media are adding AI to their advertising platforms faster than the speed of sound. So, this might be mistaken as background noise.

But I’ve spotted the signal in the noise. Today’s frenzy to provide AI solutions is remarkably like the frenzy to provide programmatic solutions a decade ago. As Yogi said, “It’s déjà vu all over again.”

This means that digital marketers – and their agencies – can quickly refresh their “programmatic” workflow and turn it into “AI” best practices.

For example, Google touted a five-step programmatic workflow five years ago.

It consisted of:

  • Organize audience insights.
  • Design compelling creative.
  • Execute with integrated technology.
  • Reach audiences across screens.
  • Measure the impact.

Why is today’s process of buying and selling digital media in an automated fashion so similar? Because AI is just fulfilling the early promise of programmatic to engage with consumers in the moments that matter most.

But there’s one significant difference between then and now.

As you’ll read below, it’s the improved ability to integrate your advertising platforms with your analytics platform to measure the impact of campaigns on brand awareness and lead generation.

6. Analytics: “You Can Observe A Lot By Watching.”

Performance marketers integrated their advertising platforms with their analytics platform more than a decade ago to measure the impact of their campaigns on “conversions.”

But brand marketers rarely focused on their analytics data because “brand awareness” was something they measured when consumers initially saw their display ads or watched their video ads.

A funny thing happened after Google Analytics 4 rolled out last summer. A “Business objectives” collection replaced the “Life cycle” collection of reports and one business objective you can now track is “Raise brand awareness.”

For example, brand marketers can now use traffic acquisition, demographic details, user acquisition, as well as which pages and screens users visit to measure brand awareness in places that are less vulnerable to ad fraud.

Another business objective you can now track is “Generate leads.”

So, digital marketers can measure any user action that’s valuable to their organization, including:

  • Scrolling to 90% or more of their blog post.
  • Downloading a whitepaper.
  • Subscribing to their newsletter.
  • Playing at least 50% of a product video.
  • Completing a tutorial.
  • Submitting a registration form.

And as Yogi noted, “You can observe a lot by watching.”

7. Content Marketing: “When You Come To A Fork In The Road, Take It.”

In the summer of 2020, the Content Marketing Institute and MarketingProfs fielded their annual survey and found that “Content marketers are resilient. Most have met the challenges of the pandemic head-on.”

In response to the pandemic, B2B and B2C marketers:

  • Increased time spent talking with customers.
  • Revisited their customer/buyer personas.
  • Re-examined the customer journey.
  • Changed their targeting/messaging strategy.
  • Changed their distribution strategy.
  • Adjusted their editorial calendar.
  • Put more resources toward social media/online communities.
  • Changed their website.
  • Changed their products/services.
  • Adjusted their key performance indicators (KPIs).
  • Changed their content marketing metrics (e.g., set up new analytics/dashboards).

In other words, many content marketers totally overhauled their process for creating a content marketing plan from stem to stern.

For some, 2020 was the year of quickly adapting their content marketing strategy. For others, it was the year to finally develop one.

According to BrightEdge, content marketers are now “preparing for a Searchquake,” a tectonic shift in the content marketing landscape triggered by Google’s Search Generative Experiences (SGE).

But content marketers now know exactly what to do. As Yogi directed, “When you come to a fork in the road, take it.”

8. Video Creation: “If You Can’t Imitate Him, Don’t Copy Him.”

I teach an online class at the New Media Academy in Dubai on “Influencer Marketing and AI.” This may seem like an odd combination of topics, but they’re related to another class I teach on “Engaging Audiences through Content.”

I tell my students that creating great content is hard. That’s why marketers start using influencers or AI to create video content that their audience will find valuable and engaging. Then, they learn that there’s more to learn.

For example, AI can create realistic and imaginative scenes from text instructions. But AI can’t be creative like humans. So, the heart of every great video is still innovative, surprising, human-led creativity.

I show them “OpenAI Sora’s first short film – ‘Air Head,’ created by shy kids,” a Toronto-based production company.

Then, I ask them to apply what they have learned by using Synthesia, Runway, or invideo AI to generate a short video for their capstone project.

Invariably, they report that AI video generators can create realistic and imaginative scenes from text instructions but aren’t creative like shy kids.

Or, as Yogi put it, “If you can’t imitate him, don’t copy him.”

9. Influencer Marketing: “Nobody Goes There Anymore. It’s Too Crowded.”

The Influencer Marketing Hub says, “Most marketers believe that finding and selecting the best, most relevant influencers to be the most difficult part of influencer marketing.”

That’s ironic because HypeAuditor offers an influencer discovery platform that enables marketers to search through a database of 137.5 million influencers on Instagram, YouTube, TikTok, X (formerly Twitter), and Twitch.

It also enables marketers to apply filters to discover the perfect partners for their brand.

This apparent contradiction reminds me of Yogi’s comment, “Nobody goes there anymore. It’s too crowded.”

But it also indicates that most marketers are looking at influencer identification through the wrong end of the telescope. What should they do instead?

Well, I show the students in my “Influencer Marketing and AI” class how to use SparkToro to get a free report on the audience that searches for “Dubai.”

Infographic showcasing digital marketing trends for 2024 with monthly searches and demographics for Dubai. Image from SparkToro, June 2024

SparkToro estimates that 446,000 to 654,000 people search for “Dubai” monthly. And it uncovers the websites they visit, the keywords they search for, and their gender demographics.

Screenshot of a list showing accounts related to Dubai, their affinity scoresImage from SparkToro, June 2024

SparkToro also identifies the sources of influence for this audience, including high-affinity accounts and hidden gems, so marketers can invest in the right ones.

10. Social Media: “The Future Ain’t What It Used To Be.”

I’m a big believer in “the rule of three.”

So, I wasn’t startled when I received an email from Jennifer Radke inviting me to attend “an exciting webinar focused on a high-level look into using ChatGPT for social media!”

But I was shocked when Katie Delahaye Paine shared a link to new research by Asana’s Work Innovation Lab and Meltwater, which found that “only 28% of marketing professionals have received training on how to use AI tools effectively.”

I was also horrified when I read a column by Mark Ritson in MarketingWeek that argued, “AI’s strength is automating high-volume, short-term marketing activity, which means social media could become a cesspool of synthetic content.”

Hey, I was having lunch with Chris Shipley in 2004 when she coined the term “social media.” So, I remember when social media still had a promising future.

But, as Yogi once declared, “The future ain’t what it used to be.”

So, social media marketing has three options:

  • They can get upskilled to use AI tools more effectively.
  • They can get reskilled to identify the right influencers.
  • They can update their resumes and look for new jobs.

Picking Digital Marketing Trends Is Like Playing Moneyball

Some skeptics may question this counter-intuitive lineup of the top 10 digital marketing trends for 2024. Some of my selections seem to throw out conventional wisdom.

I recently watched the movie Moneyball (2011) for a second time. I was reminded that the Oakland Athletics baseball team’s general manager, Billy Beane (Brad Pitt), and assistant general manager, Peter Brand (Jonah Hill), used sabermetrics to analyze players.

This produced an epiphany: Picking digital marketing trends is like playing Moneyball. If you want to win against competitors with bigger budgets, then you need to find strategic insights, critical data, tactical advice, and digital marketing trends that conventional wisdom has overlooked.

And where did I come up with the whimsical idea of matching each trend with one of Yogi’s memorable quotes? Was it inspiration or hallucination?

I recently watched the documentary It Ain’t Over (2022) for the first time. It’s about New York Yankee Hall of Fame catcher Yogi Berra. And it supported Yogi’s claim, “I really didn’t say everything I said.”

But sportswriters kept attributing these Yogi-isms to the catcher because these “distilled bits of wisdom … like good country songs … get to the truth in a hurry,” as Allan Barra, the author of a book on Yogi, has explained.

And that strategic insight produced this year’s update – by a human – as opposed to last year’s top 10 digital marketing trends by ChatGPT.

More resources:


Featured Image: SuPatMaN/Shutterstock

Google’s E-E-A-T & The Myth Of The Perfect Ranking Signal via @sejournal, @MattGSouthern

Few concepts have generated as much buzz and speculation in SEO as E-E-A-T.

Short for Experience, Expertise, Authoritativeness, and Trustworthiness, this framework has been a cornerstone of Google’s Search Quality Evaluator Guidelines for years.

But despite its prominence, more clarity about how E-E-A-T relates to Google‘s ranking algorithms is still needed.

In a recent episode of Google’s Search Off The Record podcast, Search Director & Product Manager Elizabeth Tucker addressed this complex topic.

Her comments offer insights into how Google evaluates and ranks content.

No Perfect Match

One key takeaway from Tucker’s discussion of E-E-A-T is that no single ranking signal perfectly aligns with all four elements.

Tucker explained

“There is no E-E-A-T ranking signal. But this really is for people to remember it’s a shorthand, something that should always be a consideration, although, you know, different types of results arguably need different levels of E-E-A-T.”

This means that while Google’s algorithms do consider factors like expertise, authoritativeness, and trustworthiness when ranking content, there isn’t a one-to-one correspondence between E-E-A-T and any specific signal.

The PageRank Connection

However, Tucker did offer an example of how one classic Google ranking signal – PageRank – aligns with at least one aspect of E-E-A-T.

Tucker said:

“PageRank, one of our classic Google ranking signals, probably is sort of along the lines of authoritativeness. I don’t know that it really matches up necessarily with some of those other letters in there.”

For those unfamiliar, PageRank is an algorithm that measures the importance and authority of a webpage based on the quantity and quality of links pointing to it.

In other words, a page with many high-quality inbound links is seen as more authoritative than one with fewer or lower-quality links.

Tucker’s comments suggest that while PageRank may be a good proxy for authoritativeness, it doesn’t necessarily capture the other elements of E-E-A-T, like expertise or trustworthiness.

Why SEJ Cares

While it’s clear that E-E-A-T matters, Tucker’s comments underscore that it’s not a silver bullet to ranking well.

Instead of chasing after a mythical “E-E-A-T score,” websites should create content that demonstrates their expertise and builds user trust.

This means investing in factors like:

  • Accurate, up-to-date information
  • Clear sourcing and attribution
  • Author expertise and credentials
  • User-friendly design and navigation
  • Secure, accessible web infrastructure

By prioritizing these elements, websites can send strong signals to users and search engines about the quality and reliability of their content.

The E-E-A-T Evolution

It’s worth noting that E-E-A-T isn’t a static concept.

Tucker explained in the podcast that Google’s understanding of search quality has evolved over the years, and the Search Quality Evaluator Guidelines have grown and changed along with it.

Today, E-E-A-T is just one of the factors that Google considers when evaluating and ranking content.

However, the underlying principles – expertise, authoritativeness, and trustworthiness – will likely remain key pillars of search quality for the foreseeable future.

Listen to the full podcast episode below:


Featured Image: salarko/Shutterstock

Google Warns Of Soft 404 Errors And Their Impact On SEO via @sejournal, @MattGSouthern

In a recent LinkedIn post, Google Analyst Gary Illyes raised awareness about two issues plaguing web crawlers: soft 404 and other “crypto” errors.

These seemingly innocuous mistakes can negatively affect SEO efforts.

Understanding Soft 404s

Soft 404 errors occur when a web server returns a standard “200 OK” HTTP status code for pages that don’t exist or contain error messages. This misleads web crawlers, causing them to waste resources on non-existent or unhelpful content.

Illyes likened the experience to visiting a coffee shop where every item is unavailable despite being listed on the menu. While this scenario might be frustrating for human customers, it poses a more serious problem for web crawlers.

As Illyes explains:

“Crawlers use the status codes to interpret whether a fetch was successful, even if the contents of the page is basically just an error message. They might happily go back to the same page again and again wasting your resources, and if there are many such pages, exponentially more resources.”

The Hidden Costs Of Soft Errors

The consequences of soft 404 errors extend beyond the inefficient use of crawler resources.

According to Illyes, these pages are unlikely to appear in search results because they are filtered out during indexing.

To combat this issue, Illyes advises serving the appropriate HTTP status code when the server or client encounters an error.

This allows crawlers to understand the situation and allocate their resources more effectively.

Illyes also cautioned against rate-limiting crawlers with messages like “TOO MANY REQUESTS SLOW DOWN,” as crawlers cannot interpret such text-based instructions.

Why SEJ Cares

Soft 404 errors can impact a website’s crawlability and indexing.

By addressing these issues, crawlers can focus on fetching and indexing pages with valuable content, potentially improving the site’s visibility in search results.

Eliminating soft 404 errors can also lead to more efficient use of server resources, as crawlers won’t waste bandwidth repeatedly visiting error pages.

How This Can Help You

To identify and resolve soft 404 errors on your website, consider the following steps:

  1. Regularly monitor your website’s crawl reports and logs to identify pages returning HTTP 200 status codes despite containing error messages.
  2. Implement proper error handling on your server to ensure that error pages are served with the appropriate HTTP status codes (e.g., 404 for not found, 410 for permanently removed).
  3. Use tools like Google Search Console to monitor your site’s coverage and identify any pages flagged as soft 404 errors.

Proactively addressing soft 404 errors can improve your website’s crawlability, indexing, and SEO.


Featured Image: Julia Tim/Shutterstock

Google’s Search Dilemma: The Battle With ‘Not’ & Prepositions via @sejournal, @MattGSouthern

While Google has made strides in understanding user intent, Director & Product Manager Elizabeth Tucker says specific queries remain challenging.

In a recent episode of Google’s Search Off The Record podcast, Tucker discussed some lingering pain points in the company’s efforts to match users with the information they seek.

Among the top offenders were searches containing the word “not” and queries involving prepositions, Tucker reveals:

“Prepositions, in general, are another hard one. And one of the really big, exciting breakthroughs was the BERT paper and transformer-based machine learning models when we started to be able to get some of these complicated linguistic issues right in searches.”

BERT, or Bidirectional Encoder Representations from Transformers, is a neural network-based technique for natural language processing that Google began leveraging in search in 2019.

The technology is designed to understand the nuances and context of words in searches rather than treating queries as a bag of individual terms.

‘Not’ There Yet

Despite the promise of BERT and similar advancements, Tucker acknowledged that Google’s ability to parse complex queries is still a work in progress.

Searches with the word “not” remain a thorn in the search engine’s side, Tucker explains:

“It’s really hard to know when ‘not’ means that you don’t want the word there or when it has a different kind of semantic meaning.”

For example, Google’s algorithms could interpret a search like “shoes not made in China” in multiple ways.

Does the user want shoes made in countries other than China, or are they looking for information on why some shoe brands have moved their manufacturing out of China?

This ambiguity poses a challenge for websites trying to rank for such queries. If Google can’t match the searcher’s intent with the content on a page, it may struggle to surface the most relevant results.

The Preposition Problem

Another area where Google’s algorithms can stumble is prepositions, which show the relationship between words in a sentence.

Queries like “restaurants with outdoor seating” or “hotels near the beach” rely on prepositions to convey key information about the user’s needs.

For SEO professionals, this means that optimizing for queries with prepositions may require some extra finesse.

It’s not enough to include the right keywords on a page; the content needs to be structured to communicate the relationships between those keywords.

The Long Tail Challenge

The difficulties Google faces with complex queries are particularly relevant to long-tail searches—those highly specific, often multi-word phrases that make up a significant portion of all search traffic.

Long-tail keywords are often seen as a golden opportunity for SEO, as they tend to have lower competition and can signal a high level of user intent.

However, if Google can’t understand these complex queries, it may be harder for websites to rank for them, even with well-optimized content.

The Road Ahead

Tucker noted that Google is actively improving its handling of these linguistically challenging queries, but a complete solution may still be a way off.

Tucker said:

“I would not say this is a solved problem. We’re still working on it.”

In the meantime, users may need to rephrase their searches or try different query formulations to find the information they’re looking for – a frustrating reality in an age when many have come to expect Google to understand their needs intuitively.

Why SEJ Cares

While BERT and similar advancements have helped Google understand user intent, the search giant’s struggles with “not” queries and prepositions remind us that there’s still plenty of room for improvement.

As Google continues to invest in natural language processing and other AI-driven technologies, it remains to be seen how long these stumbling blocks will hold back the search experience.

What It Means For SEO

So, what can SEO professionals and website owners do in light of this information? Here are a few things to keep in mind:

  1. Focus on clarity and specificity in your content. The more you can communicate the relationships between key concepts and phrases, the easier it will be for Google to understand and rank your pages.
  2. Use structured data and other technical SEO best practices to help search engines parse your content more effectively.
  3. Monitor your search traffic and rankings for complex queries, and be prepared to adjust your strategy if you see drops or inconsistencies.
  4. Monitor Google’s efforts to improve its natural language understanding and be ready to adapt as new algorithms and technologies emerge.

Listen to the full podcast episode below:

Google Completes June 2024 Spam Update Rollout via @sejournal, @MattGSouthern

Google has officially confirmed the completion of its June 2024 spam update, a week-long process aimed at enhancing search result quality by targeting websites that violate the company’s spam policies.

The update began on June 20, 2024, and was announced via Google’s Search Central Twitter account.

Google’s Search Status Dashboard shows the update finished on June 27 at 9:10 PDT.

This spam update is part of Google’s ongoing efforts to combat web spam and improve user experience.

It’s important to note that this is not the algorithmic component of the site reputation abuse update, which Google has clarified is yet to be implemented.

Key Points Of The June 2024 Spam Update

  1. The update targets websites violating Google’s spam policies.
  2. It is separate from the anticipated site reputation abuse algorithmic update.
  3. The rollout process lasted approximately one week.

Google’s spam updates typically focus on eliminating various forms of web spam, including:

  • Automatically generated content aimed solely at improving search rankings
  • Purchased or sold links intended to manipulate rankings
  • Thin, duplicated, or poor-quality content
  • Hidden redirects or other deceptive techniques

This latest update follows Google’s previous spam update in March 2024.

Despite that update’s impact, some AI-generated content performed well in search results.

An analysis by Search Engine Journal’s Roger Montti revealed that certain AI spam sites ranked for over 217,000 queries, with more than 14,900 ranking in the top 10 search results.

The June update is expected to refine Google’s spam detection capabilities further. However, as with previous updates, it may cause fluctuations in website search rankings.

Those engaging in practices that violate Google’s spam policies or heavily relying on AI-generated content may see a decline in their search visibility.

Conversely, legitimate websites adhering to Google’s guidelines may benefit from reduced competition from spammy sites in search results.

SEO professionals and website owners are advised to review their sites for spammy practices and ensure compliance with Google’s Webmaster Guidelines.

For more information about the June 2024 spam update and its potential impact, refer to Google’s official communication channels, including the Google Search Central Twitter account and the Google Search Status Dashboard.


Featured Image: ninefotostudio/Shutterstock

Google Reveals Its Methods For Measuring Search Quality via @sejournal, @MattGSouthern

How does Google know if its search results are improving?

As Google rolls out algorithm updates and claims to reduce “unhelpful” content, many wonder about the true impact of these changes.

In an episode of Google’s Search Off The Record podcast, Google Search Directer, Product Management, Elizabeth Tucker discusses how Google measures search quality.

This article explores Tucker’s key revelations, the implications for marketers, and how you can adapt to stay ahead.

Multifaceted Approach To Measurement

Tucker, who transitioned to product management after 15 years as a data scientist at Google, says it’s difficult to determine whether search quality is improving.

“It’s really hard,” she admitted, describing a comprehensive strategy that includes user surveys, human evaluators, and behavioral analysis.

Tucker explained

“We use a lot of metrics where we sample queries and have human evaluators go through and evaluate the results for things like relevance.”

She also noted that Google analyzes user behavior patterns to infer whether people successfully find the information they seek.

The Moving Target Of User Behavior

Tucker revealed that users make more complex queries as search quality improves.

This creates a constantly shifting landscape for Google’s teams to navigate.

Tucker observed:

“The better we’re able to do this, the more interesting and difficult searches people will do.”

Counterintuitive Metrics

Tucker shared that in the short term, poor search performance might lead to increased search activity as users struggle to find information.

However, this trend reverses long-term, with sustained poor performance resulting in decreased usage.

Tucker cautioned:

“A measurement that can be good in the long term can be misleading in the short term.”

Quantifying Search Quality

To tackle the challenge of quantifying search quality, Google relies on an expansive (and expanding) set of metrics that gauge factors like relevance, accuracy, trustworthiness, and “freshness.”

But numbers don’t always tell the full story, Tucker cautioned:

“I think one important thing that we all have to acknowledge is that not everything important is measurable, and not everything that is measurable is important.”

For relatively straightforward queries, like a search for “Facebook,” delivering relevant results is a comparatively simple task for modern search engines.

However, more niche or complex searches demand rigorous analysis and attention, especially concerning critical health information.

The Human Element

Google aims to surface the most helpful information for searchers’ needs, which are as diverse as they are difficult to pin down at the scales Google operates at.

Tucker says:

“Understanding if we’re getting it right, where we’re getting it right, where needs focus out of those billions of queries – man, is that a hard problem.”

As developments in AI and machine learning push the boundaries of what’s possible in search, Tucker sees the “human element” as a key piece of the puzzle.

From the search quality raters who assess real-world results to the engineers and product managers, Google’s approach to quantifying search improvements blends big data with human insight.

Looking Ahead

As long as the web continues to evolve, Google’s work to refine its search quality measurements will be ongoing, Tucker says:

“Technology is constantly changing, websites are constantly changing. If we just stood still, search would get worse.”

What Does This Mean?

Google’s insights can help align your strategies with Google’s evolving standards.

Key takeaways include:

  1. Quality over quantity: Given Google’s focus on relevance and helpfulness, prioritize creating high-quality, user-centric content rather than aiming for sheer volume.
  2. Embrace complexity: Develop content that addresses more nuanced and specific user needs.
  3. Think long-term: Remember that short-term metrics can be misleading. Focus on sustained performance and user satisfaction rather than quick wins.
  4. Holistic approach: Like Google, adopt a multifaceted approach to measuring your content’s success, combining quantitative metrics with qualitative assessments.
  5. Stay adaptable: Given the constant changes in technology and user behavior, remain flexible and ready to adjust your strategies as needed.
  6. Human-centric: While leveraging AI and data analytics, don’t underestimate the importance of human insight in understanding and meeting user needs.

As Tucker’s insights show, this user-first approach is at the heart of Google’s efforts to improve search quality – and it should be at the center of every marketer’s strategy as well.

Listen to the discussion on measuring search quality in the video below, starting at the 17:39 mark:


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral, June 2024

SEO Cost Calculator: How Much Should You Budget For SEO Services? via @sejournal, @ChuckPrice518

Digital is the primary marketing channel for many companies.

Many owners and executives still have difficulty budgeting for online marketing.

Budgeting for SEO can be complex and influenced by factors like project scope, industry competition, and specific services needed. There is no universal calculator for calculating costs.

This article explores key SEO pricing components and how to calculate and plan your budget.

What Businesses Don’t Understand About Investing In SEO

SEO is an area where you truly get what you pay for. Investing adequately in SEO services can significantly impact your online presence and business growth.

According to recent data, over half of all SEO professionals work with monthly budgets ranging from $500 to $5,000, with 28.6% reporting budgets in the $1,001-$5,000 range.

SEO BudgetsImage from Search Engine Journal, May 2024

Many business owners are reluctant to invest in SEO, often because they lack understanding of how search marketing works and are too busy running their businesses to learn about SEO.

Most industries follow a standardized, step-by-step process to achieve specific outcomes.

Many business owners mistakenly assume SEO works the same way, treating it as a commodity.

This misconception leads them to fall for low-cost offers like $99/month “guaranteed page one” services from spammers and scammers, which never deliver meaningful results.

The Cost Of Cheap SEO

I belong to several internet marketing groups on Facebook. It’s truly frightening the number of noobs posing as SEO professionals and taking on clients.

It’s common to see a question like: “I just landed a client that wants to rank for [keyword x] – how do I do it?”

A close second is people using link schemes, specifically private blog networks and third-party pages known as parasite SEO, without ever explaining the risk to clients. Many use AI to generate content at a scale without fact-checking.

However, AI can be a powerful tool when used ethically in SEO.

AI helps automate data analysis, identify patterns, and streamline content creation and optimization, which in turn helps to reduce SEO costs.

If business owners were just throwing money away by hiring an incompetent SEO, that would be bad enough. Unfortunately, the collateral damage from “cheap SEO” can go much deeper.

It can draw a Google penalty and virtually wipe out a website’s visibility on the web.

Business owners must remember that they’re ultimately responsible for any SEO work performed on their site. They should discuss the specific tactics service providers use before entering into an agreement.

Managing Your Resources

With Google utilizing 200+ (and likely exponentially more) ranking factors, it’s easy to become intimidated and paralyzed.

The good news is that if you focus on just three factors, you can still crush it, regardless of your niche.

Here’s what you need to pay attention to:

1. Information Architecture

Your site should:

2. Content

Your site’s content should conform to best practices as disclosed in the Search Quality Ratings Guidelines with an emphasis on:

3. Backlinks

  • It must be natural. Avoid popular link schemes like private blog networks (PBNs) and paid guest posts. Instead, focus on building real links from topically relevant websites with high-quality content.
  • Quality is key: A lower number of high trust/high authority/relevant links can outperform a large quantity of lower quality links.

You Manage What You Measure – Set Goals

Before establishing a budget, one must define specific goals for a campaign.

Your goals should include measurable results, a defined timeframe, and an actual measurement for success.

At one time, success was measured solely by keyword rankings. While SERPs remain an important metric, they are not the most important.

I would argue that the most important metrics are those that directly impact the bottom line. Organic sessions, goal conversions, and revenue fall into that category.

Goal setting could include improving organic sessions by X%, increasing conversions by Y per month, and/or increasing revenues by Z%.

When setting goals, it’s important to keep a couple of things in mind.

First, they need to be achievable. Stretch goals are fine, but pie-in-the-sky benchmarks can actually work as a disincentive.

Equally important: you need to give the campaign time to work.

According to Google,

“…in most cases, SEOs need four months to a year to help your business first implement improvements and then see potential benefit.”

Developing A Budget

Your goals will determine what tactics are needed for success. This, in turn, sets up a framework for developing an action plan and the budget necessary to support that plan.

This brings us full circle to positioning and paying attention to those factors that move the dial.

The answers to those questions will determine priorities as well as the volume of work needed to reach your goals.

In many cases, the actual work performed will be the same, regardless of budget level. The difference is the volume of work performed.

If you add twice the content and twice the links at budget level “B” compared to budget level “A,” you are more likely to achieve earlier success at the higher budget.

That said, the right budget is one you can afford, without losing sleep, for a minimum of 6 and ideally 12 months.

It takes time to properly plan, implement, and tweak a campaign to evaluate its success.

Also, remember that the lower the budget, the longer the journey.

How Much Can You Expect To Spend On SEO?

To execute a local campaign, you could budget between $1,001 and $5,000 per month, the most common budget range among SEO professionals SEJ surveyed in 2023.

The budget will likely be higher for a national or international campaign, with many SEO pros working with budgets exceeding $10,000 per month for broader campaigns.

Some firms offer a “trial package” at a lower price with no contract. This allows prospective clients to test their services while minimizing risk.

There are some options if you can’t afford to retain a top-level SEO pro. The most common is a one-time website SEO audit with actionable recommendations.

Just fixing your website will often lead to a meaningful boost in organic traffic. Content development and keyword analysis are other areas where you can get help from a pro for a one-time fixed rate.

Another option is to become an expert and do it yourself.

SEO Cost Calculator – Measuring Organic Search (SEO) ROI

The following is a calculator commonly used for (incorrectly) measuring return on investment for SEO.

Organic Search ROI Calculation Assuming “One Shots”

Example: selling blue widgets
Number of new customers acquired via organic search in a given month 10
Average net income (profit) per order $100
Total profits from new organic search customers in a given month $1,000
Monthly marketing budget (expense) $2,500
Monthly profits from new customers ($1,000) divided by monthly organic marketing spend ($2,500) ROI = -60%

The flaw in the above calculator is that it fails to take into consideration the lifetime value of a new customer.

Online retailers need repeat business to grow. By not calculating the lifetime value of a new customer, the true ROI is grossly understated.

The right way to calculate ROI is to build lifetime value into the calculator.

To calculate the cost of SEO and its true ROI use this formula:

Average lifetime profits from new customers acquired in one month divided by monthly organic marketing spend.

Organic Search SEO ROI Calculation Assuming Lifetime Value

Same example: selling blue widgets
Number of new customers acquired via organic search in a given month 10
Average net income (profit) per order $100
Total profits from new organic search customers in a given month $1,000
Average number of orders per customer over a “lifetime” 5
Total average lifetime profit $5,000
Monthly marketing budget (expense) $2,500
Average lifetime profits from new customers ($5,000) divided by monthly organic marketing spend ($2,500) ROI = 200%

As you can see, that one variable makes a huge difference in how the ROI is stated.

SEO Campaigns Are Long-Term Investments

Unlike PPC, an organic search campaign will not yield immediate results.

A comprehensive SEO campaign will involve a combination of technical SEO, content marketing, and link-building. Even when executed to perfection, it takes time for Google to recognize and reward these efforts.

That said, the traffic earned from these efforts is often the most consistent and highest-converting among all channels.

FAQ

How do SEO professionals measure success?

The top metrics used to measure SEO performance are click-through rate (CTR), keyword rankings, and branded vs. non-branded traffic.

What is the most common budget range for SEO campaigns?

The most common SEO budget range is between $1,001 and $5,000 per month, with 28.6% of respondents working within this range.

What are the primary factors that affect SEO budgeting?

Determining the appropriate budget for SEO involves considering several key components that can influence the overall cost. These factors include the scope of the SEO project, the level of competition within the industry, and the specific types of SEO services that are required. For example, a small business in a niche market with low competition might budget around $1,000 per month for local SEO services, focusing on optimizing its Google My Business profile and building local citations. In contrast, an ecommerce company targeting an audience in a highly competitive industry might need to budget $5,000 to $10,000 monthly for a comprehensive SEO strategy that includes extensive link building and technical SEO audits.

What risks are associated with choosing low-cost SEO services?

Opting for low-cost SEO services poses significant risks to a business. These services often fail to comply with ethical SEO practices, resulting in the use of tactics such as link schemes or private blog networks (PBNs), which can be detrimental to a site’s reputation and rankings. Such practices can potentially attract penalties from Google, severely compromising a site’s online visibility and trustworthiness. It is crucial for business owners to be vigilant and discerning when selecting SEO professionals to avoid these harmful consequences.

More resources:


Featured Image: pattarawat/Shutterstock

Google: “Our Ranking Systems Aren’t Perfect” via @sejournal, @martinibuster

Google’s SearchLiaison responded to a plea on X (formerly Twitter) about ridiculously poor search results in which he acknowledged that Google’s reviews algorithm could be doing a better job and outlined what’s being done to stop rewarding sites that shouldn’t be ranking in the first place.

Questioning Google’s Search Results

The exchange with Google began with a post about a high ranking sites that was alleged to fall short of Google’s guidelines.

@dannyashton tweeted:

“This review has been ranking #1 on Google for “Molekule Air Mini+ review” for the past six months.

It is 50% anecdotal and 50% marketing messaging. It doesn’t share in-depth original research.

So, how did they make it to the top of Google?”

Followed by:

“Instead of a third-party review (which is likely what searchers are looking for), Google ranks an article backed by the brand:

Searchers land in an advertorial built off marketing materials:

So little care that they even left briefing notes in the published version 😞

And I think I found the reason why it ranks #1… Money.”

The general responses to the tweets were sympathetic, such as this one:

“WILD.

And this is on page 1…

Is this what writing for readers is? Is this what people need/want?

I think of folks like my mom here who wouldn’t know better and to dig more.

It looks and seems nice, must be trustworthy.

I mean, that’s their goals, right? Dupe and dip.”

Google’s Algorithms Aren’t Perfect

SearchLiaison responded to those tweets to explain that he personally goes through the feedback submitted to Google and discusses them with the search team. He also shared about the monumental scale of ranking websites, saying that Google is indexing trillions of web pages, and because of that the ranking process is itself scaled and automated.

SearchLiaison tweeted:

“Danny, I appreciate where you’re coming from — just as I appreciated the post that HouseFresh originally shared, as well as this type of feedback from others. I do. I also totally agree that the goal is for us to reward content that’s aligned with our guidance. From the HouseFresh post itself, there seemed to be some sense that we had actually improved over time:

“In our experience, each rollout of the Products Review Update has shaken things up, generally benefitting sites and writers who actually dedicated time, effort, and money to test products before they would recommend them to the world.”

That said, there’s clearly more we should be doing. I don’t think this is particularly new, as I’ve shared before that our ranking systems aren’t perfect and that I see content that we ought to do better by, as well as content we’re rewarding when we shouldn’t.

But it’s also not a system where any individual reviews content and says “OK, that’s great — rank it better” or “OK that’s not great, downrank it.” It simply wouldn’t work for a search engine that indexes trillions of pages of content from across the web to operate that way. You need scalable systems. And you need to keep working on improving those systems.

That’s what we’ll keep doing. We’re definitely aware of these concerns. We’ve seen the feedback, including the feedback from our recent form. I’ve personally been through every bit of that feedback and have been organizing it so our teams can look further at different aspects. This is in addition to the work they’re already doing, based on feedback we’ve already seen.”

Some of the takeaways from SearchLiaison’s statement is that:

1. Google agrees that their algorithms should reward content that is aligned with their guidance (presumably guidance about good reviews, helpfulness, and spam).

2. He acknowledged that the current ranking systems can still use improvement in rewarding the useful content and not rewarding inappropriate content.

3. Google’s systems are scaled.

4. Google is committed to listening to feedback and working toward improving their algorithms.

5. SearchLiaison confirmed that they are reviewing the feedback and organizing it for further analysis to identify what needs attention for improvement to rankings.

What Is Taking So Long To Fix Google?

Someone else questioned Google’s process for rolling out updates that subsequently shakes things up. It’s a good question because it makes sense to test an update to rankings to make sure that the changes improve the quality of sites being ranked and not do the opposite.

@mikefutia tweeted:

“Danny, aren’t all your ‘system improvements’ fully tested BEFORE rolling them out?

Surely your team was aware of the shakeup in the SERPs that these last few updates would cause.

Completely legitimate hobby sites written by passionate creators getting absolutely DECIMATED by these updates.

All in favor of Reddit, Pinterest, Quora, Forbes, Business Insider, and other nonsense gaining at their expense.

I guess what I’m saying is — surely this was not a surprise.

You guys knew this carnage was coming as a direct result of the updates.

And now — here we are, NINE months later — and there have been ZERO cases of these legitimate sites recovering. In fact, the March update just made it 100x worse.

And so Google is saying ‘yeah we f-d up, we’re working on it.’

But the question is—and I think I speak on behalf of thousands of creators when I ask—’What the hell is taking so long?’”

We know that Google’s third party quality raters review search results before an update is rolled out. But clearly there are many creators, site owners and search marketers who feel that Google’s search results are going the wrong way with every update.

SearchLiaison’s response is a good one because it acknowledges that Google is not perfect and that they are actively trying to improve the search results. But that does nothing to help the thousands of site owners who are disappointed in the direction that Google’s algorithm is headed.

Featured Image by Shutterstock/ivan_kislitsin

Google Announces New GA4 Features As Universal Analytics Sunset Nears via @sejournal, @MattGSouthern

As the July 1, 2024 shutdown date for Universal Analytics (UA) draws near, Google has announced new features and improvements for Google Analytics 4 (GA4).

These enhancements give marketers deeper insights and tools for cross-channel measurement and budget optimization.

Expanded Cross-Channel Reporting

GA4 is getting improved cross-channel reporting capabilities.

You will soon be able to integrate data from third-party advertising partners such as Pinterest, Reddit, and Snap directly into GA4 properties.

This will allow for a more complete view of campaign performance across platforms.

Additionally, GA4 will introduce aggregated impressions from linked Campaign Manager 360 accounts in the advertising workspace.

This feature will give advertisers a thorough overview of campaign performance across the entire marketing funnel.

AI-Powered Insights

Google is leveraging its AI capabilities to provide users with generated insights.

These AI-driven summaries will explain data trends and fluctuations using plain language, enabling businesses to make faster, more informed decisions based on their analytics data.

Advanced Planning & Budgeting Tools

Later this year, GA4 will introduce cross-channel budgeting features, including a projections report.

This tool will allow advertisers to track media pacing and projected performance against target objectives across multiple channels.

This addition should improve marketers’ ability to optimize media spend and allocate budgets more effectively.

Privacy-First Approach

GA4 continues to prioritize user privacy while delivering effective measurement solutions.

Upcoming features include support for Chrome Privacy Sandbox APIs and improvements to enhanced conversions.

Google says these updates will offer complete picture of cross-channel conversion attribution in a privacy-safe manner.

Preparing For The Future

Steve Ganem, Director of Product Management for Google Analytics, highlights the platform’s commitment to adaptability:

“Google Analytics 4 is truly built to be durable for the future. We’ll continue to invest in giving you a tool that helps answer fundamental questions about your business across your consumer’s entire path to purchase, despite ongoing changes in the measurement landscape.”

As the sunset date for Universal Analytics approaches, Google encourages users who haven’t yet made the switch to complete their migration to GA4.

The company also reminds UA users to download any historical data they wish to retain before the July 1 shutdown date.


Featured Image: Muhammad Alimaki/Shutterstock