Google Quietly Ends Covid-Era Rich Results via @sejournal, @martinibuster

Google removed the Covid-era structured data associated with the Home Activities rich results that allowed online events to be surfaced in search since August 2020, publishing a mention of the removal in the search documentation changelog.

Home Activities Rich Results

The structured data for the Home Activities rich results allowed providers of online livestreams, pre-recorded events and online events to be findable in Google Search.

The original documentation has been completely removed from the Google Search Central webpages and now redirects to a changelog notation that explains that the Home Activity rich results is no longer available for display.

The original purpose was to allow people to discover things to do from home while in quarantine, particularly online classes and events. Google’s rich results surfaced details of how to watch, description of the activities and registration information.

Providers of online events were required to use Event or Video structured data. Publishers and businesses who have this kind of structured data should be aware that this kind of rich result is no longer surfaced but it’s not necessary to remove the structured data if it’s a burden, it’s not going to hurt anything to publish structured data that isn’t used for rich results.

The changelog for Google’s official documentation explains:

“Removing home activity documentation
What: Removed documentation on home activity structured data.

Why: The home activity feature no longer appears in Google Search results.”

Read more about Google’s Home Activities rich results:

Google Announces Home Activities Rich Results

Read the Wayback Machine’s archive of Google’s original announcement from 2020:

Home activities

Featured Image by Shutterstock/Olga Strel

YouTube Rolls Out Thumbnail A/B Testing To All Channels via @sejournal, @MattGSouthern

YouTube will launch a new “Thumbnail Test & Compare” feature for all channels over the next few weeks.

This tool allows you to upload and test up to three different thumbnails for each video to see which performs best.

How Thumbnail Testing Works

The ‘Thumbnail Test & Compare‘ feature lets you upload multiple thumbnail options when publishing a new YouTube video.

During the testing period, YouTube will randomly display the different thumbnails to some of the video’s viewers.

After collecting enough data, which takes around two weeks, YouTube analyzes which thumbnail generated the highest “watch time share” from viewers.

It will then designate one of three potential outcomes:

  • Winner: A clear winner outperforming the other options based on watch time. The winning thumbnail is automatically applied.
  • Preferred: One thumbnail likely performed better than others, but the results are less certain statistically.
  • None: No thumbnail emerged as a clear winner. The original uploaded thumbnail is kept.

You can manually select your preferred video thumbnail even if it isn’t the winning option.

For a full demonstration, see the video below:

YouTube Thumbnail Best Practices

As part of the demonstration, YouTube outlined best practices for designing and testing thumbnails.

YouTube suggests creators start by testing thumbnails on a limited number of older videos to get initial guidance. Then, apply any learnings to testing thumbnails for more recent videos.

For thumbnail design itself, YouTube offers these tips:

Balance & Focal Point

“Ensure your images are balanced with a focal point to direct viewers’ attention towards.”

High Contrast

“Utilize high contrast allowing your subject to stand out against the background in both Light and Dark mode.”

Facial Expressions

“If there’s a face in your thumbnail, consider emotion. Be expressive and consider how you want viewers to feel when seeing your thumbnail.”

Concise Text

“With text, remember that fewer words can be impactful while too much text can be difficult to process while scrolling.”

Depth & Blank Space

“When it comes to depth of field keep your background in mind and play with blank space.”

Rollout To All Eligible Channels

All channels can access the ‘Thumbnail Test & Compare’ feature in the YouTube Studio desktop application. To do so, the “Advanced features” setting must be enabled.

YouTube is gradually rolling this out over the next few weeks to all channels that have opted in and meet those requirements.

The company says it will provide updates on expanding availability, such as potential mobile app support, in the future.

Optimizing For Watch Time

In an FAQ addressing common questions, YouTube explains that thumbnails are judged solely based on their ability to drive watch time, not other metrics like click-through rates.

YouTube states

“We want to make sure that your thumbnail and content gets you the highest amount of viewer engagement, so we are optimizing for overall watch time share over other metrics.

We believe that this metric is the best way to guide your content strategy decisions & support your chances of success on the platform.”

Why SEJ Cares

The Thumbnail Test & Compare tool addresses a pain point by allowing true A/B testing. Previously, creators had to rely on best guesses or small-sample polls when selecting thumbnails for new videos.

By optimizing for watch time as the key success metric, YouTube is putting an emphasis on long-term viewer engagement over short-term clicks.

However, it’s understandable that some channels may also want data on how thumbnails impact initial impressions and click-through rates.

How This Can Help You

Smarter, higher-performing thumbnails could boost your content in YouTube’s recommendations and keep viewers watching more videos.

Video openers and thumbnails are the first make-or-break moments on YouTube, so having data-backed tools to perfect those first impressions could be a difference-maker.


Featured Image: Chayjitti Hongmanee/Shutterstock

Google’s Structured Data Update May Boost Merchant Sales via @sejournal, @martinibuster

Google updated their structured data guidelines to reflect support for a sitewide return policy within the Organization structured data. This eliminates the need to add redundant return policy information for every product listing structured data and can result in more traffic and sales to online merchants.

This doesn’t mean that merchants are required to change their current structured data, the old method remains unchanged. This simply adds an alternative way that is more streamlined and reduces the size of product structured data.

Improvement To Brand Knowledge Panel

Google’s change to the organization structured data will be reflected in the brand panel that Google shows when someone searches on a brand name. The updated brand panel will feature a new entry that reflects the company’s return policy.

Screenshot Of Brand Knowledge Panel Example

Benefits Of Organization-Level Return Policy

As part of this change Google is adding search features in Knowledge Panels and in Brand Panels that can show a merchant’s return policies. This means that a merchant’s search feature will be eligible to show a returns policy which in turn can encourage a higher clickthrough rate from the search engine results pages (SERPs) and a higher conversion rate.

Research conducted by the International Council of Shopping Centers (ICSC) in 2024 shows that online shoppers are strongly influenced by a merchant’s returns policy.

They discovered:

“82% of respondents said that when shopping online, return policies influence whether they decide to purchase from a retailer.

… If retailers charged a fee to ship back purchases made online, nearly three-fourths (71%) of respondents said they’d likely stop shopping online from that company altogether, while 6 in 10 said they’d likely stop shopping online with retailers that shortened the free return window.”

Clearly a return policy can be a way to generate more online sales and Google’s new support for a sitewide returns policy structured data helps to communicate that information to online shoppers directly from search.

Google’s announcement explained:

“A return policy is a major factor considered by shoppers when buying products online, and so last year we enabled the extraction of structured data return policies for individual products. Today we’re adding support for return policies at the organization level as well, which means you’ll be able to specify a general return policy for your business instead of having to define one for each individual product you sell.

Adding a return policy to your organization structured data is especially important if you don’t have a Merchant Center account and want the ability to provide a return policy for your business. Merchant Center already lets you provide a return policy for your business, so if you have a Merchant Center account we recommend defining your return policy there instead.

…If your site is an online or local business, we recommend using one of the OnlineStore, or LocalBusiness subtypes of Organization.

We hope this addition makes it easier for you to add return policies for your business, and enable them to be shown across Google shopping experiences.”

Google Updates Organization Structured Data Documentation

Google added a new section to their Organization structured data documentation to reflect support for this new way to show return policies in the search results.

The new documentation states:

“MerchantReturnPolicy
Use the following properties to describe general return policies for your entire Organization, if applicable to your business. If you have specific policies for individual products, use merchant listing markup instead.”

Read Google’s announcement:

Adding markup support for organization-level return policies

Read the new MerchantReturnPolicy documentation on Google’s official Organization structured data page:

Organization (Organization) structured data – MerchantReturnPolicy

Google’s Gary Illyes: Lastmod Signal Is Binary via @sejournal, @MattGSouthern

In a recent LinkedIn discussion, Gary Illyes, Analyst at Google, revealed that the search engine takes a binary approach when assessing a website’s lastmod signal from sitemaps.

The revelation came as Illyes encouraged website owners to upgrade to WordPress 6.5, which now natively supports the lastmod element in sitemaps.

When Mark Williams-Cook asked if Google has a “reputation system” to gauge how much to trust a site’s reported lastmod dates, Illyes stated, “It’s binary: we either trust it or we don’t.”

No Shades Of Gray For Lastmod

The lastmod tag indicates the date of the most recent significant update to a webpage, helping search engines prioritize crawling and indexing.

Illyes’ response suggests Google doesn’t factor in a website’s history or gradually build trust in the lastmod values being reported.

Google either accepts the lastmod dates provided in a site’s sitemap as accurate, or it disregards them.

This binary approach reinforces the need to implement the lastmod tag correctly and only specify dates when making meaningful changes.

Illyes commends the WordPress developer community for their work on version 6.5, which automatically populates the lastmod field without extra configuration.

Accurate Lastmod Essential For Crawl Prioritization

While convenient for WordPress users, the native lastmod support is only beneficial if Google trusts you’re using it correctly.

Inaccurate lastmod tags could lead to Google ignoring the signal when scheduling crawls.

With Illyes confirming Google’s stance, it shows there’s no room for error when using this tag.

Why SEJ Cares

Understanding how Google acts on lastmod can help ensure Google displays new publish dates in search results when you update your content.

It’s an all-or-nothing situation – if the dates are deemed untrustworthy, the signal could be disregarded sitewide.

With the information revealed by Illyes, you can ensure your implementation follows best practices to the letter.


Featured Image: Danishch/Shutterstock

Google Reminds Websites To Use Robots.txt To Block Action URLs via @sejournal, @MattGSouthern

In a LinkedIn post, Gary Illyes, an Analyst at Google, reiterated long-standing guidance for website owners: Use the robots.txt file to prevent web crawlers from accessing URLs that trigger actions like adding items to carts or wishlists.

Illyes highlighted the common complaint of unnecessary crawler traffic overloading servers, often stemming from search engine bots crawling URLs intended for user actions.

He wrote:

“Looking at what we’re crawling from the sites in the complaints, way too often it’s action URLs such as ‘add to cart’ and ‘add to wishlist.’ These are useless for crawlers, and you likely don’t want them crawled.”

To avoid this wasted server load, Illyes advised blocking access in the robots.txt file for URLs with parameters like “?add_to_cart” or “?add_to_wishlist.”

As an example, he suggests:

“If you have URLs like:
https://example.com/product/scented-candle-v1?add_to_cart
and
https://example.com/product/scented-candle-v1?add_to_wishlist

You should probably add a disallow rule for them in your robots.txt file.”

While using the HTTP POST method can also prevent the crawling of such URLs, Illyes noted crawlers can still make POST requests, so robots.txt remains advisable.

Reinforcing Decades-Old Best Practices

Alan Perkins, who engaged in the thread, pointed out that this guidance echoes web standards introduced in the 1990s for the same reasons.

Quoting from a 1993 document titled “A Standard for Robot Exclusion”:

“In 1993 and 1994 there have been occasions where robots have visited WWW servers where they weren’t welcome for various reasons…robots traversed parts of WWW servers that weren’t suitable, e.g. very deep virtual trees, duplicated information, temporary information, or cgi-scripts with side-effects (such as voting).”

The robots.txt standard, proposing rules to restrict well-behaved crawler access, emerged as a “consensus” solution among web stakeholders back in 1994.

Obedience & Exceptions

Illyes affirmed that Google’s crawlers fully obey robots.txt rules, with rare exceptions thoroughly documented for scenarios involving “user-triggered or contractual fetches.”

This adherence to the robots.txt protocol has been a pillar of Google’s web crawling policies.

Why SEJ Cares

While the advice may seem rudimentary, the re-emergence of this decades-old best practice underscores its relevance.

By leveraging the robots.txt standard, sites can help tame overzealous crawlers from hogging bandwidth with unproductive requests.

How This Can Help You

Whether you run a small blog or a major e-commerce platform, following Google’s advice to leverage robots.txt for blocking crawler access to action URLs can help in several ways:

  • Reduced Server Load: You can reduce needless server requests and bandwidth usage by preventing crawlers from hitting URLs that invoke actions like adding items to carts or wishlists.
  • Improved Crawler Efficiency: Giving more explicit rules in your robots.txt file about which URLs crawlers should avoid can lead to more efficient crawling of the pages/content you want to be indexed and ranked.
  • Better User Experience: With server resources focused on actual user actions rather than wasted crawler hits, end-users will likely experience faster load times and smoother functionality.
  • Stay Aligned with Standards: Implementing the guidance puts your site in compliance with the widely adopted robots.txt protocol standards, which have been industry best practices for decades.

Revisiting robots.txt directives could be a simple but impactful step for websites looking to exert more control over crawler activity.

Illyes’ messaging indicates that the ancient robots.txt rules remain relevant in our modern web environment.


Featured Image: BestForBest/Shutterstock

Something Weird Is Going On In Google’s SERPs via @sejournal, @martinibuster

People are always complaining that there’s something wrong with Google’s search results but what’s going on with search results for queries with the acronym “SEO” is in a class by itself and has to be seen to be believed.

Anomalies In Search Results

An anomaly is something that deviates from the norm or what’s expected. A lot of time when there’s something wrong with the search engine results pages (SERPs) the anomaly is explainable. For example, queries that combine a geographical element with a relatively longtail phrase tend to generate weird results. Another driver of strange search results is when there simply isn’t enough data about a specific combination of words, which sometimes leads to offensive search results.

What’s happening with a particular group of keyword phrases that are related to the word “SEO ” is not any of those kinds of anomalies. It’s a true anomaly.

Here are the keywords that Google is (arguably) getting wrong:

  • SEO program
  • What is an SEO program?
  • SEO New York (City)
  • SEO NYC
  • SEO Conference
  • SEO Events
  • SEO Education
  • SEO Awards
  • SEO-USA.Org

The site that’s ranking for all those SEO search queries (and probably more) is a site called SEO-USA.org. The acronym SEO in that website stands for Sponsors for Educational Opportunity. It’s not a spam site, it’s a legit non-profit website that’s been around since 1963. The purpose of the non-profit is to provide mentorship to young people who are underserved to help them get into colleges and universities. That program evolved in the SEO Scholars, an eight year academic program for talented young people to help them through high school and college.

“SEO Scholars creates a more equitable society by closing the academic opportunity gap for motivated young people, setting the standard for academics, mentorship, community, peer-to-peer support, and a powerful, lifelong network.”

SEO-USA.org Is Not Relevant For SEO

The acronym SEO is heavily relevant for the context of online marketing. A search for “SEO” in Google spawns suggestions that are all relevant for SEO in the sense of search marketing.

Google Trends shows that the phrase SEO Scholars and SEO Scholars Application are not widely searched in the United States, most of the searches occur in New York. But SEO-USA.org is top ranked for the group of keywords listed above in other areas outside of New York.

Screenshot Of SERPs For Keyword Phrase “SEO Awards”

It’s kind of obvious that SEO-USA.org is not relevant for the most commonly understood meaning for the acronym SEO.

Could Backlinks Be The Reason?

It’s possible that the reason SEO-USA.org is ranking for all of those phrases is because of backlinks. A search for the domain name but restricted to .edu sites shows almost seventy .edu websites that link to the the SEO-USA.org domain name.

This is the advanced search that shows scores of .edu sites that link or mention SEO-USA.org:

"seo-usa.org" site:.edu"

Screenshot Of Site:.EDU Search

There are also a large amount of high quality sites with dot org domains that link to SEO-USA.org as well, which is observable using the following advanced search:

"seo-usa.org" site:.org -site:seo-usa.org"

On the surface it looks clear that backlinks are the reason why SEO-USA.org ranks for irrelevant keywords.

But of course, the most obvious answer isn’t always the right answer. There’s more to the picture.

Why Links Probably Don’t Explain The Rankings

If links were the reason for SEO-USA.org’s rankings then it would follow that virtually every keyword phrase related to SEO would be littered with .edu and .org websites but that’s not the case.

I’ve been doing SEO for about 25 years now and I remember the days when sites that had the maximum level of PageRank used to rank for virtually anything. Also, dot edu links were regarded as powerful because SEOs were able to rank quite well with them.

Google’s algorithms improved and the effect from .edu links started to wane because context of a link started counting more. The words in the title element and the words in the surrounding text influenced the links. I know this too from my experience.

Another important change in Google’s link ranking algorithms was to dampen the effect of quantity of links. It used to be that an avalanche of links was enough to help a site rank over more authoritative sites. I know this from my experience too.

But the effect of a huge amount of links also changed in many ways, like hundreds of links from one domain stopped counting as hundreds of links and began counting as just one link. The position of a link within a page also mattered more, there were lots of changes that whittled down the power of links so that less and less links mattered for the wrong reasons.

I’m kind of skeptical that links is the reason why SEO-USA.org ranks.

What’s The Answer?

For some reason, a relevance factor is not kicking in, which allows the (arguably) irrelevant SEO-USA.org site to rank for keywords it probably shouldn’t rank for.

I think that’s a clue, a reason for why that site is ranking where it should not. It’s slipping through because something is missing that would ordinarily be there to keep it out.

It may very well be that there’s a factor related to trustworthiness that is allowing that site to slip through. That’s just speculation. Do you have any ideas?

Featured Image by Shutterstock/SS 360

Google Ranking Systems & Signals: How To Adapt Your SEO Strategy In 2024 & Beyond via @sejournal, @sejournal

Have you noticed a dip in your search rankings lately?

Are you feeling frustrated and anxious about your website’s performance?

Given the state of SEO this past year, we’d be surprised if you didn’t.

As the search landscape continues to evolve, we’re seeing a surge in volatility, with high-quality content often outranked by spam pages.

And with Google’s algorithms becoming more and more complex, traditional best practices no longer seem to cut it.

So, what does this mean for you and your strategy?

How can you navigate these complexities and boost your search rankings?

Our new ebook, Google Ranking Systems & Signals 2024, is the ultimate resource for understanding the recent ranking trends and unlocking sustainable SEO success.

You’ll get expert insights and analysis from seasoned SEO professionals, digital marketing strategists, industry thought leaders, and more.

Our featured experts include:

  • Adam Riemer, President, Adam Riemer Marketing.
  • Aleh Barysevich, Founder, SEO PowerSuite.
  • Andrea Volpini, Co-Founder and CEO, WordLift.
  • Dan Taylor, Partner & Head of Technical SEO, SALT.agency.
  • Erika Varangouli, Head of Branded Content at Riverside.fm.
  • Helen Pollitt, Head of SEO, Car & Classic.
  • Kevin Indig, Writer of the Growth Memo.
  • Kevin Rowe, Founder & Head of Digital PR Strategy, PureLinq.
  • Ludwig Makhyan, Global Head of Technical SEO, EssilorLuxottica.
  • Mordy Oberstein, Head of SEO Brand at Wix.
  • Scott Stouffer, CTO and Co-Founder, Market Brew.

Download the ebook to learn about the latest developments in Google Search, and how to meet the challenges of today’s competitive search environment.

From the rise of spam content on SERPs to the most reliable ranking factors, this comprehensive guide covers it all.

We also address where different types of content belong and offer advice on whether you should diversify your acquisition channels or pivot to gated content models.

Explore the following topics inside:

  • Why Is Search Full Of Spam?
  • What Are The Top Ranking Factors That SEO Pros Can Rely On Right Now?
    • The Top 3 Ranking Factors
    • Freshness & Content Maintenance
    • “Ranking” In Search Generative Experience
  • Staying Indexed Is The New SEO Challenge
  • Where Does Your Best Content Belong?
  • Proactively Embracing SEO Disruption By Focusing On User Needs
  • Making Sense Of Ranking In 2024

Whether you’re a seasoned professional or just starting out, this ebook is full of practical tips and actionable strategies to help you improve your website’s visibility and drive organic traffic.

Grab your copy of Google Ranking Systems & Signals 2024 today, and start optimizing your website for success in 2024 and beyond!

An announcement to adapt SEO strategies for Google's systems with an image of a book titled


Featured Image: Paulo Bobita/Search Engine Journal

Google Issues Statement About CTR And HCU via @sejournal, @martinibuster

In a series of tweets, Google’s SearchLiaison responded to a question that connected click-through rates (CTR) and HCU (Helpful Content Update) with how Google ranks websites, remarking that if the associated ideas were true it would be impossible for any new website to rank.

Users Are Voting With Their Feet?

Search Liaison’s answer was to a tweet that quoted an interview answer by Google CEO Sundar Pichai, the quote being, “Users vote with their feet”.

Here is the tweet:

“If the HCU (Navboost, whatever you want to call it) is clicks/user reaction based – how could sites hit by the HCU ever hope to recover if we’re no longer being served to Google readers?

@sundarpichai “Users vote with their feet”,

Okay I’ve changed my whole site – let them vote!”

The above tweet appears to connect Pichai’s statement to Navboost, user clicks and rankings. But as you’ll see below, Sundar’s statement about users voting “with their feet” has nothing to do with clicks or ranking algorithms.

Background Information

Sundar Pichai’s answer about users voting “with their feet” has nothing to do with clicks.

The problem with the interview question (and Sundar Pichai’s answer) is that the question and answer are in the context of “AI-powered search and the future of the web.”

The interviewer at The Verge used a site called HouseFresh as an example of a site that’s losing traffic because of Google’s platform shift to the new AI Overviews.

But the HouseFresh site’s complaints predate AI Overviews. Their complaints are about Google ranking low quality “big media” product reviews over independent sites like HouseFresh.

HouseFresh wrote:

“Big media publishers are inundating the web with subpar product recommendations you can’t trust…

Savvy SEOs at big media publishers (or third-party vendors hired by them) realized that they could create pages for ‘best of’ product recommendations without the need to invest any time or effort in actually testing and reviewing the products first.”

Sundar Pichai’s answer has nothing to do with why HouseFresh is losing traffic. His answer is about AI Overviews. HouseFresh’s issues are about low quality big brands outranking them. Two different things.

  • The Verge-affiliated interviewer was mistaken to cite HouseFresh in connection with Google’s platform shift to AI Overviews.
  • Furthermore, Pichai’s statement has nothing to do with clicks and rankings.

Here is the interview question published on The Verge:

“There’s an air purifier blog that we covered called HouseFresh. There’s a gaming site called Retro Dodo. Both of these sites have said, “Look, our Google traffic went to zero. Our businesses are doomed.”

…Is that the right outcome here in all of this — that the people who care so much about video games or air purifiers that they started websites and made the content for the web are the ones getting hurt the most in the platform shift?”

Sundar Pichai answered:

“It’s always difficult to talk about individual cases, and at the end of the day, we are trying to satisfy user expectations. Users are voting with their feet, and people are trying to figure out what’s valuable to them. We are doing it at scale, and I can’t answer on the particular site—”

Pichai’s answer has nothing to do with ranking websites and absolutely zero context with the HCU. What Pichai’s answer means is that users are determining whether or not AI Overviews are helpful to them.

SearchLiaison’s Answer

Let’s reset the context of SearchLiaison’s answer, here is the tweet (again) that started the discussion:

“If the HCU (Navboost, whatever you want to call it) is clicks/user reaction based – how could sites hit by the HCU ever hope to recover if we’re no longer being served to Google readers?

@sundarpichai “Users vote with their feet”,

Okay I’ve changed my whole site – let them vote!”

Here is SearchLiaison’s response:

“If you think further about this type of belief, no one would ever rank in the first place if that were supposedly all that matters — because how would a new site (including your site, which would have been new at one point) ever been seen?

The reality is we use a variety of different ranking signals including, but not solely, “aggregated and anonymized interaction data” as covered here:”

The person who started the discussion responded with:

“Can you please tell me if I’m doing right by focusing on my site and content – writing new articles to be found through search – or if I should be focusing on some off-site effort related to building a readership? It’s frustrating to see traffic go down the more effort I put in.”

When a client says something like “writing new articles to be found through search” I always follow up with questions to understand what they mean. I’m not commenting about the person who made the tweet, I’m just making an observation about past conversations I’ve had with clients. When a client says something like that, they sometimes mean that they’re researching Google keywords and competitor sites and using that keyword data verbatim within their content instead of relying on their own personal expertise and understanding of what the readers want and need.

Here’s SearchLiaison’s answer:

“As I’ve said before, I think everyone should focus on doing whatever they think is best for their readers. I know it can be confusing when people get lots of advice from different places, and then they also hear about all these things Google is supposedly doing, or not doing, and really they just want to focus on content. If you’re lost, again, focus on that. That is your touchstone.”

Site Promotion To People

SearchLiaison next addressed the excellent question about off-site promotion where he strongly asserted focusing on the readers. A lot of SEOs focus on promoting sites to Google, which is what link building is all about.

Promoting sites to people is super important. It’s one of the things that I see high ranking sites do and, although I won’t mention specifics, I believe it feeds into higher rankings in an indirect way.

SearchLiaison continued:

“As to the off-site effort question, I think from what I know from before I worked at Google Search, as well as my time being part of the search ranking team, is that one of the ways to be successful with Google Search is to think beyond it.

Great sites with content that people like receive traffic in many ways. People go to them directly. They come via email referrals. They arrive via links from other sites. They get social media mentions.

This doesn’t mean you should get a bunch of social mentions, or a bunch of email mentions because these will somehow magically rank you better in Google (they don’t, from how I know things). It just means you’re likely building a normal site in the sense that it’s not just intended for Google but instead for people. And that’s what our ranking systems are trying to reward, good content made for people.”

What About False Positives?

The phrase false positive is used in many contexts and one of them is to describe the situation of a high quality site that loses rankings because an algorithm erroneously identified it as low quality. SearchLiaison offered hope to high quality sites that may have seen a decrease in traffic, saying that it’s possible that the next update may offer a positive change.

He tweeted:

“As to the inevitable “but I’ve done all these things when will I recover!” questions, I’d go back to what we’ve said before. It might be the next core update will help, as covered here:

It might also be that, as I said here, it’s us in some of these cases, not the sites, and that part of us releasing future updates is doing a better job in some of these cases:

SearchLiaison linked to a tweet by John Mueller from a month ago where he said that the search team is looking for ways to surface more helpful content.

“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”

Is Your Site High Quality?

Everyone likes to think that their site is high quality and most times it is. But there are also cases where a site publisher will do “everything right” in terms of following SEO practices but what they’re unaware of is that those “good SEO practices” that are backfiring on them.

One example, in my opinion, is the widely practiced strategy of copying what competitors are doing but “doing it better.” I’ve been hands-on involved in SEO for well over 20 years and that’s an example of building a site for Google and not for users. It’s a strategy that explicitly begins and ends with the question of “what is Google ranking and how can I create that?”

That kind of strategy can create patterns that overtly signal that a site is not created for users.  It’s also a recipe for creating a site that offers nothing new from what Google is already ranking. So before assuming that everything is fine with the site, be certain that everything is indeed fine with the site.

Featured Image by Shutterstock/Michael Vi

Google Analytics Update To Improve Paid Search Attribution via @sejournal, @MattGSouthern

Google has announced an update to the attribution models in Google Analytics 4 (GA4) to improve the accuracy of paid search campaigns.

Google plans to roll out adjustments over the next two weeks to address a longstanding issue where conversions originating from paid search were mistakenly attributed to organic search traffic.

According to the company’s statement, this misattribution occurs with single-page applications when the “gclid” parameter — a unique identifier for paid search clicks — fails to persist across multiple page views.

As a result, conversions that should have been credited to paid search campaigns were incorrectly assigned to organic search channels.

Improved Conversion Attribution Methodology

To address this problem, Google is modifying how it attributes conversions to ensure campaign information is captured from the initial event on each page.

Under the new methodology, the attribution will be updated to reflect the appropriate traffic source if a user exits the site and returns through a different channel.

This change is expected to increase the number of conversions attributed to paid search campaigns, potentially impacting advertising expenditures for marketers leveraging Google Ads.

Preparation & Review Recommended

In light of the impending update, Google strongly advises advertisers to review their budget caps and make necessary adjustments before the changes take effect.

As more conversions may be assigned to paid search efforts, campaign spending levels could be affected.

Proactive budget management should be used to align with evolving performance data.

Why SEJ Cares

Improved attribution accuracy gives you a clearer picture of how well your paid search advertising works.

This will allow you to make smarter decisions about where to spend your marketing budget and how to improve your paid search campaigns based on precise data.

How This Can Help You

With more accurate conversion data, you can:

  • Gain a clearer picture of your paid search campaigns’ actual impact and return on investment (ROI).
  • Optimize campaigns based on reliable performance metrics, allowing for more effective budget allocation and targeting strategies.
  • Identify areas for improvement or expansion within your paid search efforts, informed by precise attribution data.
  • Make data-driven decisions regarding budget adjustments, bid strategies, and overall campaign management.

To get the most out of these changes, review your budget caps and make necessary adjustments to anticipate the potential increase in conversions attributed to paid search campaigns.

Staying ahead will make it easier to adapt to the new attribution method and leverage the improved data.


Featured Image: Piotr Swat/Shutterstock

Google Gives Merchants New Insights Into Shopping Search Performance via @sejournal, @MattGSouthern

Google has introduced a feature in Search Console that allows merchants to track their product listings in the Google Search Image tab.

This expanded functionality can help businesses better understand their visibility across Google’s shopping experiences.

Where To Find ‘Merchant Listings Performance’ In Search Console

The new data is accessible through the “Performance” report under the “Google Search Image” tab.

From there, you can monitor the performance of your listings across various Google surfaces.

This includes information on impressions, clicks, and other key metrics related to your product showcases.

By integrating merchant listing performance into Search Console, businesses get a more comprehensive view of their product visibility to optimize their strategies accordingly.

Eligibility & Shopping Section In Search Console

To qualify for merchant listing reports, a website must be identified by Google as an online merchant primarily selling physical goods or services directly to consumers.

Affiliate sites or those that redirect users to other platforms for purchase completion are not considered eligible.

Once recognized as an online merchant, the Search Console will display a “Shopping” section in its navigation bar.

This dedicated area houses tools and reports tailored to shopping experiences, including:

  1. Product Snippet Rich Report: Providing insights into product snippet structured data on the site, enabling enhanced search result displays with visual elements like ratings and prices.
  2. Merchant Listing Rich Report: Offering analytics on merchant listing structured data enables more comprehensive search results, often appearing in carousels or knowledge panels.
  3. Shopping Tab Listings: Information and guidance on enabling products to appear in the dedicated Shopping tab within Google Search results.

Google’s automated systems determine a site’s eligibility as an online merchant based on the presence of structured data and other factors.

In Summary

This new feature in Google’s Search Console provides valuable information about the visibility of your product listings in search results.

You can use these insights to make changes and improve your products’ visibility so that more potential customers can find them.


Featured Image: T. Schneider/Shutterstock