Google: Can 10 Pages Impact Sitewide Rankings? via @sejournal, @martinibuster

Google’s John Mueller answered a question about sitewide impacts on a site with ten pages that lost rankings in the March/April 2024 Core Update then subsequently experienced a sitewide collapse in May.

Can 10 Pages Trigger A Sitewide Penalty?

The person asking the question on Reddit explained that they had ten pages (out of 20,000 pages) that were hit by the Helpful Content Update (HCU) in September 2023. They subsequently updated the pages which eventually recovered their rankings and traffic. Things were fine until the the same ten pages got slammed by the March/April core update. The precise date of the second ranking drop event was April 20th.

Up to that point the rest of the site was fine. Only the same ten pages were affected. That changed on May 7th when the site experienced a sitewide drop in rankings across all 20,000 pages of the website.

Their question was if the ten problematic pages triggered a sitewide impact or whether the May 7th collapse was due to the Site Reputation Abuse penalties that were announced on May 6th.

A Note About Diagnosing Ranking Drops

I’m not commenting specifically about the person who asked the question but… the question has the appearance of correlating ranking drops with specific parts of announced algorithm updates.

Here is the exact wording:

“Our website has about 20K pages, and we found that around 10 pages were hit by HCU in September. We updated those articles and saw a recovery in traffic, but after the March core update around April 20, the same pages were hit again, likely due to HCU. On May 7th, we saw a sharp drop in rankings across the board, and suspect that a sitewide classifier may have been applied.

Question: Can an HCU hit on 10 pages cause a sitewide classifier for 20K pages? Or on May 7th reputation abuse update may had an impact?”

In general it’s reasonable to assume that a ranking drop is connected to a recently announced Google update when the dates of both events match. However, it bears pointing out that a core algorithm update can affect multiple things (for example query-content relevance) and it should be understood that the HCU is no longer a single system.

The person asking the question is following a pattern that I often see which is that they’re assuming that ranking drops are due to something wrong with their site but that’s not always the case, it could be changes in how Google interprets a search query (among many other potential reasons).

The other potential mistake is assuming that the problem is related to a specific algorithm. The person asking the question assumes they were hit by the HCU system, which is something that no longer exists. All the elements of the HCU were subsumed into the core ranking algorithm as signals.

Here is what Google’s documentation says about what happened to the HCU:

Is there a single “helpful content system” that Google Search uses for ranking?
Our work to improve the helpfulness of content in search results began with what we called our “helpful content system” that was launched in 2022. Our processes have evolved since. There is no one system used for identifying helpful content. Instead, our core ranking systems use a variety of signals and systems.”

While Google is still looking for helpfulness in content there is no longer a helpful content system that’s de-ranking pages on specific dates.

The other potential evidence of faulty correlation is when the Redditor asked if their May 7th sitewide collapse was due to the site reputation abuse penalties. The site reputation abuse penalties weren’t actually in effect by May 7th. On May 6th it was announced that site reputation abuse manual actions would begin at some point in the near future.

Those are two examples of how it can be misleading to correlate site ranking anomalies with announced updates. There is more to diagnosing updates than correlating traffic patterns to announced updates. Site owners and SEOs who diagnose problems in this manner risk approaching the solution like someone who’s focusing on the map instead of looking at the road.

Properly diagnosing issues requires understanding the full range of technical issues that can impact a site and algorithmic changes that can happen on Google’s side (especially unannounced changes). I have over 20 years experience and know enough to be able to identify anomalies in the SERPs that indicate changes to how Google is approaching relevance.

Complicating the diagnosis is that sometimes it’s not something that needs “fixing” but rather it’s more about the competition is doing something more right than the sites that lost rankings. More right can be a wide range of things.

Ten Pages Caused Sitewide “Penalty”?

John Mueller responded by first addressing the specific issue of sitewide ranking collapse, remarking that he doesn’t think it’s likely that ten pages would cause 20,000 other pages to lose rankings.

John wrote:

“The issues more folks post about with regards to core updates tend to be site-wide, and not limited to a tiny subset of a site. The last core update was March/April, so any changes you’d be seeing from May would be unrelated. I’m not sure how that helps you now though :-), but I wouldn’t see those 10 pages as being indicative of something you need to change across 20k other pages.”

Sometimes It’s More Than Announced Updates

John Mueller didn’t offer a diagnosis of what is wrong with the site, that’s impossible to say without actually seeing the site. SEOs on YouTube, Reddit and Facebook routinely correlate ranking drops with recently announced updates but as I wrote earlier in this article, that could be a mistake.

When diagnosing a drop in rankings it’s important to look at the site, the competition and the SERPs.

Do:

  • Inspect the website
  • Review a range of keywords and respective changes in the SERPs
  • Inspect the top ranked sites

Don’t:

  • Assume that a ranking drop is associated with a recent update and stop your investigation right there.

Google’s John Mueller alludes to the complexity of diagnosing ranking drops by mentioning that sometimes it’s not even about SEO, which is 100% correct.

John explained:

“Based on the information you posted, it’s also impossible to say whether you need to improve / fix something on those 20k pages, or if the world has just moved on (in terms of their interests, their expectations & your site’s relevance).

It sounds like you did find things to make more “helpful” on those 10 pages, maybe there’s a pattern? That’s something for you to work out – you know your site, its content, its users best. This isn’t an easy part of SEO, sometimes it’s not even about SEO.”

Look At The Road Ahead

It’s been a trend now that site owners focus on recent announcements by Google as clues to what is going on with their sites. It’s a reasonable thing to do and people should 100% keep doing that. But don’t make that the limit of your gaze because there is always the possibility that there is something else going on.

Featured Image by Shutterstock/vovan

Google Quietly Fixed Site Names In Search Results via @sejournal, @martinibuster

Google resolved a site names issue that had been ongoing since September 2023 that prevented a website’s site name from properly appearing when an inner page was ranked in the search results.

Site Names In The Search Results

A site name is exactly what it sounds like, the name of a website that’s displayed in the search engine results pages (SERPs). This is a feature that allows users to identify the name of the site that’s in the search engine results pages (SERPs).

If your site name is Acme Anvil Company, and that’s how the company is known, then Google wants to display Acme Anvil Company in the search results. If Acme Anvil Company is better known as the AAC and that’s what the company wants to show in the SERPs, then that’s what Google wants to show.

Google allows site owners to use the “WebSite” structured data on the home page to specify the correct site name that Google should use.

Problem Propagating Site Names

Back in September 7, 2023, Google published a warning in their site name documentation that acknowledged they were having problems propagating the site name to the inner pages of a site when those inner pages were shown in the SERPs.

This is the warning that was published:

“Known issue: site name isn’t appearing for internal pages
In some cases, a site name shown for a home page may not have propagated to appear for other pages on that site. For example, example.com might be showing a site name that’s different from example.com/internal-page.html.

We’re actively working to address this. We will update this help page when this issue is resolved. In the meantime, if your home page is showing the site name you prefer, understand that it should also appear for your internal pages eventually.”

Google Fixes Site Name Problem

The documentation for the site name problem was recently removed. A changelog for Google documentation noted this:

“Resolving the issue with site names and internal pages
What: Removed the warning about the issue that was preventing new site names from propagating to internal pages.

Why: The issue has been resolved. Keep in mind that it takes time for Google to recrawl and process the new information, including recrawling your internal pages.”

There’s no word on what caused the site name propagation problem but it is interesting that it was finally fixed after all this time because one has to wonder if it took so long because it was low priority or if something on the backend of Google’s systems changed that allowed them to finally fix the issue.

Read Google’s updated site names documentation:

Provide a site name to Google Search

Featured Image by Shutterstock/Cast Of Thousands

Google On How It Manages Disclosure Of Search Incidents via @sejournal, @martinibuster

Google’s latest Search Off The Record podcast discussed examples of disruptive incidents that can affect crawling and indexing and discuss the criteria for deciding whether or not to disclose the details of what happened.

Complicating the issue of making a statement is that there are times when SEOs and publishers report that Search is broken when from Google’s point of view they’re working the way they’re supposed to.

Google Search Has A High Uptime

The interesting part of the podcast began with the observation that Google Search (the home page with the search box) itself has an “extremely” high uptime and rarely ever goes down and become unreachable. Most of the reported issues were due to network routing issues from the Internet itself than a failure from within Google’s infrastructure.

Gary Illyes commented:

“Yeah. The service that hosts the homepage is the same thing that hosts the status dashboard, the Google Search Status Dashboard, and it has like an insane uptime number. …the number is like 99.999 whatever.”

John Mueller jokingly responded with the word “nein” (pronounced like the number nine), which means “no” in German:

“Nein. It’s never down. Nein.”

The Googlers admit that the rest of Google Search on the backend does experience outages and they explain how that’s dealt with.

Crawling & Indexing Incidents At Google

Google’s ability to crawl and index web pages is critical for SEO and earnings. Disruption can lead to catastrophic consequences particularly for time-sensitive content like announcements, news and sales events (to name a few).

Gary Illyes explained that there’s a team within Google called Site Reliability Engineering (SRE) that’s responsible for making sure that the public-facing systems are running smoothly. There’s an entire Google subdomain devoted to the site reliability where they explain that they approach the task of keeping systems operational similar to how software issues are. They watch over services like Google Search, Ads, Gmail, and YouTube.

The SRE page explains the complexity of their mission as being very granular (fixing individual things) to fixing larger scale problems that affect “continental-level service capacity” for users that measure in the billions.

Gary Ilyes explains (at the 3:18 minute mark):

“Site Reliability Engineering org publishes their playbook on how they manage incidents. And a lot of the incidents are caught by incidents being issues with whatever systems. They catch them with automated processes, meaning that there are probers, for example, or there are certain rules that are set on monitoring software that looks at numbers.

And then, if the number exceeds whatever value, then it triggers an alert that is then captured by a software like an incident management software.”

February 2024 Indexing Problem

Gary next explains how the February 2024 indexing problem is an example of how Google monitors and responds to incidents that could impact users in search. Part of the response is figuring out if it’s an actual problem or a false positive.

He explains:

“That’s what happened on February 1st as well. Basically some number went haywire, and then that opened an incident automatically internally. Then we have to decide whether that’s a false positive or it’s something that we need to actually look into, as in like we, the SRE folk.

And, in this case, they decided that, yeah, this is a valid thing. And then they raised the priority of the incident to one step higher from whatever it was.

I think it was a minor incident initially and then they raised it to medium. And then, when it becomes medium, then it ends up in our inbox. So we have a threshold for medium or higher. Yeah.”

Minor Incidents Aren’t Publicly Announced

Gary Ilyes next explained that they don’t communicate every little incident that happens because most of the times it won’t even be noticed by users. The most important consideration is whether the incident affects users, which will automatically receive an upgraded priority level.

An interesting fact about how Google decides what’s important is that problems that affect users are automatically boosted to a higher priority  level. Gary said he didn’t work in SRE so he was unable to comment on the exact number of users that need to be affected before Google decides to make a public announcement.

Gary explained:

“SRE would investigate everything. If they get a prober alert, for example, or an alert based on whatever numbers, they will look into it and will try to explain that to themselves.

And, if it’s something that is affecting users, then it almost automatically means that they need to raise the priority because users are actually affected.”

Incident With Images Disappearing

Gary shared another example of an incident, this time it was about images that weren’t showing up for users. It was decided that although the user experience was affected it was not affected to the point that it was keeping users from finding what they were searching for, the user experience was degraded but not to the point where Google became unusable. Thus, it’s not just whether users are affected by an incident that will cause an escalation in priority but also how badly the user experience is affected.

The case of the images not displaying was a situation in which they decided to not make a public statement because users could still be able to find the information they needed. Although Gary didn’t mention it, it does sound like an issue that recipe bloggers have encountered in the past where images stopped showing.

He explained:

“Like, for example, recently there was an incident where some images were missing. If I remember correctly, then I stepped in and I said like, “This is stupid, and we should not externalize it because the user impact is actually not bad,” right? Users will literally just not get the images. It’s not like something is broken. They will just not see certain images on the search result pages.

And, to me, that’s just, well, back to 1990 or back to 2008 or something. It’s like it’s still usable and still everything is dandy except some images.”

Are Publishers & SEOs Considered?

Google’s John Mueller asked Gary if the threshold for making a public announcement was if the user’s experience was degraded or if it was the case that the experience of publishers and SEOs were also considered.

Gary answered (at about the 8 minute mark):

“So it’s Search Relations, not Site Owners Relations, from Search perspective.

But by extension, like the site owners, they would also care about their users. So, if we care about their users, it’s the same group of people, right? Or is that too positive?”

Gary apparently sees his role as primarily as Search Relations in a general sense of their users. That may come as a surprise to many in the SEO community because Google’s own documentation for their Search Off The Record podcast explains the role of the Search Relations team differently:

“As the Search Relations team at Google, we’re here to help site owners be successful with their websites in Google Search.”

Listening to the entire podcast, it’s clear that Googlers John Mueller and Lizzi Sassman are strongly focused on engaging with the search community. So maybe there’s a language issue that’s causing his remark to be interpretable differently than he intended?

What Does Search Relations Mean?

Google explained that they have a process for deciding what to disclose about disruptions in search and it is a 100% sensible approach. But something to consider is that the definition of “relations” is that it’s about a connection between two or more people.

Search is a relation(ship). It is an ecosystem where two partners, the creators (SEOs and site owners) create content and Google makes it available to their users.

Featured Image by Shutterstock/Khosro

Google Case Study Shows Importance Of Structured Data via @sejournal, @martinibuster

Google published a case study that shows how using structured data and following best practices improved discoverability and brought more search traffic. The case study was about the use of Video structured data but the insights shared are applicable across a range of content types.

The new case study is about an Indonesian publisher called Vidio.

How CDNs Can Cause Indexing Problems

One of the interesting points in the case study is about an issue related to how CDNs can link to image and video files with expiring URLs. The new documentation specifically mentions that it’s important that the CDN uses stable URLs and links to another Google documentation page that goes into more detail.

Google explains that some CDNs use quickly expiring URLs for video and thumbnail files and encourages publishers and SEOs to use just one stable URL for each video. Something interesting to note is that not only does this help Google index the files it also helps Google collect user interest signals.

This is what the documentation advises:

“Some CDNs use quickly expiring URLs for video and thumbnail files. These URLs may prevent Google from successfully indexing your videos or fetching the video files. This also makes it harder for Google to understand users’ interest in your videos over time.

Use a single unique and stable URL for each video. This allows Google to discover and process the videos consistently, confirm they are still available and collect correct signals on the videos.”

Implementing The Correct Structured Data

Google highlighted the importance of using the correct structured data and validating it with Google’s structured data testing tool.

These are the results of the above work:

“Within a year of implementing VideoObject markup, Vidio saw improvements in impressions and clicks on their video pages. While the number of videos that Vidio published from Q1 2022 to Q1 2023 increased by ~30%, adding VideoObject markup made their videos eligible for display in various places on Google.

This led to an increase of ~3x video impressions and close to 2x video clicks on Google Search. Vidio also used the Search Console video indexing report and performance report, which helped them to identify and fix issues for their entire platform.”

Indexing + Structured Data = More Visibility

The keys to better search performance were ensuring that Google is able to crawl the URLs, which is something that can easily be overlooked in the rush to correlate a drop in rankings to a recent algorithm update. Never rule anything out during a site audit.

Another thing the case study recommends that is important is to assure that the proper structured data is being used. Using the appropriate structured data can help make a webpage qualify for improved search visibility through one of Google’s enhanced search features like featured snippets.

Read Google’s case study:

How Vidio brought more locally relevant video-on-demand (VOD) content to Indonesian users through Google Search

Featured Image by Shutterstock/Anton Vierietin

Rand Fishkin At MozCon: Rethinking Strategies Amid Google API “Leak” via @sejournal, @MattGSouthern

At the MozCon industry conference this week, Rand Fishkin, the outspoken former CEO of Moz and founder of SparkToro, shared his opinion on how SEOs and marketers should potentially adjust strategies based on his interpretation of the recent Google API leaks.

In a packed session with Dr. Pete Meyers, Fishkin laid out specific ways he believes the leaked information, which has not been verified, could impact best practices.

Fishkin firmly believes the leaks contradict Google’s public statements about its systems.

“Google has been unkind and unfair. They have been abusive about this,” Fishkin stated, though these are his opinions based on reviewing the leaks.

On Google’s lack of transparency, Fishkin states:

“Google has told us off and on that they don’t use clicks for ranking. And I always heard it, maybe this is charitable on my part, as we don’t use capital ‘C’ clicks for capital ‘R’ ranking. And the truth is, I think even that was charitable on my case.

And we’ve seen in not just these documents, but anyone who’s familiar with Andrew Navick’s testimony last year, it’s really confirming a lot of what we saw, a lot of what we saw with Navboost.”

He adds:

“They have lied through either omission or misinformation.”

Fishkin’s Recommendations

Fishkin admittedly speculated and provided concrete examples of how SEO strategies could change if his interpretations of the leaks were accurate.

However, these are his opinions, not directives. Among his potential recommendations:

1. Invest In Author/Entity Authority

Surprised by the continued emphasis on authorship and entity signals in the leaked code, Fishkin said brands should prioritize hiring writers with established reputational authority that Google already associates with quality content.

Fishkin said this is what he’s going to do differently:

“We’re going to hire a content marketer, basically a part-time content person, to make sure that the SparkToro blog has a couple of new posts on it every week.

And all that authorship and entity stuff made me think we should find someone who already has a profile.”

2. Supplement Link-building With Public Relations

According to Fishkin, the leaks uncovered potential evidence that Google devalues links to sites without sufficient brand awareness and search volume.

As a result, he recommends accompanying traditional link acquisition with broader brand-building efforts like PR and advertising to increase branded search demand.

Fishkin stated:

“If you get a whole bunch of links in one day and nothing else, guess what? You manipulated the link graph.

If you’re really a big brand, people should be talking about you.”

3. Embrace Geographic Nuance

With abundant references to geographic and country-specific signals throughout the code, Fishkin cautioned against one-size-fits-all global strategies.

What works for major markets like the US may prove ineffective for smaller regions where Google needs more data.

Fishkin advised attendees:

“I would encourage you to think about SEO as being more geographically specific than you think it is even for web search results.”

4. Rediscover Experimentation

More than anything, Fishkin hopes the leaks will catalyze a renewed sense of curiosity and skepticism within SEO.

On the value of experimentation, Fishkin says:

“We’ve seen it over and over. One thing we’ve lost, I feel like, is that spirit of experimentation. And with these things coming out where I don’t think we can take what Google says for granted, how do you see, how do we get that back?”

He challenged practitioners to move beyond regurgitating Google’s public statements and instead embrace testing to uncover what drives results.

Referring to an unexplained metric surfaced in the leaks, Fishkin states:

“My dream would be that if I were to come back to MozCon next year, somebody would be on this stage, and they’d be like, ‘Guys, I figured out what Keto score is. Publish that. I’ll amplify it.”

A Wakeup Call?

In many ways, Fishkin framed the leaks as a pivotal moment for an industry he believes has grown insular, conflict-averse, and too accepting of Google’s carefully crafted narratives.

His call to action left some energized and others put off by its unrestrained bluntness.

But whether one admires Fishkin’s brash delivery or not, the leaks have undeniably cracked open Google’s black box.

For those willing to dig into the technical details and chart their path through testing, Fishkin argues lucrative opportunities await those who stop taking Google’s word as gospel.

A Word Of Caution Regarding The Google API Leak

Doubts have emerged about the true nature and significance of this “leak.”

Evidence suggests the data may be connected to Google’s public Document AI Warehouse API rather than exposing the ranking system’s inner workings. The information also appears to be at least five years old.

While Fishkin’s plans to adjust his SEO tactics are interesting, they should be taken with a grain of salt, given the ongoing debate over what the data really signifies.

It illustrates the importance of vetting sources when evaluating any supposed “insider information” about how search engines operate.

As the discussion around the Google “leak” continues, be careful not to fall victim to confirmation bias—seeing the data through the lens of pre-existing theories rather than objectively assessing it.


Featured Image: Taken by author at MozCon, June 2024. 

Your Guide To Dominating Local Search Marketing via @sejournal, @meetsoci

This post was sponsored by SOCi. The opinions expressed in this article are the sponsor’s own.

As a marketer, you may feel like the ground is shifting under your feet with so many changes in the world of search. From Google’s recent announcement to release AI Overviews to all U.S. users to OpenAI revealing GPT-4o, there’s a lot to keep up with.

How will these changes impact your search efforts? Do you need to shift your search strategy?

We have the answers for you and more!

In this blog, we’ll explain how search marketing has changed, what this means for your brand, and share tactics to improve your online visibility. At the end, we’ll also introduce our new game-changer for local search management.

Let’s get into it!

The Evolution Of Search Marketing

As search evolves, many marketers are worried about their brand remaining visible online. While AI-generated search experiences are so new, we do know that now isn’t the time to make any drastic changes to your search marketing strategies.

You can test how your brand appears in generative AI (genAI) results (what we’ve dubbed GAIRs), but there’s no reason to sound an alarm — at least not yet.

Today, nearly three-quarters of consumers conduct local searches at least once a week. Similarly, in the U.S., over 800 million monthly searches contain some variation of “near me,” and more than 5 million keywords are related to “near me.”

Focusing on conventional local SEO efforts is the best way for your brand to ensure its visibility in traditional and GAIRs.

Local SEO for businesses with multiple locations involves incorporating a local SEO strategy for each business location. A multi-location SEO strategy, when done correctly, will boost your local search rankings, help you gain local customers, and improve brand awareness.

If your business doesn’t have multiple locations, you can still follow the tactics below to ensure your business is visible to your target audience in your specific area.

5 Ways To Improve Your Online Visibility

Now that you understand how search has evolved and the importance of local SEO, let’s dive into five local SEO tactics your brand can leverage to boost online visibility.

1. Claim & Optimize Local Listings

Local listings are online profiles of local businesses. They appear on search engines, local directories, and platforms like Google, Apple Maps, Yelp, Bing, and Facebook.

To increase your visibility on Google and beyond, your brand must claim local listings across all major local directories and remove duplicate listings.

Additionally, you need consistent and accurate information across all listings. At a minimum, your local listings should include the following information:

  • Name, address, and phone number (NAP) citations.
  • Business categories. (Example: Sushi restaurant)
  • Business hours, especially during holidays and major events
  • Products and services your business offers.
  • Links to your website and social media profiles.
  • Attributes. (Example: Curbside pickup or wheelchair-accessible seating)
  • High-quality photos and videos.

After optimizing your local listings, you can focus on your local pages.

2. Create Local Pages For Each Location

A local page, sometimes called a local landing page, is a web page you create for an individual store location or franchisee. It’s similar to local listings but lives on your site rather than an external directory like Yelp or Google.

Your multi-location business might have dozens or hundreds of local pages, each containing specific information about that store and the surrounding area.

Local pages should contain most of the business information found on your local listings. However, they’re also high-conversion pages. Therefore, they should also contain calls to action (CTAs) such as “order now” buttons or promotional sales and discounts.

Well-designed and optimized local pages can help your business appear high in local organic search results. As mentioned, these higher rankings often lead to more conversions and business for your stores!

3. Leverage A Store Locator

Store locators are similar to local pages. A store locator is a web page that lists all of your local stores or third-party dealers that sell your products.

Store locators help move website visitors through the customer journey by displaying valuable location information and unique details about each store. They make it easier for customers to purchase online and to contact or visit local stores.

Well-optimized and compatible store locators and local pages will help improve:

  • Local search rankings.
  • Website traffic and online conversions.
  • Analytics, such as where visitors are searching and coming from.

4. Implement An Online Reputation Management Strategy

While reputation management might not be something you’d consider when you think of improving your online visibility, you’d be surprised. According to local SEO experts, high numerical Google ratings are the sixth highest ranking factor in Google’s local pack and finder. At the same time, the quantity of native Google reviews (with text) is the eighth ranking factor.

A high quantity and quality of reviews don’t just affect local search rankings — they also impact conversion rates. According to our State of Google Reviews research report, an increase in one full star on a Google Business Profile (GBP) corresponds with a 44% increase in conversions.

To improve your reputation management strategy and gain more reviews:

  1. Respond to existing reviews in a personalized manner to show customers you value their feedback.
  2. Utilize social media to encourage customer feedback, ratings, and reviews.
  3. Make leaving a review accessible! Include links to your GBP on your website and in emails.
  4. Monitor the feedback that your business receives from reviews and make adjustments accordingly.

5. Create Unique Content

Generating localized content for your local pages, website, and listings is also essential. You want to ensure that your localized content optimizes and targets specific areas.

For instance, if you’re targeting the keyword “sporting goods store Seattle,” you want to update your URL, title tag, meta description, and headings with locally relevant keywords.

You should also leverage local images, including photos of your stores and products. Remember to include geo-targeted meta descriptions, alternative text, and descriptions within your images.

Types of local content your brand can create include but are not limited to:

  • Blogs.
  • Surveys.
  • Infographics.
  • Whitepapers.
  • Social media content.
  • Neighborhood guides.
  • User-generated content. (UGC)

For a more in-depth look at what it takes to improve your brand’s local SEO strategy, download our Top 10 Things You Should Be Doing in Local SEO Now guide!

How SOCi Can Help

Now that you understand what goes into creating a solid local search strategy, it’s time to boost your brand’s visibility. As marketers, you get how crucial search marketing is, but let’s be real, coming up with a plan to roll it out on a big scale is easier said than done.

That’s where SOCi comes in! We’ve built SOCi for more than a decade to ensure multi-location businesses rank well on local search and social media platforms, can create engaging content, and have the ability to manage each location’s online reputation.

We’ve enhanced our CoMarketing Cloud with SOCi Genius, an AI automation layer to help automate all of your daily localized marketing tasks. As part of SOCi Genius, we recently released Genius Search, a game-changer in search marketing!

As the newest innovation within the CoMarketing Cloud, Genius Search transcends traditional listings management by offering a dynamic, data-driven local search strategy that aligns with evolving consumer behaviors and market trends.

Genius Search uses the top data signals, such as reviews, search keywords and volume, weather, holidays, and others to deliver monthly AI-powered recommendations that can be accepted with the click of a button. Once accepted, these optimizations instantly improve your business listings’ rankings to directly relate to each location’s community.

It’s time to level up your local search strategy, and SOCi is here to help. Request a personalized demo today for more insight on Genius Search and our other Genius products!

Ready to start optimizing your website? Sign up for SOCi and get the data you need to deliver great user experiences.


Image Credits

Featured Image: Image by SOCi. Used with permission.

SEO In Crisis? Moz Search Scientist Warns Of Challenges Ahead via @sejournal, @MattGSouthern

Are the days of organic SEO numbered? That’s the idea raised by a search scientist’s assessment of Google’s AI-powered disruptions.

At MozCon’s 20th annual conference, Tom Capper, Moz’s Senior Search Scientist, provided a data-driven reality check.

Capper warned attendees

“At the end of this talk, I’m going to tell you that full-funnel organic marketing is borderline impossible in 2024 for most businesses.”

He examined how Google’s AI overview results, aggressive monetization, and evolving search intents pose challenges for companies relying on SEO.

Additionally, in an exclusive interview with Search Engine Journal, Capper highlighted potential paths forward for those willing to pivot.

Photo taken by author at MozCon, June 2024.

The Zero-Click Threat

Capper opened by chronicling the rise of search “intents” like informational, navigational, commercial, and transactional queries.

Google’s new AI Overview feature, which generates direct answers at the top of the page, has proven particularly disruptive for informational searches.

“Organic is a really tough game for informational intent,” said Capper, displaying data that informational searches have the lowest share of voice for traditional organic results due to AI Overviews and other SERP features.

Photo taken by author at MozCon, June 2024.

He also noted 21% of informational searches now surface a Featured Snippet result, which can satisfy users without a click.

“You basically can’t play at the top of the funnel,” he stated bluntly.

AI Overviews A “Mistake”

In the exclusive interview, Capper cautioned that Google’s rush to implement AI overviews could negatively impact the company’s brand image:

“I think Google has gone too soon and rushed this, and yeah, I do think it’s a mistake. That is a little bit dangerous for SEO in that if Google suffers, then that’s disruptive for our industry as well.”

The Commercial Battleground

While the data is dire for informational content, Capper says commercial searches represent a “sweet spot.”

However, these valuable mid-funnel queries have become a “turbulent” and “incredibly contested” battleground.

Weighing in on the Google product reviews update and other recent changes, Capper said:

“Commercial is where a lot of this [Google’s search quality issues] plays out…it’s become an incredibly volatile section.”

Major sites like Amazon, Reddit, and YouTube dominate commercial results alongside a glut of price listings and review rich results. This raises the bar for smaller sites trying to rank.

“There are arts, hobbies, real estate – much more realistic to try and compete in here,” Capper advised.

He warned publishers who rely solely on easily answered questions,

“If that’s what you’ve been doing, you’ve probably been suffering for a long time…If you’re not willing to pivot to any other kind of content, then yeah, sure, go. Find a different channel.”

The Paid & Local Future?

At the bottom funnel, Capper described transactional searches as “pay-to-play unless you’re a brick-and-mortar business.”

Google’s monetization of product listings and its experimental map embeds for transactional queries continue to squeeze out organic visibility.

However, Capper highlighted local SEO as a promising path forward, stating:

“If you can do well in local search, I think even in a worst-case scenario AI Overview rollout, you would still be doing well here.”

Adapting To The Changing Landscape

Despite the challenges posed by AI-powered search features, Capper believes there are still opportunities for organic marketing success.

He offers the following recommendations:

  • Target informational queries that don’t have a featured snippet, allowing for better organic visibility.
  • Focus on less competitive commercial queries in verticals like arts, hobbies, and real estate.
  • Leverage local search optimization for transactional queries, even for businesses without a brick-and-mortar presence.
  • Use keyword modifiers like “best,” “compare,” “top,” and “reviews” to identify commercial intent queries.
Photo taken by author at MozCon, June 2024.

Looking To The Future

When asked about his advice for SEO professionals who may be disheartened by the AI search revolution, Capper suggests adapting and focusing on creating high-quality, authoritative content.

Capper stated in the exclusive interview:

“If you’ve got any willingness at all to write something more interesting, then I think you can still play an organic.”

Ultimately, Capper remains optimistic about the future of organic search.

In the interview, he points Google’s business model depends on sending organic traffic to other sites:

“I don’t think Google will ever reach the point where Google doesn’t send traffic at all because, ultimately, that’s its business model.

People expect when they search Google that they will end up going to other websites; if people don’t have that expectation, they won’t click on ads; if people aren’t clicking on ads, Google doesn’t make any money.”

In Summary

While informational and transactional searches have become challenging to rank for organically, Capper’s research suggests there are opportunities in commercial and local spaces.

To adapt, he recommends focusing on less competitive commercial topics, leveraging local SEO for transactional queries, and creating content beyond simply answering basic questions.


Featured Image: KieferPix/Shutterstock

The Rise Of Reddit: How You Can Leverage The Platform That’s Revolutionizing Search via @sejournal, @hethr_campbell

It’s no secret that Reddit is making major waves in the digital marketing landscape.

But what does that mean for your strategy?

Join us live on June 12 as we dive into how Reddit is shaping the future of search and how you can leverage it to your advantage. 

With Google investing $60 million to access Reddit’s real-time content and OpenAI integrating it into ChatGPT, Reddit’s visibility on search engines and AI platforms has skyrocketed. 

So if your goal is to get your brand and content in front of evolving search audiences, knowing how to navigate Reddit is now essential. 

In this insightful webinar, our Managing Partner and Co-Owner, Brent Csutoras, will lead an engaging discussion about how to strategically position your brand on Reddit to capitalize on its growing influence.

As a Reddit expert with over 18 years of experience on the platform, Brent will guide you through how you can effectively navigate Subreddits and engage with communities without violating platform rules.

Here are some key talking points we’ll cover during the presentation: 

  • Reddit as an Influencer Over the Years: Explore Reddit’s major impact on SERPs, as well as its evolution.
  • Recent Changes and Partnerships: Discuss Google’s and OpenAI’s recent investments and how they enhance Reddit’s significance.
  • Understanding Reddit: Get a comprehensive look at the platform, how it works, how to approach it, and how to become an active member.
  • Engagement Opportunities: Identify and outline various opportunities on Reddit.

Reddit has always been a goldmine for real-time conversations and target audiences, but recent developments have catapulted its significance to new heights. 

Don’t miss this chance to discover Reddit best practices and unlock the power of authentic engagement in the AI-powered search era.

Sign up now and learn how taking advantage of this platform can help you captivate your target audience and boost brand visibility.

Be sure to stay for the live Q&A session, as Brent will be answering your most pressing Reddit questions after the presentation. 

Can’t make it on the 12th? We’ve got you covered! Simply register here and we’ll send you a recording of the webinar to view at your convenience.

Google AI Overviews: New Research Offers Insights via @sejournal, @martinibuster

New research by BrightEdge offers a snapshot of the kinds of queries that tend to show Google AI Overviews (AIO) and provides insights into the kinds of queries and verticals where AIO are more prevalent.

The findings show dramatic differences in the amount of AI Overviews shown across different verticals in a way that reflects the kinds of queries that are common. This effect works in reverse as well, where some verticals experience less AIO search features.

Is This A Paradigm Shift?

While BrightEdge calls it the greatest paradigm shift in decades, I think that’s understating shifts to Google search in the recent past, not just the ones in 2024. Something that’s not widely understood is that Google Search has been an AI Search engine since at least 2015 with the introduction of RankBrain and other subsequent changes to the backend side of search.

The big change in Search this year is that AI is more obvious on the front-end as a Feature in Search, largely replacing the role that Featured Snippets once played. Perhaps more importantly there may have been an infrastructure change at the beginning of 2024.

BrightEdge Generative Parser

BrightEdge has a technology, called the Generative Parser, which tracks and analyzes patterns in Google’s AI search features. BrightEdge used their Generative Parser to produce research findings about Google’s new AI Overviews (AIO) search feature.

Albert Gouyet, VP of Operations at BrightEdge said this about the BrightEdge Generative Parser:

“It’s fascinating to see the BrightEdge Generative Parser™ giving marketers a front-row seat into how AI in search is developing and giving the community a glimpse into the future. For marketers who rely on organic traffic, early indications suggest that AI will help reach new customers and present new opportunities to create content that serves multiple needs and elevates brand performance.”

What Triggers AIO

BrightEdge’s report indicates that Featured Snippets and questions were likely to trigger the AIO feature. Featured Snippets are answers to questions that are created with direct quotes from websites. BrightEdge found that AI Overviews were more likely to appear when there was also a Featured Snippet.

What Doesn’t Trigger AI Overviews

The research showed that local search queries were the least likeliest to trigger an AI Overview search result. That makes sense because a user is looking for a structured search result (business names, addresses, phone numbers), information that can’t be usefully summarized.

Similarly, search queries that generate sitelinks were also less likely to trigger AIO. Sitelinks are search results related to branded searches which feature multiple links to inner pages of a website. For example, searching for the name of a clothing store can generate a search result that features inner pages for women’s clothes, men’s clothes, etc. This also makes sense because it’s the kind of search query that is best answered with direct data and not a summary.

Verticals Most Likely To Contain AIO

Search results that tended to feature AI Overviews were wildly different when compared by verticals (verticals means specific industries or topics). This likely doesn’t mean that Google was targeting specific verticals for showing more AIO. Search features are always tied to the helpfulness of the features. The helpfulness of features are tested with the Search Quality Raters, workers who test out new kinds of search results and rate them for helpfulness and other criteria.

Search queries related to Healthcare tended to generate AI Overviews at a rate of 63% of the time. That makes sense for search queries that are information-seeking.

B2B technology queries tended to generate AIO results 32% of the time while Ecommerce search queries triggered AI Overviews 23% of the time.

Interestingly, restaurants and travel related queries did not tend to trigger AIO results.

AIO Shown Less Often Than SGE

Another interesting data point is that AIO is triggered 20% less times than Search Generative Experience (SGE) answers were.

BrightEdge offered three insights related to why AIO is shown less than the experimental SGE was.

  1. “This indicates that AI is getting more precise when generating helpful experiences.
  2. This is likely because AI now caters better to people’s needs, such as looking for summaries, recommendations, or conversational experiences.
  3. Ultimately, Google is getting better at selecting answers.”

BrightEdge research pointed out that Google is improving the ability to anticipate follow up questions by providing AI search summaries that more completely answer a question.

They write:

“Since Google l/O, the overlap between citations in AI and traditional results has diminished. Google is ensuring users do not get the same results in the two types of different results. It is also now delivering on its promise to do the second, third, and fourth search for you. AI is beginning to anticipate the following question and give options before a user even asks. This often happens with ‘what,’ ‘where,’ and ‘how’ intent-based queries.”

Early Days Of AIO

Google has received overwhelmingly negative reviews from users and the news media about the quality of Google’s AI Overviews, which in turn can lead to trust issues. BrightEdge’s report can be considered a snapshot of Google AIO today and I’m certain BrightEdge will be back with new data in the future when Google’s (AI) SERPs eventually change again.

Featured image by Shutterstock/Marco Lazzarini

2,596: How To Make The Most Out Of Google’s Leaked Ranking Factors via @sejournal, @Kevin_Indig

Over the last week, I observed many arguments against digging deep into the 2,596 pages.

But the only question we should ask ourselves is, “How can I test and learn as much as possible from these documents?”

SEO is an applied science where theory is not the end goal but the basis for experiments.

Image Credit: Lyna ™

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

14,000 Test Ideas

You couldn’t ask for a better breeding ground for test ideas. But we cannot test every factor the same way. They have different types (number/integer: range, Boolean: yes/no, string: word/list) and reaction times (meaning the speed at which they lead to a change in organic rank).

As a result, we can A/B test fast and active factors while we have to before/after test slow and passive ones.

A 2x2 grid with axes labeled Prioritize tests by speed. (Image Credit: Kevin Indig)

Test ranking factors systematically by:

  1. Selecting a ranking factor.
  2. Selecting the impacted (success) metric.
  3. Define where you test.
  4. Define the type of test.
Flowchart detailing four steps of testing ranking factors systematically.Image Credit: Kevin Indig

Ranking Factors

Most ranking factors in the leak are integers, meaning they work on a spectrum, but some Boolean factors are easy to test:

  • Image compression: Yes/No?
  • Intrusive interstitials: Yes/No?
  • Core Web Vitals: Yes/No?

Factors you can directly control:

  • UX (navigation, font size, line spacing, image quality).
  • Content (fresh, optimized titles, not duplicative, rich in relevant entities, focus on one user intent, high effort, crediting original sources, using canonical forms of a word instead of slang, high-quality UGC, expert author).
  • User engagement (high rate of task completion).

Demoting (negative) ranking factors:

  • Links from low-quality pages and domains.
  • Aggressive anchor text (unless you have an extremely strong link profile).
  • Poor navigation.
  • Poor user signals.

Factors you can only influence passively:

  • Title match and relevance between source and linked document.
  • Link clicks.
  • Links from new and trusted pages.
  • Domain authority.
  • Brand mentions.
  • Homepage PageRank.

Start with an assessment of your performance in the area you want to test in. A straightforward use case would be Core Web Vitals.

Metrics

Pick the right metric for the right factor based on the description in the leaked document or your understanding of how a factor might impact a metric:

  • Crawl rate.
  • Indexing (Yes/No).
  • Rank (for main keyword).
  • Click-through rate (CTR).
  • Engagement.
  • Keywords a page ranks for.
  • Organic clicks.
  • Impressions.
  • Rich snippets.

Where To Test

Find the right place to test:

  • If you’re skeptical, use a country-specific domain or a site where you can test with low risk. If you have a site in many languages, you can roll out changes based on the leaks in one country and compare relative performance against your core country.
  • You can limit tests to a one-page type or subdirectory to isolate the impact as well as you can.
  • Limit tests to pages addressing a specific type of keyword (e.g., “Best X”) or user intent (e.g., ”Read reviews”).

Some ranking factors are sitewide signals, like site authority, and others are page-specific, like click-through rates.

Considerations

Ranking factors can work with or against each other since they’re part of an equation.

Humans are notoriously bad at intuitively understanding functions with many variables, which means we most likely underestimate how much goes into achieving a high rank score, but also how a few variables can significantly impact the outcome.

The high complexity of the relationship between ranking factors shouldn’t keep us from experimenting.

Aggregators can test easier than Integrators because they have more comparable pages that lead to more significant outcomes. Integrators, which have to create content themselves, have differences between every page that dilute test results.

My favorite test: One of the best things you can do for your understanding of SEO is scoring ranking factors by your own perception and then systematically challenge and test your assumptions. Create a spreadsheet with each ranking factor, give it a number between zero and one based on your idea of its importance, and multiply all factors.

Monitoring Systems

Testing only gives us an initial answer to the importance of ranking factors. Monitoring allows us to measure relationships over time and come to more robust conclusions.

The idea is to track metrics that reflect ranking factors, like CTR could reflect title optimization, and chart them over time to see whether optimization bears fruit. The idea no different from regular (or what should be regular) monitoring, except for new metrics.

You can build monitoring systems in:

  • Looker.
  • Amplitude.
  • Mixpanel.
  • Tableau.
  • Domo.
  • Geckoboard.
  • GoodData.
  • Power BI.

The tool is not as important as the right metrics and URL path.

Example Metrics

Measure metrics by page type or a set of URLs over time to measure the impact of optimizations.

Note: I’m using thresholds based on my personal experience that you should challenge.

User Engagement:

  • Average number of clicks on navigation.
  • Average scroll depth.
  • CTR (SERP to site).

Backlink Quality:

  • % of links with high topic-fit/title-fit between source and target.
  • % of links of pages that are younger than 1 year.
  • % of links from pages that rank for at least one keyword in the top 10.

Page Quality:

  • Average dwell time (compared between pages of the same type).
  • % users who spend at least 30 seconds on the site.
  • % of pages that rank in the top 3 for their target keyword.

Site Quality:

  • % of pages that drive organic traffic.
  • % of zero-click URLs over the last 90 days.
  • Ratio between indexed and non-indexed pages.

It’s ironic that the leak happened shortly after Google started showing AI for results (AI Overviews) because we can use AI to find SEO gaps based on the leak.

One example is title matching between source and target for backlinks. With common SEO tools, we can pull titles, anchor text, and surrounding content of the link for referring and target pages.

We can then rate the topical proximity or token overlap with common AI tools, Google Sheets/Excel integrations, or local LLMs and basic prompts like “Rate the topical proximity of the title (column B) compared to the anchor (column C) on a scale of 1 to 10 with 10 being exactly the same and 1 having no relationship at all.”

A spreadsheet displaying SEO page titles, anchors, AI ratings, and explanations. Using AI to rate title-match between link sources and targets. (Image Credit: Kevin Indig)

A Leak Of Their Own

Google’s ranking factor leak isn’t the first time the inner works of a big platform algorithm became available to the public:

1. In January 2023, a Yandex leak revealed many ranking factors that we also found in the latest Google leak. The underwhelming reaction surprised me just as much back then as today.

2. In March 2023, Twitter published most parts of its algorithm. Similar to the Google leak, it lacks “context” between the factors, but it was insightful nonetheless.

Twitter’s algorithm in a system chart.Twitter’s algorithm in a system chart. (Image Credit: Kevin Indig)

3. Also in March 2023, Instagram’s chief Adam Mosseri published an in-depth follow-up post on how the platform ranks content in different parts of its product.

Despite the leaks, there are no known cases of a user or brand hacking the platform in a clean, ethical way.

The more a platform rewards engagement in its algorithm, the harder it is to game. And yet, the Google algorithm leak is quite interesting because it’s an intent-driven platform where users indicate their interest through searches instead of behavior.

As a result, knowing the ingredients for the cake is a big step forward, even without knowing how much of each to use.

I cannot understand why Google has been so secret about ranking factors all along. I’m not saying it should have published them in the degree of the leak. It could have incentivized a better web with fast, easy-to-navigate, good-looking, informative sites.

Instead, it left people guessing too much, which led to a lot of poor content, which led to algorithm updates that cost many businesses a lot of money.


System Diagram from Github.com

Instagram Ranking Explained


Featured Image: Paulo Bobita/Search Engine Journal