Ask An SEO: Balancing Content That Converts With Content That Builds Brand Authority via @sejournal, @MordyOberstein

This week’s Ask an SEO question comes from Rachel P., who wants to find the middle ground between conversion-focused pages and authority-building content:

“How do I balance content that converts with content that builds brand authority? It feels like my CRO pages don’t rank well, and my blog posts don’t convert.”

It’s hard not to think of your marketing in parts. There’s the part of the strategy that focuses on brand awareness, while another aspect focuses on conversions or whatever key performance indicator.

That might be how you’ve been taught to do marketing, but that’s not how marketing actually works, in my opinion. I know that’s a contentious statement. I’ll do my best to back it up.

Marketing Is One Song

All of your marketing should work together to present a unified process that supports itself at each step of the way. Everything should work together harmoniously.

The question here almost assumes they don’t. We have efforts, or in this case, content meant to build authority and content that’s meant to convert, etc. What if we just had content?

Instead of siloing all of our efforts, what if we looked at it all as being about “generating momentum”? Producing a certain level of brand inertia and energy that could be captured or bottled up.

What if we looked at it as a process across multiple touchpoints, pages, and platforms (many of which are offline) of deepening the connection between our brands and our audience or consumer?

In that framework, it’s more about ensuring you give your audience the opportunity to move further down the rabbit hole than it is about “ranking your conversion” pages.

To me, the notion of “getting our conversion pages to rank” is more aligned with how you work and far less aligned with how your audience works.

You want to give your audience “access” to convert or reinforcement to convert as they feel connected and engaged with you.

The connection and sense of engagement are what drive conversion. To paraphrase baseball legend Yogi Berra, conversions are 90% about connection; the other half is product or service awareness.

In simple terms, I would focus more on how to build yourself up and engage an audience than on worrying about how to balance conversion rate-optimized content and informational content.

If your audience is engaged with your informational content and no one cares about your landing pages, who cares if the latter ranks well?

Use the blog to reinforce your offering. If that’s where people want to engage you, then engage them there.

Throw a banner up on the side of the blog. Include a call to action here or there. Add screenshots and tie the content into your product (naturally), etc.

Meet your audience where they are engaged and connected to you, and guide them to details on your offering from there.

This means I wouldn’t worry about balancing content and ranking. I’d worry about, “How do I engage my audience and remind them at the right time of my offering? How do I ensure that if they are engaged and want to move things forward, they have the access they need to be able to do that?”

What Does This Mean Practically?

Instead of building out two separate sets of pages in this case, think of them as one thing.

Authority and trust create a connection. Connection is the straw that mixes the drink. It’s what takes what might be a sterile statement about your brand’s offering and makes it intriguing.

Often, I find marketers think of this in reverse order: “My landing pages will rank or whatever, and then the consumer will see my informational content and be reassured. This reassurance will enable them to convert.”

It’s the opposite. In whatever way, whether it be via social, informational content, or company advocates, people form a connection with you.

This connection emotionally enables them to even consider opening their wallets or recommending your offering, etc. (i.e., conversions).

In my opinion, what you want to do is simply remind and reinforce. In those moments of connection, you want to remind the consumer of your offering, reinforce its value, and ensure they have easy access to it.

YouTubers sometimes do a great job of this. They offer informational value that creates a connection with their audience. Then, they ask you to subscribe (which in and of itself is a conversion).

They will reinforce that connection with additional content, and then, at some point, will mention their product, service, or whatever it is they offer.

They’ll tell you to check out the link in the description (because as I mentioned, you need to offer easy entry points), and take it from there.

I feel it’s more important to engage your audience and ensure that the access to the material you want them to convert with is accessible than worrying about various sets of content with various intents and how they perform.

If your audience is really engaged with your social media content, make sure links are included in your profile and post about your offering. (I like the 80/20 rule here: 80% value-driven content, 20% promotional. I often go 90/10.)

My point is, it’s not about page types. It’s about engaging and connecting with your audience and properly utilizing that connection to make your audience aware of your offering and to make it as frictionless as possible thereafter.

New Ways For A New Web

This sort of dichotomy between areas of marketing and aspects of your marketing strategy works for what I call the “old web.” To be honest, it didn’t work as well as we all bragged about on the “old web” either.

Regardless, I see the “new web” as being about success by refinement. In this era of the web, having a consistent message and presence that creates a strong and demarcated identity matters most.

We’re in a web environment where there has to be a strong connection to the audience if you expect them to notice you and purchase from you.

That means taking better advantage of opportunities to create pathways to products within those moments of audience engagement – not separate and siloed conversion paths.

It also means pushing product pathways in a far less aggressive and inorganic way as well, but that’s a conversation for another day.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How AI Is Changing Affiliate Strategies via @sejournal, @rollerblader

This week’s Ask an SEO question about affiliate strategies comes from Mike R:

“How is AI changing affiliate marketing strategy in 2025? I’m concerned my current approach will become obsolete but don’t know which new techniques are actually worth adopting.”

Great question, Mike. I’m seeing a few trends and strategies that are changing, for the better and for the worse.

When AI is used properly in the affiliate marketing channel, it can help businesses and brands grow.

If any of the three types of businesses (defined below) in affiliate marketing use it in a way that AI and large language models are not ready for “yet,” it can backfire.

I’m answering this question in three parts, as I’m unsure which side of the industry you’re on.

For the record: The affiliate channel is not at risk (i.e., affiliate marketing is not dead) because affiliate marketing is more than a content website that creates a list or writes a review, and coupon sites intercept the end of sale.

Affiliate marketing is a mix of all marketing channels, including email, SMS, online and offline communities, PPC, media buying, and even print media.

It is not going to be as impacted by AI as SEO and content marketing – and in many ways, it will likely grow and scale from it.

1. Affiliates (Content Creators, Publishers, Media Houses, Etc.)

Affiliates are the party that promotes another brand in hopes of earning a commission.

Here’s some of what I’m seeing regarding the use of AI and its impact on affiliate revenue.

Programmatic SEO And Content Creation

Programmatic SEO is not new, and using LLMs to create content or lists is burning what were quality sites to the ground.

It is almost never a good idea; it doesn’t matter if AI can spin up content and get it publish-ready in minutes.

In the early 2000s, affiliates and SEO professionals would use pre-AI article spinners to create massive quantities of content from one or two professionally written and fact-checked articles, then publish them to blogs and third-party publishing platforms like Squidoo.

This is equivalent to affiliates publishing their content on Reddit or LinkedIn Pulse to rank it.

The algorithms caught up and penalized the affiliate websites. Squidoo and some of the third-party platforms managed to stay afloat as they had trust and a strong user base for a while.

Next, PHP became the go-to for programmatic SEO, and affiliates would generate shopping lists or pages with unique mixes of products and descriptions via merchant data feeds and network-provided tools. Then, these got penalized. Again, nothing new.

Media companies have been getting penalized and devalued for years for this, and plenty of content creators, too.

If an affiliate manager is telling you to use LLMs to create content, or someone is using LLMs and AI to do programmatic SEO, look for advice elsewhere.

I’ve watched multiple quality sites fall since ChatGPT, Perplexity, and others began writing and spinning their content.

Content And Creator Value

In traditional affiliate marketing, if an affiliate is not making sales, even if they send quality traffic, they get ignored. LLMs have changed this 100%.

I’ve seen affiliates, including bloggers, YouTubers, forums, and social media influencers, are being sourced and cited by AI systems.

If the brand is not on the content being used for fact-checking (grounding) and sourcing, the brands begin to disappear from outputs and results. I’m seeing this firsthand.

Not getting traffic or sales, or being number seven to 10 on a list, now has value. The citations and mentions from the resources that LLMs trust can help your brand gain visibility in AI.

Affiliates can and should begin charging extra fees for these placements until the LLMs begin penalizing or ignoring pay-to-play content.

We’re likely a couple of years away from their algorithms being anywhere near that advanced, so it is a prime opportunity while Google is reducing traffic to publishers via AI Overviews.

Coupon Sites For Top And End-Of-Sale Touchpoints

I think coupon sites are going to take a substantial hit, as AI is starting to create its own lists of coupons that work.

It also includes where and how to save, where to shop, and current deals on specific products. For example, “I want to buy a pair of Asics Kayano 32 men’s running shoes and get them on sale. Where can I find a deal?”

Right now, Google’s AI Overviews are populating lists of where to find deals, and it is showing the coupon sites as the sources to the right. These sites are likely getting clicks now.

I’ve seen ChatGPT pull the codes directly and preventing the need to click to the coupon website and set their affiliate tracking. It does show the website it came from, though – just no reason to click since you get the code in the output.

One interesting thing is that ChatGPT may pull in vanity codes.

The output from ChatGPT featuring these could give an influencer who was sourced for the code or a coupon site credit for their sales, throwing attribution off, because it was the coupon that triggered the commission, even though the user was using the LLM.

The influencer did not have anything to do with this transaction, but they’ll be getting credit.

The brand may now pay more money to the influencer, when, in reality, it should be ChatGPT – that is where the customers are, not the influencer.

By showing where to find the deals and which deals are available by product (not brand), AI eliminates one of the deal and coupon site’s top-funnel traffic strategies to brands.

The biggest hit I see coupon sites taking is ranking in search engines for “brand + coupon” for the last-second click from someone who is already in the brand’s shopping cart.

If Google AI Overviews creates its own coupon lists as the output, like ChatGPT is doing, there is no reason to click on a coupon website and click their affiliate links.

But, don’t count deal and coupon sites out. They still have email lists and social media accounts that can drive top-funnel traffic, and they can reintroduce customers who have forgotten about you by utilizing their own internal databases of shoppers.

2. Affiliate Manager And Affiliate Management Agencies

These are the people who manage programs by recruiting affiliates into the program, giving the affiliates the tools they need, and ensuring the data on the network is tracked and accurate so the brands being promoted have the sales and touchpoints they’re looking for.

Content Sites That Lost Traffic

Some managers hit the panic button because they relied on content sites and publishers who have SEO rankings, but AI Overviews is using affiliate and publisher content and not sending the same amount of traffic to the publishers.

This reduces the number of clicks and traffic. The publishers are still driving traffic, but it is coming in via Google and not the affiliate channel.

With that said, affiliate managers can shift their focus to channels not as impacted by AI Overviews, including:

  • Discord.
  • Platforms like Skool.
  • Social media groups.
  • YouTube channels.
  • Influencers.

Fraud Sign Ups

From speaking to others, it appears that high-quality publisher accounts are being created en masse as fronts for fraud and fake affiliate accounts.

I’ve had conversations with people hired by the fake affiliate account who are being paid to talk to the affiliate manager, so it makes these sites look even more legit. We’ll have back-and-forth emails, and in some cases, a call.

Once the traffic and sales start, it turns out to be stolen credit cards or program violations. In some instances, the person or websites they applied with no longer exist.

Interestingly, when they activate a year later, thinking you forgot about them, magically, the site reappears when they know you’re not checking.

Always evaluate a site, and if the content is being generated by LLMs or AI, it may be best to reject it and reduce the risk of a fake account.

AI content may rank temporarily, but this is not a long-term strategy. If your brand is being written about by AI and spun out to a site via programmatic SEO, there is a reasonable chance that the details won’t be as factual or as on-brand as they should be.

An affiliate who cannot take the time to create good content and use AI to edit, versus using AI to create and then edit, should not be trusted in your affiliate program.

Non-Factual Information And False Claims

When your affiliates are generating content or fact-checking via LLMs and AI, they’re not doing their jobs as your partners to promote your program factually, with correct talking points, and following brand guidelines.

There’s a reasonable chance that incorrect claims about financial products, medical treatments, or even books to buy and read will be in the content you, as a brand, are paying to have made.

Even if you’re paying on a performance basis, you are approving this content to be live and represent your brand. This is why affiliates in your program using AI to create content are at high risk.

Set rules and enforce them so that your brand cannot be included in any AI-created content, or remove the affiliate from your program until they’re ready to treat your brand or your clients’ brands with the same care as you do.

Partner Matching And Approvals

One interesting use of AI for affiliate management is merchant and affiliate matching using machine learning and AI by agencies and larger brands.

Just because a partner does well in one vertical or with one affiliate program that has a similar audience, it does not mean it is a good match for others.

  • One program may allow end-of-sale touchpoints while the other does not. The top partners that use low-value clicks should not be allowed in a similar program that does not (or will not) match it. If the programs are on auto-approve or using AI to approve affiliates that do well in specific verticals, the TOS is likely no longer being enforced.
  • A partner may make a ton of T-shirt sales in one program, but their audience may not respond to the colors, social causes, or price points of another merchant. If the affiliate is part of AI matching and starts to lose money because they got matched to new T-shirt shops, they may start to move on from the affiliate or focus less because they’re making less money and getting bad recommendations from the agencies and managers.
  • If the program trusts AI to do matching, but has restrictions like requiring advertising disclosures or using factual information, the machine learning likely won’t be able to check for this, and partners that are not a fit can get in.
  • Automating approvals because they pass an AI review or scan is risky, as AI will miss things that an experienced affiliate manager will find, like advertising disclosures in the wrong space and false claims in the industry or space in content.

One exception to using AI for matching is to build a list of potential partners from a database. But automatically approving that list because the output creates a list is problematic.

Each affiliate that is recommended still needs to be vetted by hand to make sure they meet the requirements of the new program.

Recruitment And List Building

Some of the best uses of AI, especially LLMs, have been building lists of potential partners.

You can train GPTs to validate the lists, remove current partners so you don’t accidentally email or call them, do a gap analysis, and even customize the recruitment email to a very strong degree.

No, it isn’t perfect, but you can save hours each week from the manual tasks of discovery, validation, and outreach.

The recruitment emails still need to be reviewed and sent manually, but it is a massive time-saver.

We manually review every email before it goes out and have to do a decent chunk of rewriting, but we’re saving large amounts of time, too.

We also pre-schedule the emails using a database tool, but we’ve slowly begun implementing new discovery and drafting methods, and they’re turning out to be fantastic.

I was a non-believer in AI for this at first, but now I’m about ready to double down, especially as the systems advance.

3. Affiliate Networks

These are the tracking and payment platforms that power the affiliate programs.

Affiliates rely on them to accurately record sales and release payments.

Affiliate managers use them to track progress, simplify paying partners around the world, and generate reports based on the key performance indicators (KPIs) their company uses.

Better Controls

All of the networks we’re working on have an influx of AI-generated sites. I’ve talked to agencies and managers on the ones we don’t work on, and they’re seeing the same.

The networks would be wise to add filters and create an alert for affiliate managers to let them know if the affiliate is human or AI, meaning that AI would be a website and promotional method without quality control.

There are no advanced controls in place on any networks that I’ve seen specifically for AI affiliates. But most networks do have compliance teams to which you can report fake accounts.

From the networks I’ve talked to, they’re working on solutions to help detect and reject these sites, but it is a massive problem because they’re being generated at high volumes, and some are really hard to detect.

The spammers and scammers are getting smarter, and AI has given them a new advantage.

Partnership Matching

This is a double-edged sword. Networks have more data than any affiliate agency, and they may be best suited to try partner and program matching algorithms.

They can create a list of programs that an affiliate may want to test, or a list of partners a program manager can pay to recruit based on program goals and dimensions.

The downside is that programs spend countless hours recruiting partners for their programs. Networks doing matching and recruitment take that work and give it for free to that program’s competitors.

A second downside is that affiliates get bombarded with program requests, and this can cause that to skyrocket, making it harder to get them to open emails, including program updates and newsletters.

Once they start ignoring emails because of too many, you may not get compliance issues fixed or promotions that would normally have benefited both parties.

Reporting

One of the most beneficial things a network can do, but none are currently doing on a mass scale (some are starting to, and it’s looking promising), is to use AI to create custom reports for affiliate programs. These could be charts and graphs on trends over XYZ years.

Another is a gap analysis of products that get bundled together by type of affiliate, and then which similar affiliates already in the program don’t have a specific SKU in their orders.

The manager can recommend pre-selling the SKU within the content that drives the sale, or adding that specific SKU as an upsell to any customer who came from that affiliate’s link, based on the affiliate ID passed in the URL.

It can show trends where there are cross-channel (SEO, email, PPC, SMS, etc) touchpoints and how it modifies seasonally, annually, and if the goal creates more or less sales for the affiliate channel or company as a whole.

One important thing to remember is that not all affiliate networks offer true cross-channel reporting. Multiple only offer it once the user has clicked an affiliate link.

Final Thoughts

AI is going to be amazing and horrible for each of the three entities above that make up the affiliate marketing channel.

If used correctly, it can save time, increase efficiency, and create more meaningful strategies.

At the same time, it could result in violations of a program’s Terms of Service (TOS), steal traffic from publishers, and harm multiple types of businesses.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Why Is GA Reporting Higher Organic Traffic Than GSC? via @sejournal, @HelenPollitt1

Today’s question centers on the differences in Google Analytics 4 and Google Search Console measurement:

“I’m reaching out for help with a puzzling issue in Google Analytics 4 (GA4). We’ve experienced a sudden and unexplained surge in traffic over a four-day period, but surprisingly, Google Search Console (GSC) doesn’t show any corresponding data.

The anomaly is specific to organic search traffic, and it’s only affecting our main page. I’d greatly appreciate any insights you can offer on what might be causing this discrepancy.”

Why GA4 And GSC Report Different Traffic Numbers

It’s a very interesting and common question about data from Google Analytics 4 and Google Search Console.

They are both Google products, so you could assume their data would be consistent. However, it isn’t, and for very good reasons.

Let’s take a look at the differences between the two.

Traffic Mediums

Google Analytics measures user interactions with a digital property. It is highly customizable and can even accept data inputs.

Google Search Console provides an overview of your website’s performance in Google Search.

This means that Google Analytics 4 is measuring traffic from all types of sources, including paid search campaigns, email newsletters, display ads, and direct visits.

Google Search Console is far narrower in scope, as it only reports on Google Search traffic.

Organic Sources

Another key difference to remember is that when reporting on organic traffic, Google Analytics will look at all sources marked as “organic search,” which includes other search engines like Bing, Naver, and Yandex.

This means that unless you instruct Google Analytics 4 to filter the organic search sources to only Google, you will see vastly different numbers between the two programs.

Clicks And Sessions

The two most comparable metrics are Google Analytics 4’s “sessions” and Google Search Console’s “clicks.” However, they are not identical metrics.

A “session” in GA4 is counted when a user either opens your app in the foreground or views a page of your website. A session, by default, lasts only 30 minutes, although this can be altered through your configuration of GA4.

A “click” in Google Search Console is counted when a user clicks on a link displayed in Google Search (across web, images, or video, and including News and Discover).

Reasons For Higher GSC Clicks Than GA4 Sessions

As you can imagine, these small but critical differences in the technical ways these two metrics are counted can have a significant impact on the end volumes reported.

There are other reasons that can impact the final numbers.

Typically, we see Google Search Console’s “clicks” being higher than Google Analytics’ organic “sessions” from Google.

Let’s assume a user clicks on an organic search listing on Google Search and arrives at the webpage it links to. What would be registered in different scenarios?

Cookies

This is a differentiating factor that is becoming more prominent as laws surrounding cookie policies change.

GA4 requires cookies to be accepted in order to track a user’s interaction with a website, whereas GSC doesn’t.

This means that a user might click on an organic search result in Google Search, which registers as a “click” in Google Search Console, arrive on the webpage, but not accept cookies. It means there would be one click registered in Google Search Console but no session registered in Google Analytics 4.

JavaScript

GA4 won’t work if JavaScript is blocked on the website, whereas GSC doesn’t rely on your site’s code to track clicks, but is based on search engine-side data. Therefore, will continue to register clicks.

If JavaScript is blocked in some way, this would again result in a click being registered on Google Search Console, but no session being registered in Google Analytics 4.

Ad Blockers

If the user is utilizing an ad blocker, it may well suppress Google Analytics 4, preventing the session from being registered.

However, since Google Search Console is not affected by ad blockers, it will still register the click.

Tracking Code

Google Analytics 4 only tracks pages that have the GA4 tracking code installed on them.

If the URL the user clicks on from Google Search results does not contain the tracking code, Google Search Console will still register the click, but Google Analytics will not register the session.

Filters And Segments

GA4 allows filtering and segments to be set up that can discount some visits or reclassify them as coming from another source or medium.

Google Search Console does not allow this. It means that if the user clicks on a URL and displays some behavior that gets it caught in a filter, then Google Analytics may not count that session, or may reclassify it as coming from a source other than Google.

In that instance, Google Search Console would register the click, but Google Analytics 4 may not register the session, or may register it as a different source or medium.

Similarly, if your GA4 account has segments set up and these are not properly managed during the reporting process, you may find that you are only reporting on a subset of your Google organic data, even if the full data has been captured correctly by Google Analytics 4.

Why GA4 Might Report More Sessions Than GSC Clicks

In your case, you’ve mentioned that you have seen a surge in organic search traffic to your main page only. Let’s look at some of the potential reasons that might be the case.

Semantics

I want to start by looking at the technicalities. You haven’t specified what metric you are using to determine “traffic” in Google Analytics 4.

For example, if you are using “page views,” then that would not be a closely comparable metric to Google Search Console “clicks,” as there can be several page views per session.

However, if you are looking at “sessions,” that is more comparable.

Also, you haven’t specified whether you have filtered down to look at just Google as the source of the organic traffic, or if you might be including other search engines as sources as well.

That would mean you are likely getting much higher sessions reported in Google Analytics, as Google Search Console only reports on Google clicks.

Tracking Issues

I would start by looking at the way tracking has been set up on your site. It could be that you have incorrectly set up cross-domain tracking, or there is something causing your tracking code to fire twice, only on the homepage.

This could be causing inflated sessions to be recorded in your Google Analytics 4 account.

Multiple Domains

The way you have set up your Google Analytics 4 properties may be quite different from your Google Search Console account.

In GA4, it’s possible to combine multiple domains under one view, whereas in GSC, you cannot.

So, for example, if you have a brand with multiple ccTLDs like example.com, example.fr, example.co.uk, then you will have these set out as separate properties in Google Search Console.

In Google Analytics 4, however, it’s possible to combine all these websites to show an overall brand’s website traffic.

It might not be obvious at first glance when looking at your homepage’s traffic, as you’ll likely only see one row with “/” as the reported URL.

When you add “hostname” as an additional column in those reports, you will be shown a breakdown of each ccTLD’s homepage, rather than a combined homepage row.

In this instance, it might be that you are viewing the Google Search Console account for one of your ccTLDs, e.g., example.com, whereas when you look at your Google Analytics 4 traffic, you may be viewing a row detailing the combined ccTLDs’ homepages’ traffic.

Length Of A Standard Session

Google Search Console tracks clicks from Google Search. It doesn’t go much beyond that initial journey from SERP to webpage. As such, it is really reporting on how users got to your webpage from an organic search.

Google Analytics 4 is looking at user behavior on your site, too. This means it will continue to track a user as they navigate around your site.

As mentioned, by default, Google Analytics 4 will only track a session for 30 minutes unless another interaction occurs.

If a user navigated to your website, landed on the homepage, and then took a phone call for an hour, they might be shown as languishing on your homepage for 30 minutes.

Then, when they come back to their computer and navigate from your homepage to another page, it will count as a second session starting.

It is most likely that in this scenario, the second session would be attributed to direct/none, but there may be cases where Google Analytics 4 is able to identify the previous referral source.

However, it is unlikely that this would cause the sudden spike in organic traffic that you have noticed on your homepage.

Bots Mimicking Google

It might well be that Google Analytics 4 is being forced to classify landing page traffic incorrectly as coming from an organic search source due to bot traffic spoofing the referral information of a search engine.

Google Search Console is better at filtering out this fake traffic due to the way it records interactions from Google Search to your website.

If there is a surge of bots visiting your homepage with this fake Google referrer, they may be incorrectly counted by Google Analytics 4 as genuine visitors from Google Search.

Misclassified UTMs

UTM tracking is often used within paid media campaigns to assign value to different campaigns more accurately.

It enables marketers to specify the medium, source, and campaign from which the traffic came if it clicked on their advert. However, mistakes happen, and quite often, UTMs are set up incorrectly, which alters the attribution of traffic irrevocably.

In this instance, if a member of your team was testing a new campaign, or perhaps using a UTM as part of an internal split test, they may have incorrectly specified “organic” as the medium instead of the correct value.

As such, when a user clicks on the advert or participates in the split test, their visit may be misattributed as organic instead of the correct source.

If your team is testing something and has used an incorrect UTM, this would explain a sudden surge in organic traffic to your homepage.

UTMs do not affect Google Search Console in this way, so the traffic that is misattributed in Google Analytics 4 would not register in Google Search Console as an organic click.

In Summary

There are a myriad of reasons why Google Analytics 4 may be reporting a different volume of homepage sessions than Google Search Console reports homepage clicks.

When using these two data sources, it’s best to recognize that they report on similar but not exactly the same metrics.

It is also wise to recognize that Google Analytics 4 can be highly customized, but improper setup may lead to data discrepancies.

It is best to use these two tools in conjunction when working on SEO to give you the widest possible view of your organic search performance.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Can I Improve The Visibility Of My Category Pages? via @sejournal, @kevgibbo

This week’s Ask an SEO question comes from a medium-sized ecommerce site manager who’s run up against a common problem:

“Our product pages rank well, but our category pages rarely appear in search results. What specific optimization strategies would you recommend for category pages to improve their visibility?”

Thanks for the question!

It’s a common issue for ecommerce site managers. You have lots of category pages that would present a good opportunity for driving traffic, but they just don’t seem to be getting visibility in the search engine results pages.

First Thoughts

If your product pages are ranking well but your category pages are struggling more in search results, it’s likely due to the greater competition for broader, middle-of-the-funnel keywords.

While product pages can capture long-tail, bottom-of-the-funnel queries, category pages often struggle with more competitive, high-traffic terms.

Here are a few key reasons your product pages might be outshining your category pages, along with some tips to give those category pages a boost:

1. Technical Accessibility

There might be incorrect indexing directives. Category pages won’t rank well if basic technical elements aren’t working correctly.

To ensure your category pages are fully crawlable and indexable by search engines, check these aspects:

  • On-page directives: Ensure noindex tags aren’t blocking your category pages from appearing in search results.
  • Robots.txt file: Double-check that your robots.txt file isn’t unintentionally blocking important category pages.
  • Canonical tags: Confirm that canonical tags are correctly set to point to the preferred version of each page.

2. Site Architecture And Internal Linking

It’s possible that your site architecture is designed to give more link equity to product pages rather than category pages.

To improve category page visibility, focus on linking more frequently to those category pages, especially the ones that have the ability to drive the highest amount of revenue.

This can be done through linking from other categories, blog posts, guides, and more. By ensuring category pages are linked to more often, you help search engines understand their importance and authority.

This is why conducting an opportunity analysis early in your SEO strategy is crucial. It helps identify the category pages that should receive the most internal linking support.

A final point on linking: Make sure your breadcrumbs are optimized and visible. Not only does that help visitors understand where they are on your site, but it might also encourage them to explore more of what you have to offer.

3. Issues With Faceted Navigation

Faceted navigation is an essential feature for large ecommerce websites, allowing users to filter product searches. However, if not properly managed, it can pose significant SEO challenges.

One of the primary concerns is “index bloat” – the creation of multiple, often duplicate URLs for each possible filter combination.

It can exhaust your crawl budget, and then search engines can potentially overlook critical pages.

Also, improper implementation can result in duplicate content, cannibalize rankings for category pages, and dilute internal link equity.

To avoid this, I recommend limiting the number of indexed filter combinations at any given time – ideally no more than two.

The specific number will depend on the range of filters available, but it’s crucial to prioritize filters that align with search demand.

For example, avoid indexing a combination like “size 7, green, wide fit, running shoe” if there’s minimal search volume for it.

However, “green size 7 running shoe” could be a valuable combination to index, as it has higher search intent.

4. Insufficient Or Low-Quality Content On Category Pages

Over the years in this industry, I’ve seen firsthand how impactful on-page copy can be for category pages. It helps to provide extra context that helps search engines better understand the focus of your pages.

After all, search engines prioritize pages with valuable content that provides context for users.

Many category pages are nothing more than long lists of products and icons. That’s a real missed opportunity – and also makes them less likely to surface in the SERPs.

Here are a few ways to boost their chances:

Short Introductions At The Top

On many ecommerce sites, you’ll notice there’s often a short block of intro copy at the top of the page.

This doesn’t need to be more than 100 words or so and is an effective way of helping search engines understand the page’s context. Avoid fluff or boilerplate copy; it needs to be unique and meaningful.

Tip: Explain what the category is, and the broad range of products or brands you sell.

Say the category page was “running shoes.” The intro could talk about all the materials the running shoes are made from, colors available, types of runs they can be used for, and so on.

Guidance Lower Down

Further down the page, you can include additional content modules to help the customer make an informed decision.

Ecommerce stores often use things like:

  • FAQs.
  • Feature comparisons.
  • More information about your brand.
  • Information on how to choose between products.
  • Videos.
  • Delivery information.

5. Lack Of On-Page Optimization

Your on-page optimization for category pages might not be fully aligned with search intent, so it’s worth reviewing and refining it to better match what users are searching for.

Page Titles

If category pages have generic or poorly optimized page titles, search engines may struggle to understand the page’s relevance, and users won’t feel enticed to click on the result in SERPs.

When creating them:

  • Review current SERPS to see what’s working for competitors.
  • Keep titles unique for each category to avoid duplication, and aim for 50-60 characters to prevent truncation in search results.
  • Ensure your titles reflect what users are looking for – like specific product attributes (e.g., color, size) when relevant.

Meta Descriptions

A compelling meta description for a product listing page (PLP) should give users a reason to click, showcasing its offering and value.

Keep the meta description within 150-160 characters to avoid truncation, and craft it to answer potential user queries, like “best [category] for [specific need].”

Header Tags

When you’re reviewing header tags for categories, the key is to capture the essence of the entire category while speaking to the intent of shoppers browsing or filtering options.

Start with a clear, keyword-rich H1 that tells users exactly what the page is about, like “Men’s Running Shoes.”

Then use H2 tags to break things down further with subcategories or popular filters, such as “Top Rated” or “Shop by Brand.”

For product detail pages (PDPs), header tags become more specific to the individual product.

6. Low-Quality Or Missing Schema Markup

Now, we’re getting into some of the more technical tasks to improve your category pages’ rankings.

It might be that your schema markup is better for PDPs than your PLPs, or your PLPs just need some more tweaks or additions.

Here are some simple actions that can make a difference:

  • Consider adding the BreadcrumbList schema to your category pages. (It helps search engines understand the page’s position within your site’s hierarchy, improving internal linking.)
  • Consider collection-level structured data if applicable.
  • Review if category pages have any missing structured data.

7. Content Freshness Signals

All too often, people create category pages, then basically forget about them.

However, regularly updating them will show that the page is actively maintained, increasing your chances of appearing in SERPs.

Keep Category Pages Dynamic

Highlight trending products, top-rated items, or seasonal goods, surfacing them at the top of your category pages.

Include Recent Reviews

Getting positive reviews for products? Insert them as content blocks within your category pages. The more recent the reviews, the better.

Refresh Copy

Trends come and go, stock gets replaced, and new products get made. Refresh your category page copy to reflect these changes.

Final Word

I hope these tips can help you get more visibility for your category pages – and complement your already successful product pages.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Do We Shift Google From Our Old Brand Name to Our New One? via @sejournal, @MordyOberstein

The question for this edition of Ask An SEO comes from a reader who’s trying to make their rebrand stick in search:

“Our company recently went through a rebrand with a new name. We’re seeing our old brand name still dominating search results, while our new brand barely registers.

What’s the best approach to transition brand equity in search from our old name to our new one?”

Having your old brand name appear on Google can be extremely frustrating. You just launched a new brand name, spent a lot of time working on it, and here you are stuck on the search engine results page with the old name.

It’s genuinely frustrating.

There are essentially three steps here:

  1. Handle your own ecosystem.
  2. Request changes to third-party sites.
  3. Build up your new brand name so you don’t have to rely on No. 2 happening.

Aligning The Assets You Control

The first, and obvious thing to do, is ensure your new brand name appears consistently across all the assets you own.

Some of these places are entirely obvious, like your homepage. Obviously, you’re going to change how you refer to yourself on your homepage.

However, there can be a lot of nooks and crannies across your ecosystem that may still mention your brand’s former name.

This can include:

  • Title tags and meta descriptions.
  • Alt text.
  • Knowledge Base pages.
  • Structured data markup.
  • Unused social media accounts.
  • Employee bios (both on and off your site).

It’s a matter of crossing your i’s and dotting your t’s. If you’re a big brand with a broad ecosystem, this can be more complicated than it might seem.

Let’s imagine for a moment that what changed is not the main brand name but a product name or the name of a sub-brand.

There could be thousands of pages that you would never even think of that might reference the old naming.

In such a case, you should conduct an extensive audit. I recommend this in general, even if you are not a huge website – it’s so easy to forget a page that references your naming and that such a page even existed.

This should help ensure your own brand SERP is aligned with the new naming as much as possible.

However, there are still elements even on these SERPs that will need some help, such as your Knowledge Panel. For this, we need to think beyond your owned assets.

Align Third-Party Assets

Getting others to recognize your new brand name is a little tougher than just combing through your assets to ensure alignment (which, as I said earlier, might not be as straightforward as it may seem).

Getting third parties to pick up on your branding change is incredibly important.

The underlying goal or concept is: We want people to talk about you and to mention your new brand name when they do.

Within this task, there are things that are easier to accomplish and things that are much harder.

Start with the easier things. Getting these done will help you push areas that you have less influence over.

One easy place is author bios. If you, or anyone in your company, has contributed content to a third-party site (whether it be an article, webinar, podcast, etc.), there is often a bio that will mention, if not link to, your company.

Make sure these bios are up to date and reflect the current and only the current company name.

By the way, sometimes these bios have multiple places where the brand is mentioned; make sure all instances are up to date.

For example, in my Search Engine Journal bio, my company is mentioned twice:

Screenshot from Search Engine Journal, May 2025

Getting these updated should not be hard at all.

It’s easy to miss a few wins here.

But getting these citations right can help with the Google results.

When I went around and had my current company added to all of my bios across the internet, Google’s Knowledge Panel took notice.

While my old Wix bio still often appears as the main URL in the Knowledge Panel and as a top organic result, Google started to pull in the images from my site as well:

Screenshot from search for [mordy oberstein], Google, May 2025

Notice, by the way, that because I took care of my social media. My current brand shows up as part of my LinkedIn profile, which is confusing considering what the Wix result says below it (i.e., that I work at Wix).

That’s exactly what I want. I want the person to ask, “Does he still really work at Wix?”

When Third-Parties Won’t Align

What happens when your brand is listed under its previous name on some random listicle that won’t respond to any of your requests to change the brand name?

What happens if on some forum (say Reddit), there are endless references to your previous brand name that you can’t remove?

For starters, it does show the logic behind running a campaign to announce your new branding.

Often, companies will run a campaign announcing the new branding to generate buzz and interest or even to gain more conversions.

Nothing wrong with that, at all. However, even if none of that happens, it still makes sense to run a campaign when you change the name of your brand.

If only to signal that the brand name that once was, is no longer. This way, the next time someone talks about your brand on Reddit, they may stop themselves and use the new name.

If you’re lucky, when someone posts using your previous name, another user will comment that the brand name has actually been changed.

This is one less place to figure out how to go about changing how your brand is referenced and one less person who will continue to go around spreading the wrong name across the internet. That’s one less Reddit thread ranking on Google that mentions your old brand naming.

Now, let’s go back to that listicle. Your company is listed as a top 10 best whatever, and when you contact the website to update the name, they ghost you. What do you do?

Nothing.

You keep moving on. You keep doing more public appearances, writing more content, meeting more people, and generally building up your presence across the planet and the internet to the point where your new brand name is the default.

Until the point where Google’s Knowledge Graph is overwhelmed with Mordy Oberstein, founder of Unify Brand Marketing, and not Mordy Oberstein, head of SEO Brand at Wix.

Because then, that one website that hasn’t updated its content with your new naming is the one going against the grain. Now the pressure is on them to show they aren’t stale and out of date.

Set Expectations

I don’t expect this process to happen in a day nor should you. It takes time. Think of it more as a process.

Are more and more places across the digital landscape referencing your new brand name? If yes, you’re doing great. Are more people asking if you changed your name? Also, a good sign.

As you continue to spread your new brand name across the web, Google’s own Knowledge Graph will have more signals that the name that once was has been replaced.

Once your new brand name starts taking hold, anyone who cares about the accuracy of their content will start to either make the edit or reach out to you to make the edit.

Anyone else, at this point, is just running poor content that shouldn’t be there anyway (all things being equal).

More Resources:


Featured Image: PauloBobita/Search Engine Journal

Ask An SEO: Why Didn’t My Keywords Come Back After I Changed My Page Content? via @sejournal, @rollerblader

This week’s ask an SEO question comes from Jubi in Kerala:

“We changed our on-page content recently as keyword positions were nil. After updating the content, the keywords started appearing, but after four weeks the keywords went back to nil. Why is this so, any suggestions? [Page provided]”

Great to meet you, Jubi, and thank you for the question.

I reviewed your page, and although it is written for the user and in a conversational tone with keywords incorporated throughout, the site, overall, is likely the problem.

SEO is more than words on a page. It is also:

    • How your brand is represented by third parties.
    • The code of the site.
    • User and spider experience defined both topically and structurally.
    • The overall quality of the experience for the user, the spiders, and the algorithms.
    • Consumers not needing to do more searches as the solutions are provided by your website, or you give them the resources to implement with trusted third parties (backlinks) when you do not offer the product, service, or solution.

Changing the wording on a page can and does help, but it relies on the rest of the website, too.

I looked at your website for about five minutes, and multiple things popped out. After plugging it into an SEO tool that shows the history of the site, I have some starting points for you to help your site rank, and hopefully, this can help with your client work, too.

Focus On Your Target Audience And Region

First and foremost, your website is in U.S. English, and the language declarations are also in U.S. English. Your target audience is Kerala, India, and you offer digital marketing services in Kerala for local companies.

With a Google Search, I went to see if American English is the common language. Instead, it is Malayalam.

If both English and Malayalam are used, create both versions on your website. More importantly, see how people search in your area.

This is important for both you as a vendor and your local SEO and marketing clients.

I’ve done this in Scandinavia, where TV commercials in Sweden are in English (or were back then), so product searches and types were done in English more than in Swedish.

By having both languages available in content and PPC campaigns, conversions and revenue both scaled vs. only having the Swedish versions when I started working with this brand.

If they are not searching in English as a primary language, use the language they search in as the primary and make English the backup.

Next, look at your schema. You have a local business, which is great, but there are other ways you can define the area you serve and what you do.

Service schema can show you have a service, and you can nest an area served in because you’re a local business with a specific region you service.

Clean Up Hacked Code

Your website was hacked, and the hackers filled it with tons and tons of low-value content to try and rank for brands and brand products.

These pages are all 404, which is great, but they’re still being found. 410 them and make sure you block the parameter in robots.txt correctly. It looks like you’re missing an “*” on it.

You may also want to format a full robots.txt vs. using your software’s default with the one disallow line.

Undo The Over-Optimization

The website follows almost every bad practice with over-optimization, including things that are more for an end user rather than ranking a page.

Your meta descriptions on multiple pages are just keywords with commas in between vs. a sentence or two that tells the person what they’ll find if they click through.

I wasn’t sure if I was seeing it correctly, so I did a site:yourdomain search on Google and saw the descriptions were, in fact, just keywords with commas.

Optimize meta descriptions to let the person know why they should click through to the page. I created a guide to writing local SEO titles and meta descriptions here.

There are a couple of hundred backlinks, but they’re all directories and spammy websites. Think about your local media and trade organizations in India. How can you get featured there instead?

Is there a local chamber of commerce, small business, or local business group you can work with?

What can you share about market trends that will get you on the local news or news and business sites to link to your resources?  These are the backlinks that will help you.

Redo Your Blog

The blog has some topically relevant content, but the content is thin, and your guides that are supposed to answer questions start with sales pitches instead.

Sales pitches do not belong in the first paragraph or even the first five paragraphs of a blog post or guide ever.

People are there to learn. If they like what they learned, you have earned their trust. If the topic is relevant to a product or service you offer, that is when you do the sales pitch.

I clicked on two posts, and after the sales pitch, you share concepts, which is good, but there are no examples that the user can use.

The pages are missing supporting graphics and images to demonstrate concepts, information about the person who created the content, and ways to implement the solution.

One of the posts talks about slow webpage speed. Instead of giving a way to fix it or a starting point, the content just defines what it is. The person has to do another search, which means it is a bad experience.

Add in a couple of starting points like removing excess files (give a couple of types), using server-side rendering with how this helps and an example, plugins or tools for compressing images that don’t need to be in high-resolution, etc.

Now the person has action items, and you have an opportunity to link to detailed guides off of keywords (internal links) naturally to your pages that teach this.

This adds a ton of value to the user and gives them a reason to come back to you or even hire you to do the work for them.

On multiple posts, the writer stuffs internal links off of keyword phrases that are not naturally occurring. These are in the sales pitches, the opening, and the closing of each post.

In theory, this may not be bad for SEO, but it is not helpful for the user and may send low-quality page experience signals to Google if users are bouncing.

From my experience, your content is less likely to get sourced or linked to if it is a sales pitch vs. sharing a solution, but that is just what I’ve experienced.

Instead of starting with a sales pitch or having sales pitches in every post, build an email or SMS list and use remarketing to bring them back.

If you start with a sales pitch and no actual solution, they’ll likely bounce as the page is low-quality.

Final Thoughts

Your service pages overall are not bad. It is the rest of the website.

It needs to be recoded and focused on your target audience, the over-optimizations should be undone, and your agency needs to become the go-to digital marketing agency in your region. Most importantly, the code and content need to be cleaned up.

You offer these services, but prospective clients seeing these bad practices may be turned off and cost you business.

Also, don’t forget to create a Google Business Profile; you don’t currently have one even though you have a physical location, have active clients, and offer services.

I hope this helps, and thank you for asking a really good question.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Do I Rebuild My Website After A Dispute With The Hosting Company? via @sejournal, @HelenPollitt1

The question today comes from Raoof, who asks:

“I completely lost my website due to financial disputes with the hosting company. I have no backup and the only thing I have left is a domain.

I am currently preparing a new website with the previous content and theme. Can I use the previous domain or not? What is your suggestion?”

This is a difficult, but not uncommon, issue to face. You invested time, money, and resources in creating your website. To lose it is highly frustrating.

From an SEO perspective, it might feel like all is lost – the topical authority, the backlinks, your high-performing content.

But don’t worry, it’s not! I’m going to take you through a few steps to recover as much of your website and previous rankings as possible.

I see no issue with reusing your old domain address for the recovered site. That is, as long as no other site was hosted on it while yours was down.

If you owned the domain name throughout this time, you should be fine to restore your site at that address.

In fact, I would highly recommend it to ensure you recapture as much of your old site’s authority as you can.

Recovering Your Assets

The first step is to recover as much of your existing website as you can. You might not have a backup of your site, but thankfully, the internet does!

Content

I would start by going to the Wayback Machine. This is essentially a non-profit archive of the internet.

It claims to have saved over 928 billion webpages. There is a high chance that some of those will be yours!

You can search for your website domain and scroll back through time to when screenshots of your pages were taken. That should enable you to copy and paste some, if not all, of the copy that was on your site.

I would also suggest having a look at your analytics program to identify what your top-visited content was. This should be what you look to recover or recreate first.

Authority

The good news with still having your website domain is that you will still have the opportunity to recover backlinks that were pointing to your pages.

It’s important to host your content on the same URLs as it previously was. This means that if you still have links pointing to your site from external sources, they will continue to work when you set the URL live again.

If you are unable to recreate the exact URL for some reason, make sure to implement a 301 redirect from the old URL to the new one to retain the value of those links.

Reclaim Old Backlinks

If your site went down during the hosting dispute, your webpages were likely to return a 404 or other non-200 status code.

This could mean that external publications chose to change their links from pointing to your page to another so as to still enable their visitors to reach usable content.

This doesn’t mean that those links are gone forever. Evaluate which links were lost during the domain issues using a backlink analytics tool, and begin reaching out to those publications to inform them that your content is back.

It may be that they choose to link to your content again over the newer content they found.

Link Building To Help Crawling

External links aren’t just helpful for signaling relevancy and authority; they can also help to encourage the search bots to crawl the content they link to.

If your site has been offline for a while, it’s possible that the bots have reduced their frequency of crawling. New backlinks could indicate that the website is worth crawling more frequently again.

Technical

There is more to restoring your website to its former glory than just recovering the old content, of course.

A large part of what makes a website well-optimized for search engines and humans alike is its technical foundation.

Same Architecture

Where possible, try to recreate the website’s architecture.

I’ve already mentioned trying to re-use the old URLs, but also consider how and where they linked to each other.

Use the same menu structure and anchor text. This will help reinforce the relevance of the pages to each other and demonstrate that the site is the same as it was before.

Submit To Be Crawled

Once you’ve got your website back to how it was, you will want to let the search engines know to crawl it again.

Aside from encouraging crawling by getting new backlinks, as already mentioned, you can submit a request in Google Search Console and Bing Webmaster Tools for their bots to recrawl individual pages. Note that you may need to verify ownership of the domain in Google Search Console and Bing Webmaster Tools again.

Choose some of your more important pages so that they get crawled and back into the indexes as soon as possible.

XML Sitemaps

You should also make sure you have set up XML sitemaps again for the pages that you have recovered.

Submit these to the search consoles to further inform the Google and Bing bots of your pages’ existence, so they can crawl them and see that they are live again.

Take Note Of Any Issues Found

As the search engines begin to recrawl the site, take note of any issues Google and Bing report on through their search consoles.

There may be new issues that have crept in during the rebuild of your website that weren’t there before.

Improve

You can use this as an opportunity to evaluate what was working with your website and what wasn’t.

The temptation might be to recover and rebuild the site to reflect its former state. However, you might find that you can actually improve it instead.

What Were You Ranking For

As you review your old content’s performance, take a look at whether it ranked well before it was lost.

It may be that, instead of recovering it and uploading it exactly as it was before, you can use this as an opportunity to improve its relevancy to the search phrases users would use to land on it.

Review competitors’ content that has been flourishing while yours has been lost. Take note of what the top-ranking content contains that your recovered content doesn’t.

What’s Changed In The Industry

If your website has been down for a while during this hosting dispute, then the industry may have moved on.

Start to look for gaps in the content that your site used to address and what users are looking for now.

Are there new trends, products, or services that are becoming popular in your industry that you have not covered with your site previously?

Protect

The most important step once you have recovered and improved your site is to reduce the risk of losing it in the future.

You will hopefully never have an issue with your hosting again, but other issues can occur that can cause your website to go offline.

Backups

First of all, take backups of your new site. Many content management systems make it easy to do this, but if yours doesn’t, or if you’ve built it yourself, consider what you can save offline.

Save Your Content

Take copies of all the written content on your site. Make sure that you save it somewhere that isn’t directly linked to your website in case you run into issues again.

Don’t forget to save copies of the images you use, especially if they are unique to your website.

Save Your Meta

Take copies of each page’s search engine optimization.

For example, download the page title and description alongside your main body content.

Mark up the headers and save the image descriptions, and keep the filenames as you used on the site. This will speed up the recovery of your site in the future.

Save Your Schema Markup

Don’t forget to take copies of any bespoke code you used. This includes schema markup. This could save you a lot of time in the future, especially if you write your own schema rather than using plugins.

This can also help if you end up migrating from one CMS to another that doesn’t use the same schema modules.

Resuming Your Optimization Efforts

It is horrifying to think that the website you have spent so much time on is gone for good. Thankfully, it’s probably not.

It’s worth consider that there may be legal recourse available to you to aid in the recovery of your website.

Make sure to check your hosting terms of service thoroughly, as they may give avenues you can explore to regain control of your content.

It may not be as simple as asking your hosting provider for support if you are already in a legal dispute with them, but there may be some legal options available to you.

In the future, it is important to consider the trustworthiness and levels of support provided by your hosting provider.

Look up reviews of potential hosting services before committing to them to make sure you don’t end up going through a similar struggle again.

Losing access to your website can be costly in terms of money and time, and a highly stressful situation. But, follow the steps above and you should get back to working on your website.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How To Implement Faceted Navigation Without Hurting Crawl Efficiency via @sejournal, @kevgibbo

This week’s question tackles the potential SEO fallouts when implementing faceted navigation:

“How can ecommerce sites implement SEO-friendly faceted navigation without hurting crawl efficiency or creating index bloat?”

Faceted navigation is a game-changer for user experience (UX) on large ecommerce sites. It helps users quickly narrow down what they’re looking for, whether it’s a size 8 pair of red road running trainers for women, or a blue, waterproof winter hiking jacket for men.

For your customers, faceted navigation makes huge inventories feel manageable and, when done right, enhances both UX and SEO.

However, when these facets create a new URL for every possible filter combination, they can lead to significant SEO issues that harm your rankings, and waste valuable crawl budget if not managed properly.

How To Spot Faceted Navigation Issues

Faceted navigation issues often fly under the radar – until they start causing real SEO damage. The good news? You don’t need to be a tech wizard to spot the early warning signs.

With the right tools and a bit of detective work, you can uncover whether filters are bloating your site, wasting crawl budget, or diluting rankings.

Here’s a step-by-step approach to auditing your site for faceted SEO issues:

1. Do A Quick “Site:” Search

Start by searching on Google with this query: site:yourdomain.com.

This will show you all the URLs Google has indexed for your site. Review the list:

  • Does the number seem higher than the total pages you want indexed?
  • Are there lots of similar URLs, like ?color=red&size=8?

If so, you may have index bloat.

2. Dig Into Google Search Console

Check Google Search Console (GSC) for a clearer picture. Look under “Coverage” to see how many pages are indexed.

Pay attention to the “Indexed, not submitted in sitemap” section for unintended filter-generated pages.

3. Understand How Facets Work On Your Site

Not all faceted navigation behaves the same. Make sure you understand how filters work on your site:

  • Are they present on category pages, search results, or blog listings?
  • How do filters stack in the URL (e.g.,?brand=ASICS&color=red)?

4. Compare Crawl Activity To Organic Visits

Some faceted pages drive traffic; others burn crawl budget without returns.

Use tools like Botify, Screaming Frog, or Ahrefs to compare Googlebot’s crawling behavior with actual organic visits.

If a page gets crawled a lot but doesn’t attract visitors, it’s a sign that it’s consuming crawl resources unnecessarily.

5. Look For Patterns In URL Data

Run a crawler to scan your site’s URLs. Check for repetitive patterns, such as endless combinations of parameters like ?price=low&sort=best-sellers. These are potential crawler traps and unnecessary variations.

6. Match Faceted Pages With Search Demand

To decide which SEO tactics to use for faceted navigation, assess the search demand for specific filters and whether unique content can be created for those variations.

Use keyword research tools like Google Keyword Planner or Ahrefs to check for user demand for specific filter combinations. For example:

  • White running shoes (SV 1000; index).
  • White waterproof running shoes (SV 20; index).
  • Red trail running trainers size 9 (SV 0; noindex).

This helps prioritize which facet combinations should be indexed.

If there’s enough value in targeting a specific query, such as product features, a dedicated URL may be worthwhile.

However, low-value filters like price or size should remain no-indexed to avoid bloated indexing.

The decision should balance the effort needed to create new URLs against the potential SEO benefits.

7. Log File Analysis For Faceted URLs

Log files record every request, including those from search engine bots.

By analyzing them, you can track which URLs Googlebot is crawling and how often, helping you identify wasted crawl budget on low-value pages.

For example, if Googlebot is repeatedly crawling deep-filtered URLs like /jackets?size=large&brand=ASICS&price=100-200&page=12 with little traffic, that’s a red flag.

Key signs of inefficiency include:

  • Excessive crawling of multi-filtered or deeply paginated URLs.
  • Frequent crawling of low-value pages.
  • Googlebot is stuck in filter loops or parameter traps.

By regularly checking your logs, you get a clear picture of Googlebot’s behavior, enabling you to optimize crawl budget and focus Googlebot’s attention on more valuable pages.

Best Practices To Control Crawl And Indexation For Faceted Navigation

Here’s how to keep things under control, so your site stays crawl-efficient and search-friendly.

1. Use Clear, User-Friendly Labels

Start with the basics: Your facet labels should be intuitive. “Blue,” “Leather,” “Under £200” – these need to make instant sense to your users.

Confusing or overly technical terms can lead to a frustrating experience and missed conversions. Not sure what resonates? Check out competitor sites and see how they’re labeling similar filters.

2. Don’t Overdo It With Facets

Just because you can add 30 different filters doesn’t mean you should. Too many options can overwhelm users and generate thousands of unnecessary URL combinations.

Stick to what genuinely helps customers narrow down their search.

3. Keep URLs Clean When Possible

If your platform allows it, use clean, readable URLs for facets like /sofas/blue rather than messy query strings like ?color[blue].

Reserve query parameters for optional filters (e.g., sort order or availability), and don’t index those.

4. Use Canonical Tags

Use canonical tags to point similar or filtered pages back to the main category/parent page. This helps consolidate link equity and avoid duplicate content issues.

Just remember, canonical tags are suggestions, not commands. Google may ignore them if your filtered pages appear too different or are heavily linked internally.

For any faceted pages you want indexed, these should include a self-referencing canonical, and for any that don’t, canonicalize these to the parent page.

5. Create Rules For Indexing Faceted Pages

Break your URLs into three clear groups:

  • Index (e.g., /trainers/blue/leather): Add a self-referencing canonical, keep them crawlable, and internally link to them. These pages represent valuable, unique combinations of filters (like color and material) that users may search for.
  • Noindex (e.g., /trainers/blue_black): Use a to remove them from the index while still allowing crawling. This is suitable for less useful or low-demand filter combinations (e.g., overly niche color mixes).
  • Block Crawl (e.g., filters with query parameters like /trainers?color=blue&sort=popularity): Use robots.txt, JavaScript, or parameter handling to prevent crawling entirely. These URLs are often duplicate or near-duplicate versions of indexable pages and don’t need to be crawled.

6. Maintain A Consistent Facet Order

No matter the order in which users apply filters, the resulting URL should be consistent.

For example, /trainers/blue/leather and /trainers/leather/blue should result in the same URL, or else you’ll end up with duplicate content that dilutes SEO value.

7. Use Robots.txt To Conserve Crawl Budget

One way to reduce unnecessary crawling is by blocking faceted URLs through your robots.txt file.

That said, it’s important to know that robots.txt is more of a polite request than a strict rule. Search engines like Google typically respect it, but not all bots do, and some may interpret the syntax differently.

To prevent search engines from crawling pages you don’t want indexed, it’s also smart to ensure those pages aren’t linked to internally or externally (e.g., backlinks).

If search engines find value in those pages through links, they might still crawl or index them, even with a disallow rule in place.

Here’s a basic example of how to block a faceted URL pattern using the robots.txt file. Suppose you want to stop crawlers from accessing URLs that include a color parameter:

User-agent: *
Disallow: /*color*

In this rule:

  • User-agent: * targets all bots.
  • The * wildcard means “match anything,” so this tells bots not to crawl any URL containing the word “color.”

However, if your faceted navigation requires a more nuanced approach, such as blocking most color options but allowing specific ones, you’ll need to mix Disallow and Allow rules.

For instance, to block all color parameters except for “black,” your file might include:

User-agent: *
Disallow: /*color*
Allow: /*color=black*

A word of caution: This strategy only works well if your URLs follow a consistent structure. Without clear patterns, it becomes harder to manage, and you risk accidentally blocking key pages or leaving unwanted URLs crawlable.

If you’re working with complex URLs or an inconsistent setup, consider combining this with other techniques like meta noindex tags or parameter handling in Google Search Console.

8. Be Selective With Internal Links

Internal links signal importance to search engines. So, if you link frequently to faceted URLs that are canonicalized or blocked, you’re sending mixed signals.

Consider using rel=”nofollow” on links you don’t want crawled – but be cautious. Google treats nofollow as a hint, not a rule, so results may vary.

Point to only canonical URLs within your website wherever possible. This includes dropping parameters and slugs from links that are not necessary for your URLs to work.

You should also prioritize pillar pages; the more inlinks a page has, the more authoritative search engines will deem that page to be.

In 2019, Google’s John Mueller said:

“In general, we ignore everything after hash… So things like links to the site and the indexing, all of that will be based on the non hash URL. And if there are any links to the hashed URL, then we will fold up into the non hash URL.”

9. Use Analytics To Guide Facet Strategy

Track which filters users actually engage with, and which lead to conversions.

If no one ever uses the “beige” filter, it may not deserve crawlable status. Use tools like Google Analytics 4 or Hotjar to see what users care about and streamline your navigation accordingly.

10. Deal With Empty Result Pages Gracefully

When a filtered page returns no results, respond with a 404 status, unless it’s a temporary out-of-stock issue, in which case show a friendly message stating so, and return a 200.

This helps avoid wasting crawl budget on thin content.

11. Using AJAX For Facets

When you interact with a page – say, filtering a product list, selecting a color, or typing in a live search box – AJAX lets the site fetch or send data behind the scenes, so the rest of the page stays put.

It can be really effective to implement facets client-side via AJAX, which doesn’t create multiple URLs for every filter change. This reduces unnecessary load on the server and improves performance.

12. Handling Pagination In Faceted Navigation

Faceted navigation often leads to large sets of results, which naturally introduces pagination (e.g., ?category=shoes&page=2).

But when combined with layered filters, these paginated URLs can balloon into thousands of crawlable variations.

Left unchecked, this can create serious crawl and index bloat, wasting search engine resources on near-duplicate pages.

So, should paginated URLs be indexed? In most cases, no.

Pages beyond the first page rarely offer unique value or attract meaningful traffic, so it’s best to prevent them from being indexed while still allowing crawlers to follow links.

The standard approach here is to use noindex, follow on all pages after page 1. This ensures your deeper pagination doesn’t get indexed, but search engines can still discover products via internal links.

When it comes to canonical tags, you’ve got two options depending on the content.

If pages 2, 3, and so on are simply continuations of the same result set, it makes sense to canonicalize them to page 1. This consolidates ranking signals and avoids duplication.

However, if each paginated page features distinct content or meaningful differences, a self-referencing canonical might be the better fit.

The key is consistency – don’t mix page 2 canonical to page 1 and page 3 to itself, for example.

About rel=”next” and rel=”prev,” while Google no longer uses these signals for indexing, they still offer UX benefits and remain valid HTML markup.

They also help communicate page flow to accessibility tools and browsers, so there’s no harm in including them.

To help control crawl depth, especially in large ecommerce sites, it’s wise to combine pagination handling with other crawl management tactics:

  • Block excessively deep pages (e.g., page=11+) in robots.txt.
  • Use internal linking to surface only the first few pages.
  • Monitor crawl activity with log files or tools like Screaming Frog.

For example, a faceted URL like /trainers?color=white&brand=asics&page=3 would typically:

  • Canonical to /trainers?color=white&brand=asics (page 1).
  • Include noindex, follow.
  • Use rel=”prev” and rel=”next” where appropriate.

Handling pagination well is just as important as managing the filters themselves. It’s all part of keeping your site lean, crawlable, and search-friendly.

Final Thoughts

When properly managed, faceted navigation can be an invaluable tool for improving user experience, targeting long-tail keywords, and boosting conversions.

However, without the right SEO strategy in place, it can quickly turn into a crawl efficiency nightmare that damages your rankings.

By following the best practices outlined above, you can enjoy all the benefits of faceted navigation while avoiding the common pitfalls that often trip up ecommerce sites.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Should We Optimize For Keywords With High Search Volume Or Competition? via @sejournal, @rollerblader

In this week’s Ask An SEO, Chandrika asks:

“What are the important points to consider when doing keyword research for SEO using Google Keyword Planner? Should we focus on keywords with a monthly search volume of 500? Or, should we prioritize keywords with low or high competition?”

This is a great question, and here’s an easy answer: Don’t focus on the keyword. Focus on the solution for the user based on the intent of the keyword.

Google Keyword Planner shares the estimated search volume for a keyword, but that doesn’t mean the entire volume represents your audience. Some of them may be looking for information rather than shopping, and only a portion of them are there to be converted into revenue.

The word “bark,” for example, could be the bark on a tree or the noise a dog makes.

A search for bark on a tree could be what it looks like or feels like, whether it’s a sign the tree is healthy or not, and questions about using it to determine the age or genus of the tree.

“Bark” for a dog could refer to the specific sounds made by certain breeds, could indicate that the dog is sick, or the user is looking for ways to get a dog to stop barking or train a dog to bark on command.

If there are 500 searches, perhaps 300 are for the noise the dog makes, from which 200 are for determining if the dog is sick or healthy, and 50 are for training your dog to bark.

If you sell books on dog training, this may not be the best phrase to go after, but it is a topic you may want to cover. This is where optimizing for the topic comes in.

The topic will encompass the “SEO keywords” and increase the potential pool of traffic based on the entity it ranks for, and the solution it provides.

Optimize For The Solution And Topic

Instead of optimizing for a keyword by stuffing it into the copy, headers, and title, optimize for the topic it relates to.

Ask yourself what the person searching for this keyword is looking for, and build a series of pages that meet these needs.

  • If it is a conversion phrase, then incorporate the questions and solutions the person has related to the product query into the product or collection page. This can be done in the copy itself or in the FAQs, if your template has them.
  • When the keyword has an informational and conversion intent, such as “micro needling,” it can be about the process and procedure, a before-and-after photo series, or someone looking to find a local med spa. This means your site should have multiple content types for the SEO keywords based on the stage of the customer’s journey, including:
    • Pages that show the before and after, and by skin type and age.
    • Blog posts and guides that cover the process and alternatives if it isn’t a match.
    • Comparisons between micro needling and similar procedures to help the person know which is better suited to their needs.
    • A direct conversion page where you can onboard the lead or take payment.

By creating guides that address the topic, your website becomes stronger for the specific phrases.

Machine learning and AI are getting better at understanding what the content solves, and they use the trustworthiness of the content and its phrasing to determine the keywords the page should rank for.

If the content is clearly showing knowledge and expertise, and the claims or solutions are backed up by proven facts, you can show up for keywords without optimizing for the phrase from Google Keyword Planner.

Once you have the content and user intent, like shopping or learning, completed text-wise, add schema.

Use article or blog post schema, depending on whether you’re a news site, for informative content. Use the shopping schema, such as product, collection, or service, along with the area served and additional types to help drive the intent of the page home.

Keywords With Higher Search Volumes

Keywords with high search volumes are tempting to optimize for. However, instead of worrying about the keyword, take other keywords that are similar and are part of the solution.

Put those together into a group, and then think about how they interact to educate the person so that the person will have the information they need to make an informed decision about their purchase, whether it is a product or a collection/category page.

Keywords and search volumes are part of topics, but you don’t focus on their volumes – focus on the solutions for the phrases.

Your goal is to create the ultimate resource for the topic, whether it’s a question, a guide, or compatibility for products and services.

When you do this, the keyword search volume may multiply exponentially, and you can optimize the same page for multiple high-volume phrases.

By doing this, you may also be able to avoid creating content that cannibalizes itself by having a content map of your website.

When you know a page is dedicated to a topic and specific intent, you have your reminder not to create another page just because there is a search volume you found.

Instead, try to incorporate the theme of the phrase based on the search intent into the correct page for that search volume.

Competition Scores Do Not Matter

Someone has to show up for the phrase, so why shouldn’t it be you?

Competition scores are scores made up by SEO tools, not used by search engines.

Search engines are concerned with providing the most accurate answer in the easiest-to-absorb format and in the fastest way possible. If you do this, you may be the site that gets the ranking and the traffic.

For highly competitive phrases where big money is being spent, you will need some authority and trust, but there’s no reason you shouldn’t create the content that can rank.

You may get lucky and take traffic from the more established sites – it happens a lot. When it does, it can attract backlinks naturally from highly authoritative sites, which helps build your site’s stability.

Another reason to create this content now is that having it in an easy-to-use and trustworthy format can help it rank once your website is strong enough. I’ve seen this happen, where multiple pages rise to the top during core updates.

If you don’t create the content because you think it’s too competitive, you won’t have the chance to rank it when core updates happen.

The last thing I’d consider when looking at keywords with 500+ monthly searches is the long tail.

Long-tail phrases can be part of the topic. When you filter a keyword research tool to only show volumes at 500+, you miss out on parts of the entity, which can include consumer questions.

Knowing what matters to the consumer or user helps to provide them with more complete solutions.

When the page answers all of their questions, they can now convert (if your funnel is good), or they may subscribe to your publication because you’re a solution provider.

We never focus on SEO keyword volume when doing research, but we love high volumes when we find them.

We look at what will benefit the person on the page and if it matches the topic of the site, products, and services.

From there, we use keywords and search volumes to set a potential goal in traffic, but we don’t stress if there is no search volume.

Google Discover data, for example, isn’t going to show up, but if the content aligns with interests and your site qualifies, you could get featured and attract a ton of new visitors.

I hope this helps answer your question.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How To Convince C-Suite To Support Brand-Based SEO via @sejournal, @MordyOberstein

In this edition of Ask An SEO, a marketing leader reaches out with a question:

My company put pressure on me to deliver results of more traffic to our product pages.

How can I try to convince the CMO that we should invest more in brand building that will most likely reduce traffic?

There’s so much to chew on with this question. Before I get into the thick of it, I want to challenge the premise: “brand building that will most likely reduce traffic.”

It’s something I hear from clients often enough. It’s a premise I hear from SEOs all the time. While it may be true in this specific case, I would like to say something regardless.

I’m glad brand has entered into the SEO conversation. Long overdue.

At the same time, brand hasn’t been the forte of the search marketing industry. As a result, there’s a lot being said that, when put under scrutiny, doesn’t hold up.

I’d take a lot of the brand strategy you’re hearing from the SEO industry with a grain of salt.

Just because you target an audience doesn’t mean you lose wider reach. It can happen – and very often it should happen – but does it not have to happen?

You can speak to a core audience very deeply while not losing the attention of your secondary audience. Streaming platforms do this all the time.

Apple TV has an identity around great sci-fi content, but it also speaks to a wider audience as it throws some solid comedies into the mix (at least in my opinion).

Both of these “identities” work because there is a common thread: Apple puts out higher-quality content than other platforms.

So, will you lose traffic by focusing on brand? You probably should, but that’s only because I’ve been around the proverbial SEO block a few times.

It is, however, entirely possible to do things like pivot to a new audience while retaining the old one.

Losing audience as a result of building the brand is 100% not an inherent outcome. If anything, in the long term, it’s the total opposite. Brand building is all about connecting with more audiences over time.

Let’s move on to your question and work with the premise that you will lose traffic by increasing content and audience targeting.

I’m not even going to go into the obvious point and glaring absurdity of not wanting to have a more specific focus and more refined audience targeting in favor of “traffic.”

So, we’re going to work with two premises:

  1. You will lose traffic by focusing on brand.
  2. Not getting that “traffic” is “bad” somehow.

How do you sell this to the CMO?

For The Conceptual CMO

I’m going to start at a very conceptual level that will probably not speak to your CMO, but is very important for you to understand when you make your pitch.

The web is not the web you think it is. The web was a place where Wired could write about coffee mugs and rank them because everything was on an equal footing.

It was one giant “web” that was unified, where anyone could rank for anything so long as the content was halfway decent.

That web doesn’t exist anymore.

There is no “internet.” There is the internet that talks about home goods. There is the internet that deals with technology products. There is the internet that takes up sports.

On this internet, Wired isn’t relevant for coffee mugs. That’s not its sphere of influence. The web is no longer one giant unified void that algorithms sift through to pick up whatever site on whatever topics.

Think of the internet as independent spheres that sometimes move and overlap with other orbits but are generally self-contained.

If you’re selling this bowl of goods to a CMO, I would pitch it as you’re getting ahead of the curve. You’re getting ahead of the traffic loss that has already hit so many sites and is going to hit yours eventually.

I would sell this as “being able to perform as the landscape shifts.” You have to function in alignment with the ecosystem. There’s no way around it.

If you don’t, it will all hit the fan. It’s only a question of “when.” Usually, brands will wait until it’s already too late.

Not operating within the confines of the ecosystem is like trying to row a boat on an ice skating rink using a tennis racket.

For The Pragmatic CMO

The conceptual construct I just defined above will not speak to most CMOs.

While it’s extremely likely that you, the VP of marketing, head of growth, marketing manager, etc., understand this point, most CMOs are not in touch with the ecosystem enough to be swayed by this argument.

For most CMOs, I would start with the competition. Show similar sites that have undergone traffic losses because they haven’t changed with the tides.

If you’re a HubSpot competitor, showcase all the traffic HubSpot lost. And then, translate that into all the dollars spent in time and resources trying to capture traffic, as if it were 2015.

Image from author, April 2025

Honing your audience makes it less expensive to run marketing campaigns and assets.

Don’t pay to speak to everyone. Pay to speak to the right ones.

If your marketing strategy is aimed at casting a wide net, you will inevitably either pay for content production that isn’t of value or simply pay for pure visibility that isn’t worth the value.

You can also do the opposite. You can show competitors who have gotten ahead of the curve. That usually lights a fire under most CMOs. Seeing that the competition is getting “ahead” in whatever way is very uncomfortable for the C-level staff.

If you can show that your strategy is already being implemented by competitors, squeeze. And frame it. Frame it well: “Our competitors are starting to speak more directly to our ultimate target audience, and you can see that here, here, and here.” That will have an impact that they won’t ignore.

You have to try to concretize this as much as possible.

The problem with brand, as Alli Berry once put it to me, is that it’s the silent killer. I have witnessed this firsthand on more than one occasion with clients.

You don’t realize it’s the decline of brand efficacy until you have a real problem on your hands.

What happens is, time and time again, the decline of brand efficacy first manifests itself in whatever performance channel.

Suddenly, your social media performance or organic search performance is on the decline.

The immediate knee-jerk reaction brands have (especially as you move up the ladder) is to fix the channel.

These are the meetings where you are told to change things up and fix performance. You know, the meetings where you leave with your head shaking since it’s clear no one knows what they’re talking about.

The reason this happens is that the issue isn’t the channel. There’s nothing wrong with the social or SEO strategy per se. Rather, there’s a huge gap in the brand strategy, and it’s starting to have an impact.

The “suddenness” of a performance problem can be an external shift (a change in consumer behavior, for example) – and that definitely can and does happen.

However, from my experience, the usual culprit is the loss of brand traction.

Often, a product hits the market at the right time, in the right way, in the right place. The stars align and the brand takes off.

At a certain point, the brand hits what I tell my clients is a “maturity inflection point.” The brand can no longer ride the momentum of its product or service the same way, and brand efficacy (and marketing potency) ebbs away.

By the time this happens, most brands have a strong client base, etc., so they never look internally. Instead, they focus on the specific performance problems. Thus, the brand becomes the silent killer.

Your job is not to let this happen to you. If you’re managing a marketing team at whatever level, your job is to nip this problem in the bud.

Now, if your CMO is more reflective and so forth, then the argument I gave earlier might work.

This is not the norm, so you need to concretize the argument.

Whether it be, as I mentioned, through the competitor angle or whatever, you have to gain some perspective and then translate that perspective into practicality.

Roll With Your CMO

My last piece of advice is to know your audience. CMOs are often bold and brash (likely because they feel they have to be), so speak that language. Come with a plan that has a bit of edge and flair to it.

If that’s not your CMO, don’t. If they are more analytical by nature, show the data.

It’s just a matter of knowing your audience and what language they speak. You have to roll with where your CMO and company overall are at. Otherwise, you might have the greatest plan, but it won’t land.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal