Ask An SEO: Why Is GA Reporting Higher Organic Traffic Than GSC? via @sejournal, @HelenPollitt1

Today’s question centers on the differences in Google Analytics 4 and Google Search Console measurement:

“I’m reaching out for help with a puzzling issue in Google Analytics 4 (GA4). We’ve experienced a sudden and unexplained surge in traffic over a four-day period, but surprisingly, Google Search Console (GSC) doesn’t show any corresponding data.

The anomaly is specific to organic search traffic, and it’s only affecting our main page. I’d greatly appreciate any insights you can offer on what might be causing this discrepancy.”

Why GA4 And GSC Report Different Traffic Numbers

It’s a very interesting and common question about data from Google Analytics 4 and Google Search Console.

They are both Google products, so you could assume their data would be consistent. However, it isn’t, and for very good reasons.

Let’s take a look at the differences between the two.

Traffic Mediums

Google Analytics measures user interactions with a digital property. It is highly customizable and can even accept data inputs.

Google Search Console provides an overview of your website’s performance in Google Search.

This means that Google Analytics 4 is measuring traffic from all types of sources, including paid search campaigns, email newsletters, display ads, and direct visits.

Google Search Console is far narrower in scope, as it only reports on Google Search traffic.

Organic Sources

Another key difference to remember is that when reporting on organic traffic, Google Analytics will look at all sources marked as “organic search,” which includes other search engines like Bing, Naver, and Yandex.

This means that unless you instruct Google Analytics 4 to filter the organic search sources to only Google, you will see vastly different numbers between the two programs.

Clicks And Sessions

The two most comparable metrics are Google Analytics 4’s “sessions” and Google Search Console’s “clicks.” However, they are not identical metrics.

A “session” in GA4 is counted when a user either opens your app in the foreground or views a page of your website. A session, by default, lasts only 30 minutes, although this can be altered through your configuration of GA4.

A “click” in Google Search Console is counted when a user clicks on a link displayed in Google Search (across web, images, or video, and including News and Discover).

Reasons For Higher GSC Clicks Than GA4 Sessions

As you can imagine, these small but critical differences in the technical ways these two metrics are counted can have a significant impact on the end volumes reported.

There are other reasons that can impact the final numbers.

Typically, we see Google Search Console’s “clicks” being higher than Google Analytics’ organic “sessions” from Google.

Let’s assume a user clicks on an organic search listing on Google Search and arrives at the webpage it links to. What would be registered in different scenarios?

Cookies

This is a differentiating factor that is becoming more prominent as laws surrounding cookie policies change.

GA4 requires cookies to be accepted in order to track a user’s interaction with a website, whereas GSC doesn’t.

This means that a user might click on an organic search result in Google Search, which registers as a “click” in Google Search Console, arrive on the webpage, but not accept cookies. It means there would be one click registered in Google Search Console but no session registered in Google Analytics 4.

JavaScript

GA4 won’t work if JavaScript is blocked on the website, whereas GSC doesn’t rely on your site’s code to track clicks, but is based on search engine-side data. Therefore, will continue to register clicks.

If JavaScript is blocked in some way, this would again result in a click being registered on Google Search Console, but no session being registered in Google Analytics 4.

Ad Blockers

If the user is utilizing an ad blocker, it may well suppress Google Analytics 4, preventing the session from being registered.

However, since Google Search Console is not affected by ad blockers, it will still register the click.

Tracking Code

Google Analytics 4 only tracks pages that have the GA4 tracking code installed on them.

If the URL the user clicks on from Google Search results does not contain the tracking code, Google Search Console will still register the click, but Google Analytics will not register the session.

Filters And Segments

GA4 allows filtering and segments to be set up that can discount some visits or reclassify them as coming from another source or medium.

Google Search Console does not allow this. It means that if the user clicks on a URL and displays some behavior that gets it caught in a filter, then Google Analytics may not count that session, or may reclassify it as coming from a source other than Google.

In that instance, Google Search Console would register the click, but Google Analytics 4 may not register the session, or may register it as a different source or medium.

Similarly, if your GA4 account has segments set up and these are not properly managed during the reporting process, you may find that you are only reporting on a subset of your Google organic data, even if the full data has been captured correctly by Google Analytics 4.

Why GA4 Might Report More Sessions Than GSC Clicks

In your case, you’ve mentioned that you have seen a surge in organic search traffic to your main page only. Let’s look at some of the potential reasons that might be the case.

Semantics

I want to start by looking at the technicalities. You haven’t specified what metric you are using to determine “traffic” in Google Analytics 4.

For example, if you are using “page views,” then that would not be a closely comparable metric to Google Search Console “clicks,” as there can be several page views per session.

However, if you are looking at “sessions,” that is more comparable.

Also, you haven’t specified whether you have filtered down to look at just Google as the source of the organic traffic, or if you might be including other search engines as sources as well.

That would mean you are likely getting much higher sessions reported in Google Analytics, as Google Search Console only reports on Google clicks.

Tracking Issues

I would start by looking at the way tracking has been set up on your site. It could be that you have incorrectly set up cross-domain tracking, or there is something causing your tracking code to fire twice, only on the homepage.

This could be causing inflated sessions to be recorded in your Google Analytics 4 account.

Multiple Domains

The way you have set up your Google Analytics 4 properties may be quite different from your Google Search Console account.

In GA4, it’s possible to combine multiple domains under one view, whereas in GSC, you cannot.

So, for example, if you have a brand with multiple ccTLDs like example.com, example.fr, example.co.uk, then you will have these set out as separate properties in Google Search Console.

In Google Analytics 4, however, it’s possible to combine all these websites to show an overall brand’s website traffic.

It might not be obvious at first glance when looking at your homepage’s traffic, as you’ll likely only see one row with “/” as the reported URL.

When you add “hostname” as an additional column in those reports, you will be shown a breakdown of each ccTLD’s homepage, rather than a combined homepage row.

In this instance, it might be that you are viewing the Google Search Console account for one of your ccTLDs, e.g., example.com, whereas when you look at your Google Analytics 4 traffic, you may be viewing a row detailing the combined ccTLDs’ homepages’ traffic.

Length Of A Standard Session

Google Search Console tracks clicks from Google Search. It doesn’t go much beyond that initial journey from SERP to webpage. As such, it is really reporting on how users got to your webpage from an organic search.

Google Analytics 4 is looking at user behavior on your site, too. This means it will continue to track a user as they navigate around your site.

As mentioned, by default, Google Analytics 4 will only track a session for 30 minutes unless another interaction occurs.

If a user navigated to your website, landed on the homepage, and then took a phone call for an hour, they might be shown as languishing on your homepage for 30 minutes.

Then, when they come back to their computer and navigate from your homepage to another page, it will count as a second session starting.

It is most likely that in this scenario, the second session would be attributed to direct/none, but there may be cases where Google Analytics 4 is able to identify the previous referral source.

However, it is unlikely that this would cause the sudden spike in organic traffic that you have noticed on your homepage.

Bots Mimicking Google

It might well be that Google Analytics 4 is being forced to classify landing page traffic incorrectly as coming from an organic search source due to bot traffic spoofing the referral information of a search engine.

Google Search Console is better at filtering out this fake traffic due to the way it records interactions from Google Search to your website.

If there is a surge of bots visiting your homepage with this fake Google referrer, they may be incorrectly counted by Google Analytics 4 as genuine visitors from Google Search.

Misclassified UTMs

UTM tracking is often used within paid media campaigns to assign value to different campaigns more accurately.

It enables marketers to specify the medium, source, and campaign from which the traffic came if it clicked on their advert. However, mistakes happen, and quite often, UTMs are set up incorrectly, which alters the attribution of traffic irrevocably.

In this instance, if a member of your team was testing a new campaign, or perhaps using a UTM as part of an internal split test, they may have incorrectly specified “organic” as the medium instead of the correct value.

As such, when a user clicks on the advert or participates in the split test, their visit may be misattributed as organic instead of the correct source.

If your team is testing something and has used an incorrect UTM, this would explain a sudden surge in organic traffic to your homepage.

UTMs do not affect Google Search Console in this way, so the traffic that is misattributed in Google Analytics 4 would not register in Google Search Console as an organic click.

In Summary

There are a myriad of reasons why Google Analytics 4 may be reporting a different volume of homepage sessions than Google Search Console reports homepage clicks.

When using these two data sources, it’s best to recognize that they report on similar but not exactly the same metrics.

It is also wise to recognize that Google Analytics 4 can be highly customized, but improper setup may lead to data discrepancies.

It is best to use these two tools in conjunction when working on SEO to give you the widest possible view of your organic search performance.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Can I Improve The Visibility Of My Category Pages? via @sejournal, @kevgibbo

This week’s Ask an SEO question comes from a medium-sized ecommerce site manager who’s run up against a common problem:

“Our product pages rank well, but our category pages rarely appear in search results. What specific optimization strategies would you recommend for category pages to improve their visibility?”

Thanks for the question!

It’s a common issue for ecommerce site managers. You have lots of category pages that would present a good opportunity for driving traffic, but they just don’t seem to be getting visibility in the search engine results pages.

First Thoughts

If your product pages are ranking well but your category pages are struggling more in search results, it’s likely due to the greater competition for broader, middle-of-the-funnel keywords.

While product pages can capture long-tail, bottom-of-the-funnel queries, category pages often struggle with more competitive, high-traffic terms.

Here are a few key reasons your product pages might be outshining your category pages, along with some tips to give those category pages a boost:

1. Technical Accessibility

There might be incorrect indexing directives. Category pages won’t rank well if basic technical elements aren’t working correctly.

To ensure your category pages are fully crawlable and indexable by search engines, check these aspects:

  • On-page directives: Ensure noindex tags aren’t blocking your category pages from appearing in search results.
  • Robots.txt file: Double-check that your robots.txt file isn’t unintentionally blocking important category pages.
  • Canonical tags: Confirm that canonical tags are correctly set to point to the preferred version of each page.

2. Site Architecture And Internal Linking

It’s possible that your site architecture is designed to give more link equity to product pages rather than category pages.

To improve category page visibility, focus on linking more frequently to those category pages, especially the ones that have the ability to drive the highest amount of revenue.

This can be done through linking from other categories, blog posts, guides, and more. By ensuring category pages are linked to more often, you help search engines understand their importance and authority.

This is why conducting an opportunity analysis early in your SEO strategy is crucial. It helps identify the category pages that should receive the most internal linking support.

A final point on linking: Make sure your breadcrumbs are optimized and visible. Not only does that help visitors understand where they are on your site, but it might also encourage them to explore more of what you have to offer.

3. Issues With Faceted Navigation

Faceted navigation is an essential feature for large ecommerce websites, allowing users to filter product searches. However, if not properly managed, it can pose significant SEO challenges.

One of the primary concerns is “index bloat” – the creation of multiple, often duplicate URLs for each possible filter combination.

It can exhaust your crawl budget, and then search engines can potentially overlook critical pages.

Also, improper implementation can result in duplicate content, cannibalize rankings for category pages, and dilute internal link equity.

To avoid this, I recommend limiting the number of indexed filter combinations at any given time – ideally no more than two.

The specific number will depend on the range of filters available, but it’s crucial to prioritize filters that align with search demand.

For example, avoid indexing a combination like “size 7, green, wide fit, running shoe” if there’s minimal search volume for it.

However, “green size 7 running shoe” could be a valuable combination to index, as it has higher search intent.

4. Insufficient Or Low-Quality Content On Category Pages

Over the years in this industry, I’ve seen firsthand how impactful on-page copy can be for category pages. It helps to provide extra context that helps search engines better understand the focus of your pages.

After all, search engines prioritize pages with valuable content that provides context for users.

Many category pages are nothing more than long lists of products and icons. That’s a real missed opportunity – and also makes them less likely to surface in the SERPs.

Here are a few ways to boost their chances:

Short Introductions At The Top

On many ecommerce sites, you’ll notice there’s often a short block of intro copy at the top of the page.

This doesn’t need to be more than 100 words or so and is an effective way of helping search engines understand the page’s context. Avoid fluff or boilerplate copy; it needs to be unique and meaningful.

Tip: Explain what the category is, and the broad range of products or brands you sell.

Say the category page was “running shoes.” The intro could talk about all the materials the running shoes are made from, colors available, types of runs they can be used for, and so on.

Guidance Lower Down

Further down the page, you can include additional content modules to help the customer make an informed decision.

Ecommerce stores often use things like:

  • FAQs.
  • Feature comparisons.
  • More information about your brand.
  • Information on how to choose between products.
  • Videos.
  • Delivery information.

5. Lack Of On-Page Optimization

Your on-page optimization for category pages might not be fully aligned with search intent, so it’s worth reviewing and refining it to better match what users are searching for.

Page Titles

If category pages have generic or poorly optimized page titles, search engines may struggle to understand the page’s relevance, and users won’t feel enticed to click on the result in SERPs.

When creating them:

  • Review current SERPS to see what’s working for competitors.
  • Keep titles unique for each category to avoid duplication, and aim for 50-60 characters to prevent truncation in search results.
  • Ensure your titles reflect what users are looking for – like specific product attributes (e.g., color, size) when relevant.

Meta Descriptions

A compelling meta description for a product listing page (PLP) should give users a reason to click, showcasing its offering and value.

Keep the meta description within 150-160 characters to avoid truncation, and craft it to answer potential user queries, like “best [category] for [specific need].”

Header Tags

When you’re reviewing header tags for categories, the key is to capture the essence of the entire category while speaking to the intent of shoppers browsing or filtering options.

Start with a clear, keyword-rich H1 that tells users exactly what the page is about, like “Men’s Running Shoes.”

Then use H2 tags to break things down further with subcategories or popular filters, such as “Top Rated” or “Shop by Brand.”

For product detail pages (PDPs), header tags become more specific to the individual product.

6. Low-Quality Or Missing Schema Markup

Now, we’re getting into some of the more technical tasks to improve your category pages’ rankings.

It might be that your schema markup is better for PDPs than your PLPs, or your PLPs just need some more tweaks or additions.

Here are some simple actions that can make a difference:

  • Consider adding the BreadcrumbList schema to your category pages. (It helps search engines understand the page’s position within your site’s hierarchy, improving internal linking.)
  • Consider collection-level structured data if applicable.
  • Review if category pages have any missing structured data.

7. Content Freshness Signals

All too often, people create category pages, then basically forget about them.

However, regularly updating them will show that the page is actively maintained, increasing your chances of appearing in SERPs.

Keep Category Pages Dynamic

Highlight trending products, top-rated items, or seasonal goods, surfacing them at the top of your category pages.

Include Recent Reviews

Getting positive reviews for products? Insert them as content blocks within your category pages. The more recent the reviews, the better.

Refresh Copy

Trends come and go, stock gets replaced, and new products get made. Refresh your category page copy to reflect these changes.

Final Word

I hope these tips can help you get more visibility for your category pages – and complement your already successful product pages.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Do We Shift Google From Our Old Brand Name to Our New One? via @sejournal, @MordyOberstein

The question for this edition of Ask An SEO comes from a reader who’s trying to make their rebrand stick in search:

“Our company recently went through a rebrand with a new name. We’re seeing our old brand name still dominating search results, while our new brand barely registers.

What’s the best approach to transition brand equity in search from our old name to our new one?”

Having your old brand name appear on Google can be extremely frustrating. You just launched a new brand name, spent a lot of time working on it, and here you are stuck on the search engine results page with the old name.

It’s genuinely frustrating.

There are essentially three steps here:

  1. Handle your own ecosystem.
  2. Request changes to third-party sites.
  3. Build up your new brand name so you don’t have to rely on No. 2 happening.

Aligning The Assets You Control

The first, and obvious thing to do, is ensure your new brand name appears consistently across all the assets you own.

Some of these places are entirely obvious, like your homepage. Obviously, you’re going to change how you refer to yourself on your homepage.

However, there can be a lot of nooks and crannies across your ecosystem that may still mention your brand’s former name.

This can include:

  • Title tags and meta descriptions.
  • Alt text.
  • Knowledge Base pages.
  • Structured data markup.
  • Unused social media accounts.
  • Employee bios (both on and off your site).

It’s a matter of crossing your i’s and dotting your t’s. If you’re a big brand with a broad ecosystem, this can be more complicated than it might seem.

Let’s imagine for a moment that what changed is not the main brand name but a product name or the name of a sub-brand.

There could be thousands of pages that you would never even think of that might reference the old naming.

In such a case, you should conduct an extensive audit. I recommend this in general, even if you are not a huge website – it’s so easy to forget a page that references your naming and that such a page even existed.

This should help ensure your own brand SERP is aligned with the new naming as much as possible.

However, there are still elements even on these SERPs that will need some help, such as your Knowledge Panel. For this, we need to think beyond your owned assets.

Align Third-Party Assets

Getting others to recognize your new brand name is a little tougher than just combing through your assets to ensure alignment (which, as I said earlier, might not be as straightforward as it may seem).

Getting third parties to pick up on your branding change is incredibly important.

The underlying goal or concept is: We want people to talk about you and to mention your new brand name when they do.

Within this task, there are things that are easier to accomplish and things that are much harder.

Start with the easier things. Getting these done will help you push areas that you have less influence over.

One easy place is author bios. If you, or anyone in your company, has contributed content to a third-party site (whether it be an article, webinar, podcast, etc.), there is often a bio that will mention, if not link to, your company.

Make sure these bios are up to date and reflect the current and only the current company name.

By the way, sometimes these bios have multiple places where the brand is mentioned; make sure all instances are up to date.

For example, in my Search Engine Journal bio, my company is mentioned twice:

Screenshot from Search Engine Journal, May 2025

Getting these updated should not be hard at all.

It’s easy to miss a few wins here.

But getting these citations right can help with the Google results.

When I went around and had my current company added to all of my bios across the internet, Google’s Knowledge Panel took notice.

While my old Wix bio still often appears as the main URL in the Knowledge Panel and as a top organic result, Google started to pull in the images from my site as well:

Screenshot from search for [mordy oberstein], Google, May 2025

Notice, by the way, that because I took care of my social media. My current brand shows up as part of my LinkedIn profile, which is confusing considering what the Wix result says below it (i.e., that I work at Wix).

That’s exactly what I want. I want the person to ask, “Does he still really work at Wix?”

When Third-Parties Won’t Align

What happens when your brand is listed under its previous name on some random listicle that won’t respond to any of your requests to change the brand name?

What happens if on some forum (say Reddit), there are endless references to your previous brand name that you can’t remove?

For starters, it does show the logic behind running a campaign to announce your new branding.

Often, companies will run a campaign announcing the new branding to generate buzz and interest or even to gain more conversions.

Nothing wrong with that, at all. However, even if none of that happens, it still makes sense to run a campaign when you change the name of your brand.

If only to signal that the brand name that once was, is no longer. This way, the next time someone talks about your brand on Reddit, they may stop themselves and use the new name.

If you’re lucky, when someone posts using your previous name, another user will comment that the brand name has actually been changed.

This is one less place to figure out how to go about changing how your brand is referenced and one less person who will continue to go around spreading the wrong name across the internet. That’s one less Reddit thread ranking on Google that mentions your old brand naming.

Now, let’s go back to that listicle. Your company is listed as a top 10 best whatever, and when you contact the website to update the name, they ghost you. What do you do?

Nothing.

You keep moving on. You keep doing more public appearances, writing more content, meeting more people, and generally building up your presence across the planet and the internet to the point where your new brand name is the default.

Until the point where Google’s Knowledge Graph is overwhelmed with Mordy Oberstein, founder of Unify Brand Marketing, and not Mordy Oberstein, head of SEO Brand at Wix.

Because then, that one website that hasn’t updated its content with your new naming is the one going against the grain. Now the pressure is on them to show they aren’t stale and out of date.

Set Expectations

I don’t expect this process to happen in a day nor should you. It takes time. Think of it more as a process.

Are more and more places across the digital landscape referencing your new brand name? If yes, you’re doing great. Are more people asking if you changed your name? Also, a good sign.

As you continue to spread your new brand name across the web, Google’s own Knowledge Graph will have more signals that the name that once was has been replaced.

Once your new brand name starts taking hold, anyone who cares about the accuracy of their content will start to either make the edit or reach out to you to make the edit.

Anyone else, at this point, is just running poor content that shouldn’t be there anyway (all things being equal).

More Resources:


Featured Image: PauloBobita/Search Engine Journal

Ask An SEO: Why Didn’t My Keywords Come Back After I Changed My Page Content? via @sejournal, @rollerblader

This week’s ask an SEO question comes from Jubi in Kerala:

“We changed our on-page content recently as keyword positions were nil. After updating the content, the keywords started appearing, but after four weeks the keywords went back to nil. Why is this so, any suggestions? [Page provided]”

Great to meet you, Jubi, and thank you for the question.

I reviewed your page, and although it is written for the user and in a conversational tone with keywords incorporated throughout, the site, overall, is likely the problem.

SEO is more than words on a page. It is also:

    • How your brand is represented by third parties.
    • The code of the site.
    • User and spider experience defined both topically and structurally.
    • The overall quality of the experience for the user, the spiders, and the algorithms.
    • Consumers not needing to do more searches as the solutions are provided by your website, or you give them the resources to implement with trusted third parties (backlinks) when you do not offer the product, service, or solution.

Changing the wording on a page can and does help, but it relies on the rest of the website, too.

I looked at your website for about five minutes, and multiple things popped out. After plugging it into an SEO tool that shows the history of the site, I have some starting points for you to help your site rank, and hopefully, this can help with your client work, too.

Focus On Your Target Audience And Region

First and foremost, your website is in U.S. English, and the language declarations are also in U.S. English. Your target audience is Kerala, India, and you offer digital marketing services in Kerala for local companies.

With a Google Search, I went to see if American English is the common language. Instead, it is Malayalam.

If both English and Malayalam are used, create both versions on your website. More importantly, see how people search in your area.

This is important for both you as a vendor and your local SEO and marketing clients.

I’ve done this in Scandinavia, where TV commercials in Sweden are in English (or were back then), so product searches and types were done in English more than in Swedish.

By having both languages available in content and PPC campaigns, conversions and revenue both scaled vs. only having the Swedish versions when I started working with this brand.

If they are not searching in English as a primary language, use the language they search in as the primary and make English the backup.

Next, look at your schema. You have a local business, which is great, but there are other ways you can define the area you serve and what you do.

Service schema can show you have a service, and you can nest an area served in because you’re a local business with a specific region you service.

Clean Up Hacked Code

Your website was hacked, and the hackers filled it with tons and tons of low-value content to try and rank for brands and brand products.

These pages are all 404, which is great, but they’re still being found. 410 them and make sure you block the parameter in robots.txt correctly. It looks like you’re missing an “*” on it.

You may also want to format a full robots.txt vs. using your software’s default with the one disallow line.

Undo The Over-Optimization

The website follows almost every bad practice with over-optimization, including things that are more for an end user rather than ranking a page.

Your meta descriptions on multiple pages are just keywords with commas in between vs. a sentence or two that tells the person what they’ll find if they click through.

I wasn’t sure if I was seeing it correctly, so I did a site:yourdomain search on Google and saw the descriptions were, in fact, just keywords with commas.

Optimize meta descriptions to let the person know why they should click through to the page. I created a guide to writing local SEO titles and meta descriptions here.

There are a couple of hundred backlinks, but they’re all directories and spammy websites. Think about your local media and trade organizations in India. How can you get featured there instead?

Is there a local chamber of commerce, small business, or local business group you can work with?

What can you share about market trends that will get you on the local news or news and business sites to link to your resources?  These are the backlinks that will help you.

Redo Your Blog

The blog has some topically relevant content, but the content is thin, and your guides that are supposed to answer questions start with sales pitches instead.

Sales pitches do not belong in the first paragraph or even the first five paragraphs of a blog post or guide ever.

People are there to learn. If they like what they learned, you have earned their trust. If the topic is relevant to a product or service you offer, that is when you do the sales pitch.

I clicked on two posts, and after the sales pitch, you share concepts, which is good, but there are no examples that the user can use.

The pages are missing supporting graphics and images to demonstrate concepts, information about the person who created the content, and ways to implement the solution.

One of the posts talks about slow webpage speed. Instead of giving a way to fix it or a starting point, the content just defines what it is. The person has to do another search, which means it is a bad experience.

Add in a couple of starting points like removing excess files (give a couple of types), using server-side rendering with how this helps and an example, plugins or tools for compressing images that don’t need to be in high-resolution, etc.

Now the person has action items, and you have an opportunity to link to detailed guides off of keywords (internal links) naturally to your pages that teach this.

This adds a ton of value to the user and gives them a reason to come back to you or even hire you to do the work for them.

On multiple posts, the writer stuffs internal links off of keyword phrases that are not naturally occurring. These are in the sales pitches, the opening, and the closing of each post.

In theory, this may not be bad for SEO, but it is not helpful for the user and may send low-quality page experience signals to Google if users are bouncing.

From my experience, your content is less likely to get sourced or linked to if it is a sales pitch vs. sharing a solution, but that is just what I’ve experienced.

Instead of starting with a sales pitch or having sales pitches in every post, build an email or SMS list and use remarketing to bring them back.

If you start with a sales pitch and no actual solution, they’ll likely bounce as the page is low-quality.

Final Thoughts

Your service pages overall are not bad. It is the rest of the website.

It needs to be recoded and focused on your target audience, the over-optimizations should be undone, and your agency needs to become the go-to digital marketing agency in your region. Most importantly, the code and content need to be cleaned up.

You offer these services, but prospective clients seeing these bad practices may be turned off and cost you business.

Also, don’t forget to create a Google Business Profile; you don’t currently have one even though you have a physical location, have active clients, and offer services.

I hope this helps, and thank you for asking a really good question.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Do I Rebuild My Website After A Dispute With The Hosting Company? via @sejournal, @HelenPollitt1

The question today comes from Raoof, who asks:

“I completely lost my website due to financial disputes with the hosting company. I have no backup and the only thing I have left is a domain.

I am currently preparing a new website with the previous content and theme. Can I use the previous domain or not? What is your suggestion?”

This is a difficult, but not uncommon, issue to face. You invested time, money, and resources in creating your website. To lose it is highly frustrating.

From an SEO perspective, it might feel like all is lost – the topical authority, the backlinks, your high-performing content.

But don’t worry, it’s not! I’m going to take you through a few steps to recover as much of your website and previous rankings as possible.

I see no issue with reusing your old domain address for the recovered site. That is, as long as no other site was hosted on it while yours was down.

If you owned the domain name throughout this time, you should be fine to restore your site at that address.

In fact, I would highly recommend it to ensure you recapture as much of your old site’s authority as you can.

Recovering Your Assets

The first step is to recover as much of your existing website as you can. You might not have a backup of your site, but thankfully, the internet does!

Content

I would start by going to the Wayback Machine. This is essentially a non-profit archive of the internet.

It claims to have saved over 928 billion webpages. There is a high chance that some of those will be yours!

You can search for your website domain and scroll back through time to when screenshots of your pages were taken. That should enable you to copy and paste some, if not all, of the copy that was on your site.

I would also suggest having a look at your analytics program to identify what your top-visited content was. This should be what you look to recover or recreate first.

Authority

The good news with still having your website domain is that you will still have the opportunity to recover backlinks that were pointing to your pages.

It’s important to host your content on the same URLs as it previously was. This means that if you still have links pointing to your site from external sources, they will continue to work when you set the URL live again.

If you are unable to recreate the exact URL for some reason, make sure to implement a 301 redirect from the old URL to the new one to retain the value of those links.

Reclaim Old Backlinks

If your site went down during the hosting dispute, your webpages were likely to return a 404 or other non-200 status code.

This could mean that external publications chose to change their links from pointing to your page to another so as to still enable their visitors to reach usable content.

This doesn’t mean that those links are gone forever. Evaluate which links were lost during the domain issues using a backlink analytics tool, and begin reaching out to those publications to inform them that your content is back.

It may be that they choose to link to your content again over the newer content they found.

Link Building To Help Crawling

External links aren’t just helpful for signaling relevancy and authority; they can also help to encourage the search bots to crawl the content they link to.

If your site has been offline for a while, it’s possible that the bots have reduced their frequency of crawling. New backlinks could indicate that the website is worth crawling more frequently again.

Technical

There is more to restoring your website to its former glory than just recovering the old content, of course.

A large part of what makes a website well-optimized for search engines and humans alike is its technical foundation.

Same Architecture

Where possible, try to recreate the website’s architecture.

I’ve already mentioned trying to re-use the old URLs, but also consider how and where they linked to each other.

Use the same menu structure and anchor text. This will help reinforce the relevance of the pages to each other and demonstrate that the site is the same as it was before.

Submit To Be Crawled

Once you’ve got your website back to how it was, you will want to let the search engines know to crawl it again.

Aside from encouraging crawling by getting new backlinks, as already mentioned, you can submit a request in Google Search Console and Bing Webmaster Tools for their bots to recrawl individual pages. Note that you may need to verify ownership of the domain in Google Search Console and Bing Webmaster Tools again.

Choose some of your more important pages so that they get crawled and back into the indexes as soon as possible.

XML Sitemaps

You should also make sure you have set up XML sitemaps again for the pages that you have recovered.

Submit these to the search consoles to further inform the Google and Bing bots of your pages’ existence, so they can crawl them and see that they are live again.

Take Note Of Any Issues Found

As the search engines begin to recrawl the site, take note of any issues Google and Bing report on through their search consoles.

There may be new issues that have crept in during the rebuild of your website that weren’t there before.

Improve

You can use this as an opportunity to evaluate what was working with your website and what wasn’t.

The temptation might be to recover and rebuild the site to reflect its former state. However, you might find that you can actually improve it instead.

What Were You Ranking For

As you review your old content’s performance, take a look at whether it ranked well before it was lost.

It may be that, instead of recovering it and uploading it exactly as it was before, you can use this as an opportunity to improve its relevancy to the search phrases users would use to land on it.

Review competitors’ content that has been flourishing while yours has been lost. Take note of what the top-ranking content contains that your recovered content doesn’t.

What’s Changed In The Industry

If your website has been down for a while during this hosting dispute, then the industry may have moved on.

Start to look for gaps in the content that your site used to address and what users are looking for now.

Are there new trends, products, or services that are becoming popular in your industry that you have not covered with your site previously?

Protect

The most important step once you have recovered and improved your site is to reduce the risk of losing it in the future.

You will hopefully never have an issue with your hosting again, but other issues can occur that can cause your website to go offline.

Backups

First of all, take backups of your new site. Many content management systems make it easy to do this, but if yours doesn’t, or if you’ve built it yourself, consider what you can save offline.

Save Your Content

Take copies of all the written content on your site. Make sure that you save it somewhere that isn’t directly linked to your website in case you run into issues again.

Don’t forget to save copies of the images you use, especially if they are unique to your website.

Save Your Meta

Take copies of each page’s search engine optimization.

For example, download the page title and description alongside your main body content.

Mark up the headers and save the image descriptions, and keep the filenames as you used on the site. This will speed up the recovery of your site in the future.

Save Your Schema Markup

Don’t forget to take copies of any bespoke code you used. This includes schema markup. This could save you a lot of time in the future, especially if you write your own schema rather than using plugins.

This can also help if you end up migrating from one CMS to another that doesn’t use the same schema modules.

Resuming Your Optimization Efforts

It is horrifying to think that the website you have spent so much time on is gone for good. Thankfully, it’s probably not.

It’s worth consider that there may be legal recourse available to you to aid in the recovery of your website.

Make sure to check your hosting terms of service thoroughly, as they may give avenues you can explore to regain control of your content.

It may not be as simple as asking your hosting provider for support if you are already in a legal dispute with them, but there may be some legal options available to you.

In the future, it is important to consider the trustworthiness and levels of support provided by your hosting provider.

Look up reviews of potential hosting services before committing to them to make sure you don’t end up going through a similar struggle again.

Losing access to your website can be costly in terms of money and time, and a highly stressful situation. But, follow the steps above and you should get back to working on your website.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How To Implement Faceted Navigation Without Hurting Crawl Efficiency via @sejournal, @kevgibbo

This week’s question tackles the potential SEO fallouts when implementing faceted navigation:

“How can ecommerce sites implement SEO-friendly faceted navigation without hurting crawl efficiency or creating index bloat?”

Faceted navigation is a game-changer for user experience (UX) on large ecommerce sites. It helps users quickly narrow down what they’re looking for, whether it’s a size 8 pair of red road running trainers for women, or a blue, waterproof winter hiking jacket for men.

For your customers, faceted navigation makes huge inventories feel manageable and, when done right, enhances both UX and SEO.

However, when these facets create a new URL for every possible filter combination, they can lead to significant SEO issues that harm your rankings, and waste valuable crawl budget if not managed properly.

How To Spot Faceted Navigation Issues

Faceted navigation issues often fly under the radar – until they start causing real SEO damage. The good news? You don’t need to be a tech wizard to spot the early warning signs.

With the right tools and a bit of detective work, you can uncover whether filters are bloating your site, wasting crawl budget, or diluting rankings.

Here’s a step-by-step approach to auditing your site for faceted SEO issues:

1. Do A Quick “Site:” Search

Start by searching on Google with this query: site:yourdomain.com.

This will show you all the URLs Google has indexed for your site. Review the list:

  • Does the number seem higher than the total pages you want indexed?
  • Are there lots of similar URLs, like ?color=red&size=8?

If so, you may have index bloat.

2. Dig Into Google Search Console

Check Google Search Console (GSC) for a clearer picture. Look under “Coverage” to see how many pages are indexed.

Pay attention to the “Indexed, not submitted in sitemap” section for unintended filter-generated pages.

3. Understand How Facets Work On Your Site

Not all faceted navigation behaves the same. Make sure you understand how filters work on your site:

  • Are they present on category pages, search results, or blog listings?
  • How do filters stack in the URL (e.g.,?brand=ASICS&color=red)?

4. Compare Crawl Activity To Organic Visits

Some faceted pages drive traffic; others burn crawl budget without returns.

Use tools like Botify, Screaming Frog, or Ahrefs to compare Googlebot’s crawling behavior with actual organic visits.

If a page gets crawled a lot but doesn’t attract visitors, it’s a sign that it’s consuming crawl resources unnecessarily.

5. Look For Patterns In URL Data

Run a crawler to scan your site’s URLs. Check for repetitive patterns, such as endless combinations of parameters like ?price=low&sort=best-sellers. These are potential crawler traps and unnecessary variations.

6. Match Faceted Pages With Search Demand

To decide which SEO tactics to use for faceted navigation, assess the search demand for specific filters and whether unique content can be created for those variations.

Use keyword research tools like Google Keyword Planner or Ahrefs to check for user demand for specific filter combinations. For example:

  • White running shoes (SV 1000; index).
  • White waterproof running shoes (SV 20; index).
  • Red trail running trainers size 9 (SV 0; noindex).

This helps prioritize which facet combinations should be indexed.

If there’s enough value in targeting a specific query, such as product features, a dedicated URL may be worthwhile.

However, low-value filters like price or size should remain no-indexed to avoid bloated indexing.

The decision should balance the effort needed to create new URLs against the potential SEO benefits.

7. Log File Analysis For Faceted URLs

Log files record every request, including those from search engine bots.

By analyzing them, you can track which URLs Googlebot is crawling and how often, helping you identify wasted crawl budget on low-value pages.

For example, if Googlebot is repeatedly crawling deep-filtered URLs like /jackets?size=large&brand=ASICS&price=100-200&page=12 with little traffic, that’s a red flag.

Key signs of inefficiency include:

  • Excessive crawling of multi-filtered or deeply paginated URLs.
  • Frequent crawling of low-value pages.
  • Googlebot is stuck in filter loops or parameter traps.

By regularly checking your logs, you get a clear picture of Googlebot’s behavior, enabling you to optimize crawl budget and focus Googlebot’s attention on more valuable pages.

Best Practices To Control Crawl And Indexation For Faceted Navigation

Here’s how to keep things under control, so your site stays crawl-efficient and search-friendly.

1. Use Clear, User-Friendly Labels

Start with the basics: Your facet labels should be intuitive. “Blue,” “Leather,” “Under £200” – these need to make instant sense to your users.

Confusing or overly technical terms can lead to a frustrating experience and missed conversions. Not sure what resonates? Check out competitor sites and see how they’re labeling similar filters.

2. Don’t Overdo It With Facets

Just because you can add 30 different filters doesn’t mean you should. Too many options can overwhelm users and generate thousands of unnecessary URL combinations.

Stick to what genuinely helps customers narrow down their search.

3. Keep URLs Clean When Possible

If your platform allows it, use clean, readable URLs for facets like /sofas/blue rather than messy query strings like ?color[blue].

Reserve query parameters for optional filters (e.g., sort order or availability), and don’t index those.

4. Use Canonical Tags

Use canonical tags to point similar or filtered pages back to the main category/parent page. This helps consolidate link equity and avoid duplicate content issues.

Just remember, canonical tags are suggestions, not commands. Google may ignore them if your filtered pages appear too different or are heavily linked internally.

For any faceted pages you want indexed, these should include a self-referencing canonical, and for any that don’t, canonicalize these to the parent page.

5. Create Rules For Indexing Faceted Pages

Break your URLs into three clear groups:

  • Index (e.g., /trainers/blue/leather): Add a self-referencing canonical, keep them crawlable, and internally link to them. These pages represent valuable, unique combinations of filters (like color and material) that users may search for.
  • Noindex (e.g., /trainers/blue_black): Use a to remove them from the index while still allowing crawling. This is suitable for less useful or low-demand filter combinations (e.g., overly niche color mixes).
  • Block Crawl (e.g., filters with query parameters like /trainers?color=blue&sort=popularity): Use robots.txt, JavaScript, or parameter handling to prevent crawling entirely. These URLs are often duplicate or near-duplicate versions of indexable pages and don’t need to be crawled.

6. Maintain A Consistent Facet Order

No matter the order in which users apply filters, the resulting URL should be consistent.

For example, /trainers/blue/leather and /trainers/leather/blue should result in the same URL, or else you’ll end up with duplicate content that dilutes SEO value.

7. Use Robots.txt To Conserve Crawl Budget

One way to reduce unnecessary crawling is by blocking faceted URLs through your robots.txt file.

That said, it’s important to know that robots.txt is more of a polite request than a strict rule. Search engines like Google typically respect it, but not all bots do, and some may interpret the syntax differently.

To prevent search engines from crawling pages you don’t want indexed, it’s also smart to ensure those pages aren’t linked to internally or externally (e.g., backlinks).

If search engines find value in those pages through links, they might still crawl or index them, even with a disallow rule in place.

Here’s a basic example of how to block a faceted URL pattern using the robots.txt file. Suppose you want to stop crawlers from accessing URLs that include a color parameter:

User-agent: *
Disallow: /*color*

In this rule:

  • User-agent: * targets all bots.
  • The * wildcard means “match anything,” so this tells bots not to crawl any URL containing the word “color.”

However, if your faceted navigation requires a more nuanced approach, such as blocking most color options but allowing specific ones, you’ll need to mix Disallow and Allow rules.

For instance, to block all color parameters except for “black,” your file might include:

User-agent: *
Disallow: /*color*
Allow: /*color=black*

A word of caution: This strategy only works well if your URLs follow a consistent structure. Without clear patterns, it becomes harder to manage, and you risk accidentally blocking key pages or leaving unwanted URLs crawlable.

If you’re working with complex URLs or an inconsistent setup, consider combining this with other techniques like meta noindex tags or parameter handling in Google Search Console.

8. Be Selective With Internal Links

Internal links signal importance to search engines. So, if you link frequently to faceted URLs that are canonicalized or blocked, you’re sending mixed signals.

Consider using rel=”nofollow” on links you don’t want crawled – but be cautious. Google treats nofollow as a hint, not a rule, so results may vary.

Point to only canonical URLs within your website wherever possible. This includes dropping parameters and slugs from links that are not necessary for your URLs to work.

You should also prioritize pillar pages; the more inlinks a page has, the more authoritative search engines will deem that page to be.

In 2019, Google’s John Mueller said:

“In general, we ignore everything after hash… So things like links to the site and the indexing, all of that will be based on the non hash URL. And if there are any links to the hashed URL, then we will fold up into the non hash URL.”

9. Use Analytics To Guide Facet Strategy

Track which filters users actually engage with, and which lead to conversions.

If no one ever uses the “beige” filter, it may not deserve crawlable status. Use tools like Google Analytics 4 or Hotjar to see what users care about and streamline your navigation accordingly.

10. Deal With Empty Result Pages Gracefully

When a filtered page returns no results, respond with a 404 status, unless it’s a temporary out-of-stock issue, in which case show a friendly message stating so, and return a 200.

This helps avoid wasting crawl budget on thin content.

11. Using AJAX For Facets

When you interact with a page – say, filtering a product list, selecting a color, or typing in a live search box – AJAX lets the site fetch or send data behind the scenes, so the rest of the page stays put.

It can be really effective to implement facets client-side via AJAX, which doesn’t create multiple URLs for every filter change. This reduces unnecessary load on the server and improves performance.

12. Handling Pagination In Faceted Navigation

Faceted navigation often leads to large sets of results, which naturally introduces pagination (e.g., ?category=shoes&page=2).

But when combined with layered filters, these paginated URLs can balloon into thousands of crawlable variations.

Left unchecked, this can create serious crawl and index bloat, wasting search engine resources on near-duplicate pages.

So, should paginated URLs be indexed? In most cases, no.

Pages beyond the first page rarely offer unique value or attract meaningful traffic, so it’s best to prevent them from being indexed while still allowing crawlers to follow links.

The standard approach here is to use noindex, follow on all pages after page 1. This ensures your deeper pagination doesn’t get indexed, but search engines can still discover products via internal links.

When it comes to canonical tags, you’ve got two options depending on the content.

If pages 2, 3, and so on are simply continuations of the same result set, it makes sense to canonicalize them to page 1. This consolidates ranking signals and avoids duplication.

However, if each paginated page features distinct content or meaningful differences, a self-referencing canonical might be the better fit.

The key is consistency – don’t mix page 2 canonical to page 1 and page 3 to itself, for example.

About rel=”next” and rel=”prev,” while Google no longer uses these signals for indexing, they still offer UX benefits and remain valid HTML markup.

They also help communicate page flow to accessibility tools and browsers, so there’s no harm in including them.

To help control crawl depth, especially in large ecommerce sites, it’s wise to combine pagination handling with other crawl management tactics:

  • Block excessively deep pages (e.g., page=11+) in robots.txt.
  • Use internal linking to surface only the first few pages.
  • Monitor crawl activity with log files or tools like Screaming Frog.

For example, a faceted URL like /trainers?color=white&brand=asics&page=3 would typically:

  • Canonical to /trainers?color=white&brand=asics (page 1).
  • Include noindex, follow.
  • Use rel=”prev” and rel=”next” where appropriate.

Handling pagination well is just as important as managing the filters themselves. It’s all part of keeping your site lean, crawlable, and search-friendly.

Final Thoughts

When properly managed, faceted navigation can be an invaluable tool for improving user experience, targeting long-tail keywords, and boosting conversions.

However, without the right SEO strategy in place, it can quickly turn into a crawl efficiency nightmare that damages your rankings.

By following the best practices outlined above, you can enjoy all the benefits of faceted navigation while avoiding the common pitfalls that often trip up ecommerce sites.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Should We Optimize For Keywords With High Search Volume Or Competition? via @sejournal, @rollerblader

In this week’s Ask An SEO, Chandrika asks:

“What are the important points to consider when doing keyword research for SEO using Google Keyword Planner? Should we focus on keywords with a monthly search volume of 500? Or, should we prioritize keywords with low or high competition?”

This is a great question, and here’s an easy answer: Don’t focus on the keyword. Focus on the solution for the user based on the intent of the keyword.

Google Keyword Planner shares the estimated search volume for a keyword, but that doesn’t mean the entire volume represents your audience. Some of them may be looking for information rather than shopping, and only a portion of them are there to be converted into revenue.

The word “bark,” for example, could be the bark on a tree or the noise a dog makes.

A search for bark on a tree could be what it looks like or feels like, whether it’s a sign the tree is healthy or not, and questions about using it to determine the age or genus of the tree.

“Bark” for a dog could refer to the specific sounds made by certain breeds, could indicate that the dog is sick, or the user is looking for ways to get a dog to stop barking or train a dog to bark on command.

If there are 500 searches, perhaps 300 are for the noise the dog makes, from which 200 are for determining if the dog is sick or healthy, and 50 are for training your dog to bark.

If you sell books on dog training, this may not be the best phrase to go after, but it is a topic you may want to cover. This is where optimizing for the topic comes in.

The topic will encompass the “SEO keywords” and increase the potential pool of traffic based on the entity it ranks for, and the solution it provides.

Optimize For The Solution And Topic

Instead of optimizing for a keyword by stuffing it into the copy, headers, and title, optimize for the topic it relates to.

Ask yourself what the person searching for this keyword is looking for, and build a series of pages that meet these needs.

  • If it is a conversion phrase, then incorporate the questions and solutions the person has related to the product query into the product or collection page. This can be done in the copy itself or in the FAQs, if your template has them.
  • When the keyword has an informational and conversion intent, such as “micro needling,” it can be about the process and procedure, a before-and-after photo series, or someone looking to find a local med spa. This means your site should have multiple content types for the SEO keywords based on the stage of the customer’s journey, including:
    • Pages that show the before and after, and by skin type and age.
    • Blog posts and guides that cover the process and alternatives if it isn’t a match.
    • Comparisons between micro needling and similar procedures to help the person know which is better suited to their needs.
    • A direct conversion page where you can onboard the lead or take payment.

By creating guides that address the topic, your website becomes stronger for the specific phrases.

Machine learning and AI are getting better at understanding what the content solves, and they use the trustworthiness of the content and its phrasing to determine the keywords the page should rank for.

If the content is clearly showing knowledge and expertise, and the claims or solutions are backed up by proven facts, you can show up for keywords without optimizing for the phrase from Google Keyword Planner.

Once you have the content and user intent, like shopping or learning, completed text-wise, add schema.

Use article or blog post schema, depending on whether you’re a news site, for informative content. Use the shopping schema, such as product, collection, or service, along with the area served and additional types to help drive the intent of the page home.

Keywords With Higher Search Volumes

Keywords with high search volumes are tempting to optimize for. However, instead of worrying about the keyword, take other keywords that are similar and are part of the solution.

Put those together into a group, and then think about how they interact to educate the person so that the person will have the information they need to make an informed decision about their purchase, whether it is a product or a collection/category page.

Keywords and search volumes are part of topics, but you don’t focus on their volumes – focus on the solutions for the phrases.

Your goal is to create the ultimate resource for the topic, whether it’s a question, a guide, or compatibility for products and services.

When you do this, the keyword search volume may multiply exponentially, and you can optimize the same page for multiple high-volume phrases.

By doing this, you may also be able to avoid creating content that cannibalizes itself by having a content map of your website.

When you know a page is dedicated to a topic and specific intent, you have your reminder not to create another page just because there is a search volume you found.

Instead, try to incorporate the theme of the phrase based on the search intent into the correct page for that search volume.

Competition Scores Do Not Matter

Someone has to show up for the phrase, so why shouldn’t it be you?

Competition scores are scores made up by SEO tools, not used by search engines.

Search engines are concerned with providing the most accurate answer in the easiest-to-absorb format and in the fastest way possible. If you do this, you may be the site that gets the ranking and the traffic.

For highly competitive phrases where big money is being spent, you will need some authority and trust, but there’s no reason you shouldn’t create the content that can rank.

You may get lucky and take traffic from the more established sites – it happens a lot. When it does, it can attract backlinks naturally from highly authoritative sites, which helps build your site’s stability.

Another reason to create this content now is that having it in an easy-to-use and trustworthy format can help it rank once your website is strong enough. I’ve seen this happen, where multiple pages rise to the top during core updates.

If you don’t create the content because you think it’s too competitive, you won’t have the chance to rank it when core updates happen.

The last thing I’d consider when looking at keywords with 500+ monthly searches is the long tail.

Long-tail phrases can be part of the topic. When you filter a keyword research tool to only show volumes at 500+, you miss out on parts of the entity, which can include consumer questions.

Knowing what matters to the consumer or user helps to provide them with more complete solutions.

When the page answers all of their questions, they can now convert (if your funnel is good), or they may subscribe to your publication because you’re a solution provider.

We never focus on SEO keyword volume when doing research, but we love high volumes when we find them.

We look at what will benefit the person on the page and if it matches the topic of the site, products, and services.

From there, we use keywords and search volumes to set a potential goal in traffic, but we don’t stress if there is no search volume.

Google Discover data, for example, isn’t going to show up, but if the content aligns with interests and your site qualifies, you could get featured and attract a ton of new visitors.

I hope this helps answer your question.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How To Convince C-Suite To Support Brand-Based SEO via @sejournal, @MordyOberstein

In this edition of Ask An SEO, a marketing leader reaches out with a question:

My company put pressure on me to deliver results of more traffic to our product pages.

How can I try to convince the CMO that we should invest more in brand building that will most likely reduce traffic?

There’s so much to chew on with this question. Before I get into the thick of it, I want to challenge the premise: “brand building that will most likely reduce traffic.”

It’s something I hear from clients often enough. It’s a premise I hear from SEOs all the time. While it may be true in this specific case, I would like to say something regardless.

I’m glad brand has entered into the SEO conversation. Long overdue.

At the same time, brand hasn’t been the forte of the search marketing industry. As a result, there’s a lot being said that, when put under scrutiny, doesn’t hold up.

I’d take a lot of the brand strategy you’re hearing from the SEO industry with a grain of salt.

Just because you target an audience doesn’t mean you lose wider reach. It can happen – and very often it should happen – but does it not have to happen?

You can speak to a core audience very deeply while not losing the attention of your secondary audience. Streaming platforms do this all the time.

Apple TV has an identity around great sci-fi content, but it also speaks to a wider audience as it throws some solid comedies into the mix (at least in my opinion).

Both of these “identities” work because there is a common thread: Apple puts out higher-quality content than other platforms.

So, will you lose traffic by focusing on brand? You probably should, but that’s only because I’ve been around the proverbial SEO block a few times.

It is, however, entirely possible to do things like pivot to a new audience while retaining the old one.

Losing audience as a result of building the brand is 100% not an inherent outcome. If anything, in the long term, it’s the total opposite. Brand building is all about connecting with more audiences over time.

Let’s move on to your question and work with the premise that you will lose traffic by increasing content and audience targeting.

I’m not even going to go into the obvious point and glaring absurdity of not wanting to have a more specific focus and more refined audience targeting in favor of “traffic.”

So, we’re going to work with two premises:

  1. You will lose traffic by focusing on brand.
  2. Not getting that “traffic” is “bad” somehow.

How do you sell this to the CMO?

For The Conceptual CMO

I’m going to start at a very conceptual level that will probably not speak to your CMO, but is very important for you to understand when you make your pitch.

The web is not the web you think it is. The web was a place where Wired could write about coffee mugs and rank them because everything was on an equal footing.

It was one giant “web” that was unified, where anyone could rank for anything so long as the content was halfway decent.

That web doesn’t exist anymore.

There is no “internet.” There is the internet that talks about home goods. There is the internet that deals with technology products. There is the internet that takes up sports.

On this internet, Wired isn’t relevant for coffee mugs. That’s not its sphere of influence. The web is no longer one giant unified void that algorithms sift through to pick up whatever site on whatever topics.

Think of the internet as independent spheres that sometimes move and overlap with other orbits but are generally self-contained.

If you’re selling this bowl of goods to a CMO, I would pitch it as you’re getting ahead of the curve. You’re getting ahead of the traffic loss that has already hit so many sites and is going to hit yours eventually.

I would sell this as “being able to perform as the landscape shifts.” You have to function in alignment with the ecosystem. There’s no way around it.

If you don’t, it will all hit the fan. It’s only a question of “when.” Usually, brands will wait until it’s already too late.

Not operating within the confines of the ecosystem is like trying to row a boat on an ice skating rink using a tennis racket.

For The Pragmatic CMO

The conceptual construct I just defined above will not speak to most CMOs.

While it’s extremely likely that you, the VP of marketing, head of growth, marketing manager, etc., understand this point, most CMOs are not in touch with the ecosystem enough to be swayed by this argument.

For most CMOs, I would start with the competition. Show similar sites that have undergone traffic losses because they haven’t changed with the tides.

If you’re a HubSpot competitor, showcase all the traffic HubSpot lost. And then, translate that into all the dollars spent in time and resources trying to capture traffic, as if it were 2015.

Image from author, April 2025

Honing your audience makes it less expensive to run marketing campaigns and assets.

Don’t pay to speak to everyone. Pay to speak to the right ones.

If your marketing strategy is aimed at casting a wide net, you will inevitably either pay for content production that isn’t of value or simply pay for pure visibility that isn’t worth the value.

You can also do the opposite. You can show competitors who have gotten ahead of the curve. That usually lights a fire under most CMOs. Seeing that the competition is getting “ahead” in whatever way is very uncomfortable for the C-level staff.

If you can show that your strategy is already being implemented by competitors, squeeze. And frame it. Frame it well: “Our competitors are starting to speak more directly to our ultimate target audience, and you can see that here, here, and here.” That will have an impact that they won’t ignore.

You have to try to concretize this as much as possible.

The problem with brand, as Alli Berry once put it to me, is that it’s the silent killer. I have witnessed this firsthand on more than one occasion with clients.

You don’t realize it’s the decline of brand efficacy until you have a real problem on your hands.

What happens is, time and time again, the decline of brand efficacy first manifests itself in whatever performance channel.

Suddenly, your social media performance or organic search performance is on the decline.

The immediate knee-jerk reaction brands have (especially as you move up the ladder) is to fix the channel.

These are the meetings where you are told to change things up and fix performance. You know, the meetings where you leave with your head shaking since it’s clear no one knows what they’re talking about.

The reason this happens is that the issue isn’t the channel. There’s nothing wrong with the social or SEO strategy per se. Rather, there’s a huge gap in the brand strategy, and it’s starting to have an impact.

The “suddenness” of a performance problem can be an external shift (a change in consumer behavior, for example) – and that definitely can and does happen.

However, from my experience, the usual culprit is the loss of brand traction.

Often, a product hits the market at the right time, in the right way, in the right place. The stars align and the brand takes off.

At a certain point, the brand hits what I tell my clients is a “maturity inflection point.” The brand can no longer ride the momentum of its product or service the same way, and brand efficacy (and marketing potency) ebbs away.

By the time this happens, most brands have a strong client base, etc., so they never look internally. Instead, they focus on the specific performance problems. Thus, the brand becomes the silent killer.

Your job is not to let this happen to you. If you’re managing a marketing team at whatever level, your job is to nip this problem in the bud.

Now, if your CMO is more reflective and so forth, then the argument I gave earlier might work.

This is not the norm, so you need to concretize the argument.

Whether it be, as I mentioned, through the competitor angle or whatever, you have to gain some perspective and then translate that perspective into practicality.

Roll With Your CMO

My last piece of advice is to know your audience. CMOs are often bold and brash (likely because they feel they have to be), so speak that language. Come with a plan that has a bit of edge and flair to it.

If that’s not your CMO, don’t. If they are more analytical by nature, show the data.

It’s just a matter of knowing your audience and what language they speak. You have to roll with where your CMO and company overall are at. Otherwise, you might have the greatest plan, but it won’t land.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: My Content Can’t Be Seen When I Disable JavaScript – Will This Affect My SEO? via @sejournal, @HelenPollitt1

This week’s question comes from Thomas, who asks:

I disabled the JavaScript just to check the content of my webpage, but unfortunately I could not see any content except the banner H1 tag.

Will it hurt my SEO? If yes, what are the advisable solutions for this?

This is a great question – it’s something that all SEO professionals need to be aware of.

We spend so much time trying to create interesting, engaging content that it would be heartbreaking to think that it isn’t visible to search engines.

However, given the recent advancements in Google’s ability to render JavaScript content, is that something we still need to be concerned about?

The short answer is yes.

Why JavaScript Can Be A Problem

We know that to ingest information, Googlebot will discover a page, crawl it, parse and index it. For JavaScript, the crawler needs to “render” the code. The rendering stage is where JavaScript problems can occur.

JavaScript has to be downloaded and executed in order for the content to be parsed. This takes more resources than the bot parsing content in HTML.

As such, sometimes Google will defer the rendering stage and come back to a page to render it at a later date.

Most websites these days will use some JavaScript – that’s absolutely fine.

However, if your website requires JavaScript to load important content that is crucial to the page, then it might be a risk.

If, for some reason, a search bot does not render the JavaScript on a page, then it will not have any context as to what the page is about.

It is crucial to remember that not every search engine can render JavaScript. This is becoming increasingly important in the era of generative search engines – very few of which render JavaScript.

Diagnosing A Problem

You’ve done the right thing by starting to investigate the effect JavaScript rendering might be having on your site.

Turning off the JavaScript and seeing what content remains, and what is still interactive without it, is important.

I suggest going a step further and looking at what is available to the search bots to read on a page’s first load. This will help you identify content accessible without JavaScript rendering.

Check Google Search Console

First off, use Google Search Console URL Inspection tool and look at the rendered HTML. If the content is present in the rendered HTML then Google should be able to read the content.

Check Chrome Browser

You can go to “View Source” in Chrome to see what the pre-rendered HTML looks like. If the content is all there, you don’t need to worry any further.

However, if it’s not, then you can use the Developer Tools in Chrome for further diagnostics. Look in the “Elements” tab. If you can see your content, then again, you are probably OK.

Check The Robots.txt

Sometimes, developers may choose to block specific JavaScript files from being crawled by disallowing them in the robots.txt.

This isn’t necessarily an issue unless those files are needed to render important information.

It’s always worth checking your robots.txt file to see if there are any JavaScript files blocked that could prevent the bots, in particular, from accessing the content of the page.

Next Steps

JavaScript tends to worry a lot of people when it comes to SEO. It’s a significant part of the modern web, however. There’s no escaping the use of JavaScript.

We need to ensure that our websites utilize JavaScript so that both popular and emerging search engines can find and read our content.

You don’t need to worry but be diligent.

If you have developer resources at hand, you can work with them to identify the most applicable solution.

Here are some checks you may want to make:

Are We Using Client-Side Rendering Or Server-Side Rendering?

Client-side rendering essentially utilizes the browser to render the JavaScript of a page.

When a page is visited, the server responds by sending the HTML code and the JavaScript files. The browser then downloads those files and generates the content from the JavaScript.

This is counter to server-side rendering, where the content is rendered by the server and then sent to the browser with the data provided.

In general, server-side rendering is easier for bots, can be a quicker experience for users, and tends to be the default recommendation for SEO.

However, it can be more costly for the websites and, therefore, isn’t always the default choice for developers.

Is Our Main Content Able To Be Rendered Without JavaScript?

The most important content on your page, the main content, needs to be possible to parse without JavaScript rendering.

That is always the safest way to ensure that bots can access the content.

Are We Using JavaScript Links?

A further consideration is whether your links can be crawled easily by the search bots.

It’s not always an issue to have links generated through JavaScript. However, there is a risk that bots might not be able to resolve them unless they are properly contained in HTML element with an href attribute.

Google states it “can’t reliably extract URLs from elements that don’t have an href attribute or other tags that perform as links because of script events.”

Remember, though, it’s not just Google that you need to be conscious of. It’s always better to err on the side of making your links easy to follow.

In Summary

It is crucial to make sure your content is accessible to bots, now and in the future.

That means that if your website relies heavily on JavaScript to load content, you may struggle to communicate that information to some search engines.

It’s true that Google is much better at rendering JavaScript-heavy sites than it used to be, but the SEO playing field is not just Google.

To make sure your website can perform well in search platforms beyond Google, you may want to change how your website renders content, making sure your main content is in HTML.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: If I Am Not An SEO Expert, Is It Better For Me To Start An Agency? via @sejournal, @HelenPollitt1

Today’s question from Kazi is such an honest and important one for our industry. Kazi asks:

“I’m new to this sector. Should I start an agency as an one-man employee? I know it sounds ridiculous, but I am very much in need of work – more specifically, in need of money.

So, I think an agency will have better pull power than me working as a solo search engine optimizer because clients won’t get attracted if they see a new SEO learner…”

I understand that the crux of this question is, “Will you appear more legitimate and competitive if you are offering services as an agency rather than a solo contractor?”

However, I really want to address the more important aspect of this: Should you be offering SEO services for a fee as someone brand new to the industry?

My answer to that is no. Not only shouldn’t you be offering SEO services as an agency, but you also shouldn’t be offering them as a solo contractor if you are brand new to SEO.

Why SEO Is A Great Career Move

I completely get the appeal of this as a career move. On the face of it, SEO has no barriers to entry, which is great!

I fully recommend this industry to anyone passionate about analysis, psychology, creativity, sustainability, and technology.

You’ll hear me encourage it for people who like to problem-solve and find solutions that don’t have obvious answers.

I recommend it not only for those who have a background in creative pursuits, but also for those whose background is more tech-focused. It is a career that captures the interests of a lot of different types of people.

Not only does it give you the opportunity to earn money through skills that are in demand, but there are also no expensive overheads.

There are no formal qualifications needed and no regulatory bodies to convince. All you need is a computer, the internet, and the passion to develop your skills.

But it’s not easy.

There is a lot to learn before you can realistically start charging for your services. It is not just about the mechanics of SEO, but also how to apply them in different situations.

Risks To Clients

If you take on SEO work for a business or organization without knowing how to apply SEO concepts in practice, you could open them up to significant risk.

Learning On The Job

SEO is not a straightforward practice. There are a lot of variables and circumstances that affect what we might deem “good practice.” Because of this, you can’t apply a blanket solution to every situation you encounter.

If you are brand new to SEO, there may be nuances with your client’s industry, website, or tech stack that you aren’t aware of, which will impact how successful your strategy is.

A lot of SEO comes down to problem-solving. This is a skill that gets honed over time. As you encounter more situations and learn from what worked and what didn’t, you will become more adept at creating successful strategies.

If you are a brand-new SEO bearing full responsibility for the success of the organic performance of a client, you will come unstuck.

SEO is a great industry for learning on the job. However, if you are learning SEO from scratch, you don’t want the pressure of being the most senior SEO in the room. You will likely make mistakes, and these could be costly to your client.

May Cause Significant Traffic Loss

In some situations, a junior SEO working on a website alone could cause significant traffic loss for a client.

For example, you could accidentally set live a solution that de-indexes their whole site. You may not know how to guard against that sort of mistake. You could see your client’s organic traffic disappear in a matter of days.

These are risks that more experienced SEO professionals face as well, but after years of working in SEO, we can foresee what issues might arise from the recommendations we make.

Could Cause Financial Harm

If your SEO recommendations cause organic performance issues, your client’s revenue could be significantly affected.

If they run a business that relies on their website – and in particular, organic traffic – then your mistakes could be very costly.

People have lost jobs over organic traffic loss. Without much experience in SEO and no one more senior to help flag risks, this is something that you could easily cause.

Risks To You

Not only would charging for your services as an SEO when you’ve never done it before put your clients at risk, but it could also be harmful to you.

Significant Pressure

You will face significant pressure. You will be expected to set reasonable goals and achieve them in a timely manner.

Without much experience in SEO, that’s going to be incredibly difficult to do. You’ll either set unattainable goals that no SEO could realistically achieve, or you simply won’t have the skills to achieve more practical objectives.

With that, you will find yourself trying to appease an increasingly disgruntled client.

Any SEO professional who has worked as a freelancer or as part of an agency has had to have difficult conversations with a client. They have expected results that you have not been able to deliver.

However, an experienced SEO will be able to identify when that is likely to be the case and adapt strategy, or inform the client of more realistic timelines or goals.

They will also have ways to help the client feel like they are getting a genuine return on their investment, even if it’s not as much or as quickly as they had anticipated.

A brand-new SEO specialist simply will not know how to do that. It’s too much to expect from someone so early on in their career. You will likely feel the pressure of that.

Your lack Of Experience Will Be Discovered Quickly

The above is really a best-case scenario – you actually make it to the point where you have convinced clients to trust you, and you are beginning to see that you can’t hold up to the promised performance standards.

Most likely, your lack of knowledge and experience will be identified more quickly. You may be working with people with more SEO knowledge than you, such as marketers or product owners.

They may not consider themselves experts in SEO, but they will assume you are if you sell it as a service. They will probably be able to identify significant gaps in your knowledge very early on in your relationship.

Not only will that likely sour the client-agency relationship, but you may also find yourself without a client pretty quickly.

Could Have Legal Ramifications

In some situations, positioning yourself as someone who can get results through offering a service – without the ability to fulfill that – could be a breach of contract.

I would be very wary of making promises to clients about your abilities unless you are upfront that you are brand new to SEO and they are among your first-ever clients.

What You Could Try Instead

So, if I’ve managed to give you pause about committing to offering SEO services as a contractor or an agency, may I suggest some other ways forward?

You can still make money through a career in SEO, even if you are beginning to learn about it.

Join A Company As An Intern

You are clearly passionate about SEO if you are already thinking about starting a business working in the industry. That passion is a great start.

Consider finding an entry-level job in the SEO field and learning on the job in a more supportive, less pressured way.

You could find an agency or in-house position that values your drive and ambition but can support you with the right resources and opportunities to learn SEO while minimizing the risk to yourself and others.

Practice On Sites That You Can Fail With And Learn

If you are struggling to find employment within SEO but want to learn it to get into a position where you can legitimately offer SEO services, you need practice.

Do not practice on sites that rely on organic traffic. Instead, consider building your own sites around subjects you are passionate about and develop your experience and confidence in SEO.

You can make mistakes, weather traffic drops, and be hit by algorithm updates – all without risking anyone’s livelihood. Through that, you will develop the skills you need to work on other people’s sites.

I would still consider graduating to other sites where the risk is low. For example, volunteering your time to work alongside other SEO professionals.

Or, you could try optimizing the site of a friend who understands you are still learning SEO and is happy for you to practice and make mistakes on their site.

Set Up Your Own Site And Monetize It

If you are determined to make money through SEO right away, then build and optimize a site that you can monetize.

This might mean an affiliate site, or perhaps you can start a business that drop-ships.

Whatever course you take, make sure that the risk is minimal, and you will not suffer if you lose traffic and revenue while perfecting your SEO skills.

Make Sure You Have The Experience Before You Go Alone

Most importantly, understand that learning SEO well takes time.

You can easily read up on SEO and have a very high theoretical knowledge of it, but you still need to put what you’ve learned into practice.

This way, you will be able to understand how to adapt strategies for different goals or how to rally when performance doesn’t go as expected.

I want to encourage you to pursue a career in SEO, but I caution you against running before you can walk.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal