Ask An SEO: What Is The Threshold Between Keyword Stuffing & Being Optimized? via @sejournal, @rollerblader

In this week’s Ask An SEO, Bre asks:

“What is the threshold between keyword stuffing and being optimized? Is there a magic rule for how often to use your main keyword and related keywords in a 2,000-word page? Should the main keyword be in the Headers AND the body in the same section?”

Great question!

There is no such thing as “being optimized” when it comes to keywords and repetitions. This is similar to looking at “authority” scores for domains. The optimization scores you get are measurements based on what an SEO tool thinks gives a domain trust, and not the actual search engines or LLM and AI systems. The idea of a keyword needing to be repeated is from an SEO concept called keyword density, which is a result of SEO tools.

Each tool would have a different way to say if you repeated a word or phrase enough for it to be “SEO friendly,” and because people trust the tools, they trust that this is a valid ranking factor or signal for a search engine. It is not because the search engines do not pay attention to how many times a word is on a page or in a paragraph, as that doesn’t produce a good experience.

Panda reduced the effectiveness of low-quality, keyword-stuffed content, and Google’s later advancements, BERT and MUM, allowed better understanding of context, relationships between terms, and the overall structure of a page. Google is now far better at interpreting meaning without relying on repeated exact-match keywords.

With that said, keywords are important.

Keywords help to send a signal to a search engine about the topic of the page. And they can be used in headers, within text, as internal links, within title tags, schema, and the URL structure. But worrying about using the keyword for SEO purposes can lead to trouble. So, let’s define keyword stuffing for the sake of this post.

Keyword stuffing is when you force a keyword or keyword phrase into content, headers, and URLs for the sole purpose of SEO.  

By forcing a keyword into a post, or forcing it into headers, you hurt the user experience. Although the search engine will know what you want to rank for, the language won’t feel natural. Instead of worrying about how many times you say the keyword, think about synonyms and other ways to say things that are easy to understand. Many search engines are getting better and better at understanding how topics, words, sentences, and phrases relate to one another. You don’t have to repeat the same words over and over anymore.

If you Google the word “swimsuit,” you’ll likely see it in a couple of title tags, but also see “swimwear.” Now type “bathing suits” in, you’ll likely not see it in a ton of the title tags, but the title tags will say “swimwear” and other synonyms, even though “bathing suits” is a popular name for the same product.

Now try “hairdresser near me,” and you’ll likely not see “hairdresser” in a lot of the results, but you will see “hair salon” and similar types of businesses. This is because search engines produce solutions to problems, and if they understand the page has the solution, you don’t need to keep repeating keywords.

For example, instead of saying “keyword stuffing” in this post, I could say “overusing phrases for SEO.” It means the same thing. Readers on this column will get bored pretty fast if I keep saying keyword stuffing, and by mixing it up, I can keep their interest, and search engines are still able to determine it is one-in-the-same. This also applies to header tags.

I don’t have any solid proof of this, but it seems to work well for our clients and the content we create, and it has worked for more than 10 years. If the main keyword phrase is in the H1 tag, whether it is a menu item or a blog post, we don’t worry about placing it in H2, H3, etc. I won’t be upset if the keyword shows up naturally, as that creates a good UX.

The theory here is that headers carry the theme and topic through the sections below. If the top-level header has the word “blue” in it, I make the assumption that theme “blue” carries through the page and applies to the H2 tag as the H2 is a sub-topic of “blue.” “H2’s” for blue could be “t-shirts” and “shorts.”

If this is true, by having the H1 be “blue” and the H2 be “shorts,” a search engine will know they are “blue shorts,” and I feel very confident users will too. They clicked blue or found a SERP for blue clothing, and they clicked shorts from the menu or found them from scrolling.

If you stuff “blue” into each link and header, it is annoying for the user to see it over and over. But many sites that get penalized will have “blue cargo shorts,” “blue chino shorts,” “blue workout shorts,” etc. It looks nicer to just say the styles of shorts like “cargo” or “chino,” and search engines likely already know they’re blue because you had it in the H tag one level up. You also likely have the “blue” part in breadcrumbs, site structure, product descriptions, etc.

One thing you definitely do not want to do is have a million footer links that match the navigation or are keyword-stuffed. This worked a long time ago, but now it is just spam. It doesn’t benefit the user; it is obvious to search engines you’re doing it for SEO. Sites that stuff keywords tend to use these outdated tactics too, so I want to include it here.

I hope this helps answer your question about overusing specific topics or phrases. Doing this only makes the tool happy; it does not mean you’ll be creating a good UX for users or search engines. If you focus on writing for your consumer and incorporate a keyword or phrase naturally, you’ll likely be rewarded.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Digital PR Or Traditional Link Building, Which Is Better? via @sejournal, @rollerblader

This week’s ask an SEO question is:

“Should SEOs be focusing more on digital PR than traditional link building?”

Digital PR is synonymous with link building at this point as SEO’s needed a new way to package and resell the same service. Actual PR work will always be more valuable than link building because PR, whether digital or traditional, focuses on a core audience of customers and reaching specific demographics. This adds value to a business and drives revenue.

With that said, here’s how I’d define digital PR vs. link building if a client asked what the difference is.

  • Digital PR: Getting brand coverage and citations in media outlets, niche publications, trade journals, niche blogs, and websites that do not allow guest posting, paid links, or unvetted contributors with the goal of building brand awareness and driving traffic from the content.
  • Link Building: Getting links from websites as a way to try and increase SERP rankings. Traffic from the links, sales from the links, etc., are not being tracked, and the quality of the website can be questionable.

Digital PR is always going to be better than link building because you’re treating the technique as a business and not a scheme to try and game the rankings. Link building became a bad practice years ago as links became less relevant, they are still important, so I want to ensure that isn’t taken out of context, and we stopped doing link building completely. Quality content attracts links naturally, including media mentions. When this happens in a natural way, the website will begin rising as the site has a lot of value for users, and search engines can tell when the site is quality.

If you’re building links without evaluating the impact they have traffic and sales-wise, you’re likely setting your site up for failure. Getting a ton of links, just like creating content in mass with AI/LLMs or article spinners, can grow a site quickly. That URL/domain can then burn to the ground equally as fast.

That’s why when we purchase a link, an advertorial, or we’re doing a partnership, we always ask ourselves the following questions:

  • Is there an active audience on this website that is also coming back to the website via branded search for information?
  • Is the audience on this website part of our customer base?
  • Will the article we’re pitching or being featured in be helpful to the user, and is our product or service something that is part of the post naturally vs. being forced?
  • Are we ok with the link being nofollow or sponsored if we’re paying for the inclusion?

If the answer is yes to these four, then we’re good to go with the link. The active audience on the website and people returning by brand name means there is an audience that trusts them for information. If the readership, visitors, or customers are similar or the same demographics as our user base, then it makes sense we’d want to be in front of them where they go for information.

We may have knowledge that is helpful to the user, but if it is not on topic within the post, there is no reason for them to come through and use our services, buy our products, or subscribe to our newsletters. Instead, we’ll wait until there is a fit, so there is a direct “link” between the content we’re contributing, or being an expert on, and our website.

For the last question, our goal is always traffic and customer acquisition, not getting a link. The website owner controls this, and if they want to follow Google’s best practices (which we obviously recommend doing), we will still be happy if they mark it as sponsored or nofollow. This is the most important of the questions. Building links to game the SERPs is a bad idea; building a brand that people search for by name will overpower any link any day of the week. This is always our goal when it comes to Digital PR and link building. Driving that branded search.

So, that begs the question, where do we go for digital PR?

Sources To Get Digital PR Mentions And Links

When we’re about to start a Digital PR campaign, we create lists of the following targets to reach out to.

  • Mass Media: Household names like magazines, news websites, and local media, where everyone in the area, the customers, or the country or world knows them by name. The only stipulation we apply is if they have an active category vs. only a few articles here and there. The active category means it is something interesting enough to their reader base that they’re investing in it, so our customers may be there.
  • Trade Publications: Conferences, associations, and non-profits, as well as industry insiders will have websites and print publications that go out to members. Search Engine Journal could be considered a trade publication for the SEO and PPC industry, same with SEO Roundtable, and some of the communities like Webmaster World. They publish directly relevant content for search engine marketers and have active users, so if I was an SEO service provider or tool, this is where I’d be looking to get featured and ideally links from.
  • Niche Sites and Bloggers: There is no shortage of niche sites and content producers out there. The trick is finding ones that do not publicly allow guest contributions, advertorials, etc., and that do not link out to non-niche websites and content. This includes sites that got hacked and had link injections. Even if their “authority” is zero, there is value if they quality control and all links and mentions are earned.
  • Influencers: Whether it is YouTube, Facebook group leaders, LinkedIn that is crawlable, or other channels, getting coverage from people with subscribers and an active audience can let search engines crawl the link back to your website. It may not boost your rankings, but it drives customers to you and helps with page discoverability if the link gets crawled. LLMs are also citing their content as sources, so there could be value for AIO, too.

Link building is not dead by any means; links still matter. You just don’t need to build them anymore. Focus on quality where an active audience is and where you have a chance at getting traffic and revenue. This is what will move the needle for the long run and help you grow in SERPs that matter.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask an SEO: Is An XML Or HTML Sitemap Better For SEO? via @sejournal, @HelenPollitt1

In this edition of Ask An SEO, we break down a common point of confusion for site owners and technical SEOs:

Do I need both an XML sitemap and an HTML one, and which one is better to use for SEO?

It can be a bit confusing to know whether it’s better to use an XML sitemap or an HTML one for your site. In some instances, neither is needed, and in some, both are helpful. Let’s dive into what they are, what they do, and when to use them.

What Is An XML sitemap?

An XML sitemap is essentially a list of URLs for pages and files on your website that you want the search bots to be able to find and crawl. You can also use the XML sitemap to detail information about the files, like the length of run-time for the video file specified, or the publication date of an article.

It is primarily used for bots. There is little reason why you would want a human visitor to use an XML sitemap. Well, unless they are debugging an SEO issue!

What Is The XML Sitemap Used For?

The purpose of the XML sitemap is to help search bots understand which pages on your website should be crawled, as well as giving them extra information about those pages.

The XML sitemap can help bots identify pages on the site that would otherwise be difficult to find. This can be orphaned pages, those with low internal links, or even pages that have changed recently that you may want to encourage the bots to recrawl.

Best Practices For XML Sitemaps

Most search bots will understand XML sitemaps that follow the sitemaps.org protocol. This protocol defines the necessary location of the XML sitemap on a site, schema it needs to use to be understood by bots, and how to prove ownership of domains in the instance of cross-domain references.

There is typically a limit on the size an XML sitemap can be, and still be parsed by the search bots. This means when building an XML sitemap, you should ensure it is under 50 MB uncompressed, and no more than 50,000 URLs. If your website is larger, you may need multiple XML sitemaps to cover all of the URLs. In that instance, you can use a sitemap index file to help organize your sitemaps into one location.

As the purpose of the XML sitemap is typically to help bots find your crawlable, indexable pages, it is usually necessary to ensure the file references it contains all lead to URLs with 200 server response codes. In most instances, the URLs should be the canonical version, and not contain any crawl or index restrictions.

Things To Be Aware Of With XML Sitemaps

There may be good reasons to go against “best practice” for XML sitemaps. For example, if you are instigating a lot of redirects, you may wish to include the old URLs in an XML sitemap even though they will return a 301 server response code. Adding a new XML sitemap containing those altered URLs can encourage the bots to recrawl them and pick up the redirects sooner than if they were just left to find them via crawling the site. This is especially the case if you have gone to the trouble of removing links to the 301 redirects on the site itself.

What Is An HTML Sitemap?

The HTML sitemap is a set of links to pages within your website. It is usually linked to from somewhere on the site, like the footer, that is easily accessed by users if they are specifically looking for it. However, it doesn’t form part of the main navigation of the site, but more as an accompaniment to it.

What Is An HTML Sitemap Used For?

The idea of the HTML sitemap is to serve as a catch-all for navigation. If a user is struggling to find a page on your site through your main navigation elements, or search, they can go to the HTML sitemap and find links to the most important pages on your site. If your website isn’t that large, you may be able to include links to all of the pages on your site.

The HTML sitemap pulls double duty. Not only does it work as a mega-navigation for humans, but it can also help bots find pages. As bots will follow links on a website (as long as they are followable), it can aid in helping them to find pages that are otherwise not linked to, or are poorly linked to, on the site.

Best Practices For HTML Sitemaps

Unlike the XML sitemap, there is no specific format that an HTML sitemap needs to follow. As the name suggests, it tends to be a simple HTML page that contains hyperlinks to the pages you want users to find through it.

In order to make it usable for bots too, it is important that the links are followable, i.e., they do not have a nofollow attribute on them. It is also prudent to make sure the URLs they link to aren’t disallowed through the robots.txt. It won’t cause you any serious issues if the links aren’t followable for bots; it just stops the sitemap from being useful for bots.

Things To Be Aware Of With HTML Sitemaps

Most users are not going to go to the HTML sitemap as their first port of call on a site. It is important to realize that if a user is going to your HTML sitemap to find a page, it suggests that your primary navigation on the site has failed them. It really should be seen as a last resort to support navigation.

Which Is Better To Use For SEO?

So, which is more important for SEO? Well, neither. That is, it really is dependent on your website and its needs.

For example, a small website with fewer than 20 pages may not have a need for either an XML sitemap or an HTML sitemap. In this instance, if all the pages are linked to well from the main navigation system, the chances are high that users and search bots alike will easily be able to find each of the site’s pages without additional help from sitemaps.

However, if your website has millions of pages, and has a main navigation system that buries links several sub-menus deep, an XML sitemap and an HTML sitemap may be useful.

They both serve different purposes and audiences.

When To Use The XML Sitemap

In practice, having an XML sitemap, or several, can help combat crawl issues. It gives a clear list of all the pages that you want a search bot to crawl and index. An XML sitemap can also be very helpful for debugging crawling issues, as when you upload it to Google Search Console, you will get an alert if there are issues with it or the URLs it contains. It can allow you to narrow in on the indexing status of URLs within the XML sitemap. This can be very useful for large websites that have millions of pages.

Essentially, there isn’t really a reason not to use an XML sitemap, apart from the time and cost of creating and maintaining them. Many content management systems will automatically generate them, which can take away some of the hassle.

Really, if you can have an XML sitemap, you might as well. If, however, it will be too costly or developer-resource intensive, it is not critical if your site is fairly small and the search engines already do a good job of crawling and indexing it.

When To Use The HTML Sitemap

The HTML sitemap is more useful when a website’s navigation isn’t very intuitive, or the search functionality isn’t comprehensive. It serves as a backstop to ensure users can find deeply buried pages. An HTML sitemap is particularly useful for larger sites that have a more complicated internal linking structure. It can also show the relationship between different pages well, depending on the structure of the sitemap. Overall, it is helpful to both users and bots, but is only really needed when the website is suffering from architectural problems or is just exceedingly large.

So, in summary, there is no right or wrong answer to which is more important. It is, however, very dependent on your website. Overall, there’s no harm in including both, but it might not be critical to do so.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Do I Need To Rethink My Content Strategy For LLMs? via @sejournal, @MordyOberstein

For this week’s Ask An SEO, the question asked was:

“Do I need to rethink my content strategy for LLMs and how do I get started with that?”

To answer, I’m going to explain the non-linear journey down the customer journey funnel and where large language models (LLMs) show up.

From rethinking traffic expectations to conducting an audit on sentiment picked up by LLMs, I will talk about why brand identity matters in building the kind of reputation that both users and machines recognize as authoritative.

You can watch this week’s Ask An SEO video and read the full transcript below.

Editor’s note: The following transcript has been edited for clarity, brevity, and adherence to our editorial guidelines.

Don’t Rush Into Overhauling Your Strategy

Off the bat, I strongly advise not to rush into this. I know there’s an extreme amount of noise and buzz and advice out there on social media that you need to rethink your strategy because of LLMs, but this thing is very, very far from settled.

For example, or most notably, AI Mode is still not in traditional search results. When that happens, when Google moves the AI Mode tab from being a tab into the main search results, the whole ecosystem is set for another upheaval, whatever that looks like, because we don’t actually know what that will look like.

I personally think that Google’s Gemini demo (the one they did way, way back, where they showed customized results for certain types of queries with certain answer formats) might be what AI Mode ends up resembling more than what it does right now, which is purely a text-based output that sort of aligns with ChatGPT.

I think Google will differentiate those two products once it moves AI Mode over from the tab into the main search results. So, things are not settled yet. And if you think they’re not. They are not settled yet.

Rethinking Traffic Expectations From LLMs

The other thing I want you to rethink is the traffic expectations from LLMs.

There’s been a lot of talk about citations and traffic – citations and traffic, citations and traffic. I don’t think citations, and therefore traffic, are the main diamond within the LLM ecosystem. I believe mentions are. And I don’t think that’s anything really new, by the way.

Traditionally, the funnel has been messy, and Google’s been talking about that for a long time. Now, you have an LLM that may be a starting point or a step in that messy funnel, but I don’t believe it’s fundamentally different.

I’ll give you an example. If I’m looking for a pair of shoes, I might go to Google and search, [Are these Nike shoes any good?]. I might look at a website, then go to Amazon and look at the actual product.

Then I might go to YouTube, see a review of the product, maybe watch a different one, go back to Amazon, have a look, check Google Shopping to see if it’s cheaper there, and then head back to Amazon to buy it.

Now, you have an LLM thrown into the mix, and that’s really the main difference. Maybe now, the LLM gives me the answer. Or maybe Google gives me the answer. Then I go to Amazon, look at the product, go to Google Shopping to see if it’s cheaper, watch a YouTube review, maybe switch things up a bit, go back to ChatGPT, see if it recommends something different this time, go through the whole process, and eventually buy on Amazon. That’s just me, personally.

It’s important to realize that the paradigm has been around for a while. But if you’re thinking of LLMs as a source of traffic, I highly recommend you don’t. They are not necessarily built for that.

ChatGPT, specifically, is not built for citations or to offer traffic. It’s built to provide answers and to be interactive. You’ll notice you usually don’t get a citation in ChatGPT until the third, fourth, or fifth prompt, whatever it is.

Other LLMs, like AI Mode or Perplexity, are a little bit more citation or link-based, but still, their main commodity is the output, giving you the answer and the ability to explore further.

So, I’m a big believer that the brand mention is far more important than the actual citation, per se. Also, the citation might just be the source of information. If I’m asking, “Are Nike shoes good?” I might get a review from a third-party website, say, the CNET of shoes, and even if I click there, that’s not where I’m going to buy the actual shoe.

So, the traffic in that case isn’t even the desirable outcome for the brand. You want users to end up where they can buy the shoe, not just read a review of it.

The Importance Of Synergy And Context With Content

The next thing is the importance of synergy and context with your content. In order to be successful with LLMs, it’s not about (and I’ve heard this before from people) that the top citations are just the ones that already do well on Google. Not necessarily.

There might be a correlation, but not causation. LLMs are trying to do something different than search engines. They’re trying to synthesize the web to serve as a proxy for the entire web. So, what happens with your content across the web matters way more: How your content is talked about, where it’s talked about, who’s talking about it, and how often it’s mentioned.

That doesn’t mean what’s on your site doesn’t factor in, but it’s weighted differently than with traditional search engines. You need to give the LLM the brand context to realize that you have a digital presence in this area, that you’re someone worth mentioning or citing.

Again, I’d focus more on mentions. That’s not to say citations aren’t important (they are), but mentions tend to carry more weight in this context.

Conducting An Audit

The way to go about this, in my opinion, is to conduct an audit. You need to see how the LLM is talking about the topic.

LLMs are notoriously positive and tend to loop in tiny bits of negative sentiment within otherwise positive answers. I was looking at a recent dataset. I don’t have the formal numbers, but I can tell you they’re built to lean neutral or net positive.

For example, if I ask, “Are the Dodgers good?” LLMs, in this case, I was looking at AI Mode, which will say, “Yes, the Dodgers are good…” and go on about that. If I ask, “Are the Yankees good?” and let’s say two or three weeks ago they weren’t doing well, it won’t say, “Yes, the Yankees are good.” It’ll say, “Well, if you look at this and you look at that, overall you might say the Yankees are good.”

Those are two very different answers. They’re both trying to be positive, but you have to read between the lines to understand how the LLM is actually perceiving the brand and what possible user hesitancies or skepticism are bound up in that. Or where are the gaps?

For instance, if I ask, “Is Gatorade a great drink?” and it answers one way, and then I ask, “Is Powerade a good drink?” and it answers slightly differently, you have to notice why that’s happening. Why does it say, “Gatorade is great,” but “Powerade is loved by many”? You have to dig in and understand the difference.

Running an audit helps you see how the LLM is treating your brand and your market. Is it consistently bringing up the same user points of skepticism or hesitation? If I ask, “What’s a good alternative to Folgers coffee?” AI Mode might say, “If you’re looking for a low-cost coffee, Folgers is an option. But if you want something that tastes better at a similar price, consider Brand X.”

That tells you something: There’s a negative sentiment around Folgers and its taste. That’s something you should be picking up on for your content and brand strategy. The only way to know that is to conduct an audit, read between the lines, and understand what the LLM is saying.

Shaping What LLMs Say About Your Brand

The way to get LLMs to say what you want about your brand is to start with a conscious point of view: What do you want LLMs to say about your brand? Which really comes down to: what do you want people to say about your brand?

And the only way to do that is to have a very strong, focused, and conscious brand identity. Who are you? What are you trying to do? Why is that meaningful? Who are you doing it for? And who is interested in you because of it?

Your brand identity is what gives your brand focus. It gives your content marketing focus, your SEO strategy focus, your audience targeting focus, and your everything focus.

If this is who you are, and that is not who you are, then you’re not going to write content that’s misaligned with who you are and what you’re trying to do. You’re not going to dilute your brand identity by creating content that’s tangential or inconsistent.

If you want third-party sites and people around the web to pick up who you are and what you’re about, to build that presence, you need a very conscious and meaningful understanding of who you are and what you do.

That way, you know where to focus, where not to, what content to create, what not to, and how to reinforce the idea around the web that you are X and relevant for X.

It sounds simple, but developing all of that, making sure it’s aligned, and auditing all the way through to ensure it’s actually happening … that’s easier said than done.

Final Thoughts

LLMs may shift how your customers find information about your brands, but chasing citations and clicks isn’t a solid strategy.

Despite the chaos in AI and search in the age of LLMs, marketers need to stick to the fundamentals: brand identity, trust, and relevance still matter.

Focus on brand identity to build your reputation, ensuring that both users and search engines recognize your brand as an authority in your niche.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How To Manage Stakeholders When An Algorithm Update Hits via @sejournal, @HelenPollitt1

In this edition of Ask An SEO, we address a familiar challenge for marketers:

How do you keep stakeholders from abandoning SEO when algorithm updates cause traffic drops?

This is an all-too-common issue that SEOs will encounter. They have strong plans in place, the buy-in from their leadership, and are making great strides in their organic performance.

When disaster strikes – or, more specifically, a Google algorithm update – all of that goodwill and great results are lost overnight.

What’s worse is, rather than doubling down and trying to recoup lost visibility through data-led SEO work, leadership starts questioning if there is a faster way.

Check The Cause Of The Decline In Traffic

First of all, I would say the most critical step to take when you see a drastic traffic drop is to check that it is definitely the result of an algorithm update.

It’s very easy to ascribe the blame to an update, when it could be caused by a myriad of things. The timing might be suspicious, but before anything, you need to rule out other causes.

Is It Definitely The Result Of The Algorithm Update?

This means checking if there have been any development rollouts, SEO fixes set live, or changes in the SERPs themselves recently. Make sure that the traffic loss is genuine, and not a missing Google Analytics 4 tag. Check that you aren’t seeing the same seasonal dip that you saw this time last year.

Essentially, you need to run down every other possible cause before concluding that it is definitely the result of the algorithm update.

This is important. If it’s not the algorithm update, the loss could be reversible.

Identify Exactly What Has Been Impacted

You are unlikely to have seen rankings and traffic decimated across your entire site. Instead, there are probably certain pages, or topics that you have seen a decline in.

Begin your investigation with an in-depth look into which areas of your site have been impacted.

Look at the webpages that were favored in place of yours. Have they got substantially different content? Are they more topically aligned to the searcher’s intent than yours? Or has the entire SERP changed to favor a different type of SERP feature, or content type?

Why Are These Specific Pages Affected?

What is the commonality between the pages on your site that have seen the rankings and traffic drops? Look for similarities in the templates used, or the technical features of the pages. Investigate if they are all suffering from slow-loading or poor-quality content. If you can spot the common thread between the affected pages, it will help you to identify what needs to be done to recover their rankings.

Is The Impact As Disastrous As It First Appears?

Also, ask yourself if the affected pages are actually important to your business. The impulse might be to remedy what’s gone wrong with them to recover their rankings, but is that the best use of your time? Sometimes, we jump to trying to fix the impact of an algorithm update when, actually, the work would be better spent further improving the pages that are still performing well, because they are the ones that actually make money. If the pages that have lost rankings and traffic were not high-converting ones in the first place, stop and assess. Are the issues they have symptomatic of a wider problem that might affect your revenue-driving pages? If not, maybe don’t worry too much about their visibility loss.

This is good context to have when speaking to your stakeholders about the algorithm impact. Yes, you may have seen traffic go down, but that doesn’t necessarily mean you will see a revenue loss alongside it.

Educate Stakeholders On The Fluctuations In SEO

SEO success is rarely linear. We’ve all seen the fluctuations on the Google Search Console graphs. Do your stakeholders know that, too?

Take time to educate them on how algorithm updates, seasonality, and changing user behavior can affect SEO traffic. Remind them that traffic is not the end goal of SEO; conversions are. Explain to them how algorithm updates are not the end of the world, and just mean there is room for further improvement.

The Best Time To Talk About Algorithm Updates

Of course, this is a lot easier to do before the algorithm update decimates your traffic.

Before you get to the point where panic is ensuing, make sure you have a good process in place to identify the impact of an algorithm update and explain it to your stakeholders. This means that you will take a methodical approach to diagnosing the issues, and not a reactive one.

Let your stakeholders know a reasonable timeframe for that analysis, and that they can’t expect answers on day one of the update announcement. Remind them that the algorithm updates are not stable as they first begin to roll out. They can cause temporary fluctuations that may resolve. You need time and space to consider the cause and remedies of any suspected algorithm update generated traffic loss.

If you have seen this type of impact before, it would be prudent to show your stakeholders where recovery has happened and how. Help them to see that now is the time for further SEO investment, not less.

Reframe The Conversation Back To Long-Term Strategy

There is a very understandable tendency for SEOs to panic in the wake of an algorithm update and try to make quick changes to revert the traffic loss. This isn’t a good idea.

Instead, you need to look at your overarching SEO strategy and locate changes that might have a positive impact over time. For example, if you know that you have a problem with low-quality and duplicate content on your site that you had intended to fix through your SEO strategy, don’t abandon that plan now. Chances are, working to improve the quality of your content on the site will help with regaining that lost traffic.

Resist The Urge To Make Impulsive Changes And Instead Be Methodical About Your Recovery Plans

Don’t throw away your existing plans. You may need to modify them to address specific areas of the site that have been impacted negatively by the update. Carry out intensive investigations into exactly what has happened and to which keywords/topics/pages on your site. Using this information, you can refine your existing strategy.

Any work that is carried out without much thought to the long-term impacts will be unlikely to stand the test of time. You may see a temporary boost, which will placate your stakeholders for a period, but that traffic growth may only be short-lived. For example, buying links to point to the areas of the site most negatively affected by the algorithm update might give you the boost in authority needed to see rankings recover. Over time, though, they are unlikely to carry the same weight, and at worst, may see you further penalized in future algorithm updates or through manual actions.

In Summary

The best time to talk to your stakeholders about the steps to resolve a negative impact from an algorithm update is before it happens. Don’t wait until disaster strikes before communicating your investigation and recovery plans. Instead, let them know ahead of time what to expect and why it isn’t worth a panicked and reactive response.

If you do find your site on the receiving end of a ferocious algorithm update, then take a deep breath. Let your analytical head prevail. Spend time assessing the breadth and depth of the damage, and formulate a plan that yields dividends for the long-term and not just to placate a worried leadership team.

SEO is about the long game. Don’t let your stakeholders lose their nerve just because an algorithm update has happened.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Is It Better To Refresh Content Or Create New Pages? via @sejournal, @rollerblader

This week’s Ask An SEO asks a classic content conundrum:

“Are content refreshes still an effective tactic, or is it better to create new pages altogether?”

Yes, content refreshes are still an effective tactic in cases such as:

  • Product releases where you only continue to sell the new product (new colors or sizes and other variants, but the same product).
  • Data is released and should be updated for the content to be helpful or accurate.
  • New customer or reader questions that are something readers are considering and thinking about.
  • New brands enter the space and others close down, making shopping lists non-helpful if there’s nowhere to shop.
  • New ways to present the content, such as adding bullet lists or tables, or a new video.

With that said, not every page needs to be refreshed. If there is a similar topic that will help the reader but isn’t directly related to an existing header or sub-header, refreshing the page to include the new content could take your page off-topic. This can make it somewhat irrelevant or less helpful for users, which makes it bad for SEO, too. In this case, you’ll want to create a new page.

Once you have the new page created, look for where it can tie into the page you initially wanted to refresh and add an internal link to the new page. This gives the visitor on the page the opportunity to learn more or find the alternative, and then click back to finish reading or shopping. It also helps search engines and crawlers find their way to the new content.

New pages could be a good solution for:

  • Articles and guides where you want to define a topic, strategy, or theory in more detail.
  • Ecommerce experience to bring users to a sub-collection or sub-category, or a product alternative for things that are better for specific needs like size, fit, make, or model, etc.
  • Lead gen pages where you have a few service options and want the person to find the more relevant funnel for their specific needs.

For example, a recipe site that offers a regular, gluten-free, and vegetarian option doesn’t need to stuff all three recipe versions into the main recipe page. They can use an internal link at the top of the main recipe that says, “Click here for the gluten free version,” which helps the user and lets the search engines know they have this solution, too. Clothing brands can talk about tighter or looser fits and recommend a complementary brand if a customer complains about the same thing for a specific product or brand; this can go on product or category and collection pages.

If a client asks if they should refresh or create a new page, we:

  • Recommend refreshing pages when the content begins to slip, does not recover, and we realize that the content is no longer as helpful as it could be. If refreshing the content can keep it on topic and provide a more accurate solution, or a better way for visitors to absorb it.
  • Add new pages when the solution a visitor needs is relevant to the page that we thought about refreshing, but is unique enough from the main topic to justify having its own page. SEO pages aren’t about the keywords; they are about the solution the page provides and how you can uncomplicate it.

Complicated pages are ones with:

  • Tons of jargon that regular consumers won’t understand without doing another search.
  • Multiple sections where the content is hard to scan through and has solutions that are difficult to find.
  • Large bulky paragraphs and no visual breaks, or short choppy paragraphs that don’t have actual solutions, just general statements.
  • Sentences that should instead be lists, headers, tables, and formatted in easier-to-absorb formats.

But knowing what you could do or try doing doesn’t mean anything if you aren’t measuring the results.

How To Measure The Effectiveness

Depending on which one you choose, you’ll have different ways to measure the effectiveness. Here are a few tests we do with clients in these same situations:

The first option is to have a control group with a couple of pages or topics, and we leave them alone as a control group. We then either expand with an equal amount of new content or refresh the same amount. The control group should be about as competitive to rank as the test groups, and from there, we watch over a few months to see if the test group begins climbing or gaining traffic while the control group remains the same.

The second test you can run, assuming you have a reasonably reliable rank tracking tool, is to monitor how many new keywords the content group has in the top 100 positions, top 20 positions, and top 10 positions after a couple of months. If the keywords and phrases have the same user intent as the topic (i.e., shopping vs. how to do something vs. informative and educational), then it looks like you made a good decision. On top of this, look for rich results like increases in People Also Ask and AI overview appearances. This is a sign the new content may be high quality and that you made the right decision.

Summary

I hope this helps answer your question. Refresh when the content is outdated, could be formatted better, or because it is fluffy and doesn’t provide value. Add new pages when there is a solution for a problem or an answer for a question, and it is unique enough from an existing page to justify the page’s existence. SEO keywords and search volumes do not justify this; an actual unique solution does.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: What Are The Most Common Hreflang Mistakes & How Do I Audit Them? via @sejournal, @HelenPollitt1

This week’s Ask An SEO question comes from a reader facing a common challenge when setting up international websites:

“I’m expanding into international markets but I’m confused about hreflang implementation. My rankings are inconsistent across different countries, and I think users are seeing the wrong language versions. What are the most common hreflang mistakes, and how do I audit my international setup?”

This is a great question and an important one for anyone working on websites that cover multiple countries or languages.

The hreflang tag is an HTML attribute that is used to indicate to search engines what language and/or geographical targeting your webpages are intended for. It’s useful for websites that have multiple versions of a page for different languages or regions.

For example, you may have a page dedicated to selling a product to a U.S. audience, and a different one about the same product targeted at a UK audience. Although both these pages would be in English, they may have differences in the terminology used, pricing, and delivery options.

It would be important for the search engines to show the U.S. page in the SERPs for audiences in the US, and the UK page to audiences in the UK. The hreflang tag is used to help the search engines understand the international targeting of those pages.

How To Use An Hreflang Tag

The hreflang tag comprises the “rel=” alternate code, which indicates the page is part of a set of alternates. The “href=” attribute, which tells the search engines the original page, and the “hreflang=” attribute, which details the country and or language the page is targeted to.

It’s important to remember that hreflang tags should be:

  • Self-referencing: Each page that has an hreflang tag should also include a reference to itself as part of the hreflang implementation.
  • Bi-directional: Each page that has an hreflang tag on it should also be included in the hreflang tags of the pages it references, so Page A references itself and Page B, with Page B referencing itself and Page A.
  • Set up in either the XML sitemaps of the sites, or HTML/HTTP headers of the pages: Make sure that you are not only formatting your hreflang tags correctly, but placing them in the code where the search engines will look for them. This means putting them in your XML sitemaps, or in your HTML head (or in the HTTP header of documents like PDFs).

An example of hreflang implementation for the U.S. product page mentioned above would look like:



A hreflang example for the UK page:



Each page includes a self-referencing canonical tag, which hints to search engines that this is the right URL to index for its specific region.

Common Mistakes

Although in theory, hreflang tags should be simple to set up, they are also easy to get wrong. It’s also important to remember that hreflang tags are considered hints, not directives. They are one signal, among several, that helps the search engines determine the relevance of the page to a particular geographic audience.

Don’t forget to make hreflang tags work well for your site; your site also needs to adhere to the basics of internationalization.

Missing Or Incorrect Return Tags

A common issue that can be seen with hreflang tags is that they are not formatted to reference the other pages that are, in turn, referencing them. That means, Page A needs to reference itself and Pages B and C, but Pages B and C need to reference themselves and each other as well as Page A.

As an example the code above shows, if we were to miss the required return tag on the UK page, that points back to the U.S. version.

Invalid Language And Country Codes

Another problem that you may see when auditing your hreflang tag setup is that the country code or language code (in ISO 3166-1 Alpha 2 format) or language code (in ISO 639-1 format) isn’t valid. This means that either a code has been misspelled, like “en-uk” instead of the correct “en-gb,” to indicate the page is targeted towards English speakers in the United Kingdom.

Hreflang Tags Conflict With Other Directives Or Commands

This issue arises when the hreflang tags contradict the canonical tags, noindex tags, or link to non-200 URLs. So, for example, on an English page for a U.S. audience, the hreflang tag might reference itself and the English UK page, but the canonical tag doesn’t point to itself; instead, it points to the English UK page. Alternatively, it might be that the English UK page doesn’t actually resolve to a 200 status URL, and instead is a 404 page. This can cause confusion for the search engines as the tags indicate conflicting information.

Similarly, if the hreflang tag includes URLs that contain a no-index tag, you will confuse the search engines more. They will disregard the hreflang tag link to that page as the no-index tag is a hard-and-fast rule the search engines will respect, whereas the hreflang tag is a suggestion. That means the search engines will respect the noindex tag over the hreflang tag.

Not Including All Language Variants

A further issue may be that there are several pages that are alternatives to the one page, but it does not include all of them within the hreflang tag. By doing that, it does not signify that these other alternative pages should be considered a part of the hreflang set.

Incorrect Use Of “x-default”

The “x-default” is a special hreflang value that tells the search engines that this page is the default version to show when no specific language or region match is appropriate. This x-default page should be a page that is relevant to any user who is not better served by one of the other alternate pages. It is not a required part of the hreflang tag, but if it is used, it should be used correctly. That means making a page that serves as a “catch-all” page the x-default, not a highly localized page. The other rules of hreflang tags also apply here – the x-default URL should be the canonical of itself and should serve a 200 server response.

Conflicting Formats

Although it is perfectly fine to put hreflang tags in either the XML sitemap or in the head of a page, it can cause problems if they are in both locations and conflict with each other. It is a lot simpler to debug hreflang tag issues if they are only present in either the XML sitemap or in the head. It will also confuse the search engines if they are not consistent with each other.

The Issues May Not Just Be With The Hreflang Tags

The key to ensuring the search engines truly understand the intent behind your hreflang tags is that you need to make sure the structure of your website is reflective of them. This means keeping the internationalization signals consistent throughout your site.

Site Structure Doesn’t Make Sense

When internationalizing your website, whether you decide to use sub-folders, sub-domains, or separate websites for each geography or language, make sure you keep it consistent. It can help your users understand your site, but also makes it simpler for the search engines to decode.

Language Is Translated On-the-Fly Client-Side

A not-so-common, but very problematic issue with internationalization can be when pages are automatically translated. For example, when JavaScript swaps out the original text on page load with a translated version, there is a risk that the search engines may not be able to read the translated language and may only see the original language.

It all depends on the mechanism used to render the website. When client-side rendering uses a framework like React.js, it’s best practice to have translated content (alongside hreflang and canonical tags) available in the DOM of the page on first load of the site to make sure the search engines can definitely read it.

Read: Rehydration For Client-Side Or Server-Side Rendering

Webpages Are In Mixed Languages Or Poorly Translated

Sometimes there may be an issue with the translations on the site, which can mean only part of the page is translated. This is common in set-ups where the website is translated automatically. Depending on the method used to translate pages, you may find that the main content is translated, but the supplementary information, like menu labels and footers, is not translated. This can be a poor user experience and also means the search engines may consider the page to be less relevant to the target audience than pages that have been translated fully.

Similarly, if the quality of the translations is poor, then your audience may favor well-translated alternatives above your page.

Auditing International Setup

There are several ways to audit the international setup of your website, and hreflang tags in particular.

Check Google Analytics

Start by checking Google Analytics to see if users from other countries are landing on the wrong localized pages. For example, if you have a UK English page and a U.S. English page but find users from both locations are only visiting the U.S. page, you may have an issue. Use Google Search Console to see if users from the UK are being shown the UK page, or if they are only being shown the U.S. page. This will help you identify if you may have an issue with your internationalization.

Validate Tags On Key Pages Across The Whole Set

Take a sample of your key pages and check a few of the alternate pages in each set. Make sure the hreflang tags are set up correctly, that they are self-referencing, and also reference each of the alternate pages. Ensure that any URLs referenced in the hreflang tags are live URLs and are the canonicals of any set.

Review XML Sitemap

Check your XML sitemaps to see if they contain hreflang tag references. If they do, identify if you also have references within the of the page. Spot check to see if these references agree with each other or have any differences. If there are differences in the XML sitemap’s hreflang tags with the same page’s hreflang tag in the , then you will have problems.

Use Hreflang Testing Tools

There are ways to automate the testing of your hreflang tags. You can use crawling tools, which will likely highlight any issues with the setup of the hreflang tags. Once you have identified there are pages with hreflang tag issues, you can run them through dedicated hreflang checkers like Dentsu’s hreflang Tags Testing Tool or Dan Taylor and SALT Agency’s hreflangtagchecker.

Getting It Right

It is really important to get hreflang tags right on your site to avoid the search engines being confused over which version of a page to show to users in the SERPs. Users respond well to localized content, and getting the international setup of your website is key.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: High Volumes Or High Authority Evergreen Content? via @sejournal, @rollerblader

This week’s Ask an SEO question comes from an anonymous user:

“Should we still publish high volumes of content, or is it better to invest in fewer, higher-authority evergreen pieces?”

Great question! The answer is always higher-authority content, but not always evergreen if your goal is growth and sustainability. If the goal is quick traffic and a churn-and-burn model, high volume makes sense. More content does not mean more SEO. Sustainable SEO traffic via content is providing a proper user experience, which includes making sure the other topics on the site are helpful to a user.

Why High Volumes Of Content Don’t Work Long Term

The idea of creating high volumes of content to get traffic is a strategy where you focus a page on specific keywords and phrases and optimize the page for these phrases. When Google launched BERT and MUM, this strategy (which was already outdated) got its final nail in the coffin. These updates to Google’s systems looked at the associations between the words, hierarchy of the page, and the website to figure out the experience of the page vs. the specific words on the page.

By looking at what the words mean in relation to the headers, the sentences above and below, and the code of the page, like schema, SEO moved away from keywords to what the user will learn from the experience on the page. At the same time, proactive SEOs focused more heavily on vectors and entities; neither of these are new topics.

Back in the mid-2000s, article spinners helped to generate hundreds of keyword-focused pages quickly and easily. With them, you create a spintax (similar to prompts for large language models or LLMs like ChatGPT and Perplexity) with macros for words to be replaced, and the software would create “original” pieces of content. These could then be launched en masse, similar to “programmatic SEO,” which is not new and never a smart idea.

Google and other search engines would surface these and rank the sites until they got caught. Panda did a great job finding article spinner pages and starting to devalue and penalize sites using this technique of mass content creation.

Shortly after, website owners began using PHP with merchant data feeds to create shopping pages for specific products and product groups. This is similar to how media companies produce shopping listicles and product comparisons en masse. The content is unique and original (for that site), but is also being produced en masse, which usually means little to no value. This includes human-written content that is then used for comparisons, even when a user selects to compare the two. In this situation, you’ll want to use canonical links and meta robots properly, but that’s for a different post.

Panda and the core algorithms already had a way to detect “thin pages” from content spinning, so although these product pages worked, especially when combined with spun content or machine-created content describing the products, these sites began getting penalized and devalued.

We’re now seeing AI content being created that is technically unique and “original” via ChatGPT, Perplexity, etc, and it is working for fast traffic gains. But these same sites are getting caught and losing that traffic when they do. It is the same exact pattern as article spinning and PHP + data feed shopping lists and pages.

I could see an argument being made for “fan-out” queries and why having pages focused on specific keywords makes sense. Fan-out queries are AI results that automate “People Also Ask,” “things to know,” and other continuation-rich results in a single output, vs. having separate search features.

If an SEO has experience with actual SEO best practices and knows about UX, they’ll know that the fan-out query is using the context and solutions provided on the pages, not multiple pages focused on similar keywords.

This would be the equivalent of building a unique page for each People Also Ask query or adding them as FAQs on the page. This is not a good UX, and Google knows you’re spamming/overoptimizing. It may work, but when you get caught, you’re in a worse position than when you started.

Each page should have a unique solution, not a unique keyword. When the content is focused on the solution, that solution becomes the keyword phrases, and the same page can show up for multiple different phrases, including different variations in the fan-out result.

If the goal is to get traffic and make money quickly, then abandon or sell the domain, more content is a good strategy. But you won’t have a reliable or long-term income and will always be chasing the next thing.

Evergreen And Non-Evergreen High-Quality Content

Focusing on quality content that provides value to an end user is better for long-term success than high volumes of content. The person will learn from the article, and the content tends to be trustworthy. This type of content is what gets backlinks naturally from high-authority and topically relevant websites.

More importantly, each page on the website will have a clear intent. With sites that focus on volume vs. quality, a lot of the posts and pages will look similar as they’re focused on similar keywords, and users won’t know which article provides the actual solution. This is a bad UX. Or the topics jump around, where one page is about the best perfumes and another is about harnesses for dogs. The trust in the quality of the content is diminished because the site can’t be an expert in everything. And it is clear the content is made up by machines, i.e., fake.

Not all of the content needs to be evergreen, either. Companies and consumer trends happen, and people want timely information mixed in with evergreen topics. If it is product releases, an archive and list of all releases can be helpful.

Fashion sites can easily do the trends from that season. The content is outdated when the next season starts, but the coverage of the trends is something people will look back on and source or use as a reference. This includes fashion students sourcing content for classes, designers looking for inspiration from the past, and mass media covering when things trended and need a reference point.

When evergreen content begins to slide, you can always refresh it. Look back and see what has changed or advanced since the last update, and see how you can improve on it.

  • Look for customer service questions that are not answered.
  • Add updated software features or new colors.
  • See if there are examples that could be made better or clearer.
  • If new regulations are passed locally, state level, or federally, add these in so the content is accurate.
  • Delete content that is outdated, or label it as no longer relevant with the reasons why.
  • Look for sections that may have seemed relevant to the topic, but actually weren’t, and remove them so the content becomes stronger.

There is no shortage of ways to refresh evergreen content and improve on it. These are the pillar pages that can bring consistent traffic over the long run and keep business strong, while the non-evergreen pages do their part, creating ebbs and flows of traffic. With some projects, we don’t produce new content for a month or two at a time because the pillar pages need to be refreshed, and the clients still do well with traffic.

Creating mass amounts of content is a good strategy for people who want to make money fast and do not plan on keeping the domain for a long time. It is good for churn-and-burn sites, domains you rent (if the owner is ok with it), and testing projects. When your goal is to build a sustainable business, high-authority content that provides value is the way to go.

You don’t need to worry about the amount of content with this strategy; you focus on the user experience. When you do this, most channels can grow, including email/SMS, social media, PR, branding, and SEO.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Do You Prioritize Technical SEO Fixes With Limited Dev Support? via @sejournal, @HelenPollitt1

Today’s question cuts to the heart of resource management for SEO:

“How do you prioritize SEO fixes when technical debt keeps piling up and you can’t get dev resources?”

In this article, we’ll look at different prioritization methods and what you can do when you have more work than support to do it.

What Is Technical Debt?

Let’s first take a look at what we consider “technical debt” in SEO.

In development, this term refers to long-standing issues with the website that have grown due to poor management, or “quick-fixes” that have not stood the test of time.

In SEO, we tend to use it to signify any code-based issue that is fundamentally affecting optimization efforts. Typically, these are issues that cannot be fixed by the SEO function alone, but require the input of front or back-end development teams.

So, when the bulk of the work required to fix SEO technical debt falls to other teams, how do you make sure the most important work gets completed?

Prioritization Matrix

In order to prioritize the work, you should look at three core aspects. These are the associated risks of the work not being completed, the potential benefits if it is, and the likelihood of it being implemented.

You may even want to create a matrix that details the overall score of a technical item. Then, use that to prioritize them. Discuss each item with the stakeholders whose teams will need to be involved in its implementation.

Get a better idea of the full scope of the work. From there, you can assign a figure to each category of “risk”, “reward”, and “implementation likelihood.”

Example of a tech SEO prioritization matrixExample of a technical SEO prioritization matrix (Screenshot by author, August 2025)

Risk

Start by calculating the risk to the business if this work isn’t carried out.

Consider aspects like financial risk, i.e., “If we don’t carry out this work then our product pages will be no-indexed. Currently X% of revenue from those product pages is generated by organic traffic and therefore by not completing this work we risk $Y of revenue each year.”

It could also be a risk to the website’s performance. For example, by not fixing a Cumulative Layout Shift (CLS) issue across a group of pages, you may risk conversions as well as rankings.

Get a better idea of the level of risk associated with not fixing that technical debt. Then, assign it a score from 1 (low risk) to 5 (high risk).

Reward

In a similar way, consider the positive implications of carrying out this work. Look at how implementing these fixes could affect revenue, conversion rate, customer satisfaction, or even how it could save money.

For example, “We know that we have a lot of duplicate pages that are not generating revenue but are repeatedly crawled by search bots. We know that every time a bot crawls a page, it costs us $X in server hosting costs; therefore, if we remove those pages we can save the company $Y each year.”

Look primarily at the financial benefits of carrying out the work, but consider also some secondary benefits.

For example, will this work help users complete their goals more easily? Will it aid them in discovering new products or perhaps enjoy a better user experience?

Consider whether the work will benefit other channels beyond organic search. Your technical debt fixes may improve the landing page experience for a group of pages that are used for paid advertising campaigns as well as organic traffic. The benefit of that work may be felt by the paid media team as well as the organic search team.

Assess each of your planned tasks and assign them a value between 1 (low reward) and 5 (high reward).

Implementation Likelihood

When what you are asking is actually an extremely involved, expensive project that the development team doesn’t have the capacity to do, then it won’t get done. This might sound obvious, but often when we are trying to prioritize our technical requests, we think about their impact on our key performance indicators (KPIs), not their strain on the development queue.

Through talking with engineering stakeholders, you may realize that some of your tasks are more complicated than you originally thought. For example, a simple editable content block being added to a page might actually require a whole content management system (CMS) to be built.

Discuss your activities with stakeholders who understand the true requirements of the work, from the teams involved to the hours of work it will take.

From there, you will have a greater understanding of how easy or quick this work will be. Then, you can assign it a score from 1 to 5 of its likelihood of being implemented (1 being highly unlikely and 5 being highly likely).

Prioritization Method

Once you have assigned a score under each of the three categories for all of the technical debt fixes that you want to have carried out, you can prioritize the work based on the sum of all three categories’ scores. The higher the score, the higher a priority that work is.

Additional Ways To Get Dev Resources

Now, just because you have prioritized your fixes, it does not mean your development team will be keen to implement them. There may still be reasons why they are unable to carry out your requests.

Here are some additional suggestions to help you collaborate more closely with your technical team.

Discuss The Work With The Team Leader/Product Manager

The biggest hurdle you may need to overcome is usually sorted through communication. Help your development team understand your request and the benefits of carrying out these technical fixes.

Meet with the tech team lead or product/project manager to discuss the work and how it might fit into their workload.

There may be better ways for you to brief your technical team on the work that saves them “discovery” time and therefore gives more opportunity to work on your other requests.

Invest more time with the development team upfront in creating a brief for them that goes into all of the necessary detail.

Batch Issues In One Ticket

A tip for getting more of your work through the development queue is batching requests into one ticket. If you group together items that need to be worked on across the same group of pages, or template, it will mean developers can make multiple changes at once.

For example, if you want hard-coded page titles changed on your product pages, as well as their header tags and breadcrumbs added, put them all into one ticket. Instead of three separate requests for the development team to schedule in, they now have one larger ticket that can be worked on.

Show The Value Of Your Work To The Development Stakeholders

Show the value of your work to the stakeholders’ goals. So, in the instance of the development team, think about how your suggested fixes might benefit them. Find out what their KPIs or goals are and try to position your work to show the benefits to them.

For example, development teams are often tasked with monitoring and improving the performance of webpages. Part of this may be managing the budget for the server. You may be asking for a group of redirect chains to be removed, but the work isn’t getting prioritized by your development team. Demonstrate the value of removing redirect hops in reducing the load on the server, and therefore server costs.

If you can demonstrate how reducing the technical debt benefits both the SEO team and the development team, it is much more likely to get implemented.

Get Buy-In From Other Teams

On that note, look at getting buy-in from other teams for your work. When the activity you have proposed will not just benefit SEO, but also CRO, or PPC, then it may generate enough support to have it prioritized with the development team.

Show the value of your work beyond just its SEO implications. This can add weight to your request for prioritization.

Summary: Managing Technical Debt Is More Than A To-Do List

Managing technical SEO debt is never as simple as keeping a to-do list and working through it in order. Internal resources are often limited, competing priorities will arise, and most likely, you need the help of teams with very different goals. By weighing risk, reward, and implementation likelihood, you can make more informed decisions about which fixes will have the most impact.

Just as important is how you communicate those priorities. When you position SEO requests in terms of broader business value, you increase the chances of securing development time and cross-team support.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Should Small Brands Go All In On TikTok For Audience Growth? via @sejournal, @MordyOberstein

This week’s Ask An SEO question is about whether small brands should prioritize TikTok over Google to grow their audience:

“I keep hearing that TikTok is a better platform for small brands with an easier route to an audience. Do you think that Google is still relevant, or should I go all in on TikTok?”

The short answer to your question is that you do not want to pigeonhole your business into one channel, no matter the size. There’s also no such thing as an “easier” way. They are all hard.

I’m going to get the obvious out of the way so we can get to something beyond the usual answers to this question.

Your brand should be where your audience is.

Great, now that we didn’t spend four paragraphs saying the same thing that’s been said 100 times before, let me tell you something you want to consider beyond “be where your audience is.”

It’s Not About Channel, It’s About Traction

I have a lot of opinions here, so let me just “channel” my inner Big Lebowski and preface this with … this is just my opinion, man.

Stop thinking about channels. That’s way down the funnel (yet marketers make channels the seminal question all the time).

Start thinking about traction. How do you generate the most traction?

When I say “traction,” what I really mean is how to start resonating with your audience so that the “chatter” and momentum about who you are compound so that new doors of opportunity open up.

The answer to that question is not, “We will focus on TikTok.”

The answer is also not, “We will focus on Google.”

The answer is also not, “We will focus on YouTube.”

I could go on.

Now, there is another side to this: resources and operations. The question is, how do you balance traction with the amount of resources you have?

For smaller brands, I would think about: What can you do to gain traction that bigger brands have a hard time with?

For example, big brands have a very hard time with video content. They have all sorts of production standards, operations, and a litany of people who have a say, who shouldn’t even be in sniffing distance of having a say.

They can’t simply turn on their phone, record a video, and share something of value.

You can.

Does that mean you should focus on TikTok?

Nope.

It means you should think about what you can put out there that would resonate and help your audience, and does that work for the format?

If so, you may want to go with video shorts. I’m not sure why you would limit that to just TikTok.

Also, if your age demographic is not on TikTok, don’t do that. (“Being where your audience is” is a fundamental truth. Although I think the question is more about being in tune with your audience overall than “being where they are.” If you’re attuned to your audience, then you would know where they are and where to go just naturally.)

I’ll throw another example at you.

Big brands have a hard time communicating with honesty, transparency, and a basic level of authenticity. As a result, a lot of their content is “stale,” at best.

In this instance, trying to generate traction and even traffic by writing more authentic content that speaks to your audience, and not at them, seems quite reasonable.

In other words, the question is, “What resonates with your audience and what opportunities can you seize that bigger brands can’t?”

It’s a framework. It’s what resonates + what resources do you have + what vulnerabilities do the bigger brands in your vertical have that you can capitalize on.

There’s no one-size-fits-all answer to that. Forget your audience for a second, where are the vulnerabilities of the bigger brands in your space?

They might be super-focused on TikTok and have figured out all of the production hurdles I mentioned earlier, but they might not be focused on text-based content in a healthy way, if at all.

Is TikTok “easier” in that scenario?

Maybe not.

Don’t Pigeonhole Yourself

Every platform has its idiosyncrasies. One of the problems with going all-in on a platform is that your brand adopts those idiosyncrasies.

If I were all about Google traffic, my brand might sound like (as too many do) “SEO content.” Across the board. It all seeps through.

The problem with “channels” to me is that it produces a mindset of “optimizing” for the channel. When that happens – which inevitably it does (just look at all the SEO content on the web) – the only way out is very painful.

While you might start with the right mindset, it’s very easy to lose your brand’s actual voice along the way.

That can pigeonhole your brand’s ability to maneuver as time goes on.

For starters, one day what you had on TikTok may no longer exist (I’m just using TikTok as an example).

Your audience may evolve and grow older with you, and move to other forms of content consumption. The TikTok algorithm may gobble up your reach one day. Who knows.

What I am saying is, it is possible to wake up one day and what you had with a specific channel doesn’t exist anymore.

That’s a real problem.

That very real problem gets compounded if your overarching brand voice is impacted by your channel approach. Which it often is.

Now, you have to reinvent the wheel, so to speak.

Now, you have to adjust your channel approach (and never leave all your eggs in one basket), and you have to find your actual voice again.

This whole time, you were focused on speaking to a channel and what the channel demanded (i.e., the algorithm) and not your audience.

All of this is why I recommend a “traction-focused” approach. If you’re focused on traction, then this whole time, you’ve been building yourself up to become less and less reliant on the channel.

If you’re focused on traction, which inherently focuses on resonance, people start to come to you. You become a destination that people seek out, or, at a minimum, are familiar with.

That leaves you less vulnerable to changes within a specific channel.

It also helps you perform better across other channels. When you resonate and people start to recognize you, it makes performing easier (and less costly).

Let’s play it out.

You start creating material for TikTok, but you do it with a traction, not a channel mindset.

The content you produce starts to resonate. People start talking about you, tagging you on social, mentioning you in articles, etc.

All of that would, in theory, help your web content become more visible within organic search and your brand overall more visible in large language models (LLMs), no?

Let’s play it out even more.

One day, TikTok shuts down.

Now, you have to switch channels (old TV reference).

If you focused more on traction:

  1. You should have more direct traffic or branded search traffic than you had when you started your “TikTok-ing.”
  2. You should have more cache to rank better if you decide to create content for Google Search (just as an example).

The opposite is true as well. If Google shut down one day, and you had to move to TikTok, you would:

  1. Have more direct traffic than when you started to focus on Google.
  2. Have more cache and awareness to start building a following on TikTok.

It’s all one song.

Changing The Channel

I feel like, and this is a bit of a controversial take (for some reason), the less you “focus” on channels, the better.

The more you see a channel as less of a strategy and more of a way to actualize the traction you’re looking to create, the better off you’ll be.

You’ll also have an easier time answering questions like “Which channel is better?”.

To reiterate:

  • Don’t lose your brand voice to any channel.
  • Build up traction (resonance) so that when a channel changes, you’re not stuck.
  • Build up traction so that you already have cache when pivoting to the new channel.
  • It’s better to be a destination than anything.
  • All of this depends on your vertical, your resources, your competition, and most importantly, what your audience needs from you.

The moment you think beyond “channels” is the moment you start operating with a bit more clarity about channels. (It’s a kind of “there is no spoon” sort of thing.)

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal