Ask An SEO: How To Manage Stakeholders When An Algorithm Update Hits via @sejournal, @HelenPollitt1

In this edition of Ask An SEO, we address a familiar challenge for marketers:

How do you keep stakeholders from abandoning SEO when algorithm updates cause traffic drops?

This is an all-too-common issue that SEOs will encounter. They have strong plans in place, the buy-in from their leadership, and are making great strides in their organic performance.

When disaster strikes – or, more specifically, a Google algorithm update – all of that goodwill and great results are lost overnight.

What’s worse is, rather than doubling down and trying to recoup lost visibility through data-led SEO work, leadership starts questioning if there is a faster way.

Check The Cause Of The Decline In Traffic

First of all, I would say the most critical step to take when you see a drastic traffic drop is to check that it is definitely the result of an algorithm update.

It’s very easy to ascribe the blame to an update, when it could be caused by a myriad of things. The timing might be suspicious, but before anything, you need to rule out other causes.

Is It Definitely The Result Of The Algorithm Update?

This means checking if there have been any development rollouts, SEO fixes set live, or changes in the SERPs themselves recently. Make sure that the traffic loss is genuine, and not a missing Google Analytics 4 tag. Check that you aren’t seeing the same seasonal dip that you saw this time last year.

Essentially, you need to run down every other possible cause before concluding that it is definitely the result of the algorithm update.

This is important. If it’s not the algorithm update, the loss could be reversible.

Identify Exactly What Has Been Impacted

You are unlikely to have seen rankings and traffic decimated across your entire site. Instead, there are probably certain pages, or topics that you have seen a decline in.

Begin your investigation with an in-depth look into which areas of your site have been impacted.

Look at the webpages that were favored in place of yours. Have they got substantially different content? Are they more topically aligned to the searcher’s intent than yours? Or has the entire SERP changed to favor a different type of SERP feature, or content type?

Why Are These Specific Pages Affected?

What is the commonality between the pages on your site that have seen the rankings and traffic drops? Look for similarities in the templates used, or the technical features of the pages. Investigate if they are all suffering from slow-loading or poor-quality content. If you can spot the common thread between the affected pages, it will help you to identify what needs to be done to recover their rankings.

Is The Impact As Disastrous As It First Appears?

Also, ask yourself if the affected pages are actually important to your business. The impulse might be to remedy what’s gone wrong with them to recover their rankings, but is that the best use of your time? Sometimes, we jump to trying to fix the impact of an algorithm update when, actually, the work would be better spent further improving the pages that are still performing well, because they are the ones that actually make money. If the pages that have lost rankings and traffic were not high-converting ones in the first place, stop and assess. Are the issues they have symptomatic of a wider problem that might affect your revenue-driving pages? If not, maybe don’t worry too much about their visibility loss.

This is good context to have when speaking to your stakeholders about the algorithm impact. Yes, you may have seen traffic go down, but that doesn’t necessarily mean you will see a revenue loss alongside it.

Educate Stakeholders On The Fluctuations In SEO

SEO success is rarely linear. We’ve all seen the fluctuations on the Google Search Console graphs. Do your stakeholders know that, too?

Take time to educate them on how algorithm updates, seasonality, and changing user behavior can affect SEO traffic. Remind them that traffic is not the end goal of SEO; conversions are. Explain to them how algorithm updates are not the end of the world, and just mean there is room for further improvement.

The Best Time To Talk About Algorithm Updates

Of course, this is a lot easier to do before the algorithm update decimates your traffic.

Before you get to the point where panic is ensuing, make sure you have a good process in place to identify the impact of an algorithm update and explain it to your stakeholders. This means that you will take a methodical approach to diagnosing the issues, and not a reactive one.

Let your stakeholders know a reasonable timeframe for that analysis, and that they can’t expect answers on day one of the update announcement. Remind them that the algorithm updates are not stable as they first begin to roll out. They can cause temporary fluctuations that may resolve. You need time and space to consider the cause and remedies of any suspected algorithm update generated traffic loss.

If you have seen this type of impact before, it would be prudent to show your stakeholders where recovery has happened and how. Help them to see that now is the time for further SEO investment, not less.

Reframe The Conversation Back To Long-Term Strategy

There is a very understandable tendency for SEOs to panic in the wake of an algorithm update and try to make quick changes to revert the traffic loss. This isn’t a good idea.

Instead, you need to look at your overarching SEO strategy and locate changes that might have a positive impact over time. For example, if you know that you have a problem with low-quality and duplicate content on your site that you had intended to fix through your SEO strategy, don’t abandon that plan now. Chances are, working to improve the quality of your content on the site will help with regaining that lost traffic.

Resist The Urge To Make Impulsive Changes And Instead Be Methodical About Your Recovery Plans

Don’t throw away your existing plans. You may need to modify them to address specific areas of the site that have been impacted negatively by the update. Carry out intensive investigations into exactly what has happened and to which keywords/topics/pages on your site. Using this information, you can refine your existing strategy.

Any work that is carried out without much thought to the long-term impacts will be unlikely to stand the test of time. You may see a temporary boost, which will placate your stakeholders for a period, but that traffic growth may only be short-lived. For example, buying links to point to the areas of the site most negatively affected by the algorithm update might give you the boost in authority needed to see rankings recover. Over time, though, they are unlikely to carry the same weight, and at worst, may see you further penalized in future algorithm updates or through manual actions.

In Summary

The best time to talk to your stakeholders about the steps to resolve a negative impact from an algorithm update is before it happens. Don’t wait until disaster strikes before communicating your investigation and recovery plans. Instead, let them know ahead of time what to expect and why it isn’t worth a panicked and reactive response.

If you do find your site on the receiving end of a ferocious algorithm update, then take a deep breath. Let your analytical head prevail. Spend time assessing the breadth and depth of the damage, and formulate a plan that yields dividends for the long-term and not just to placate a worried leadership team.

SEO is about the long game. Don’t let your stakeholders lose their nerve just because an algorithm update has happened.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Is It Better To Refresh Content Or Create New Pages? via @sejournal, @rollerblader

This week’s Ask An SEO asks a classic content conundrum:

“Are content refreshes still an effective tactic, or is it better to create new pages altogether?”

Yes, content refreshes are still an effective tactic in cases such as:

  • Product releases where you only continue to sell the new product (new colors or sizes and other variants, but the same product).
  • Data is released and should be updated for the content to be helpful or accurate.
  • New customer or reader questions that are something readers are considering and thinking about.
  • New brands enter the space and others close down, making shopping lists non-helpful if there’s nowhere to shop.
  • New ways to present the content, such as adding bullet lists or tables, or a new video.

With that said, not every page needs to be refreshed. If there is a similar topic that will help the reader but isn’t directly related to an existing header or sub-header, refreshing the page to include the new content could take your page off-topic. This can make it somewhat irrelevant or less helpful for users, which makes it bad for SEO, too. In this case, you’ll want to create a new page.

Once you have the new page created, look for where it can tie into the page you initially wanted to refresh and add an internal link to the new page. This gives the visitor on the page the opportunity to learn more or find the alternative, and then click back to finish reading or shopping. It also helps search engines and crawlers find their way to the new content.

New pages could be a good solution for:

  • Articles and guides where you want to define a topic, strategy, or theory in more detail.
  • Ecommerce experience to bring users to a sub-collection or sub-category, or a product alternative for things that are better for specific needs like size, fit, make, or model, etc.
  • Lead gen pages where you have a few service options and want the person to find the more relevant funnel for their specific needs.

For example, a recipe site that offers a regular, gluten-free, and vegetarian option doesn’t need to stuff all three recipe versions into the main recipe page. They can use an internal link at the top of the main recipe that says, “Click here for the gluten free version,” which helps the user and lets the search engines know they have this solution, too. Clothing brands can talk about tighter or looser fits and recommend a complementary brand if a customer complains about the same thing for a specific product or brand; this can go on product or category and collection pages.

If a client asks if they should refresh or create a new page, we:

  • Recommend refreshing pages when the content begins to slip, does not recover, and we realize that the content is no longer as helpful as it could be. If refreshing the content can keep it on topic and provide a more accurate solution, or a better way for visitors to absorb it.
  • Add new pages when the solution a visitor needs is relevant to the page that we thought about refreshing, but is unique enough from the main topic to justify having its own page. SEO pages aren’t about the keywords; they are about the solution the page provides and how you can uncomplicate it.

Complicated pages are ones with:

  • Tons of jargon that regular consumers won’t understand without doing another search.
  • Multiple sections where the content is hard to scan through and has solutions that are difficult to find.
  • Large bulky paragraphs and no visual breaks, or short choppy paragraphs that don’t have actual solutions, just general statements.
  • Sentences that should instead be lists, headers, tables, and formatted in easier-to-absorb formats.

But knowing what you could do or try doing doesn’t mean anything if you aren’t measuring the results.

How To Measure The Effectiveness

Depending on which one you choose, you’ll have different ways to measure the effectiveness. Here are a few tests we do with clients in these same situations:

The first option is to have a control group with a couple of pages or topics, and we leave them alone as a control group. We then either expand with an equal amount of new content or refresh the same amount. The control group should be about as competitive to rank as the test groups, and from there, we watch over a few months to see if the test group begins climbing or gaining traffic while the control group remains the same.

The second test you can run, assuming you have a reasonably reliable rank tracking tool, is to monitor how many new keywords the content group has in the top 100 positions, top 20 positions, and top 10 positions after a couple of months. If the keywords and phrases have the same user intent as the topic (i.e., shopping vs. how to do something vs. informative and educational), then it looks like you made a good decision. On top of this, look for rich results like increases in People Also Ask and AI overview appearances. This is a sign the new content may be high quality and that you made the right decision.

Summary

I hope this helps answer your question. Refresh when the content is outdated, could be formatted better, or because it is fluffy and doesn’t provide value. Add new pages when there is a solution for a problem or an answer for a question, and it is unique enough from an existing page to justify the page’s existence. SEO keywords and search volumes do not justify this; an actual unique solution does.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: What Are The Most Common Hreflang Mistakes & How Do I Audit Them? via @sejournal, @HelenPollitt1

This week’s Ask An SEO question comes from a reader facing a common challenge when setting up international websites:

“I’m expanding into international markets but I’m confused about hreflang implementation. My rankings are inconsistent across different countries, and I think users are seeing the wrong language versions. What are the most common hreflang mistakes, and how do I audit my international setup?”

This is a great question and an important one for anyone working on websites that cover multiple countries or languages.

The hreflang tag is an HTML attribute that is used to indicate to search engines what language and/or geographical targeting your webpages are intended for. It’s useful for websites that have multiple versions of a page for different languages or regions.

For example, you may have a page dedicated to selling a product to a U.S. audience, and a different one about the same product targeted at a UK audience. Although both these pages would be in English, they may have differences in the terminology used, pricing, and delivery options.

It would be important for the search engines to show the U.S. page in the SERPs for audiences in the US, and the UK page to audiences in the UK. The hreflang tag is used to help the search engines understand the international targeting of those pages.

How To Use An Hreflang Tag

The hreflang tag comprises the “rel=” alternate code, which indicates the page is part of a set of alternates. The “href=” attribute, which tells the search engines the original page, and the “hreflang=” attribute, which details the country and or language the page is targeted to.

It’s important to remember that hreflang tags should be:

  • Self-referencing: Each page that has an hreflang tag should also include a reference to itself as part of the hreflang implementation.
  • Bi-directional: Each page that has an hreflang tag on it should also be included in the hreflang tags of the pages it references, so Page A references itself and Page B, with Page B referencing itself and Page A.
  • Set up in either the XML sitemaps of the sites, or HTML/HTTP headers of the pages: Make sure that you are not only formatting your hreflang tags correctly, but placing them in the code where the search engines will look for them. This means putting them in your XML sitemaps, or in your HTML head (or in the HTTP header of documents like PDFs).

An example of hreflang implementation for the U.S. product page mentioned above would look like:



A hreflang example for the UK page:



Each page includes a self-referencing canonical tag, which hints to search engines that this is the right URL to index for its specific region.

Common Mistakes

Although in theory, hreflang tags should be simple to set up, they are also easy to get wrong. It’s also important to remember that hreflang tags are considered hints, not directives. They are one signal, among several, that helps the search engines determine the relevance of the page to a particular geographic audience.

Don’t forget to make hreflang tags work well for your site; your site also needs to adhere to the basics of internationalization.

Missing Or Incorrect Return Tags

A common issue that can be seen with hreflang tags is that they are not formatted to reference the other pages that are, in turn, referencing them. That means, Page A needs to reference itself and Pages B and C, but Pages B and C need to reference themselves and each other as well as Page A.

As an example the code above shows, if we were to miss the required return tag on the UK page, that points back to the U.S. version.

Invalid Language And Country Codes

Another problem that you may see when auditing your hreflang tag setup is that the country code or language code (in ISO 3166-1 Alpha 2 format) or language code (in ISO 639-1 format) isn’t valid. This means that either a code has been misspelled, like “en-uk” instead of the correct “en-gb,” to indicate the page is targeted towards English speakers in the United Kingdom.

Hreflang Tags Conflict With Other Directives Or Commands

This issue arises when the hreflang tags contradict the canonical tags, noindex tags, or link to non-200 URLs. So, for example, on an English page for a U.S. audience, the hreflang tag might reference itself and the English UK page, but the canonical tag doesn’t point to itself; instead, it points to the English UK page. Alternatively, it might be that the English UK page doesn’t actually resolve to a 200 status URL, and instead is a 404 page. This can cause confusion for the search engines as the tags indicate conflicting information.

Similarly, if the hreflang tag includes URLs that contain a no-index tag, you will confuse the search engines more. They will disregard the hreflang tag link to that page as the no-index tag is a hard-and-fast rule the search engines will respect, whereas the hreflang tag is a suggestion. That means the search engines will respect the noindex tag over the hreflang tag.

Not Including All Language Variants

A further issue may be that there are several pages that are alternatives to the one page, but it does not include all of them within the hreflang tag. By doing that, it does not signify that these other alternative pages should be considered a part of the hreflang set.

Incorrect Use Of “x-default”

The “x-default” is a special hreflang value that tells the search engines that this page is the default version to show when no specific language or region match is appropriate. This x-default page should be a page that is relevant to any user who is not better served by one of the other alternate pages. It is not a required part of the hreflang tag, but if it is used, it should be used correctly. That means making a page that serves as a “catch-all” page the x-default, not a highly localized page. The other rules of hreflang tags also apply here – the x-default URL should be the canonical of itself and should serve a 200 server response.

Conflicting Formats

Although it is perfectly fine to put hreflang tags in either the XML sitemap or in the head of a page, it can cause problems if they are in both locations and conflict with each other. It is a lot simpler to debug hreflang tag issues if they are only present in either the XML sitemap or in the head. It will also confuse the search engines if they are not consistent with each other.

The Issues May Not Just Be With The Hreflang Tags

The key to ensuring the search engines truly understand the intent behind your hreflang tags is that you need to make sure the structure of your website is reflective of them. This means keeping the internationalization signals consistent throughout your site.

Site Structure Doesn’t Make Sense

When internationalizing your website, whether you decide to use sub-folders, sub-domains, or separate websites for each geography or language, make sure you keep it consistent. It can help your users understand your site, but also makes it simpler for the search engines to decode.

Language Is Translated On-the-Fly Client-Side

A not-so-common, but very problematic issue with internationalization can be when pages are automatically translated. For example, when JavaScript swaps out the original text on page load with a translated version, there is a risk that the search engines may not be able to read the translated language and may only see the original language.

It all depends on the mechanism used to render the website. When client-side rendering uses a framework like React.js, it’s best practice to have translated content (alongside hreflang and canonical tags) available in the DOM of the page on first load of the site to make sure the search engines can definitely read it.

Read: Rehydration For Client-Side Or Server-Side Rendering

Webpages Are In Mixed Languages Or Poorly Translated

Sometimes there may be an issue with the translations on the site, which can mean only part of the page is translated. This is common in set-ups where the website is translated automatically. Depending on the method used to translate pages, you may find that the main content is translated, but the supplementary information, like menu labels and footers, is not translated. This can be a poor user experience and also means the search engines may consider the page to be less relevant to the target audience than pages that have been translated fully.

Similarly, if the quality of the translations is poor, then your audience may favor well-translated alternatives above your page.

Auditing International Setup

There are several ways to audit the international setup of your website, and hreflang tags in particular.

Check Google Analytics

Start by checking Google Analytics to see if users from other countries are landing on the wrong localized pages. For example, if you have a UK English page and a U.S. English page but find users from both locations are only visiting the U.S. page, you may have an issue. Use Google Search Console to see if users from the UK are being shown the UK page, or if they are only being shown the U.S. page. This will help you identify if you may have an issue with your internationalization.

Validate Tags On Key Pages Across The Whole Set

Take a sample of your key pages and check a few of the alternate pages in each set. Make sure the hreflang tags are set up correctly, that they are self-referencing, and also reference each of the alternate pages. Ensure that any URLs referenced in the hreflang tags are live URLs and are the canonicals of any set.

Review XML Sitemap

Check your XML sitemaps to see if they contain hreflang tag references. If they do, identify if you also have references within the of the page. Spot check to see if these references agree with each other or have any differences. If there are differences in the XML sitemap’s hreflang tags with the same page’s hreflang tag in the , then you will have problems.

Use Hreflang Testing Tools

There are ways to automate the testing of your hreflang tags. You can use crawling tools, which will likely highlight any issues with the setup of the hreflang tags. Once you have identified there are pages with hreflang tag issues, you can run them through dedicated hreflang checkers like Dentsu’s hreflang Tags Testing Tool or Dan Taylor and SALT Agency’s hreflangtagchecker.

Getting It Right

It is really important to get hreflang tags right on your site to avoid the search engines being confused over which version of a page to show to users in the SERPs. Users respond well to localized content, and getting the international setup of your website is key.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: High Volumes Or High Authority Evergreen Content? via @sejournal, @rollerblader

This week’s Ask an SEO question comes from an anonymous user:

“Should we still publish high volumes of content, or is it better to invest in fewer, higher-authority evergreen pieces?”

Great question! The answer is always higher-authority content, but not always evergreen if your goal is growth and sustainability. If the goal is quick traffic and a churn-and-burn model, high volume makes sense. More content does not mean more SEO. Sustainable SEO traffic via content is providing a proper user experience, which includes making sure the other topics on the site are helpful to a user.

Why High Volumes Of Content Don’t Work Long Term

The idea of creating high volumes of content to get traffic is a strategy where you focus a page on specific keywords and phrases and optimize the page for these phrases. When Google launched BERT and MUM, this strategy (which was already outdated) got its final nail in the coffin. These updates to Google’s systems looked at the associations between the words, hierarchy of the page, and the website to figure out the experience of the page vs. the specific words on the page.

By looking at what the words mean in relation to the headers, the sentences above and below, and the code of the page, like schema, SEO moved away from keywords to what the user will learn from the experience on the page. At the same time, proactive SEOs focused more heavily on vectors and entities; neither of these are new topics.

Back in the mid-2000s, article spinners helped to generate hundreds of keyword-focused pages quickly and easily. With them, you create a spintax (similar to prompts for large language models or LLMs like ChatGPT and Perplexity) with macros for words to be replaced, and the software would create “original” pieces of content. These could then be launched en masse, similar to “programmatic SEO,” which is not new and never a smart idea.

Google and other search engines would surface these and rank the sites until they got caught. Panda did a great job finding article spinner pages and starting to devalue and penalize sites using this technique of mass content creation.

Shortly after, website owners began using PHP with merchant data feeds to create shopping pages for specific products and product groups. This is similar to how media companies produce shopping listicles and product comparisons en masse. The content is unique and original (for that site), but is also being produced en masse, which usually means little to no value. This includes human-written content that is then used for comparisons, even when a user selects to compare the two. In this situation, you’ll want to use canonical links and meta robots properly, but that’s for a different post.

Panda and the core algorithms already had a way to detect “thin pages” from content spinning, so although these product pages worked, especially when combined with spun content or machine-created content describing the products, these sites began getting penalized and devalued.

We’re now seeing AI content being created that is technically unique and “original” via ChatGPT, Perplexity, etc, and it is working for fast traffic gains. But these same sites are getting caught and losing that traffic when they do. It is the same exact pattern as article spinning and PHP + data feed shopping lists and pages.

I could see an argument being made for “fan-out” queries and why having pages focused on specific keywords makes sense. Fan-out queries are AI results that automate “People Also Ask,” “things to know,” and other continuation-rich results in a single output, vs. having separate search features.

If an SEO has experience with actual SEO best practices and knows about UX, they’ll know that the fan-out query is using the context and solutions provided on the pages, not multiple pages focused on similar keywords.

This would be the equivalent of building a unique page for each People Also Ask query or adding them as FAQs on the page. This is not a good UX, and Google knows you’re spamming/overoptimizing. It may work, but when you get caught, you’re in a worse position than when you started.

Each page should have a unique solution, not a unique keyword. When the content is focused on the solution, that solution becomes the keyword phrases, and the same page can show up for multiple different phrases, including different variations in the fan-out result.

If the goal is to get traffic and make money quickly, then abandon or sell the domain, more content is a good strategy. But you won’t have a reliable or long-term income and will always be chasing the next thing.

Evergreen And Non-Evergreen High-Quality Content

Focusing on quality content that provides value to an end user is better for long-term success than high volumes of content. The person will learn from the article, and the content tends to be trustworthy. This type of content is what gets backlinks naturally from high-authority and topically relevant websites.

More importantly, each page on the website will have a clear intent. With sites that focus on volume vs. quality, a lot of the posts and pages will look similar as they’re focused on similar keywords, and users won’t know which article provides the actual solution. This is a bad UX. Or the topics jump around, where one page is about the best perfumes and another is about harnesses for dogs. The trust in the quality of the content is diminished because the site can’t be an expert in everything. And it is clear the content is made up by machines, i.e., fake.

Not all of the content needs to be evergreen, either. Companies and consumer trends happen, and people want timely information mixed in with evergreen topics. If it is product releases, an archive and list of all releases can be helpful.

Fashion sites can easily do the trends from that season. The content is outdated when the next season starts, but the coverage of the trends is something people will look back on and source or use as a reference. This includes fashion students sourcing content for classes, designers looking for inspiration from the past, and mass media covering when things trended and need a reference point.

When evergreen content begins to slide, you can always refresh it. Look back and see what has changed or advanced since the last update, and see how you can improve on it.

  • Look for customer service questions that are not answered.
  • Add updated software features or new colors.
  • See if there are examples that could be made better or clearer.
  • If new regulations are passed locally, state level, or federally, add these in so the content is accurate.
  • Delete content that is outdated, or label it as no longer relevant with the reasons why.
  • Look for sections that may have seemed relevant to the topic, but actually weren’t, and remove them so the content becomes stronger.

There is no shortage of ways to refresh evergreen content and improve on it. These are the pillar pages that can bring consistent traffic over the long run and keep business strong, while the non-evergreen pages do their part, creating ebbs and flows of traffic. With some projects, we don’t produce new content for a month or two at a time because the pillar pages need to be refreshed, and the clients still do well with traffic.

Creating mass amounts of content is a good strategy for people who want to make money fast and do not plan on keeping the domain for a long time. It is good for churn-and-burn sites, domains you rent (if the owner is ok with it), and testing projects. When your goal is to build a sustainable business, high-authority content that provides value is the way to go.

You don’t need to worry about the amount of content with this strategy; you focus on the user experience. When you do this, most channels can grow, including email/SMS, social media, PR, branding, and SEO.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Do You Prioritize Technical SEO Fixes With Limited Dev Support? via @sejournal, @HelenPollitt1

Today’s question cuts to the heart of resource management for SEO:

“How do you prioritize SEO fixes when technical debt keeps piling up and you can’t get dev resources?”

In this article, we’ll look at different prioritization methods and what you can do when you have more work than support to do it.

What Is Technical Debt?

Let’s first take a look at what we consider “technical debt” in SEO.

In development, this term refers to long-standing issues with the website that have grown due to poor management, or “quick-fixes” that have not stood the test of time.

In SEO, we tend to use it to signify any code-based issue that is fundamentally affecting optimization efforts. Typically, these are issues that cannot be fixed by the SEO function alone, but require the input of front or back-end development teams.

So, when the bulk of the work required to fix SEO technical debt falls to other teams, how do you make sure the most important work gets completed?

Prioritization Matrix

In order to prioritize the work, you should look at three core aspects. These are the associated risks of the work not being completed, the potential benefits if it is, and the likelihood of it being implemented.

You may even want to create a matrix that details the overall score of a technical item. Then, use that to prioritize them. Discuss each item with the stakeholders whose teams will need to be involved in its implementation.

Get a better idea of the full scope of the work. From there, you can assign a figure to each category of “risk”, “reward”, and “implementation likelihood.”

Example of a tech SEO prioritization matrixExample of a technical SEO prioritization matrix (Screenshot by author, August 2025)

Risk

Start by calculating the risk to the business if this work isn’t carried out.

Consider aspects like financial risk, i.e., “If we don’t carry out this work then our product pages will be no-indexed. Currently X% of revenue from those product pages is generated by organic traffic and therefore by not completing this work we risk $Y of revenue each year.”

It could also be a risk to the website’s performance. For example, by not fixing a Cumulative Layout Shift (CLS) issue across a group of pages, you may risk conversions as well as rankings.

Get a better idea of the level of risk associated with not fixing that technical debt. Then, assign it a score from 1 (low risk) to 5 (high risk).

Reward

In a similar way, consider the positive implications of carrying out this work. Look at how implementing these fixes could affect revenue, conversion rate, customer satisfaction, or even how it could save money.

For example, “We know that we have a lot of duplicate pages that are not generating revenue but are repeatedly crawled by search bots. We know that every time a bot crawls a page, it costs us $X in server hosting costs; therefore, if we remove those pages we can save the company $Y each year.”

Look primarily at the financial benefits of carrying out the work, but consider also some secondary benefits.

For example, will this work help users complete their goals more easily? Will it aid them in discovering new products or perhaps enjoy a better user experience?

Consider whether the work will benefit other channels beyond organic search. Your technical debt fixes may improve the landing page experience for a group of pages that are used for paid advertising campaigns as well as organic traffic. The benefit of that work may be felt by the paid media team as well as the organic search team.

Assess each of your planned tasks and assign them a value between 1 (low reward) and 5 (high reward).

Implementation Likelihood

When what you are asking is actually an extremely involved, expensive project that the development team doesn’t have the capacity to do, then it won’t get done. This might sound obvious, but often when we are trying to prioritize our technical requests, we think about their impact on our key performance indicators (KPIs), not their strain on the development queue.

Through talking with engineering stakeholders, you may realize that some of your tasks are more complicated than you originally thought. For example, a simple editable content block being added to a page might actually require a whole content management system (CMS) to be built.

Discuss your activities with stakeholders who understand the true requirements of the work, from the teams involved to the hours of work it will take.

From there, you will have a greater understanding of how easy or quick this work will be. Then, you can assign it a score from 1 to 5 of its likelihood of being implemented (1 being highly unlikely and 5 being highly likely).

Prioritization Method

Once you have assigned a score under each of the three categories for all of the technical debt fixes that you want to have carried out, you can prioritize the work based on the sum of all three categories’ scores. The higher the score, the higher a priority that work is.

Additional Ways To Get Dev Resources

Now, just because you have prioritized your fixes, it does not mean your development team will be keen to implement them. There may still be reasons why they are unable to carry out your requests.

Here are some additional suggestions to help you collaborate more closely with your technical team.

Discuss The Work With The Team Leader/Product Manager

The biggest hurdle you may need to overcome is usually sorted through communication. Help your development team understand your request and the benefits of carrying out these technical fixes.

Meet with the tech team lead or product/project manager to discuss the work and how it might fit into their workload.

There may be better ways for you to brief your technical team on the work that saves them “discovery” time and therefore gives more opportunity to work on your other requests.

Invest more time with the development team upfront in creating a brief for them that goes into all of the necessary detail.

Batch Issues In One Ticket

A tip for getting more of your work through the development queue is batching requests into one ticket. If you group together items that need to be worked on across the same group of pages, or template, it will mean developers can make multiple changes at once.

For example, if you want hard-coded page titles changed on your product pages, as well as their header tags and breadcrumbs added, put them all into one ticket. Instead of three separate requests for the development team to schedule in, they now have one larger ticket that can be worked on.

Show The Value Of Your Work To The Development Stakeholders

Show the value of your work to the stakeholders’ goals. So, in the instance of the development team, think about how your suggested fixes might benefit them. Find out what their KPIs or goals are and try to position your work to show the benefits to them.

For example, development teams are often tasked with monitoring and improving the performance of webpages. Part of this may be managing the budget for the server. You may be asking for a group of redirect chains to be removed, but the work isn’t getting prioritized by your development team. Demonstrate the value of removing redirect hops in reducing the load on the server, and therefore server costs.

If you can demonstrate how reducing the technical debt benefits both the SEO team and the development team, it is much more likely to get implemented.

Get Buy-In From Other Teams

On that note, look at getting buy-in from other teams for your work. When the activity you have proposed will not just benefit SEO, but also CRO, or PPC, then it may generate enough support to have it prioritized with the development team.

Show the value of your work beyond just its SEO implications. This can add weight to your request for prioritization.

Summary: Managing Technical Debt Is More Than A To-Do List

Managing technical SEO debt is never as simple as keeping a to-do list and working through it in order. Internal resources are often limited, competing priorities will arise, and most likely, you need the help of teams with very different goals. By weighing risk, reward, and implementation likelihood, you can make more informed decisions about which fixes will have the most impact.

Just as important is how you communicate those priorities. When you position SEO requests in terms of broader business value, you increase the chances of securing development time and cross-team support.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Should Small Brands Go All In On TikTok For Audience Growth? via @sejournal, @MordyOberstein

This week’s Ask An SEO question is about whether small brands should prioritize TikTok over Google to grow their audience:

“I keep hearing that TikTok is a better platform for small brands with an easier route to an audience. Do you think that Google is still relevant, or should I go all in on TikTok?”

The short answer to your question is that you do not want to pigeonhole your business into one channel, no matter the size. There’s also no such thing as an “easier” way. They are all hard.

I’m going to get the obvious out of the way so we can get to something beyond the usual answers to this question.

Your brand should be where your audience is.

Great, now that we didn’t spend four paragraphs saying the same thing that’s been said 100 times before, let me tell you something you want to consider beyond “be where your audience is.”

It’s Not About Channel, It’s About Traction

I have a lot of opinions here, so let me just “channel” my inner Big Lebowski and preface this with … this is just my opinion, man.

Stop thinking about channels. That’s way down the funnel (yet marketers make channels the seminal question all the time).

Start thinking about traction. How do you generate the most traction?

When I say “traction,” what I really mean is how to start resonating with your audience so that the “chatter” and momentum about who you are compound so that new doors of opportunity open up.

The answer to that question is not, “We will focus on TikTok.”

The answer is also not, “We will focus on Google.”

The answer is also not, “We will focus on YouTube.”

I could go on.

Now, there is another side to this: resources and operations. The question is, how do you balance traction with the amount of resources you have?

For smaller brands, I would think about: What can you do to gain traction that bigger brands have a hard time with?

For example, big brands have a very hard time with video content. They have all sorts of production standards, operations, and a litany of people who have a say, who shouldn’t even be in sniffing distance of having a say.

They can’t simply turn on their phone, record a video, and share something of value.

You can.

Does that mean you should focus on TikTok?

Nope.

It means you should think about what you can put out there that would resonate and help your audience, and does that work for the format?

If so, you may want to go with video shorts. I’m not sure why you would limit that to just TikTok.

Also, if your age demographic is not on TikTok, don’t do that. (“Being where your audience is” is a fundamental truth. Although I think the question is more about being in tune with your audience overall than “being where they are.” If you’re attuned to your audience, then you would know where they are and where to go just naturally.)

I’ll throw another example at you.

Big brands have a hard time communicating with honesty, transparency, and a basic level of authenticity. As a result, a lot of their content is “stale,” at best.

In this instance, trying to generate traction and even traffic by writing more authentic content that speaks to your audience, and not at them, seems quite reasonable.

In other words, the question is, “What resonates with your audience and what opportunities can you seize that bigger brands can’t?”

It’s a framework. It’s what resonates + what resources do you have + what vulnerabilities do the bigger brands in your vertical have that you can capitalize on.

There’s no one-size-fits-all answer to that. Forget your audience for a second, where are the vulnerabilities of the bigger brands in your space?

They might be super-focused on TikTok and have figured out all of the production hurdles I mentioned earlier, but they might not be focused on text-based content in a healthy way, if at all.

Is TikTok “easier” in that scenario?

Maybe not.

Don’t Pigeonhole Yourself

Every platform has its idiosyncrasies. One of the problems with going all-in on a platform is that your brand adopts those idiosyncrasies.

If I were all about Google traffic, my brand might sound like (as too many do) “SEO content.” Across the board. It all seeps through.

The problem with “channels” to me is that it produces a mindset of “optimizing” for the channel. When that happens – which inevitably it does (just look at all the SEO content on the web) – the only way out is very painful.

While you might start with the right mindset, it’s very easy to lose your brand’s actual voice along the way.

That can pigeonhole your brand’s ability to maneuver as time goes on.

For starters, one day what you had on TikTok may no longer exist (I’m just using TikTok as an example).

Your audience may evolve and grow older with you, and move to other forms of content consumption. The TikTok algorithm may gobble up your reach one day. Who knows.

What I am saying is, it is possible to wake up one day and what you had with a specific channel doesn’t exist anymore.

That’s a real problem.

That very real problem gets compounded if your overarching brand voice is impacted by your channel approach. Which it often is.

Now, you have to reinvent the wheel, so to speak.

Now, you have to adjust your channel approach (and never leave all your eggs in one basket), and you have to find your actual voice again.

This whole time, you were focused on speaking to a channel and what the channel demanded (i.e., the algorithm) and not your audience.

All of this is why I recommend a “traction-focused” approach. If you’re focused on traction, then this whole time, you’ve been building yourself up to become less and less reliant on the channel.

If you’re focused on traction, which inherently focuses on resonance, people start to come to you. You become a destination that people seek out, or, at a minimum, are familiar with.

That leaves you less vulnerable to changes within a specific channel.

It also helps you perform better across other channels. When you resonate and people start to recognize you, it makes performing easier (and less costly).

Let’s play it out.

You start creating material for TikTok, but you do it with a traction, not a channel mindset.

The content you produce starts to resonate. People start talking about you, tagging you on social, mentioning you in articles, etc.

All of that would, in theory, help your web content become more visible within organic search and your brand overall more visible in large language models (LLMs), no?

Let’s play it out even more.

One day, TikTok shuts down.

Now, you have to switch channels (old TV reference).

If you focused more on traction:

  1. You should have more direct traffic or branded search traffic than you had when you started your “TikTok-ing.”
  2. You should have more cache to rank better if you decide to create content for Google Search (just as an example).

The opposite is true as well. If Google shut down one day, and you had to move to TikTok, you would:

  1. Have more direct traffic than when you started to focus on Google.
  2. Have more cache and awareness to start building a following on TikTok.

It’s all one song.

Changing The Channel

I feel like, and this is a bit of a controversial take (for some reason), the less you “focus” on channels, the better.

The more you see a channel as less of a strategy and more of a way to actualize the traction you’re looking to create, the better off you’ll be.

You’ll also have an easier time answering questions like “Which channel is better?”.

To reiterate:

  • Don’t lose your brand voice to any channel.
  • Build up traction (resonance) so that when a channel changes, you’re not stuck.
  • Build up traction so that you already have cache when pivoting to the new channel.
  • It’s better to be a destination than anything.
  • All of this depends on your vertical, your resources, your competition, and most importantly, what your audience needs from you.

The moment you think beyond “channels” is the moment you start operating with a bit more clarity about channels. (It’s a kind of “there is no spoon” sort of thing.)

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: The Metrics That Matter For Content Strategies via @sejournal, @rollerblader

This week’s Ask an SEO question comes from Zahara:

“What metrics should small businesses actually care about when measuring content strategy success? Everyone talks about different KPIs, but I need to know which ones truly matter for growth.”

The metrics to measure for growth with a content strategy change by company and industry, and the type of business you run.

Publishers, for example, make their money by selling ad impressions and repeat content to consumers.

Ecommerce stores rely heavily on direct conversions and subscribers, while service-based and SaaS companies need leads and to scale remarketing groups.

There’s no shortage of ways to twist data, but there are certain key performance indicators (KPIs) and conversion items I measure based on what the goal of the client is, their current and future marketing capabilities as they grow or shrink, and things that I like to use as a measure of success when talking to the C-suite vs. day-to-day workers.

Here are some of the metrics or KPIs I measure from content marketing campaigns, and when I apply them to different clients.

Email And SMS Opt-ins

These are the unsung heroes of the marketing world. They’re people with enough of an interest in your company that they want to get marketing messages from you.

They sign up from blog content, whitepapers, and all other channels. Yet, most companies segment them without considering where the opt-in originated from.

The metrics here are:

  • Number of opt-ins.
  • Dollars in sales.
  • Average Order Value (AOV).
  • Lifetime Value (LTV) of the customer by content type and by article (if you get granular).

By tracking how many email and SMS opt-ins you get from content, and then the conversions and LTV metrics, you can tie revenue directly to the type of content on your site and how valuable each customer is based on the type of content you produce.

A comparison guide between two compatible electronic accessories for a camera may bring a photographer in; they liked the content, so they subscribed.

Six months later, they need to replace their computer. There’s a new version of editing software, so they get your message saying there is a sale, and this conversion happened because of your comparison content.

The email team would not have had the opt-in if you hadn’t created your guide.

The same can be said for companies that sell cookware.

The recipes you produce for their cooking blog or the recipe books you use as a lead gen get the SMS and email opt-in, so that when you’re having a sale or deal, the SMS and email teams have active lists to drive revenue.

Without your content, the customers would not be on your list, and the email or SMS team would not be able to do their jobs.

YOY Traffic Increases With Conversions

The next metric we track from content marketing is the total traffic increase year-over-year.

Showing an increase in non-branded and branded traffic displays:

  • More impressions are being made that build brand awareness if the topics are relevant to your business.
  • An increase in website visits, which can result in opt-ins for email and SMS, PPC, and social media to build remarketing lists.
  • Direct conversions if you’re tracking clicks on internal links, banner ads, and other calls to action.
  • Increases in branded search.

One metric we use with some of our clients is when non-branded search rises and people come back to visit the site again for more content and to shop.

One of our current clients requires seven website visits before a conversion happens, and as we show up for high-volume “newbie” phrases, we notice an increase in branded search.

We then tracked the pathways for the users who came back for more research questions, and when they eventually converted.

The finance team was then able to calculate the value for the cold topics. On top of that, we learned where people are who have never heard of the company before, but were in a mid-funnel stage.

By creating copy at this touch point, we have been able to reduce the seven visits to four or five in some cases.

The biggest benefit here was the branded search building. As branded searches increased, the site started to appear for high-volume and topically relevant product and shopping queries.

Examples (not from this client) could be a funny t-shirt company that now shows up for “t-shirts” and “graphic t-shirts” vs. only specific ones like “vintage 90’s cartoon t-shirts,” which has a lower search volume and is less competitive.

Direct Conversions

One of the easiest content KPIs to measure is direct conversions.

These could be completed sales, completed form fills with or without identifiable and financial information (credit cards or social security numbers), and sign-ups for petitions, non-profits, and parties or events.

The reason this is the easiest content KPI is because you can track the conversion from a piece of content, and the system records it on the thank you or confirmation page.

Page Views Per Visit

Publishers need page views to make money, and analytics packages make it easy to monitor how many page views each topic and content type gets on average.

By using internal links, an educational series, and content that makes sense to read as a follow-up, you can measure how the content you’re creating increases the amount of pageviews per visit, so you can increase your company’s overall revenue.

This also helps you find opportunities to promote similar articles, adding better internal links, and creating more guides when you notice people leave to do another search, and then come back to finish the article when there weren’t enough examples on your current site.

Repeat Visitors

These are people who come back for more content, whether it is a direct type-in, a new non-branded phrase from a different keyword in search results because they enjoyed your previous content, or from a different marketing team sharing content that is interesting to the audience.

By seeing which visitors come back from what efforts, you can better segment who gets what type of content and the types of content that move the needle.

  • Publishers can segment lists based on interests and email or send SMS messages as new content is created.
  • Retailers can email deals and specials based on what customers engage with.
  • Lead generation companies can fine-tune their sales funnels by showing relevant content within the customer’s need, want, and use cohorts.
  • Branding teams can keep good associations with the company to current customers as a way to keep them subscribing, paying, and sharing the good their companies are doing.

Final Thoughts

There is no shortage of KPIs you can track from content marketing. It’s a matter of matching them to the people you report to.

HR may want more job applicants, while the sales team wants leads. Marketing and advertising want direct conversions and subscriber list growth, while the C-suite wants to know market share and reach.

As a content marketer, you can fine-tune your tracking and reporting to meet each stakeholder’s needs and become the star of the company by keeping everyone informed on how your efforts are growing their parts of the company, and that is how we decide which KPIs to monitor and report on, based on the client.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Why Aren’t My Pages Getting Indexed? via @sejournal, @HelenPollitt1

This week’s question comes from Xaris, who asks:

“Why, even though I have correctly composed and linked the sitemap to a client’s website, and I have checked everything, am I having indexing problems with some articles, not all of them, even after repeated requests to Google and Google Search Console. What could be the problem? I can’t figure it out.”

This is far from a unique problem; we’ve all experienced it! “I’ve done everything I can think of, but Google still isn’t indexing my pages.”

Is It Definitely Not Indexed?

The very first aspect to check is if the page is truly not indexed, or simply isn’t ranking well.

It could be that the page appears not indexed because you can’t find it for what you consider the relevant keywords. However, that doesn’t mean it’s not indexed.

For the purposes of this question, I’m going to give you advice on how to deal with both circumstances.

What Could Be The Issue?

There are many reasons that a page might not be indexed by, or rank well, on Google. Let’s discuss the main ones.

Technical Issue

There are technical reasons, both mistakes and conscious decisions, that could be stopping Googlebot from reaching your page and indexing it.

Bots Blocked In Robots.txt

Google needs to be able to reach a page’s content if it is to understand the value of the page and ultimately serve it as a search result for relevant queries.

If Googlebot is blocked from visiting these pages via the robots.txt, that could explain why it isn’t indexing them.

It can technically still index a page that it can’t access, but it will not be able to determine the content of the page and therefore will have to use external signals like backlinks to determine its relevancy.

If it cannot crawl the page, even if it knows it exists via the sitemap, it will still make it unlikely to rank.

Page Can’t Be Rendered

In a similar way, if the bot can crawl the page but it can’t render the content, it might choose not to index it. It will certainly be unlikely to rank the page well as it won’t be able to read the content of the page.

Page Has A No-Index Tag

An obvious, but often overlooked, issue is that a noindex tag has been applied to the page. This will literally instruct Googlebot not to index the page.

This is a directive, that is, something Googlebot is committed to enacting.

Server-Level Bot Blocking

There could be an issue at your server level that is preventing Googlebot from crawling your webpage.

There may well have been rules set at your server or CDN level that are preventing Googlebot from crawling your site again and discovering these new pages.

It is something that can be quite a common issue when teams that aren’t well-versed in SEO are responsible for the technical maintenance of a website.

Non-200 Server Response Codes

The pages you have added to the sitemap may well be returning a server status code that confuses Googlebot.

For example, if a page is returning a 4XX code, despite you being able to see the content on the page, Googlebot may decide it isn’t a live page and will not index it.

Slow Loading Page

It could be that your webpages are loading very slowly. As a result, the perception of their quality may be diminished.

It could also be that they are taking so long to load that the bots are having to prioritize the pages they crawl so much that your newer pages are not being crawled.

Page Quality

There are also issues with the content of the website itself that could be preventing a page from being indexed.

Low Internal Links Suggesting Low-Value Page

One of the ways Google will determine if a page is worth ranking highly is through the internal links pointing to it. The links between pages on your website can both signify the content of the page being linked to, but also whether the page is an important part of your site. A page that has few internal links may not seem valuable enough to rank well.

Pages Don’t Add Value

One of the main reasons why a page isn’t indexed by Google is that it isn’t perceived as of high enough quality.

Google will not crawl and index every page that it could. Google will prioritize unique, engaging content.

If your pages are thin, or do not really add value to the internet, they may not be indexed even though they technically could be.

They Are Duplicates Or Near Duplicates

In a similar way, if Google perceives your pages to be exact or very near duplicate versions of existing pages, it may well not index your new ones.

Even if you have signaled that the page is unique by including it in your XML sitemap, and using a self-referencing canonical tag, Google will still make its own assessment as to whether a page is worth indexing.

Manual Action

There is also the possibility that your webpage has been subject to a manual action, and that’s why Google is not indexing it.

For example, if the pages that you are trying to get Google to index are what it considers “thin affiliate pages,” you may not be able to rank them due to a manual penalty.

Manual actions are relatively rare and usually affect broader site areas, but it’s worth checking Search Console’s Manual Actions report to rule this out.

Identify The Issue

Knowing what could be the cause of your issue is only half the battle. Let’s look at how you could potentially narrow down the problem and then how you could fix it.

Check Bing Webmaster Tools

My first suggestion is to check if your page is indexed in Bing.

You may not be focusing much on Bing in your SEO strategy, but it is a quick way to determine whether this is a Google-focused issue, like a manual action or poor rankings, rather than something on your site that is preventing the page from being indexed.

Go to Bing Webmaster Tools and enter the page in its URL Inspection tool. From here, you will see if Bing is indexing the page or not. If it is, then you know this is something that is only affecting Google.

Check Google Search Console’s “Page” Report

Next, go to Google Search Console. Inspect the page and see if it is genuinely marked as not indexed. If it isn’t indexed, Google should give an explanation as to why.

For example, it could be that the page is:

Excluded By “Noindex”

If Google detects a noindex tag on the page, it will not index it. Under the URL Inspection tool results, it will tell you that “page is not indexed: Excluded by ‘noindex’ tag”

If this is the result you are getting for your pages, your next step will be to remove the noindex tag and resubmit the page to be crawled by Googlebot.

Discovered – Currently Not Indexed

The inspection tool might tell you the “page is not indexed: Currently not indexed.”

If that is the case, you know for certain that it is an indexing issue, and not a problem with poor rankings, that is causing your page not to appear in Google Search.

Google explains that a URL appearing as “Discovered – currently not indexed” is:

“The page was found by Google, but not crawled yet. Typically, Google wanted to crawl the URL but this was expected to overload the site; therefore Google rescheduled the crawl. This is why the last crawl date is empty on the report.”

If you are seeing this status, there is a high chance that Google has looked at other pages on your website and deemed them not worth adding to the index, and as such, is not spending resources crawling these other pages that it is aware of because it expects them to be of as low quality.

To fix this issue, you need to signify a page’s quality and relevance to Googlebot. It is time to take a critical look at your website and identify if there are reasons why Google may consider your pages to be low quality.

For further details on how to improve a page, read my earlier article: “Why Are My Pages Discovered But Not Indexed?

Crawled – Currently Not Indexed

If your inspected page returns a status of “Crawled – currently not indexed,” this means that Google is aware of the page, has crawled it, but doesn’t see value in adding it to the index.

If you are getting this status code, you are best off looking for ways to improve the page’s quality.

Duplicate, Google Chose Different Canonical Than User

You may see an alert for the page you have inspected, which tells you this page is a “Duplicate, Google chose different canonical than user.”

What this means is that it sees the URL as a close duplicate of an existing page, and it is choosing the other page to be displayed in the SERPs instead of the inspected page, despite you having correctly set a canonical tag.

The way to encourage Google to display both pages in the SERPs is to make sure they are unique, have sufficient content so as to be useful to readers.

Essentially, you need to give Google a reason to index both pages.

Fixing The Issues

Although your pages may not be indexed for one or more of various reasons, the fixes are all pretty similar.

It is likely that there is either a technical issue with the site, like an errant canonical tag or a robots.txt block, that has been preventing correct crawling and indexing of a page.

Or, there is an issue with the quality of the page, which is causing Google to not see it as valuable enough to be indexed.

Start by reviewing the potential technical causes. These will help you to quickly identify if this is a “quick” fix that you or your developers can change.

Once you have ruled out the technical issues, you are most likely looking at quality problems.

Depending on what you now think is causing the page to not appear in the SERPs, it may be that the page itself has quality issues, or a larger part of your website does.

If it is the former, consider E-E-A-T, uniqueness of the page in the scope of the internet, and how you can signify the page’s importance, such as through relevant backlinks.

If it is the latter, you may wish to run a content audit to help you narrow down ways to improve the overall perception of quality across your website.

Summary

There will be a bit of investigation needed to identify if your page is truly not indexed, or if Google is just choosing not to rank it highly for queries you feel are relevant.

Once you have identified that, you can begin closing in on whether it is a technical or quality issue that is affecting your pages.

This is a frustrating issue to have, but the fixes are quite logical, and the investigation should hopefully reveal more ways to improve the crawling and indexing of your site.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: How Can We Recover A 30% Drop In Organic Traffic From A Site Migration? via @sejournal, @kevgibbo

This week’s Ask An SEO question comes from an ecommerce business that followed best practices, but still lost traffic after migrating to a new platform.

“We recently migrated our ecommerce store to a new platform, and despite following all the recommended SEO practices, our organic traffic dropped by 30%.

What recovery strategies should we prioritize, and how long should we expect before seeing improvements?”

This is a common frustration many ecommerce businesses face after a platform migration.

But why does it happen, and more importantly, how can you recover lost traffic? Let’s dive into the likely causes of this issue and explore the most effective strategies to get your organic traffic back on track.

Why Organic Traffic Can Drop Post-Migration

So, why does this happen? And more importantly, what can you do to fix it?

Understanding why this happens is key to finding a solution. Without pinpointing the root cause, any recovery efforts can feel like a shot in the dark – and that’s the last thing you want.

Tracking Issues

After a migration, it’s surprisingly common for something small to go wrong with your analytics setup. Maybe the Google Analytics 4 tag wasn’t added correctly. Maybe your Google Search Console property wasn’t verified.

Even a tiny mistake – like a misconfigured setting or a missing bit of code – can make it look like traffic has fallen off a cliff, when really it’s just not being tracked properly.

The upside? These problems are usually quick to spot and easy to fix. It’s always a good first step before diving into deeper SEO troubleshooting because the issue might not be your traffic at all, just your data.

Technical Issues

If your tracking is working as it should and the traffic drop is real, the next step is to check for technical SEO problems on your new site – and this almost always starts with redirects.

During a migration, especially if URLs have changed, redirects are crucial. One missing or incorrect 301 redirect can break the connection between your old and new pages, making Google think important content has disappeared.

That can quickly tank rankings and traffic. Make sure all old URLs point to the right new ones, that you’re using proper 301 (not 302) redirects, and that there are no long redirect chains slowing things down.

Other common technical pitfalls? Broken or removed internal links, staging URLs accidentally left in canonical tags, or no-index rules carried over from development.

Any of these can stop Google from crawling or indexing your site properly, and if that happens, your content won’t show up in search at all.

It’s also worth checking your XML sitemap and robots.txt file. Make sure your sitemap is up to date and submitted in Google Search Console, and that robots.txt isn’t blocking important sections of your site.

On-Page Content

In some cases, ranking drops can be caused by changes to page content itself. Even small changes like missing H1s, altered metadata, or content now rendered in JavaScript can have a big impact.

So, you will need to double-check the content on your pages to identify if anything has changed.

But, don’t forget that Google will need time to reindex and trust your new setup, especially if you didn’t submit an updated sitemap or if backlinks still point to old URLs.

Although you may see some big changes in your SEO performance initially, monitor it for a little while to see if things settle back down on their own.

Steps To Recover Your Organic Traffic

Crawl Your Site: Look For Redirect Problems And Broken Links

The first thing you should do is crawl your site. Tools like Screaming Frog or Sitebulb are perfect for this.

Crawling your site helps you identify technical issues such as broken redirects, incorrect or missing 301 redirects, and redirect chains.

During a migration, URL changes are common, and improper redirects can create huge SEO setbacks.

If you have old URLs pointing to pages that no longer exist or haven’t been properly redirected, Google might struggle to index your site correctly, impacting traffic and rankings.

Another key issue to spot during crawling is orphaned pages. These are pages that exist on your site but have no internal links pointing to them.

Without internal links, Google may have a harder time finding and indexing these pages, which can hurt your rankings.

Fix Redirect Problems Immediately

Once you’ve identified any issues with redirects during the crawl, fixing them should be your priority.

Redirects are crucial for preserving SEO value during a migration. Check that all old URLs are properly redirected to their new counterparts using 301 redirects, if your URLs have changed.

Ensure there are no redirect chains, as these can slow down page load times and confuse search engines.

Even if you think you’ve set up redirects, it’s worth doing a detailed check. Missing or incorrect redirects are one of the top causes of traffic loss after a migration.

Remember, each redirect is a connection that ensures the SEO equity of your old pages gets passed on to your new ones.

Tackle Potential On-Page Issues

If you’ve ruled out any major technical errors, focus on the content itself.

Compare your post-migration content with the version you had before the migration. Did anything change that might have negatively affected your rankings?

Ensure that all your pages are optimized for the target keywords, including title tags, meta descriptions, header tags (especially H1s), and body content.

It’s also worth revisiting your product pages to ensure they meet Google’s standards for quality content. This might involve adding more detailed product descriptions, improving product images, or enhancing user-generated content such as reviews.

Update Your XML Sitemap And Google Search Console

Once your on-page content is reviewed and technical issues addressed, the next step is to update your XML sitemap to reflect the new URLs, if applicable.

Submit the updated sitemap to Google Search Console so Google can easily find and crawl your pages. This also helps Google understand that you’ve made changes to your site’s structure and allows it to index the new pages more quickly.

Don’t forget to monitor Google Search Console closely. Regularly check for crawl errors and use the URL Inspection Tool to request indexing for important pages that may not have been crawled yet.

How Long Does SEO Recovery Take?

Recovery isn’t an instant process, unfortunately. Typically, sites begin to see improvements within four to 12 weeks, but several factors can influence the recovery timeline.

If your migration involved significant changes, like a new domain or a complete overhaul of your site structure, Google may treat your site as if it were brand new.

In this case, it can take longer for Google to rebuild trust and restore organic visibility. Sites with many pages may also experience slower recovery times, as Google has to crawl and reindex more content.

The content on your site can also affect recovery time. If important pages were altered or lost valuable content during the migration, it might take longer for Google to recognise the changes and rank your pages again.

→ Read more: How Long Should An SEO Migration Take? [Study Updated]

Long-Term Lessons & Preventative Measures

A smooth migration doesn’t start on launch day; it starts way before. SEO needs to slot into your QA and development process from the beginning.

That means making sure things like redirects, content structure, and crawlability are all working before you go live, ideally in a proper staging environment.

Issues can still happen when you go live, though, so remember to crawl your old site before launch. That way, you can run side-by-side audits of your old and new sites and catch issues early.

It’s also a smart idea to have a rollback plan just in case. That means having backups and knowing what to do if something goes wrong.

Final Thoughts

Recovering from an SEO drop after a migration can be frustrating, but unfortunately, it’s often part of the process.

By focusing on the right technical checks, reviewing your on-page content, and giving search engines time to recrawl and reindex your site, you can get things back on track.

Keep a close eye on your data, be patient, and use this as an opportunity to strengthen your site’s overall SEO health.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Why Your Content Is No Longer Good Or Helpful via @sejournal, @rollerblader

This week’s Ask An SEO question is from someone who would like to know why their content is no longer “good” or “helpful.”

The person is curious why they no longer rank for “their keywords” when other pages don’t have original photos or content that proves they have real experience.

The person would like to remain anonymous, so I’m respecting their request.

This is a long post. The top section is how we think about content and why it should rank. The second section is where you will find ways to implement based on multiple niches, from travel to food, roofing, and more.

How We Go About Creating Content

Their argument for why their content should rank is based on a concept called E-E-A-T.

Experience, Expertise, Authoritativeness, and Trustworthiness is not a ranking factor or signals; it is a trust builder for readers.

When done well, it can cause your content to get citations and backlinks naturally.

Your goal as a creator should be to show expertise and experience through original thoughts that only a person with first-hand knowledge knows.

That is how E-E-A-T works for SEO. There is no score or metric, and E-E-A-T is not a factor for ranking a page or website; it is an SEO concept.

I reviewed the website from the question submission, and it was similar to the slew of sites submitted for audits when the helpful content update killed niche sites. My feedback was the same.

The content is not original or unique, and it is not helpful. The person was just lucky they had the traffic for as long as they did.

Yes, the information given was from their personal experience, but it could have been generated by a large language model (LLM). Although the images were unique, they were not something unique to the topic or entity, and they did not help the user with a complete solution.

I live in Washington, D.C., and take photos when I go running most days, but I’m not a professional, and I do not sell my shots. It is a hobby. I could technically write a DC photography blog post or guide and try to rank it.

In order to do so, I not only need to share original photos, but I need to share original thoughts and things that will help someone wanting to take photos of DC, a complete solution. “The complete solution” part is what a lot of the sites I audited that got wiped out were missing.

The first step in providing this solution is to look for what people are asking and find a way to present it.

A question I get from friends on Facebook and sometimes on Instagram when I post a photo is “how do you get the lighting,” or “what do you use to edit people out.”

These become two topics that can be blog posts or YouTube videos on their own, and tips that only I would know because I’m the person taking these photos over and over with the same results.

This is where I can display E-E-A-T, provide the same information as everyone else, and then provide the information only I would know as the creator.

Instead of saying “take these photos at golden hour” and showing myself at golden hour, I should show what it looks like before, during, and after.

This helps the person know what to look for light-wise, so they can snap their shot around the same time I do pre-processing. But that isn’t enough. I need to go five steps further. I’ll use taking a photo of the monuments on the mall as the example.

First, I’d share that I don’t edit the people out, then write a section of the article dedicated to “taking photos of XYZ monument without people.”

In this section, I’ll give the instructions on how I calculate when fewer people will be around, the light is still diffused, and you get the glow of golden hour.

If you’re curious, the trick for this is showing up about 10 minutes after peak golden hour and facing east in the morning. People start to leave ,and you get a clean shot with sunrise.

Now I need to add a tip that only someone with a lot of experience and time spent on the craft would know. This could be looking for rainy mornings. There are few to no people out, and when you combine the weather factor and being just past golden hour, you are likely to get a photo with little to no people in it.

Third is to add a tip only I would know. This can be “this does not work for blue hour or sundown because people crowd before the sun sets, and it is dark afterwards eliminating your light source.”

But that doesn’t fully provide a solution. I need to give an alternative, as they still want a people-free photo. The opportunity here is to give three or four other monuments with examples of why they are better than the XYZ monument for sunset, including getting a photo without anyone in it.

Sounds like a lot of work, right?  It is, but only if you’re not an expert in your field. Sites that did not go this far got wiped out. For our clients, we go even further. Here’s how.

Something I have not seen on photography sites is time tracking to let someone know when to show up, specifically.

For the DC monuments, I tracked the time it takes for people to leave after the sun rises during cherry blossoms, so I can get a photo as the sky changes and the trees are in bloom.

After keeping my spreadsheet for a few years, I knew how many days in advance and after peak bloom, people show up and leave. I also learned when they’d be dispersed enough from my favorite shots and angles.

It wasn’t perfect, but the spreadsheet did the job, and I get my favorite photos each year.

For this theoretical article, I could post the spreadsheet to help others, and that is something unique to my site and may get backlinks from travel guides, photographers, and DC tourism companies.

I do this for other locations as well.  If you’re a creator in the hobbyist, travel bloggers and nomads, food space, etc. you should be doing this level of detail.

For New Orleans and when I was in St. Marteen last January, I used live streaming cameras from specific locations to track when to show up to take my photos.

For sunrise, I looked at when people left the beach by tracking movement. The beach cams showed sunrise, but the Bourbon Street cams did not, so I used the time when fewer people passed per minute.

Applying This In The Real World And To Multiple Niches

By being able to see sunrise and sunset, I knew which beaches and angles to go to.

I was also able to figure out when I’d still have the right lighting with fewer chances of people being there by tracking when people show up and leave, and how many people are in each spot by day.

For my photos in the French Quarter, I used the Bourbon Street live cams.

You can see when the streets are less full, when lights go on and off, and capture the mood and setting you want, whether it is Jackson Square, Bourbon St., Canal Street, etc.

I personally like street lamps being lit in my photos, so that’s why I tracked their on and off times.

Now it is time to present the content. Written text is only a portion; the instructions can be presented in:

  • An ordered list.
  • A spreadsheet that can be downloaded or accessed online with instructions on it.
  • Videos sharing the steps visually so the person can follow along.
  • A table that lists the steps and what to do with notes, featuring alternatives and examples.
  • Infographics that walk through the steps and include reminders and visuals.

This is pretty specific, but it applies to the work we do for clients.

If you write about outdoor sports like hiking or snowboarding, you can do this for specific slopes or trails that are always trafficked, and the person wants to enjoy it without congestion.

I was able to apply this to the Velocicoaster at Universal Orlando and not have to wait in a huge line.

The same with shopping for fashion and food sites, when are the slower times, and how can they verify?

Google Business Profiles sometimes list heavy and slow traffic hours for stores, restaurants, and entertainment venues by day and hour.

In the case studies I share on my blog, I say we don’t build backlinks anymore, and that is true.

By thinking about how and why our data, skill sets, products, services, etc. are unique and how we can apply our knowledge, the content starts to rank and people cite and source us.

We get the backlinks naturally, and the clients grow.

If you’re a contract attorney, you know the trends that could signal shifts in the markets and how businesses are growing, what their concerns are, and the direction things are heading.

Publish the numbers of types of contracts as data points without revealing any client information, and share how it either correlates or goes against what traditional media and social media are saying.

Home builders, contractors, and interior designers know what is about to be popular because demand starts to spike, if people are downsizing or looking for more space and luxury, and how it compares to previous years. This can get B2B and B2C traffic and backlinks.

  • B2C comes from potential customers that want to know what to buy or what is on trend, or to see what was popular five years ago and if it is making a comeback.
  • B2B wants to know what materials, colors, and other items they should plan to order and stock as the demand will be coming.

By creating renderings and solutions for both, you can collect leads and hopefully convert them, whether they’re brides, religious events like a Bat Mitzvah or Quinceañeras, or kitchen renovations and roof repairs.

If you’re a retailer or affiliate, optimize your product pages for these before the demand starts vs. having to optimize and compete with the companies and vendors already ranking.

You have the advantage, as you know what will be in demand months in advance.

Travel sites can go five steps further than saying here is a wheelchair-friendly entrance.

Share where the nearest bathrooms to that entrance are and the easiest pathway through the museum that does not require stairs.

You can also share when they’re likely to be less crowded, so you don’t have to fight through crowds to see the exhibits or wait for elevators.

And make sure to post photos of how to find them, not just you at the location.  This is how you help the reader and create rank-worthy content.

The same goes for castles in Europe, and beaches or temples in Asia.  Help people with more than just saying it is friendly for people; give them the resources that only a person with real experience would know.

Show images of what to look for, not just the signature photo from the space. If you don’t share where to take that photo from, the person has to do more searching.

Anyone can show a photo saying they’ve been somewhere. That does not show E-E-A-T, and neither does saying it is “reviewed by” if the content does not have unique and original thoughts by the experts.

Take your content five to 10 times further and make sure the person does not have to do another search after or while performing a task.

This is how you create content that ranks and gets backlinks.

The content in these cases is helpful, as long as you use proper formatting with it so users can thumb through with ease.

More Resources: 


Featured Image: Paulo Bobita/Search Engine Journal