Google’s Martin Splitt: JavaScript-Loaded Images Can Be Indexed via @sejournal, @MattGSouthern

Google’s Developer Advocate Martin Splitt recently debunked a common SEO myth. He confirmed that images loaded with JavaScript can be indexed by Google when set up correctly.

Splitt shared these insights during the SEO for Paws Conference, a live-streamed fundraiser by Anton Shulke.

Here’s how to avoid common image indexing issues when loading images with JavaScript.

JavaScript Image Loading Isn’t the Problem

When asked about images loaded by JavaScript, Splitt clarified that the method is not to blame for indexing issues.

Splitt explains:

“JavaScript to load images is fine. A purely JavaScript image loading solution can absolutely get your images indexed.”

This comment clears up worries among many SEO pros. Images may not appear in Google Images for reasons other than using JavaScript.

The Real Culprits Behind Unindexed Images

Splitt explained that something else is usually wrong if JavaScript-loaded images don’t appear in search results.

He pointed to a few common issues:

  • Sitemap Problems: Sometimes, key images are missing from XML sitemaps.
  • HTTP Headers: Some image files may have headers that stop them from being indexed.
  • Rendered HTML Issues: If images don’t appear in the rendered HTML (the version Google sees after JavaScript runs), they won’t get indexed.

Debugging JavaScript Image Indexing Issues

Splitt offers a simple process to spot problems. Start by checking if images appear in the rendered HTML using tools like Search Console’s URL Inspection tool.

Splitt explains:

“You would have to check: is the rendered HTML containing the images? If it is, fantastic. If it’s not, then something else is off.”

Since Google indexes the rendered HTML, any image missing from it won’t be found by Googlebot.

See Splitt’s full talk on JavaScript SEO in the video below:

Common JavaScript Image Loading Techniques & Their SEO Impact

There are several ways to load images with JavaScript. Some common methods include:

  • Lazy Loading: Loads images only when needed.
  • Progressive Loading: Shows a low-quality image first, then upgrades to a high-quality one.
  • Infinite Scroll Loading: Loads images as users continue to scroll.
  • Background Image Insertion: Adds images through CSS backgrounds.

If they are set up properly, all these methods can work with Google’s indexing. Each may need its own checks to ensure everything is working as expected.

Best Practices for SEO-Friendly JavaScript Image Loading

Even though JavaScript-loaded images can be indexed, following these best practices can help avoid issues:

  • Verify with the URL Inspection Tool: Ensure images appear in the rendered HTML.
  • Update Your XML Sitemaps: Include key images with proper tags.
  • Use Alt Text: Provide clear alt text for images loaded via JavaScript.
  • Use Native Lazy Loading: Add the loading="lazy" attribute where it makes sense.
  • Check Robots.txt: Ensure you are not blocking JavaScript resources that load images.

What This Means for SEO Professionals

Instead of avoiding JavaScript, verify that images are loaded correctly and appear in the rendered HTML.

As websites rely more on JavaScript, understanding these details is key. SEO professionals who learn to troubleshoot and optimize JavaScript-based image loading will be better prepared to support their clients’ visibility in search results.

Looking Ahead

This clarification is timely. Many modern sites built with frameworks like React, Vue, or Angular load images using JavaScript instead of traditional tags.

Splitt’s insights help dispel the myth that JavaScript harms image indexing. Developers can now focus on performance without worrying about SEO penalties.


Featured Image: Alicia97/Shutterstock

An AI-Powered Workflow To Solve Content Cannibalization via @sejournal, @Kevin_Indig

Your site likely suffers from at least some content cannibalization, and you might not even realize it.

Cannibalization hurts organic traffic and revenue: The impact can stretch from key pages not ranking to algorithm issues due to low domain quality.

However, cannibalization is tricky to detect, can change over time, and exists on a spectrum.

It’s the “microplastics of SEO.”

In this Memo, I’ll show you:

  1. How to identify and fix content cannibalization reliably.
  2. How to automate content cannibalization detection.
  3. An automated workflow you can try out right now: The Cannibalization Detector, my new keyword cannibalization tool.

I could have never done this without Nicole Guercia from AirOps. I’ve designed the concept and stress-tested the automated workflow, but Nicole built the whole thing.

How To Think About Content Cannibalization The Right Way

Before jumping into the workflow, we must clarify a few guiding principles about content cannibalization that are often misunderstood.

The biggest misconception about cannibalization is that it happens on the keyword level.

It’s actually happening on the user intent level.

We all need to stop thinking about this concept as keyword cannibalization and instead as content cannibalization based on user intent.

With this in mind, cannibalization…

  • Is a moving target: When Google updates its understanding of intent during a core update, suddenly two pages can compete with each other that previously didn’t.
  • Exists on a spectrum: A page can compete with another page or several pages, with an intent overlap from 10% to 100%. It’s hard to say exactly how much overlap is fine without looking at outcomes and context.
  • Doesn’t stop at rankings: Looking for two pages that are getting a “substantial” amount of impressions or rankings for the same keyword(s) can help you spot cannibalization, but it is not a very accurate method. It’s not enough proof.
  • Needs regular check-ups: You need to check your site for cannibalization regularly and treat your content library as a “living” ecosystem.
  • Can be sneaky: Many cases are not clear-cut. For example, international content cannibalization is not obvious. A /en directory to address all English-speaking countries can compete with a /en-us directory for the U.S. market.
Image Credit: Kevin Indig

Different types of sites have fundamentally different weaknesses for cannibalization.

My model for site types is the integrator vs. aggregator model. Online retailers and other marketplaces face fundamentally different cases of cannibalization than SaaS or D2C companies.

Integrators cannibalize between pages. Aggregators cannibalize between page types.

  • With aggregators, cannibalization often happens when two page types are too similar. For example, you can have two page types that could or could not compete with each other: “points of interest in {city}” and “things to do in {city}”.
  • With integrators, cannibalization often happens when companies publish new content without maintenance and a plan for the existing content. A big part of the issue is that it becomes harder to keep an overview of what you have and what keywords/intent it targets at a certain number of articles (I found the linchpin to be around 250 articles).

How To Spot Content Cannibalization

An example of content cannibalization (Image Credit: Kevin Indig)

Content cannibalization can have one or more of the following symptoms:

  • “URL flickering”: meaning at least two URLs alternate in ranking for one or more keywords.
  • A page loses traffic and/or ranking positions after another one goes live.
  • A new page hits a ranking plateau for its main keyword and cannot break into the top 3 positions.
  • Google doesn’t index a new page or pages within the same page type.
  • Exact duplicate titles appear in Google’s search index.
  • Google reports “crawled, not indexed” or “discovered, not indexed” for URLs that don’t have thin content or technical issues.

Since Google doesn’t give us a clear signal for cannibalization, the best way to measure similarity between two or more pages is cosine similarity between their tokenized embeddings (I know, it’s a mouthful).

But this is what it means: Basically, you compare how similar two pages are by turning their text into numbers and seeing how closely those numbers point in the same direction.

Think about it like a chocolate cookie recipe:

  • Tokenization = Break down each recipe (e.g., page content) into ingredients: flour, sugar, chocolate chips, etc.
  • Embeddings = Convert each ingredient into numbers, like how much of each ingredient is used and how important each one is to the recipe’s identity.
  • Cosine Similarity = Compare the recipes mathematically. This gives you a number between 0 and 1. A score of 1 means the recipes are identical, while 0 means they’re completely different.

Follow this process to scan your site and find cannibalization candidates:

  • Crawl: Scrape your site with a tool like Screaming Frog (optionally, exclude pages that have no SEO purpose) to extract the URL and meta title of each page
  • Tokenization: Turn words in both the URL and title into pieces of words that are easier to work with. These are your tokens.
  • Embeddings: Turn the tokens into numbers to do “word math.”
  • Similarity: Calculate the cosine similarity between all URLs and meta titles

Ideally, this gives you a shortlist of URLs and titles that are too similar.

In the next step, you can apply the following process to make sure they truly cannibalize each other:

  • Extract content: Clearly isolate the main content (exclude navigation, footer, ads, etc.). Maybe clean up certain elements, like stop words.
  • Chunking or tokenization: Either split content into meaningful chunks (sentences or paragraphs) or tokenize directly. I prefer the latter.
  • Embeddings: Embed the tokens.
  • Entities: Extract named entities from the tokens and weigh them higher in embeddings. In essence, you check which embeddings are “known things” and give them more power in your analysis.
  • Aggregation of embeddings: Aggregate token/chunk embeddings with a weighted averaging (eg, TF-IDF) or attention-weighted pooling.
  • Cosine similarity: Calculate cosine similarity between resulting embeddings.

You can use my app script if you’d like to try it out in Google Sheets (but I have a better alternative for you in a moment).

About cosine similarity: It’s not perfect, but good enough.

Yes, you can fine-tune embedding models for specific topics.

And yes, you can use advanced embedding models like sentence transformers on top, but this simplified process is usually sufficient. No need to make an astrophysics project out of it.

How To Fix Cannibalization

Once you’ve identified cannibalization, you should take action.

But don’t forget to adjust your long-term approach to content creation and governance. If you don’t, all this work to find and fix cannibalization is going to be a waste.

Solving Cannibalization In The Short Term

The short-term action you should take depends on the degree of cannibalization and how quickly you can act.

“Degree” means how similar the content across two or more pages is, expressed in cosine or content similarity.

Though not an exact science, in my experience, a cosine similarity higher than 0.7 is classified as “high”, while it’s “low” below a value of 0.5.

4 ways to fix cannibalization (Image Credit: Kevin Indig)

What to do if the pages have a high degree of similarity:

  • Canonicalize or noindex the page when cannibalization happens due to technical issues like parameter URLs, or if the cannibalizing page is irrelevant for SEO, like paid landing pages. In this case, canonicalize the parameter URL to the non-parameter URL (or noindex the paid landing page).
  • Consolidate with another page when it’s not a technical issue. Consolidation means combining the content and redirecting the URLs. I suggest taking the older page and/or the worse-performing page and redirecting to a new, better page. Then, transfer any useful content to the new variant.

What to do if the pages have a low degree of similarity:

  • Noindex or remove (status code: 410) when you don’t have the capacity or ability to make content changes.
  • Disambiguate the intent focus of the content if you have the capacity, and if the overlap is not too strong. In essence, you want to differentiate the parts of the pages that are too similar.

Solving Cannibalization In The Long Term

It’s critical to take long-term action to adjust your strategy or production process because content cannibalization is a symptom of a bigger issue, not a root cause.

(Unless we’re talking about Google changing its understanding of intent during a core algorithm update, and that has nothing to do with you or your team.)

The most critical long-term changes you need to make are:

  1. Create a content roadmap: SEO Integrators should maintain a living spreadsheet or database with all SEO-relevant URLs and their main target keywords and intent to tighten editorial oversight. Whoever is in charge of the content roadmap needs to ensure there is no overlap between articles and other page types. Writers need to have a clear target intent for new and existing content.
  2. Develop clear site architecture: The pendant of a content map for SEO Aggregators is a site architecture map, which is simply an overview of different page types and the intent they target. It’s critical to underline the intent as you define it with example keywords that you verify on a regular basis (”Are we still ranking well for those keywords?”) to match it against Google’s understanding and competitors.

The last question is: “How do I know when content cannibalization is fixed?”

The answer is when the symptoms mentioned in the previous chapter go away:

  • Indexing issues resolve.
  • URL flickering goes away.
  • No duplicate titles appear in Google’s search index.
  • “Crawled, not indexed” or “discovered, not indexed” issues decrease.
  • Rankings stabilize and break through a plateau (if the page has no other apparent issues).

And, after working with my clients under this manual framework for years, I decided it’s time to automate it.

Introducing: A Fully Automated Cannibalization Detector

Together with Nicole, I used AirOps to build a fully automated AI workflow that goes through 37 steps to detect cannibalization within minutes.

It performs a thorough analysis of content cannibalization by examining keyword rankings, content similarity, and historical data.

Below, I’ll break down the most important steps that it automates on your behalf:

1. Initial URL Processing

The workflow extracts and normalizes the domain and brand name from the input URL.

This foundational step establishes the target website’s identity and creates the baseline for all subsequent analysis.

Image Credit: Kevin Indig

2. Target Content Analysis

To ensure that the system has quality source material to analyze and compare against competitors, Step 2 involves:

  • Scraping the page.
  • Validating and analyzing the HTML structure for main content extraction.
  • Cleaning the article content and generating target embeddings.
Image Credit: Kevin Indig

3. Keyword Analysis

Step 3 reveals the target URL’s search visibility and potential vulnerabilities by:

  • Analyzing ranking keywords through Semrush data.
  • Filtering branded versus non-branded terms.
  • Identifying SERP overlap with competing URLs.
  • Conducting historical ranking analysis.
  • Determining page value based on multiple metrics.
  • Analyzing position differential changes over time.
Image Credit: Kevin Indig

4. Competing Content Analysis (Iteration Over Competing URLs)

Step 4 gathers additional context for cannibalization by iteratively processing each competing URL in the search results through the previous steps.

Image Credit: Kevin Indig

5. Final Report Generation

In the final step, the workflow cleans up the data and generates an actionable report.

Image Credit: Kevin Indig

Try The Automated Content Cannibalization Detector

Image Credit: Kevin Indig

Try the Cannibalization Detector and check out an example report.

A few things to note:

  1. This is an early version. We’re planning to optimize and improve it over time.
  2. The workflow can time out due to a high number of requests. We intentionally limit usage so as not to get overwhelmed by API calls (they cost money). We’ll monitor usage and might temporarily raise the limit, which means if your first attempt isn’t successful, try again in a few minutes. It might just be a temporary spike in usage.
  3. I’m an advisor to AirOps but was neither paid nor incentivized in any other way to build this workflow.

Please leave your feedback in the comments.

We’d love to hear how we can take the Cannibalization Detector to the next level!

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: If I Am Not An SEO Expert, Is It Better For Me To Start An Agency? via @sejournal, @HelenPollitt1

Today’s question from Kazi is such an honest and important one for our industry. Kazi asks:

“I’m new to this sector. Should I start an agency as an one-man employee? I know it sounds ridiculous, but I am very much in need of work – more specifically, in need of money.

So, I think an agency will have better pull power than me working as a solo search engine optimizer because clients won’t get attracted if they see a new SEO learner…”

I understand that the crux of this question is, “Will you appear more legitimate and competitive if you are offering services as an agency rather than a solo contractor?”

However, I really want to address the more important aspect of this: Should you be offering SEO services for a fee as someone brand new to the industry?

My answer to that is no. Not only shouldn’t you be offering SEO services as an agency, but you also shouldn’t be offering them as a solo contractor if you are brand new to SEO.

Why SEO Is A Great Career Move

I completely get the appeal of this as a career move. On the face of it, SEO has no barriers to entry, which is great!

I fully recommend this industry to anyone passionate about analysis, psychology, creativity, sustainability, and technology.

You’ll hear me encourage it for people who like to problem-solve and find solutions that don’t have obvious answers.

I recommend it not only for those who have a background in creative pursuits, but also for those whose background is more tech-focused. It is a career that captures the interests of a lot of different types of people.

Not only does it give you the opportunity to earn money through skills that are in demand, but there are also no expensive overheads.

There are no formal qualifications needed and no regulatory bodies to convince. All you need is a computer, the internet, and the passion to develop your skills.

But it’s not easy.

There is a lot to learn before you can realistically start charging for your services. It is not just about the mechanics of SEO, but also how to apply them in different situations.

Risks To Clients

If you take on SEO work for a business or organization without knowing how to apply SEO concepts in practice, you could open them up to significant risk.

Learning On The Job

SEO is not a straightforward practice. There are a lot of variables and circumstances that affect what we might deem “good practice.” Because of this, you can’t apply a blanket solution to every situation you encounter.

If you are brand new to SEO, there may be nuances with your client’s industry, website, or tech stack that you aren’t aware of, which will impact how successful your strategy is.

A lot of SEO comes down to problem-solving. This is a skill that gets honed over time. As you encounter more situations and learn from what worked and what didn’t, you will become more adept at creating successful strategies.

If you are a brand-new SEO bearing full responsibility for the success of the organic performance of a client, you will come unstuck.

SEO is a great industry for learning on the job. However, if you are learning SEO from scratch, you don’t want the pressure of being the most senior SEO in the room. You will likely make mistakes, and these could be costly to your client.

May Cause Significant Traffic Loss

In some situations, a junior SEO working on a website alone could cause significant traffic loss for a client.

For example, you could accidentally set live a solution that de-indexes their whole site. You may not know how to guard against that sort of mistake. You could see your client’s organic traffic disappear in a matter of days.

These are risks that more experienced SEO professionals face as well, but after years of working in SEO, we can foresee what issues might arise from the recommendations we make.

Could Cause Financial Harm

If your SEO recommendations cause organic performance issues, your client’s revenue could be significantly affected.

If they run a business that relies on their website – and in particular, organic traffic – then your mistakes could be very costly.

People have lost jobs over organic traffic loss. Without much experience in SEO and no one more senior to help flag risks, this is something that you could easily cause.

Risks To You

Not only would charging for your services as an SEO when you’ve never done it before put your clients at risk, but it could also be harmful to you.

Significant Pressure

You will face significant pressure. You will be expected to set reasonable goals and achieve them in a timely manner.

Without much experience in SEO, that’s going to be incredibly difficult to do. You’ll either set unattainable goals that no SEO could realistically achieve, or you simply won’t have the skills to achieve more practical objectives.

With that, you will find yourself trying to appease an increasingly disgruntled client.

Any SEO professional who has worked as a freelancer or as part of an agency has had to have difficult conversations with a client. They have expected results that you have not been able to deliver.

However, an experienced SEO will be able to identify when that is likely to be the case and adapt strategy, or inform the client of more realistic timelines or goals.

They will also have ways to help the client feel like they are getting a genuine return on their investment, even if it’s not as much or as quickly as they had anticipated.

A brand-new SEO specialist simply will not know how to do that. It’s too much to expect from someone so early on in their career. You will likely feel the pressure of that.

Your lack Of Experience Will Be Discovered Quickly

The above is really a best-case scenario – you actually make it to the point where you have convinced clients to trust you, and you are beginning to see that you can’t hold up to the promised performance standards.

Most likely, your lack of knowledge and experience will be identified more quickly. You may be working with people with more SEO knowledge than you, such as marketers or product owners.

They may not consider themselves experts in SEO, but they will assume you are if you sell it as a service. They will probably be able to identify significant gaps in your knowledge very early on in your relationship.

Not only will that likely sour the client-agency relationship, but you may also find yourself without a client pretty quickly.

Could Have Legal Ramifications

In some situations, positioning yourself as someone who can get results through offering a service – without the ability to fulfill that – could be a breach of contract.

I would be very wary of making promises to clients about your abilities unless you are upfront that you are brand new to SEO and they are among your first-ever clients.

What You Could Try Instead

So, if I’ve managed to give you pause about committing to offering SEO services as a contractor or an agency, may I suggest some other ways forward?

You can still make money through a career in SEO, even if you are beginning to learn about it.

Join A Company As An Intern

You are clearly passionate about SEO if you are already thinking about starting a business working in the industry. That passion is a great start.

Consider finding an entry-level job in the SEO field and learning on the job in a more supportive, less pressured way.

You could find an agency or in-house position that values your drive and ambition but can support you with the right resources and opportunities to learn SEO while minimizing the risk to yourself and others.

Practice On Sites That You Can Fail With And Learn

If you are struggling to find employment within SEO but want to learn it to get into a position where you can legitimately offer SEO services, you need practice.

Do not practice on sites that rely on organic traffic. Instead, consider building your own sites around subjects you are passionate about and develop your experience and confidence in SEO.

You can make mistakes, weather traffic drops, and be hit by algorithm updates – all without risking anyone’s livelihood. Through that, you will develop the skills you need to work on other people’s sites.

I would still consider graduating to other sites where the risk is low. For example, volunteering your time to work alongside other SEO professionals.

Or, you could try optimizing the site of a friend who understands you are still learning SEO and is happy for you to practice and make mistakes on their site.

Set Up Your Own Site And Monetize It

If you are determined to make money through SEO right away, then build and optimize a site that you can monetize.

This might mean an affiliate site, or perhaps you can start a business that drop-ships.

Whatever course you take, make sure that the risk is minimal, and you will not suffer if you lose traffic and revenue while perfecting your SEO skills.

Make Sure You Have The Experience Before You Go Alone

Most importantly, understand that learning SEO well takes time.

You can easily read up on SEO and have a very high theoretical knowledge of it, but you still need to put what you’ve learned into practice.

This way, you will be able to understand how to adapt strategies for different goals or how to rally when performance doesn’t go as expected.

I want to encourage you to pursue a career in SEO, but I caution you against running before you can walk.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Stop Guessing. Start Converting: The Key To Smarter Lead Generation In 2025 via @sejournal, @Juxtacognition

Marketers have always relied on data to fine-tune their strategies.

But for years, that data has been based on assumptions made from broad industry benchmarks, competitor insights, and vague trends.

The result?

Marginal conversion improvements and a constant struggle to truly connect with the audience.

Better Leads → More Sales In 2025: How To Analyze Leads To Improve Marketing Performance from CallRail shows you how to transform customer conversations into actionable insights that improve your marketing performance at every stage of the funnel.

And you can download your copy right now.

What If You Could Stop Guessing & Start Marketing With Precision?

The secret lies in the conversations your customers are already having with you.

Every call, chat, and interaction holds valuable insights and the exact words, concerns, and motivations that drive your audience to take action.

Yet, most businesses overlook this goldmine of data.

Instead, they continue to make decisions based on surface-level analytics like click-through rates, page views, and lead form submissions.

And while that strategy works, it’s only part of the story. Why?

Image by Paulo Bobita/Search Engine Journal, March 2025

While numbers show what’s happening, your customers’ conversations can tell you why a customer did or didn’t convert and what he or she was thinking.

By leveraging these conversations and other first-party conversational data, you can unlock real insights, refine your messaging, and optimize your marketing, sales, customer service, and more.

Without the guesswork.

The Power Of Listening To Your Customers

Better Leads → More Sales In 2025: How To Analyze Leads To Improve Marketing Performance shows you how to make the most of those insights by providing practical, step-by-step strategies for improving lead quality, increasing conversions, and refining your marketing approach.

You’ll learn how to craft messaging that resonates with your audience, optimize funnels to reflect actual user behavior, and uncover friction points before they impact conversions.

From improving marketing copy to boosting customer retention and increasing Customer Lifetime Value (CLV), the strategies outlined are practical and results-driven.

You’ll be able to move away from guesswork with confidence and build marketing campaigns that feel relevant, personal, and persuasive, leading to higher-quality leads and more sales.

The Future Of Marketing Is Data-Driven And Customer-Focused

Brands that win in 2025 won’t be those with the biggest ad budgets.

They’ll be the ones that listen.

When you understand your customers’ frustrations, their needs, and the exact words they use to describe their problems, you can craft campaigns that feel personal, relevant, and persuasive.

This isn’t just about getting more leads.

It’s about getting the right leads and turning them into loyal customers.

What You’ll Learn In This Ebook:

  • Uncover Customer Insights: Learn how to extract powerful insights from customer conversations, sentiment analysis, and first-hand interactions.
  • Improve Marketing Messaging: Use the language your audience naturally speaks to create high-converting ads, landing pages, and content.
  • Optimize Your Lead Generation Funnels & Customer Journeys: Build a pipeline that reflects real customer behavior. Not assumptions.
  • Reduce Friction & Increase Conversions: Identify barriers before they impact your bottom line.
  • Increase CLV & Customer Lifespan: Find upsell opportunities and improve customer retention using call and chat transcript analysis.

Why This Matters

  • Marketing is evolving. Customers expect brands to understand them and not just sell to them.
  • Data beats guesswork. First-party conversational data gives you direct access to what your customers truly care about.
  • Better insights = higher conversions. When your message aligns with customer needs, engagement and sales increase.

Want to put these insights to work for your business?

Download your free copy today and start turning customer conversations into your most powerful marketing asset.


Featured Image: Paulo Bobita/Search Engine Journal

Data Shows Google AIO Is Citing Deeper Into Websites via @sejournal, @martinibuster

New data from BrightEdge that was shared with Search Engine Journal shows that Google’s AI Overviews (AIO) in March 2025 have rapidly expanded in size and are shifting how traffic is distributed across search results. The analysis suggests that deep, specialized content is likelier to be cited than homepages.

This shows that AIO is becoming more precise about the answers it is giving, aligning with the concepts of Predictive Summaries and Grounding Links that Google recently shared with marketers at Search Central NYC.

Google has traditionally favored showing precise answers to questions, something that the home pages of websites generally cannot do. It makes sense that BrightEdge’s data reflects the kind of precise linking that Predictive Summaries and Grounding Links display in AIO.

Google has expanded the pixel height of AIO by 18.26% in the first two weeks of March. Although some may rightly note that it’s reducing the outbound links of the organic search results it’s important to put that into the context that Google’s AIO also has outbound links and that those links are highly precise contextually relevant links.

The expansion of AIO size was not across the board. Industry-specific increases in AI Overview size:

  • Travel: +39.49%
  • B2B Tech: +37.13%
  • Education: +35.49%
  • Finance: +32.89%

Strategic Response for SEOs And Publishers

BrightEdge suggests that publishers and SEOs should monitor performance metrics to track changes in traffic, impressions, CTRs, and clicks to evaluate how AI Overviews may be influencing traffic trends. Of absolute importance is to try to identify sales or revenue trends because those are the most important metrics, not traffic.

Although it may be useful to create citable content, Google is generally summarizing content and then linking to where users can read more. Now more than ever it’s important to be aware of user trends relative to your industry and be able to anticipate them, including the context of a user’s search. Jono Alderson recently suggested targeting users at the very early side of the consumer journey in order to get ahead of AI-based citations.

Importance of In-Depth, Specialized Content

Google AIO is showing a citation preference for deep-linked content, two more clicks from the home page deep (2+ deep). 82.5% of clicks were to 2+ deep pages. Home pages accounted for less than 0.5% of all clicks.

86% of cited pages were ranked for only one keyword, often high-volume. This represents an opportunity for high traffic volume keyword traffic. The median keyword volume for citations was 15,300 monthly searches and 19% of citation-triggering keywords contain traffic volume in excess of 100k monthly searches.

Implications For Technical SEO And Content Optimization

BrightEdge suggests that full site indexing is critical for AI Overviews in order to ensure that every page is available to be cited as a potential source. Even older and otherwise overlooked content may gain value, especially if they’re reviewed and updated to make them suitable to be cited and reflect the most current information.

Google has been citing deeper content for many years now and the age of the primacy of the home page has been long over, except in local search. That said, home pages only accounted for half a percent of clicks from AIO so it’s super important now more than ever to optimize inner pages.

Takeaways:

The following are the top takeaways from the data:

  • Google’s AI Overviews are rapidly expanding in visual size on the SERP
  • Industries like Travel, B2B Tech, Education, and Finance are experiencing the fastest AI Overview growth
  • Deeper, more specific content is overwhelmingly favored for AI citations over homepages
  • Pages cited in AI Overviews often surface for just one keyword—frequently high-volume
  • Technical SEO and full-site indexing are now essential for brand visibility in AI-driven search

Google’s AI Overviews are not just expanding in size; they are improving the contextual relevance of outbound links to websites. Optimizing for AIO should turn an eye toward keeping older content fresh and up to date to keep it relevant for users who will appreciate that content.

BrightEdge data shared with Search Engine Journal has not been published but a monthly updated guide to AIO is available.

Featured Image by Shutterstock/B..Robinson

Google On Scaled Content: “It’s Going To Be An Issue” via @sejournal, @martinibuster

Google’s John Mueller and Danny Sullivan discussed why AI generated content is problematic, citing the newly updated quality rater guideline and sharing examples of how AI can be used in a positive way that has added value.

Danny Sullivan, known as Google Search Liaison, spoke about the topic in more detail, providing an example of what a high quality use of AI generated content is to serve as a contrast to what isn’t a good use of it.

Update To The Quality Rater Guidelines

The quality rater guidelines (QRG) is a book created by Google to provide guidance to third-party quality raters who rate tests of changes to Google’s search results. It was recently updated and it now includes guidance about AI generated content that’s folded into a section about content created with little effort or originality.

Mueller discussed AI generated content in the context of scaled content abuse, noting that the quality raters are taught to rate that kind of content as low quality.

The new section of the QRG advises the raters:

“The lowest rating applies if all or almost all of the MC on the page (including text, images, audio, videos, etc) is copied, paraphrased, embedded, auto or AI generated, or reposted from other sources with little to no effort, little to no originality, and little to no added value for visitors to the website. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.”

Doesn’t Matter How It’s Scaled: It’s Going To Be An Issue

Danny Sullivan, known as Google Search Liaison, started his part of the discussion by saying that to Google, AI generated content is no different than scaled content tactics from the past, comparing it to the spam tactics of 2005 when Google used statistical analysis and other methods to catch scaled content. He also emphasized that it doesn’t matter how the content was scaled.

According to my notes, here’s a paraphrase of what he said:

“The key things are, large amounts of unoriginal content and also no matter how it’s created.

Because like, ‘What are you going to do about AI? How are you going to deal with all the AI explosion? AI can generate thousands of pages?’

Well 2005 just called, it’d like to explain to you how human beings can generate thousands of pages overnight that look like they’re human generated because they weren’t human generated and etcetera, etcetera, etcetera.

If you’ve been in the SEO space for a long time, you well understand that scaled content is not a new type of thing. So we wanted to really stress: we don’t really care how you’re doing this scaled content, whether it’s AI, automation, or human beings. It’s going to be an issue.

So those are things that you should consider if you’re wondering about the scaled content abuse policy and you want to avoid being caught by it.”

How To Use AI In A Way That Adds Value

A helpful thing about Danny’s session is that he offered an example of a positive use AI, citing how retailers offer a summary of actual user reviews that give an overall user sentiment of the product without having to read reviews. This is an example of how AI is providing an added value as opposed to being the entire main content.

This is from my notes of what he said:

“When I go to Amazon, I skip down to the reviews and the reviews have a little AI-generated thing at the top that tells me what the users generally think, and I’m like, this is really helpful.

And the thing that’s really helpful to me about it is, it’s AI applied to original content, the reviews, to give me a summary. That was added value for me and unique value for me. I liked it.”

As Long As It’s High Quality….

Danny next discussed how they tried to put out a detailed policy about AI generated content but he said it was misconstrued by some parts of the SEO community to mean that AI generated content was fine as long as it was quality AI generated content.

In my 25 years of SEO experience, let me tell you, whenever an SEO tells you that an SEO tactic is fine “as long as it’s quality” run. The “as long as it’s quality” excuse has been used to justify low-quality SEO practices like reciprocal links, directory links, paid links, and guest posts – If it’s not already an SEO joke it should be.

Danny continued:

“And then people’s differentiation of what’s quality is all messed up. And they say Google doesn’t care if it’s AI!’ And that is not really what we said.

We didn’t say that.”

Don’t Mislead Yourself About Quality Of Scaled Content

Danny advised that anyone using artificially generated content should think about two things to use as tests for whether it’s a good idea:

  1. The motivation for mass generated content.
  2. Unoriginality of the scaled content.

Traffic Motivated Content

The motivation shouldn’t be because it will bring more traffic. The motivation should because there’s a value-add for site visitors.

This is how Danny Sullivan explained it, according to my notes:

“Any method that you undertake to mass generate content, you should be carefully thinking about it. There’s all sorts of programmatic things, maybe they’re useful. Maybe they’re not. But you should think about it.

And the things to especially think about is if you’re primarily doing into it to game search traffic.

Like, if the primary intent of the content was, ‘I’m going to get that traffic’ and not, ‘some user actually expected it’ if they ever came to my website directly. That’s one of the many things you can use to try to determine it.”

Originality Of Scaled Content

SEOs who praise their AI-generated content lose their enthusiasm when the content is about a topic they’re actually expert in and will concede that it’s not as smart as they are… And what’s going on, that if you are not an expert then you lack the expertise to judge the credibility of the AI generated content.

AI is trained to crank out the next likeliest word in a series of words, a level of unoriginality so extreme that only a computer can accomplish it.

Sullivan next offered a critique of the originality of AI-generated content:.

“The other thing is, is it unoriginal?

If you are just using the tool saying, ‘Write me 100 pages on the 100 different topics that I got because I ran some tool that pulled all the People Also Asked questions off of Google and I don’t know anything about those things and they don’t have any original content or any value. I just kind of think it’d be nice to get that traffic.’

You probably don’t have anything original.

You’re not necessarily offering anything with really unique value with it there.

A lot of AI tools or other tools are very like human beings because they’ve read a lot of human being stuff like this as well. Write really nice generic things that read very well as if they are quality and that they answer what I’m kind of looking for, but they’re not necessarily providing value.

And sometimes people’s idea of quality differ, but that’s not the key point of it when it comes to the policy that we have with it from there, that especially because these days some people would tell you that it’s quality.”

Takeaways:

  • Google doesn’t “care how you’re doing this scaled content, whether it’s AI, automation, or human beings. It’s going to be an issue.”
  • The QRG explicitly includes AI-generated content in its criteria for ‘Lowest’ quality ratings, signaling that this is something Google is concerned about.
  • Ask if the motivation for using AI-generated content is primarily to drive search traffic or to help users
  • Originality and value-add are important qualities of content to consider
Social media optimization with Yoast SEO

Are you tired of your social media efforts not achieving the results you hoped for? It might be time to scale up your social media optimization efforts. Your content might be good, but you could do various enhancements to make it stand out. For instance, your content needs proper metadata for X, Facebook, and the like to appear properly on each platform. Yoast SEO can help you do this quickly.

Table of contents

Sharing your freshly written (or optimized) content on social media is important. It helps you stay in touch with your audience and update them on news about your business and related topics. But to get their attention, you need to optimize your social media posts before you share them.

In this article, we’ll explain how you can optimize your posts for Facebook and X, and how our plugin can help you with that! Lastly, we’ll briefly discuss Pinterest and the use of Rich Pins.

Social media optimization is about improving how you use social media platforms to build your online presence. You do this not only by creating and sharing content for every platform you’d like to be active on but also by optimizing that content in such a way that you get traffic to your site. The goal is to build strong connections with your audience and to keep them engaged.

Social media optimization starts with well-optimized, highly relevant content that grabs attention. For most platforms, images and video are best suited for this. You can test various formats and ideas to see what your audience prefers. You can use any of the social media analytics tools to do this. Also, find the best times to publish your content to get the best engagement. Your posts should also have metadata for specific platforms like X Cards or OpenGraph for Facebook to help these platforms understand your content.

After posting, remember to engage with your audience. Respond to comments, participate in discussions, and listen to what people say about you and your content. Track your best-performing posts and use data to improve your content to stay relevant and engaging.

Promoting your content on various platforms makes sense in most cases. Remember to share your articles, videos, and other content on whatever social media network makes sense for you and your audience. Read this article if you don’t know where to begin with your social media strategy.

Facebook and other social media

Years ago, Facebook introduced OpenGraph to determine which elements of your page you want to show when someone shares that page. Several social networks and search engines use Facebook’s OpenGraph, but the main reason for adding it is for Facebook itself. Facebook’s OpenGraph support is continuously evolving, but the basics are simple. With a few pieces of metadata, you declare:

  • What type of content is this?
  • What’s the locale?
  • What’s the canonical URL of the page?
  • What’s the name of the site and the title of the page?
  • What’s the page about?
  • Which image/images should be shown when this post or page is shared on Facebook?

Social media preview in Yoast SEO

When you use Yoast SEO, most of the values above are filled out automatically based on your post’s data. It uses the locale of your site, the site’s name, SEO title, the canonical, the meta description value, etc, to fill out most of the required OpenGraph tags. You can see what your post will look like when you click on ‘Social media appearance’ in the Yoast SEO sidebar:

You’ll notice the Social media appearance button in the sidebar opening the modal for the feature

This preview tab allows you to edit how your Facebook post is shown when shared. Our plugin lets you change your social image, title, and description in your preview. This makes your social media optimization much quicker and easier, as you won’t have to leave your post to make these changes.

Make more impact on social media with Yoast SEO Premium!

Get Yoast SEO Premium today and make it quick and easy to manage how your social media snippets look.

If you use the options for social media optimization in Yoast SEO, your Facebook post could look like this when you share the URL of a post or page:

Example of a Facebook post as seen on Yoast’s profile

So what do you need to do?

  1. First, go to Yoast SEO → Settings → Site representation, and fill in your social media accounts.
  2. Afterward, go to Yoast SEO → Settings → Social sharing, and make sure OpenGraph is enabled.
  3. Then, set a good default image under the site basics settings. This image is used when you have a post or page that does not contain an image. It’s important to set this image to ensure that every post or page has an image when shared. Facebook is forgiving when uploading images, but 1200px by 630px should work well.
  4. Lastly, follow the steps in this article to go to your personal WordPress profile and add a link to your Facebook profile, if you want to associate your Facebook profile with your content. If you do, be sure to also enable the ‘Follow’ functionality on Facebook.

You can complete all of these steps in a few minutes. After that, Yoast SEO takes all of the work out of your hands. However, it is important to remember that Facebook sometimes doesn’t immediately pick up changes. So, if you want to “debug” how Facebook perceives your page, enter your URL in the Facebook Sharing Debugger and click the Debug button. If the preview that you see there isn’t the latest version, you can try the Scrape again button. But remember that it can take a while for Facebook to see your changes.

OpenGraph for Video Content

If you have video content, you must do more work unless you use our Video SEO plugin. This plugin handles all the needed metadata and lets you share your videos on Facebook.

X

X’s functionality is quite similar to Facebook’s. The name of this functionality is X Cards. X “falls back” on Facebook OpenGraph for several of these values, so we don’t have to include everything. But it still is quite a bit. We’re talking about:

  • the type of content/type of card
  • an image
  • a description
  • the X account of the site/publisher
  • the X account of the author
  • the “name” for the domain to show in an X card

X preview in Yoast SEO

As you might have seen in Yoast SEO, optimizing your X listings is also an option. Simply click that tab to preview how your page appears when it gets shared to X. By default, the plugin uses the title, description and image you enter in the search appearance preview. Of course, this tab allows you to change these for your Twitter post.

Here’s an example of what your post could look like with all the required metadata our plugin helps you add:

An example of a post on Yoast’s X profile

So what do you need to do?

Ensure X card metadata is enabled by going to Yoast SEO → Settings → Site features → Social sharing and activating the X feature. This leaves a couple of values for you to fill out in the settings, which you can do using this guide on activating X Cards in Yoast SEO.

Do you spend a lot of time tweaking the preview appearance of each page or post? You’ll be glad to know that Yoast SEO Premium also offers a very helpful feature: the ability to set default templates for your social snippets. With this powerful feature, you can design the ideal social appearance for all your content and feel certain that the output will always look great to whoever is sharing it.

Use variables to set up templates to optimize your social media postings

What about Pinterest?

Pinterest’s Rich Pins allow for OpenGraph markup as well. Add variables like product name, availability, price, and currency to your page to create a rich pin. As this is mainly interesting for products, we decided to add functionalities to create rich pins to our Yoast WooCommerce SEO plugin.

Read more: How to promote your products and earn money on Pinterest »

Conclusion on social media optimization

So, go ahead and use Yoast SEO to optimize your social media. It isn’t very hard; it just takes a few minutes of your time, and you will reap the rewards immediately. As these social networks add new features, we’ll keep our plugin and this article up-to-date. So, be sure to update the Yoast SEO plugin regularly.

Keep reading: Social Media Strategy: where to begin? »

The Download: generative AI therapy, and the future of 23andMe’s genetic data

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

The first trial of generative AI therapy shows it might help with depression

The first clinical trial of a generative AI therapy bot suggests it was as effective as human therapy for people with depression, anxiety, or risk for developing eating disorders. Even so, it doesn’t give a go-ahead to the dozens of companies hyping such technologies while operating in a regulatory gray area. Read the full story.

—James O’Donnell

How a bankruptcy judge can stop a genetic privacy disaster

—Keith Porcaro

The fate of 15 million people’s genetic data rests in the hands of a bankruptcy judge now that 23andMe has filed for bankruptcy. But there’s still a small chance of writing a better ending for users—and it’s a simple fix. Read the full story.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Meet the online activists fighting back against ICE raids
Their networks are warning migrants about ICE officer hotspots in major cities. (WP $)
+ Noncitizens are growing increasingly anxious. (NPR)

2 US health experts were ordered to bury a measles forecast
The assessment warned the risk of catching the virus was high in areas with lower vaccination rates. (ProPublica)
+ The former US covid chief has called the outbreak wholly preventable. (Politico)
+ How measuring vaccine hesitancy could help health professionals tackle it. (MIT Technology Review)

3 Donald Trump is confident a TikTok deal is forthcoming
Ahead of the impending deadline on Saturday. (Reuters)

4 China’s efforts to clean up air pollution are accelerating global warming
Its dirty air had been inadvertently cooling the planet. (New Scientist $)
+ Who’s to blame for climate change? It’s surprisingly complicated. (MIT Technology Review)

5 Brands are spending small amounts on X to appease Elon Musk
They’re doing what they can to avoid triggering a public fallout with the billionaire. (FT $)
+ Musk’s X has a new owner—it’s, err, Musk’s xAI. (CNBC)

6 Campaigners are calling to pause a mental health inpatient monitoring system
The Oxevision system remotely tracks patients’ breathing and heart rates. (The Guardian)
+ This AI-powered “black box” could make surgery safer. (MIT Technology Review)

7 The US and China are locked in a race to produce the first useful humanoid robot
The first to succeed will dominate the future of many labor-intensive industries. (WSJ $)
+ Beijing is treating humanoid robots as a major future industry. (WP $)

8 Data center operators are inking solar power deals
It’s a proven, clean technology that is relatively low-cost. (TechCrunch)
+ The cost of AI services is dropping. (The Information $)
+ Why the US is still trying to make mirror-magnified solar energy work. (MIT Technology Review)

9 H&M plans to create digital replicas of its models
Which means the retailer could outsource entire photoshoots to AI. (NYT $)
+ The metaverse fashion stylists are here. (MIT Technology Review)

10 What it’s like to drive a Tesla Cybertruck in Washington DC
Expect a whole lot of abuse. (The Atlantic $)
+ Protestors are gathering at Tesla showrooms across America. (Insider $)

Quote of the day

“Viruses don’t need a passport.”

—Dr William Schaffner, an infectious disease expert at Vanderbilt University Medical Center, warns CNN that the US measles outbreak could spread widely to other countries.

The big story

Marseille’s battle against the surveillance state

June 2022

Across the world, video cameras have become an accepted feature of urban life. Many cities in China now have dense networks of them, and London and New Delhi aren’t far behind. Now France is playing catch-up.

Concerns have been raised throughout the country. But the surveillance rollout has met special resistance in Marseille, France’s second-biggest city.

It’s unsurprising, perhaps, that activists are fighting back against the cameras, highlighting the surveillance system’s overreach and underperformance. But are they succeeding? Read the full story.

—Fleur Macdonald

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ The online pocket computer museum is exceptionally charming.
+ There’s an entirely new cat color emerging, and scientists have finally worked out why.
+ Experiencing Bluesky ‘skeets’ posted in real time is a seriously trippy business.
+ Never underestimate the power of a good deed.

Google’s Martin Splitt Reveals 3 JavaScript SEO Mistakes & Fixes via @sejournal, @MattGSouthern

Google’s Martin Splitt recently shared insights on how JavaScript mistakes can hurt a website’s search performance.

His talk comes as Google Search Advocate John Mueller also urges SEO pros to learn more about modern client-side technologies.

Mistake 1: Rendered HTML vs. Source HTML

During the SEO for Paws Conference, a live-streamed fundraiser by Anton Shulke, Splitt drew attention to a trend he’s noticing.

Many SEO professionals still focus on the website’s original source code even though Google uses the rendered HTML for indexing. Rendered HTML is what you see after JavaScript has finished running.

Splitt explains:

“A lot of people are still looking at view source. That is not what we use for indexing. We use the rendered HTML.”

This is important because JavaScript can change pages by removing or adding content. Understanding this can help explain some SEO issues.

Mistake 2: Error Pages Being Indexed

Splitt pointed out a common error with single-page applications and JavaScript-heavy sites: they often return a 200 OK status for error pages.

This happens because the server sends a 200 response before the JavaScript checks if the page exists.

Splitt explains:

“Instead of responding with 404, it just responds with 200 … always showing a page based on the JavaScript execution.”

When error pages get a 200 code, Google indexes them like normal pages, hurting your SEO.

Splitt advises checking server settings to handle errors properly, even when using client-side rendering.

Mistake 3: Geolocation Request Issue

Another problem arises when sites ask users for location or other permissions.

Splitt says Googlebot will always refuse the request if a site relies on geolocation (or similar requests) without a backup plan.

Splitt explains:

“Googlebot does not say yes on that popup. It says no on all these requests … so if you request geolocation, Googlebot says no.”

The page can appear blank to Googlebot without alternative content, meaning nothing is indexed. This can turn into a grave SEO mistake.

How to Debug JavaScript for SEO

Splitt shared a few steps to help diagnose and fix JavaScript issues:

  1. Start with Search Console: Use the URL Inspection tool to view the rendered HTML.
  2. Check the Content: Verify if the expected content is there.
  3. Review HTTP Codes: Look at the status codes in the “More info” > “Resources” section.
  4. Use Developer Tools: Open your browser’s developer tools. Check the “initiator” column in the Network tab to see which JavaScript added specific content.

Splitt adds:

“The initiator is what loaded it. If it’s injected by JavaScript, you can see which part of the code did it.”

Following these steps can help you find the problem areas and work with your developers to fix them.

See Splitt’s full talk in the recording below:

A Shift in SEO Skills

Splitt’s advice fits with Mueller’s call for SEOs to broaden their skill set.

Mueller recently suggested that SEO professionals learn about client-side frameworks, responsive design, and AI tools.

Mueller stated:

“If you work in SEO, consider where your work currently fits in … if your focus was ‘SEO at server level,’ consider that the slice has shrunken.”

Modern JavaScript techniques create new challenges that old SEO methods cannot solve alone. Splitt’s real-world examples show why understanding these modern web practices is now critical.

What This Means For SEO Professionals

Both Google Advocates point to a clear trend: SEO now requires more technical skills. As companies look for professionals who can blend SEO and web development, the demand for these modern skills is growing.

To keep up, SEO pros should:

  • Learn How JavaScript Affects Indexing: Know the difference between source and rendered HTML.
  • Master Developer Tools: Use tools like Search Console and browser developer tools to spot issues.
  • Collaborate with Developers: Work together to build sites that serve users and search engines well.
  • Broaden Your Skillset: Add client-side techniques to your traditional SEO toolkit.

Looking Ahead

As the web evolves, so must the skills of SEO professionals. However, leveling up your knowledge doesn’t have to be intimidating.

This fresh look at JavaScript’s role in SEO shows that even simple changes can have a big impact.


Featured Image: BestForBest/Shutterstock

Search Volume Tools for Marketers

Understanding keyword trends is key for organic and paid search performance.

Search terms rise and fall in popularity. The movements impact decisions such as naming products, launching marketing campaigns, and developing content.

Here are four tools to identify year-over-year keyword trends.

SEOmonitor

SEOmonitor is a rank tracking platform that pulls search volume from Google Ad’s Keyword Planner.

The YoY search trends (expressed as a positive or negative percentage) compare the keyword volumes from the last month to the same month of the previous year. If the volume exceeds +200%, the trend appears as a multiplier.

Pricing starts at €99 ($107) per month with a free trial.

Screenshot of an SEOmonitor comparison chart.

SEOmonitor compares last month’s keyword volumes to those of the same month in the previous year. Click image to enlarge.

Semrush

Semrush, a multi-feature search platform, provides comprehensive keyword volume info, including keyword magic (for discovery), competitive analysis, and rank tracking.

Semrush says it relies on multiple sources for its search volume reports — Keyword Planner and “clickstream data acquired from reliable sources.” It’s the only tool on this list with a proprietary metric beyond Google-supplied metrics.

Semrush retains historical search volume data for many years, handy for analyzing long-term keyword trends. I’ve pulled high-volume keyword stats as far back as 2012, for example.

Pricing for Semrush starts at $140 per month with a free trial.

Semrush retains historical search volume data for years, such as this example for “ecommerce” in 2012. Click image to enlarge.

Advanced Web Ranking

Advanced Web Ranking is a rank-tracking platform that reports the average search volume for any query based on Keyword Planner data.

In the ranking tracking chart, the percentage displayed alongside Search Volume shows the YoY movements — queries from the latest full month compared to the same month last year.

Prices for Advanced Web Ranking start at $99 per month with a free trial.

Advanced Web Ranking reports average YoY search volume based on Google’s Keyword Planner for any query. Click image to enlarge.

Glimpse (+Google Trends)

Glimpse offers a Chrome extension that enhances Google Trends reports with search volume info and movements in the past month, quarter, and year.

Glimpse’s Chrome extension enhances Google Trends reports with monthly, quarterly, and annual search volume, such as this example for “ecommerce.” Click image to enlarge.

The tool also provides keyword seasonality analysis, such as this assessment for “ecommerce”:

Interest in “ecommerce” remains fairly stable throughout the year, with slight increases around mid-summer in July, suggesting a potential increase in online shopping activities or preparations for upcoming sales events. However, the trend demonstrates minor declines as the year progresses towards the holiday season, particularly in October, November, and December, possibly due to increasing consumer focus on traditional retail or other seasonal activities during the holiday period.

Google Trends reports only popular keywords, not niche or long-tail queries. Thus Glimpse is helpful mostly for generic competitive terms.

​Glimpse’s Chrome extension is free for 10 monthly searches. Paid plans with unlimited searches start at $49 per month.