Ask An SEO: How To Implement Faceted Navigation Without Hurting Crawl Efficiency via @sejournal, @kevgibbo

This week’s question tackles the potential SEO fallouts when implementing faceted navigation:

“How can ecommerce sites implement SEO-friendly faceted navigation without hurting crawl efficiency or creating index bloat?”

Faceted navigation is a game-changer for user experience (UX) on large ecommerce sites. It helps users quickly narrow down what they’re looking for, whether it’s a size 8 pair of red road running trainers for women, or a blue, waterproof winter hiking jacket for men.

For your customers, faceted navigation makes huge inventories feel manageable and, when done right, enhances both UX and SEO.

However, when these facets create a new URL for every possible filter combination, they can lead to significant SEO issues that harm your rankings, and waste valuable crawl budget if not managed properly.

How To Spot Faceted Navigation Issues

Faceted navigation issues often fly under the radar – until they start causing real SEO damage. The good news? You don’t need to be a tech wizard to spot the early warning signs.

With the right tools and a bit of detective work, you can uncover whether filters are bloating your site, wasting crawl budget, or diluting rankings.

Here’s a step-by-step approach to auditing your site for faceted SEO issues:

1. Do A Quick “Site:” Search

Start by searching on Google with this query: site:yourdomain.com.

This will show you all the URLs Google has indexed for your site. Review the list:

  • Does the number seem higher than the total pages you want indexed?
  • Are there lots of similar URLs, like ?color=red&size=8?

If so, you may have index bloat.

2. Dig Into Google Search Console

Check Google Search Console (GSC) for a clearer picture. Look under “Coverage” to see how many pages are indexed.

Pay attention to the “Indexed, not submitted in sitemap” section for unintended filter-generated pages.

3. Understand How Facets Work On Your Site

Not all faceted navigation behaves the same. Make sure you understand how filters work on your site:

  • Are they present on category pages, search results, or blog listings?
  • How do filters stack in the URL (e.g.,?brand=ASICS&color=red)?

4. Compare Crawl Activity To Organic Visits

Some faceted pages drive traffic; others burn crawl budget without returns.

Use tools like Botify, Screaming Frog, or Ahrefs to compare Googlebot’s crawling behavior with actual organic visits.

If a page gets crawled a lot but doesn’t attract visitors, it’s a sign that it’s consuming crawl resources unnecessarily.

5. Look For Patterns In URL Data

Run a crawler to scan your site’s URLs. Check for repetitive patterns, such as endless combinations of parameters like ?price=low&sort=best-sellers. These are potential crawler traps and unnecessary variations.

6. Match Faceted Pages With Search Demand

To decide which SEO tactics to use for faceted navigation, assess the search demand for specific filters and whether unique content can be created for those variations.

Use keyword research tools like Google Keyword Planner or Ahrefs to check for user demand for specific filter combinations. For example:

  • White running shoes (SV 1000; index).
  • White waterproof running shoes (SV 20; index).
  • Red trail running trainers size 9 (SV 0; noindex).

This helps prioritize which facet combinations should be indexed.

If there’s enough value in targeting a specific query, such as product features, a dedicated URL may be worthwhile.

However, low-value filters like price or size should remain no-indexed to avoid bloated indexing.

The decision should balance the effort needed to create new URLs against the potential SEO benefits.

7. Log File Analysis For Faceted URLs

Log files record every request, including those from search engine bots.

By analyzing them, you can track which URLs Googlebot is crawling and how often, helping you identify wasted crawl budget on low-value pages.

For example, if Googlebot is repeatedly crawling deep-filtered URLs like /jackets?size=large&brand=ASICS&price=100-200&page=12 with little traffic, that’s a red flag.

Key signs of inefficiency include:

  • Excessive crawling of multi-filtered or deeply paginated URLs.
  • Frequent crawling of low-value pages.
  • Googlebot is stuck in filter loops or parameter traps.

By regularly checking your logs, you get a clear picture of Googlebot’s behavior, enabling you to optimize crawl budget and focus Googlebot’s attention on more valuable pages.

Best Practices To Control Crawl And Indexation For Faceted Navigation

Here’s how to keep things under control, so your site stays crawl-efficient and search-friendly.

1. Use Clear, User-Friendly Labels

Start with the basics: Your facet labels should be intuitive. “Blue,” “Leather,” “Under £200” – these need to make instant sense to your users.

Confusing or overly technical terms can lead to a frustrating experience and missed conversions. Not sure what resonates? Check out competitor sites and see how they’re labeling similar filters.

2. Don’t Overdo It With Facets

Just because you can add 30 different filters doesn’t mean you should. Too many options can overwhelm users and generate thousands of unnecessary URL combinations.

Stick to what genuinely helps customers narrow down their search.

3. Keep URLs Clean When Possible

If your platform allows it, use clean, readable URLs for facets like /sofas/blue rather than messy query strings like ?color[blue].

Reserve query parameters for optional filters (e.g., sort order or availability), and don’t index those.

4. Use Canonical Tags

Use canonical tags to point similar or filtered pages back to the main category/parent page. This helps consolidate link equity and avoid duplicate content issues.

Just remember, canonical tags are suggestions, not commands. Google may ignore them if your filtered pages appear too different or are heavily linked internally.

For any faceted pages you want indexed, these should include a self-referencing canonical, and for any that don’t, canonicalize these to the parent page.

5. Create Rules For Indexing Faceted Pages

Break your URLs into three clear groups:

  • Index (e.g., /trainers/blue/leather): Add a self-referencing canonical, keep them crawlable, and internally link to them. These pages represent valuable, unique combinations of filters (like color and material) that users may search for.
  • Noindex (e.g., /trainers/blue_black): Use a to remove them from the index while still allowing crawling. This is suitable for less useful or low-demand filter combinations (e.g., overly niche color mixes).
  • Block Crawl (e.g., filters with query parameters like /trainers?color=blue&sort=popularity): Use robots.txt, JavaScript, or parameter handling to prevent crawling entirely. These URLs are often duplicate or near-duplicate versions of indexable pages and don’t need to be crawled.

6. Maintain A Consistent Facet Order

No matter the order in which users apply filters, the resulting URL should be consistent.

For example, /trainers/blue/leather and /trainers/leather/blue should result in the same URL, or else you’ll end up with duplicate content that dilutes SEO value.

7. Use Robots.txt To Conserve Crawl Budget

One way to reduce unnecessary crawling is by blocking faceted URLs through your robots.txt file.

That said, it’s important to know that robots.txt is more of a polite request than a strict rule. Search engines like Google typically respect it, but not all bots do, and some may interpret the syntax differently.

To prevent search engines from crawling pages you don’t want indexed, it’s also smart to ensure those pages aren’t linked to internally or externally (e.g., backlinks).

If search engines find value in those pages through links, they might still crawl or index them, even with a disallow rule in place.

Here’s a basic example of how to block a faceted URL pattern using the robots.txt file. Suppose you want to stop crawlers from accessing URLs that include a color parameter:

User-agent: *
Disallow: /*color*

In this rule:

  • User-agent: * targets all bots.
  • The * wildcard means “match anything,” so this tells bots not to crawl any URL containing the word “color.”

However, if your faceted navigation requires a more nuanced approach, such as blocking most color options but allowing specific ones, you’ll need to mix Disallow and Allow rules.

For instance, to block all color parameters except for “black,” your file might include:

User-agent: *
Disallow: /*color*
Allow: /*color=black*

A word of caution: This strategy only works well if your URLs follow a consistent structure. Without clear patterns, it becomes harder to manage, and you risk accidentally blocking key pages or leaving unwanted URLs crawlable.

If you’re working with complex URLs or an inconsistent setup, consider combining this with other techniques like meta noindex tags or parameter handling in Google Search Console.

8. Be Selective With Internal Links

Internal links signal importance to search engines. So, if you link frequently to faceted URLs that are canonicalized or blocked, you’re sending mixed signals.

Consider using rel=”nofollow” on links you don’t want crawled – but be cautious. Google treats nofollow as a hint, not a rule, so results may vary.

Point to only canonical URLs within your website wherever possible. This includes dropping parameters and slugs from links that are not necessary for your URLs to work.

You should also prioritize pillar pages; the more inlinks a page has, the more authoritative search engines will deem that page to be.

In 2019, Google’s John Mueller said:

“In general, we ignore everything after hash… So things like links to the site and the indexing, all of that will be based on the non hash URL. And if there are any links to the hashed URL, then we will fold up into the non hash URL.”

9. Use Analytics To Guide Facet Strategy

Track which filters users actually engage with, and which lead to conversions.

If no one ever uses the “beige” filter, it may not deserve crawlable status. Use tools like Google Analytics 4 or Hotjar to see what users care about and streamline your navigation accordingly.

10. Deal With Empty Result Pages Gracefully

When a filtered page returns no results, respond with a 404 status, unless it’s a temporary out-of-stock issue, in which case show a friendly message stating so, and return a 200.

This helps avoid wasting crawl budget on thin content.

11. Using AJAX For Facets

When you interact with a page – say, filtering a product list, selecting a color, or typing in a live search box – AJAX lets the site fetch or send data behind the scenes, so the rest of the page stays put.

It can be really effective to implement facets client-side via AJAX, which doesn’t create multiple URLs for every filter change. This reduces unnecessary load on the server and improves performance.

12. Handling Pagination In Faceted Navigation

Faceted navigation often leads to large sets of results, which naturally introduces pagination (e.g., ?category=shoes&page=2).

But when combined with layered filters, these paginated URLs can balloon into thousands of crawlable variations.

Left unchecked, this can create serious crawl and index bloat, wasting search engine resources on near-duplicate pages.

So, should paginated URLs be indexed? In most cases, no.

Pages beyond the first page rarely offer unique value or attract meaningful traffic, so it’s best to prevent them from being indexed while still allowing crawlers to follow links.

The standard approach here is to use noindex, follow on all pages after page 1. This ensures your deeper pagination doesn’t get indexed, but search engines can still discover products via internal links.

When it comes to canonical tags, you’ve got two options depending on the content.

If pages 2, 3, and so on are simply continuations of the same result set, it makes sense to canonicalize them to page 1. This consolidates ranking signals and avoids duplication.

However, if each paginated page features distinct content or meaningful differences, a self-referencing canonical might be the better fit.

The key is consistency – don’t mix page 2 canonical to page 1 and page 3 to itself, for example.

About rel=”next” and rel=”prev,” while Google no longer uses these signals for indexing, they still offer UX benefits and remain valid HTML markup.

They also help communicate page flow to accessibility tools and browsers, so there’s no harm in including them.

To help control crawl depth, especially in large ecommerce sites, it’s wise to combine pagination handling with other crawl management tactics:

  • Block excessively deep pages (e.g., page=11+) in robots.txt.
  • Use internal linking to surface only the first few pages.
  • Monitor crawl activity with log files or tools like Screaming Frog.

For example, a faceted URL like /trainers?color=white&brand=asics&page=3 would typically:

  • Canonical to /trainers?color=white&brand=asics (page 1).
  • Include noindex, follow.
  • Use rel=”prev” and rel=”next” where appropriate.

Handling pagination well is just as important as managing the filters themselves. It’s all part of keeping your site lean, crawlable, and search-friendly.

Final Thoughts

When properly managed, faceted navigation can be an invaluable tool for improving user experience, targeting long-tail keywords, and boosting conversions.

However, without the right SEO strategy in place, it can quickly turn into a crawl efficiency nightmare that damages your rankings.

By following the best practices outlined above, you can enjoy all the benefits of faceted navigation while avoiding the common pitfalls that often trip up ecommerce sites.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Bryan Johnson wants to start a new religion in which “the body is God”

Bryan Johnson is on a mission to not die. The 47-year-old multimillionaire has already applied his slogan “Don’t Die” to events, merchandise, and a Netflix documentary. Now he’s founding a Don’t Die religion.

Johnson, who famously spends millions of dollars on scans, tests, supplements, and a lifestyle routine designed to slow or reverse the aging process, has enjoyed extensive media coverage, and a huge social media following. For many people, he has become the face of the longevity field.

I sat down with Johnson at an event for people interested in longevity in Berkeley, California, in late April. We spoke on the sidelines after lunch (conference plastic-lidded container meal for me; what seemed to be a plastic-free, compostable box of chicken and vegetables for him), and he sat with an impeccable posture, his expression neutral. 

Earlier that morning, Johnson, in worn trainers and the kind of hoodie that is almost certainly deceptively expensive, had told the audience about what he saw as the end of humanity. Specifically, he was worried about AI—that we face an “event horizon,” a point at which superintelligent AI escapes human understanding and control. He had come to Berkeley to persuade people who are interested in longevity to focus their efforts on AI. 

It is this particular concern that ultimately underpins his Don’t Die mission. First, humans must embrace the Don’t Die ideology. Then we must ensure AI is aligned with preserving human existence. Were it not for AI, he says, he wouldn’t be doing any of his anti-death activities and regimens. “I am convinced that we are at an existential moment as a species,” says Johnson, who was raised Mormon but has since left the church. Solving aging will take decades, he says—we’ll survive that long only if we make sure that AI is aligned with human survival. 

The following Q&A has been lightly edited for length and clarity.

Why are you creating a new religion?

We’re in this new phase where [because of advances in AI] we’re trying to reimagine what it means to be human. It requires imagination and creativity and open-mindedness, and that’s a big ask. Approaching that conversation as a community, or a lifestyle, doesn’t carry enough weight or power. Religions have proven, over the past several thousand years, to be the most efficacious form to organize human efforts. It’s just a tried-and-true methodology. 

How do you go about founding a new religion?

It’s a good question. If you look at historical [examples], Buddha went through his own self-exploratory process and came up with a framework. And Muhammad had a story. Jesus had an origin story … You might even say Satoshi [Nakamoto, the mysterious creator of bitcoin] is like [the founder of] a modern-day religion, [launched] with the white paper. Adam Smith launched capitalism with his book. The question is: What is a modern-day religion, and how does it convince? It’s an open question for me. I don’t know yet.

Your goal is to align AI with Don’t Die—or, in other words, ensure that AI models prioritize and protect human life. How will you do that?

I’m talking to a lot of AI researchers about this. Communities of AIs could be instilled with values of conflict resolution that do not end in the death of a human. Or an AI. Or the planet.

Would you say that Don’t Die is “your” religion?

No, I think it’s humanity’s religion. It’s different from other religions, which are very founder-centric. I think this is going to be decentralized, and it will be something that everybody can make their own.

So there’s no God?

We’re playing with the idea that the body is God. We’ve been experimenting with this format of a Don’t Die fam, where eight to 12 people get together on a weekly basis. It’s patterned off of other groups like Alcoholics Anonymous. We structure an opening ritual. We have a mantra. And then there’s a part where people apologize to their body for something they’ve done that has inflicted harm upon themselves. 

It’s reframing our relationship to body and to mind. It is also a way for people to have deep friendships, to explore emotionally vulnerable topics, and to support each other in health practices.

What we’re really trying to say is: Existence is the virtue. Existence is the objective. If someone believes in God, that’s fine. People can be Christian and do this; they can be Muslim and do this. Don’t Die is a “yes, and” to all groups.

So it’s a different way of thinking about religion?

Yeah. Right now, religion doesn’t hold the highest status in society. A lot of people look down on it in some way. I think as AI progresses, it’s going to create additional questions on who we are: What is our identity? What do we believe about our existence in the future? People are going to want some kind of framework that helps them make sense of the moment. So I think there’s going to be a shift toward religion in the coming years. People might say that [founding a religion now] is kind of a weird move, and that [religion] turns people off. But I think that’s fine. I think we’re ahead.

Does the religion incorporate, or make reference to, AI in any way?

Yeah. AI is going to be omnipresent. And this is why we’ve been contemplating “the body is God.” Over the past couple of years … I’ve been testing the hypothesis that if I get a whole bunch of data about my body, and I give it to an algorithm, and feed that algorithm updates with scientific evidence, then it would eventually do a better job than a doctor. So I gave myself over to an algorithm. 

It really is in my best interest to let it tell me what to eat, tell me when to sleep and exercise, because it would do a better job of making me happy. Instead of my mind haphazardly deciding what it wants to eat based on how it feels in the moment, the body is elevated to a position of authority. AI is going to be omnipresent and built into our everyday activities. Just like it autocompletes our texts, it will be able to autocomplete our thoughts.

Might some people interpret that as AI being God?

Potentially. I would be hesitant to try to define [someone else’s] God. The thing we want to align upon is that none of us want to die right now. We’re attempting to make Don’t Die the world’s most influential ideology in the next 18 months.

Get Your Products on ChatGPT Shopping

ChatGPT now recommends products directly in search results. The feature has no ads (so far). Hence any ecommerce business can presumably gain visibility for free.

The recommendations follow from relevant prompts with clear shopping intent. ChatGPT provides info on each product it recommends, including (i) customer ratings from multiple sources, (ii) pricing from multiple sellers, and (iii) explanations of its selections.

Here’s how to expose your products to ChatGPT.

A prompt on ChatGPT for “drip coffee makers” produced multiple recommendations with images, sources, ratings, and a “Top Picks Explained” section. Click image to enlarge.

Get in the Database

In a post titled “Help ChatGPT discover your products,” OpenAI states it is considering a product feed submission feature (likely similar to Google Shopping) and provides a form to receive notifications when it’s live.

Ecommerce businesses should sign up now to become early adopters — before other (bigger) brands.

The post also reminds merchants to ensure OpenAI’s crawler can access their sites. Some content management systems and plugins block OpenAI bots by default. Check your robots.txt file to ensure it doesn’t block “OAI-SearchBot.” Log file analyzers from Screaming Frog and others can confirm the bot is crawling your site.

Knowatoa’s free “AI Search Console” discloses which generative AI bots have access.

Use Schema Markup

ChatGPT supports structured data markup from Schema.org. Thus including detailed product schema will likely increase your chances of being recommended. Searchers’ purchase intent prompts are often very specific with dimensions, colors, and similar.

In a related post, ChatGPT states it will recommend products based in part on the user’s prompt history. Using its memory feature, ChatGPT might know, for example, the searcher’s color and style preferences and will recommend products accordingly.

Searchers’ purchase intent prompts typically seek specific features. A detailed product schema with many facets (colors, sizes, dimensions, warranty, styles, compatibility, etcetera) will increase the likelihood of a recommendation.

Monitor External Product Reviews

In the same related post, ChatGPT states it relies on multiple public sources when displaying product reviews and ratings. While testing, I’ve seen ratings from Target, Amazon, Wired, Business Insider, and others.

Keep an eye on ratings and reviews to encourage ChatGPT’s recommendations. I’ve yet to see a ChatGPT recommendation with bad reviews. Some platforms (Better Business Bureau) often lean towards negative reviews, while others (Facebook) are typically positive (e.g., Facebook).

Shopper Approved can help improve your product ratings by adding positive reviews from your site to external platforms as needed.

Google’s Walled Garden: Users Make 10 Clicks Before Leaving via @sejournal, @MattGSouthern

New data shows Google keeps users on its site longer. Visitors now make 10 clicks on Google’s site before leaving for another website.

This finding comes from a 13-month study comparing Google and ChatGPT traffic patterns.

Google Keeps Users In, ChatGPT Sends Them Out

Tyler Einberger of Momentic analyzed Similarweb data showing that Google’s “pages per visit” metric has climbed to 10 as of March, a big jump from before.

Image Credit: Momentic.

What does this mean? Users spend more clicks on Google’s search results than on other websites.

The report explains:

“Increasing ‘Pages per Visit’ for Google.com is an indicator that users are spending more clicks within Google’s search results (SERPs). Since most SERP interactions—like interacting with SERP features, paging, refining searches, or clicking images—change the URL but keep visitors on Google’s domain.”

Google still sends the most overall traffic to external websites.

Google generated 175.5 million outgoing visits in March compared to ChatGPT’s 57.6 million. This represents a 66.4% increase for Google compared to last year.

The Efficiency Gap

ChatGPT is more efficient at sending people to other websites.

The numbers tell the story:

  • ChatGPT generates 1.4 external website visits per user
  • Google produces just 0.6 visits per user

This means ChatGPT users are 2.3 times more likely to visit external websites than Google users, even though Google’s audience is about 6.8 times larger.

The SERP Retention Strategy

Google’s increasing in-platform clicks match its strategy of expanding search features. These features provide immediate answers without requiring users to visit other websites.

Google is succeeding at two goals:

  1. Remaining the web’s primary traffic source
  2. Keeping users on Google’s properties longer

While Google sent more outgoing traffic in early 2025, its audience barely grew. This shows a complex relationship between keeping users and referring them elsewhere.

What This Means

For SEO pros and marketers, this trend creates new challenges and opportunities:

  • With users spending more time on Google’s interfaces, capturing attention in the first screen view matters more than ever.
  • Focus on appearing in featured snippets, knowledge panels, and other SERP elements to maintain visibility as traditional organic clicks become harder to get.
  • Consider ChatGPT and other AI platforms as additional traffic sources since they refer more visitors per user.
  • Users now interact with multiple SERP features before clicking a website, requiring better attribution models and content strategies.

The Broader AI Search Market

While Google and ChatGPT lead the conversation, other AI search platforms are growing fast.

Perplexity grew 110.7% month-over-month in March. Grok grew 48.1% and Claude grew 23%.

These newer platforms could change current traffic patterns as they gain users, though the report doesn’t analyze their referral efficiency in detail.

Google remains the biggest traffic source overall. However, its growing “walled garden” approach means marketers should watch these trends and diversify where their traffic comes from.


Featured Image: Here Now/Shutterstock

Google’s Updated Raters Guidelines Target Fake EEAT Content via @sejournal, @martinibuster

A major update to Google’s Search Quality Raters Guidelines (QRG) clarifies and expands on multiple forms of deception that Google wants its quality raters to identify. This change continues the trend of refining the guidelines so that quality raters become better at spotting increasingly granular forms of quality issues.

TL/DR

Authenticity should be the core principle for any SEO and content strategy.

Quality Guidelines Section 4.5.3

Section 4.5.3 has essentially been rewritten to be clearer and easier to understand but most importantly it has been expanded to cover more kinds of deception. One can speculate that the quality raters weren’t overlooking certain kinds of website deception and that these changes are addressing that shortcoming. This could also signal that Google’s algorithms may in the near future become more adept at spotting the described kinds of deception.

The change in the heading of section 4.5.3 reflects the scope of the changes, with greater detail over the original version.

The section title changed from this:

4.5.3 Deceptive Page Purpose and Deceptive MC Design

To this:

“4.5.3 Deceptive Page Purpose, Deceptive Information about the Website, Deceptive Design”

The entire section was lightly rewritten and reorganized for greater clarity. It’s not necessarily a new policy but rather a more detailed and nuanced version of it, with a few parts that are brand new.

Deceptive Purpose

The following is a new paragraph about deceptive purpose:

“Deceptive purpose:

● A webpage with deliberately inaccurate information to promote products in order to make money from clicks on monetized links. Examples include a product recommendation page on a website falsely impersonating a celebrity blog, or a product recommendation based on a false claim of personal, independent testing when no such testing was conducted.”

Google very likely has algorithmic signals and processes to detect and remove sites with these kinds of deceptive content. While one wouldn’t expect that a little faking would be enough to result in a sudden drop in rankings, why take the chance? It’s always the safest approach to focus on authenticity.

To be clear, the focus of this section isn’t just about putting fake information on a website but rather it’s about deceptive purpose. The opposite of a deceptive purpose is a purpose rooted in authenticity, with authentic intent.

Deceptive EEAT Content

There is now a brand new section that is about fake EEAT (Expertise, Experience, Authoritativeness, and Trustworthiness) content on a website. A lot of SEOs talk about adding EEAT to their web pages but the fact is that EEAT is not something that one adds to a website. EEAT is a quality of a website that’s inherent in the overall experience of researching a site, learning about a site, and in the process of consuming the content, which can result in signals that site visitors may generate about a website.

Here’s the guidance about fake EEAT content:

“● A webpage or website with deceptive business information. For example, a website may claim to have a physical “brick and mortar” store but in fact only exists online. While there is nothing wrong with being an online business, claiming to have a physical “brick and mortar” (e.g. fake photo, fake physical store address) is deceptive.

● A webpage or website with “fake” owner or content creator profiles. For example, AI generated content with made up “author” profiles (AI generated images or deceptive creator descriptions) in order to make it appear that the content is written by people.

● Factually inaccurate and deceptive information about creator expertise. For example, an author or creator profile inaccurately claims to have credentials or expertise (e.g. the content creator claims falsely to be a medical professional) to make the content appear more trustworthy than it is.”

Deceptive Content, Buttons, And Links

The new quality raters guidelines also goes after sites that use deceptive practices to get users to take actions they didn’t intend to. This is an extreme level of deception that shouldn’t be a concern to any normal site.

The following are additions to the section about deceptive design:

“● Pages with deceptively designed buttons or links . For example, buttons or links on pop ups, interstitials or on the page are designed to look like they do one thing (such as close a pop up) but in fact have a different result which most people would not expect, e.g. download an app.

● Pages with a misleading title or a title that has nothing to do with the content on the page. People who come to the page expecting content related to the title will feel tricked or deceived.”

Takeaways

There are three  important takeaways from the updates to section 4.5.3 of Google’s Search Quality Raters Guidelines:

1. Expanded Definition Of Deceptive Purpose

  • Section 4.5.3 now explicitly includes new examples of deceptive page intent, such as fake endorsements or falsified product testing.
  • The revision emphasizes that deceptive purpose goes beyond misinformation—it includes misleading motivations behind the content.

2. Focus On Deceptive EEAT Content

  • A new subsection addresses deceptive representations of EEAT, including:
  • Fake business details (e.g. pretending to have a physical store).
  • Made-up author profiles or AI-generated personas.
  • False claims of creator expertise, such as unearned professional credentials.

3. Deceptive Design and UI Practices

The raters guidelines calls attention to manipulative interface elements, such as:

  • Buttons that pretend to close popups but trigger downloads instead.
  • Misleading page titles that don’t match the content.

Google’s January 2025 update to the Search Quality Raters Guidelines significantly expands how raters should identify deceptive web content. The update clarifies deceptive practices involving page purpose, false EEAT (Expertise, Experience, Authoritativeness, Trustworthiness) content, and misleading design elements. The purpose of the update is to help raters to better recognize manipulation that could mislead users or inflate rankings and could indicate the kinds of low quality that Google is focusing on.

Featured Image by Shutterstock/ArtFamily

Breaking Into New Markets With PPC: Key Considerations

Google Ads dominates the global PPC market with advertising revenue surpassing $265 billion  in 2024.

Paid search is self-serving and fast to deploy. But as simple as it appears, the reality is much more complex.

The perceived ease of activation paints a picture that this channel is a silver bullet when brands look to enter new markets.

It’s as easy as piecing together an automated campaign such as Performance Max, changing the target location, and letting Google do the hard work, isn’t it?

This couldn’t be further from the truth, and from managing Paid Search for over 15 years, I can vouch for this firsthand.

PPC can play a key role in market expansion, but it’s not a market entry strategy, which it can often get confused with.

This post explores how paid search fits into a go-to-market strategy and what brands need to consider when launching in a new market, from media modeling to localization, brand building, and more.

Full-Funnel Media Planning Is Essential

Roughly speaking, paid search can be deployed in a new market within a short time frame.

Brand can liaise with in-house teams/PPC agencies, start the ball rolling, and then activate campaigns in a new locale in a fraction of the time it would take to even begin planning a full-funnel strategy.

For example, say you’re a U.S.-based ecommerce brand that sells luxury skincare and wants to break into the UK, without any brand demand in this market.

You build out Google Ads search and shopping campaigns and enter auctions for a wealth of generic queries, such as:

Screenshot from search for [buy luxury skincare], Google, April 2025

You’ll drive traffic to the site for relevant queries and might start to build momentum with sales. However, if the campaigns were to be paused, so would the entire presence of your brand in this market.

With full-funnel media buying, brands look at the full customer journey, of which different brands have a different mix of budgeting across each stage (lower, middle, and upper funnel, for example).

McKinsey defines full-funnel marketing as “an approach that combines the power of both brand building and performance marketing through linked teams, measurement systems, and key performance indicators (KPIs).”

Outside of the context of launching into a new market, this approach to media buying is essential, and PPC sits within the mix of lower, middle, and upper funnel advertising strategies.

The split of the budget across the funnel will vary by brand. Les Binet and Peter Field argue that the most effective strategies adopt a 60/40 split of long-term brand building and short-term activation.

When you’re launching into a new market, the split could look a whole lot different as you’ll need to build brand awareness from scratch. Over time, it will move the needle over to performance-based campaigns as part of a wider media mix.

Have A Robust Measurement Strategy In Place

Take the example of a U.S.-based luxury skincare brand expanding into the UK.

After the initial test period, simply looking at PPC performance through engagement or sales metrics isn’t enough to determine whether the expansion succeeded.

PPC campaigns influence more than just immediate clicks and conversions.

Depending on the strategy, they can contribute to brand awareness, drive offline actions, and more.

For instance, a search ad might not result in an immediate online purchase but could lead a customer to visit a physical store or make a purchase at a later time. When the only presence in a new market is via paid search, conversion rates could be considerably lower than those in established markets.

Taking this into account, a brand can’t expect to answer “how did the market expansion go?” based on a narrow sample of data from one channel, especially when that channel isn’t part of a broader go-to-market media strategy.

It’s crucial to measure PPC’s impact beyond platform-specific metrics, and incorporating a holistic approach to measurement is essential.

One tactic to use is Media Mix Modelling (MMM). This allows marketers to capture these indirect effects, ensuring a more accurate assessment of PPC’s role in the overall marketing strategy.

MMM is used by 53% of U.S. marketers, and 30% believe it is the best model for identifying drivers of business value as it doesn’t rely on user-level data, making it effective at viewing the impact of paid media on the bottom line.

If it’s a simple PPC activation or a full-funnel go-to-market media strategy, the importance of having a framework for measuring performance holistically is key, as this lays the groundwork for understanding the successes and failures when expanding into a new market.

Research Market-Specific Nuances And Adapt

When entering a new market, it’s not just your media plan that needs to adapt; it’s also your understanding of the consumer.

Even in an increasingly connected world, buying behaviors remain deeply influenced by local culture, habits, and expectations.

Studies have shown that organizations with high cultural intelligence see a 30% increase in their market penetration compared to their competitors.

Brands must consider:

Cultural Differences

Nearly 75% of UK shoppers say their purchasing is influenced by local culture, yet 75% of consumers in India feel that global brands offer better quality products compared to the local market.

Understanding what problems users prioritize, what features matter, and how consumers approach purchases is essential when piecing together a PPC plan and the backbone for a full go-to-market strategy.

A one-size-fits-all approach won’t cut it, and even though there may be search demand for the products/services you sell, this doesn’t mean you can simply activate and watch the sales roll in (in most cases).

This is both a strategic and tactical consideration, from the first day of planning which markets you are going to target, to the types of phrases used within your ad copy.

Longer Consideration And Research

One-third of consumers globally spend more time researching purchase decisions online than ever before.

When you layer in the nuances of a brand entering a new market, the need for a robust go-to-market strategy vs. a simple activation on PPC is crucial.

With the consideration process being longer than ever, brands need to understand and adapt to market-specific purchasing behavior, and this should run through everything involved within digital.

From the messaging used in ads to forecasting out purchase paths, to then determine when an expected return on ad spend can be accurately reported.

Local Digital Ecosystems

Digital behavior differs massively between markets. Assuming that one country will respond the same to your PPC campaigns as another is short-sighted.

Take China, for example. Google and Meta are blocked, and brands will have to look for alternative routes for activating PPC, such as Baidu.

Running search ads follows a similar blueprint, but the research, planning, build, etc., will require a bespoke approach.

Another consideration is payment methods.

India, for example, favors wallets like PayTM, while 15% of the entire Klarna market resides in Germany.

Context aside, these factors play a key role in building a thorough digital expansion plan, which incorporates PPC, as without these, brands will be scratching their heads to uncover why PPC metrics look a certain way.

The Key To Making PPC Work In New Markets Isn’t PPC

Launching PPC in a new market might seem straightforward. From a resource perspective and context aside, it doesn’t demand a great deal of time to get up and running.

This is where Google Ads shines, as brands can enter a new market with just a few clicks and begin driving traffic.

However, driving the traffic is part of the bigger picture with digital market expansion, and there’s a wealth of factors that need to be considered to give brands the best chance at success.

Factors such as:

  • Delivery fees, tariffs, shipping timelines.
  • Localized assets, website, currency.
  • Contact preferences, customer service, localized support.
  • Pricing and returns policies.
  • Trust signals, local reviews, social media presence.

These factors aren’t as easy to measure as PPC, but they are arguably more important than PPC itself.

A recent survey found that trust emerges as the most critical factor in purchasing decisions when consumers consider buying from a new brand.

Consumers place significant importance on elements such as star ratings, the number of reviews, and the credibility of those reviews.

PPC can (and will) exist in isolation for many brands, and even the most well-built, researched, and curated campaigns can fall short when activating in a new market.

To stand the best chance of success, brands must consider the full digital ecosystem, from how they apportion budget across the funnel, how they display shipping fees on their site, and how best to go about building trust signals from launch.

PPC can drive visibility and traffic, but it’s everything around it that matters most, and brands who consider and act on all of these factors are the ones who succeed.

More Resources:


Featured Image: insta_photos/Shutterstock

B2B Buyer Behavior Has Changed: Proven Strategies For Sustainable Relationships via @sejournal, @alexanderkesler

The reality of working in B2B today is that tried-and-tested tactics are no longer as effective for engaging buyers.

Buyers are independent, defensive, and prefer independent research before reaching out to sales, only when they are absolutely ready to do so.

Part of the reason for this is due to the increase in size of buying groups, the average of which now spans approximately 11 individuals.

These buying groups have their own complex purchasing processes – 70% of which take place in the dark and often anonymously before they reach out to sales.

When these buyers are ready, 84% of the deals go to the vendor on their day-one list.

This new buying behavior, where self-discovery predominates, highlights the urgency of evolving old tactics and embracing buyer-led strategies that meet buyers on their own terms, where they are.

This includes moving away from traditional lead generation in favor of evergreen, always-on buyer enablement practices based on buyer intelligence.

In this article, I will share five buyer enablement strategies based on successful demand programs that elevated our clients’ engagement metrics and resulted in sustainable buyer relationships.

5 Proven Demand Strategies To Enable Buyers

1. Identify Your Buyers Precisely

Identifying your buyer starts with recognizing that most B2B buying groups are made up of multiple stakeholders – as many as 15 individuals or more – each with distinct concerns and decision-making power, according to our own Q4 2024 market research.

To effectively engage these groups, you need detailed buying group personas that define key revenue leaders and influencers.

Referencing existing buyer personas at large accounts is a helpful starting point, while first-party data and other sources like client relationship management (CRM) insights, client interviews, website behavior analytics, and industry reports will provide you with a more comprehensive view of their needs, goals, and preferences.

Identifying intent signals is equally important for modern account-based marketing (ABM) strategies. These behavioral cues suggest when a prospect or buying group is actively researching solutions and may be in-market to buy.

Key buying group intent signals include:

  • Multiple website visitors from the same organization.
  • Consumption of solution-specific content.
  • Engagement across multiple channels.
  • Webinar or event attendance.
  • And many more.

Predictive tools, competitive research, and technographic data can further enrich these insights.

By integrating real-time intent signals into your ABM programs, you can dynamically adjust campaigns to align with the evolving needs of buying groups and their individual members.

Rather than relying solely on static personas, dynamic ABM leverages intent data to better understand each persona’s immediate pain points, preferred channels, and buying triggers.

This approach ensures that you reach the right personas, within the right accounts, with the right message at the right moment, ultimately driving greater engagement, pipeline velocity, and revenue impact.

2. Be A Partner To Your Buyers

We are seeing a trend of more and more buyers (as many as 58% according to our Q4 2024 market research) seeking the proven expertise of consultants and subject matter experts to inform and de-risk their decisions. This trend is closely associated with the increase in the length of buying cycles.

To compete effectively, marketers ought to adjust their engagement to offer integration and consultation, effectively establishing their brand as a committed partner that supports its buyers and meets their needs. This, in turn, translates to demand.

This brand-to-demand-to-revenue strategy relies on early engagement to establish trust and build a brand presence before buyers are actively looking.

The brand experience you present should carry across multiple touchpoints to engage all stakeholders within the buying group, emphasising your position as a trusted source.

Examples of early brand-to-demand tactics include:

  • Identifying the most suitable channels for generating awareness of your unique value proposition (UVP).
  • Sponsoring events to engage qualified prospects with just-in-time information.
  • Adopting a multithreaded nurturing approach.
  • Promoting curated content hubs to facilitate buyer research and self-discovery, customized per persona.
  • Offering webinars, workshops, or roundtables.
  • Providing buying group influencers with data to make an internal case, such as product specification sheets, decks, and guides.

Establishing a vendor-partner relationship is critically important when nurturing defensive buyers because it reduces risk, builds trust, and enhances buyer confidence – all of which are top priorities for today’s cautious, efficiency-driven buying groups.

This also helps buyers feel reassured that they are investing in solutions that are well-supported, credible, and aligned with other trusted vendors in their partner ecosystem​.

3. Focus On Buyer-Led Content

Nurturing and enabling buying groups calls for buyer-led content strategies that speak directly to active buyers, buying groups, and accounts.

Creating content hubs to enable buyer research is one of the most powerful moves you can make today, especially for large buying groups in complex industries.

Content hubs designed to support in-depth research empower buyers to make informed decisions with more confidence.

These content libraries also establish your brand as a reputable, expert source of information.

It is important to align content hubs to individual buying journey stages so buyers can self-serve information.

Below is an example of what that looks like:

  • Awareness stage: industry reports, tactical guides, thought leadership focused on how-tos.
  • Consideration stage: product overviews, case studies, webcasts.
  • Decision stage: demo request, pricing sheet, product implementation roadmap.

Additionally, content can be personalized by role:

  • CMO: trend reports.
  • Procurement: service-level agreement (SLA) terms.
  • Finance: return on investment (ROI) guide.
  • IT: security threat insights, compliance checklist.

Micro-targeting buyers with relevant, behavior-driven content is great for creating seamless, value-added experiences that drive preference and loyalty, ultimately enabling buying groups to confidently choose your solution.

To achieve competitive displacement, ensure your content is visible and discoverable throughout the buyer’s journey.

This can be achieved through SEO, AI/Generative Engine Optimization (AEO/GEO), and by leveraging omnichannel engagement tied to real-time behavior to deliver personalized messaging that resonates at every touchpoint.

You can also use intent-driven targeting to engage accounts already researching similar topics​.

4. Continuously Optimize With Demand Intelligence

Nurturing and enabling buying groups is not a one-time effort, but an ongoing optimization process.

Success depends on continuously updating intelligence across buyers, buying groups, and accounts to stay aligned with evolving needs and behaviors.

Measuring the impact of your buyer enablement tactics can be done in several ways:

  • Content engagement metrics, such as views, downloads, and time on page.
  • Buying group activation metrics, such as the number of roles and stakeholders engaged.
  • Deal acceleration metrics, such as sales cycle length and conversion rate.
  • Feedback loops with both buyers and sales teams.

By measuring and optimizing at every stage, organizations can identify what is working, adjust strategies in real time, and eliminate inefficiencies.

Essentially, they can successfully self-educate without talking to sales, which aligns with modern B2B buyer expectations.

Continuous optimization also turns good programs into great ones, driving stronger results and building lasting buyer trust.

This approach not only strengthens engagement, but also ensures that every marketing dollar is working harder, helping you maximize the impact of your spend.

5. Enable Sales Teams For Success

To truly nurture and enable buying groups, organizations must equip their sales teams with the right strategies, including the tools, insights, and approaches that support more informed and impactful outreach to buyers.

This starts with personalized messaging tailored to each account and buying group, and continues with follow-up that reflects the needs and behaviors of individual buying group members.

A key part of this approach is helping sales teams engage buyers at the right pace.

Acting as trusted consultants, sales teams can guide buyers through their journey with a buyer-centric mindset that clearly communicates the unique value of your organization.

Achieving this requires strong alignment and collaboration across marketing, sales, and the broader organization – everyone must rally behind a shared North Star focused on enabling the buyer first and foremost.

Best practices for achieving organizational alignment include:

  • Collaborating to define the ideal buying experience.
  • Setting joint revenue and buyer engagement targets.
  • Regularly refreshing sales outreach strategies.
  • Detailed mapping of each buying group and member, including pain points and interests.
  • Activating multi-threaded account strategies.

By prioritizing buyer enablement and supporting sales teams with the right approach, organizations create a seamless experience that builds trust, accelerates deal cycles, and drives better long-term outcomes.

The Importance Of Evolving Alongside Your Buyers

Buyer purchasing behavior has changed unequivocally. Buying groups are expected to remain cautious, with buyer journeys expected to lengthen even more.

To thrive in this environment, organizations must prioritize brand-to-demand and value-focused solutions, using their expertise to solve buying group challenges – all the while enabling self-service options whenever possible.

It is time to meet buyers where they are, which means evolving the playbook and closing the door to old ways of thinking.

Key Takeaways

  • Buying groups and long sales cycles are the new reality: Today’s buyers prefer to self-educate, making it essential to shift toward buyer-led strategies that meet them where they are – on their terms. This requires moving beyond traditional lead generation and embracing buyer enablement approaches powered by real intent signals.
  • Buyers and how they buy have changed: B2B purchases now involve up to 15 stakeholders or more, each with their own priorities and influence. To engage them effectively, build detailed buying group personas and identify intent signals that reveal when prospects are actively exploring solutions.
  • Buyer enablement does not end; it evolves: By optimizing at every stage, organizations can fine-tune strategies in real time, enabling better and consistent outcomes.

More Resources:


Featured Image: Alphavector/Shutterstock

Google Shares Insight About Time-Based Search Operators via @sejournal, @martinibuster

Google’s Search Liaison explained limitations in Google’s ability to return web pages from a prior date, also noting that date-based advanced search operators are still in beta. He provided one method for doing this but omitted discussing an older, simpler method that almost accomplishes the same thing.

How To Find Articles By Older Published Date

The person asking the question understood how to find articles published from the previous year, month, and twenty four hours. But didn’t know how to find articles published before a specific date.

The post on Bluesky asked:

“Is there a way to search for articles OLDER than a certain date?

I know advanced search can guarantee in the past year, month, 24h, but I want to specifically be able to find articles published BEFORE X historical event happened, and I can’t find a way to filter. Help?”

Search Liaison posted this way to do it which can be difficult to memorize if you’re a busy person:

“We have before: and after: operators that are still in beta. You must provide year-month-day dates or only a year. You can combine both. For example:

[avengers endgame before:2019]
[avengers endgame after:2019-04-01]
[avengers endgame after:2019-03-01 before:2019-03-05]”

Another Way To Do Time-Based Search

In my opinion it’s a lot easier to just use Google’s search tools:

Tools > Any Time > Custom Range

From there you just set whatever time range that you want, nothing to memorize. However, you can’t search before a certain date, you have to set the starting date and ending date you’re searching for.

Caveat About Time-Based Search

Search Liaison shared an interesting insight about how the advanced search operators for time work:

“Just keep in mind it can be difficult for us to know the exact date of a document for a variety of reasons. There’s no standard way that all site owners use to indicate a publishing or republishing date. Some provide no dates at all on web pages. Some might not indicate if an older page is updated.”

Takeaways:

The time-based advanced search operators are still in beta, which means that Google is testing to see how many people find it useful. Google might at some time in the future remove the search operators if it’s not popular or useful.

The other takeaway is that it’s hard for Google to know the exact date that a document is published.

Read the discussion on Bluesky.

Sell on Amazon without GTINs or UPCs

Amazon’s marketplace requires some form of a product identifier for nearly every category. Whether it’s baby toys or industrial bolts, sellers need a Universal Product Code (UPC) or, similarly, a Global Trade Item Number (GTIN).

Yet some items have no such identifier and require an “exemption” from Amazon.

GTIN

GS1, a global standards organization, developed the GTIN to identify products. The identifier encompasses several worldwide formats, including the familiar UPC (GTIN-12) in the United States and Canada. In Europe, merchants use the European Article Number (EAN), technically GTIN-13.

The GTIN standards identify any trade item (product or service) priced, ordered, or invoiced. Hence GTINs help Amazon and the merchants selling in its marketplace keep track of inventory and manage orders.

Term Stands For Digits Region
GTIN Global Trade Item Number 8, 12, 13, or 14 Global
UPC Universal Product Code 12 U.S., Canada
EAN European Article Number 13 (usually) or 8 Europe

Identifying Products

For resellers offering products from prominent consumer brands, listing a product on Amazon is as simple as looking up the GTIN or copying the UPC number from the item’s barcode.

There are, however, many products that don’t have an obvious barcode or identifier. Here are some examples.

  • Print-on-demand products. T-shirts are perhaps the best example. Services such as Printful and Gooten are excellent sources of quality shirts with crisp printing, but they don’t have UPCs.
  • Private label items. Some small brands, including in-house manufacturers, may not have a GTIN.
  • Handmade merchandise. Artisans creating anything from leather belts to home-sown bibs may not have UPCs.
  • Parts or accessories. Replacement parts, components, and accessories may not come with individual GTINs. An example is a generic phone case.
  • Bundled or items in a multipack. Most major brands have a GTIN-14 or similar for multipack, but some smaller companies with bundled products do not, despite Amazon’s guidelines.

For those and other numberless products, Amazon marketplace sellers have two options: obtain a GTIN or get an exemption from Amazon.

GTIN Exemption

Amazon often awards exemptions for categories with private-label, handmade, or similar products.

The process is straightforward. Create a listing in Amazon Seller Central, select “I don’t have a product ID,” and click “Next.” An “Apply” button will appear for products requiring a GTIN exemption.

If the product category is restricted, the seller will need double approval: an authorization to sell and a separate GTIN exemption.

Brands with existing GTINs cannot apply for an exemption for other like-branded items.

Exemption approval typically requires excellent documentation, as follows.

  • Clear photographs. Amazon wants detailed product and packaging photos, not illustrations or markups. These pictures should show the item and packaging from each side, but will not appear in product listings.
  • No GTIN. There cannot be (i) any barcode or identifier on the item or package or (ii) a GTIN associated with the product. Amazon has an exhaustive list of registered GTINs and will check.
  • Consistent branding. The brand name on the GTIN exemption form must exactly match the packaging and product. And the packaging must permanently show the brand name or logo.
  • Proper category. Ensure the product category for the GTIN exemption is correct. An error will almost guarantee a rejection.

According to its support, Amazon usually approves or rejects an exemption request within two days. Merchants who have obtained an exception often recommend waiting an additional day afterward before building those product listings, allowing Amazon’s systems to reflect the status.

Amazon’s notification of denied GTIN exemptions may state why. For example, the product and packaging photographs might not demonstrate the absence of a GTIN or may belong to a brand requiring a UPC.

Sellers with denied exemptions can reapply after updating or clarifying the initial culprits.

Finally, if all else fails, a company can purchase a UPC for about $30, plus an annual fee.

The US has approved CRISPR pigs for food

Most pigs in the US are confined to factory farms where they can be afflicted by a nasty respiratory virus that kills piglets. The illness is called porcine reproductive and respiratory syndrome, or PRRS.

A few years ago, a British company called Genus set out to design pigs immune to this germ using CRISPR gene editing. Not only did they succeed, but its pigs are now poised to enter the food chain following approval of the animals this week by the U.S. Food and Drug Administration.

The pigs will join a very short list of gene-modified animals that you can eat. It’s a short list because such animals are expensive to create, face regulatory barriers, and don’t always pay off. For instance, the US took about 20 years to approve a transgenic salmon with an extra gene that let it grow faster. But by early this year its creator, AquaBounty, had sold off all its fish farms and had only four employees—none of them selling fish.

Regulations have eased since then, especially around gene editing, which tinkers with an animal’s own DNA rather than adding to it from another species, as is the case with the salmon and many GMO crops.

What’s certain is that the pig project was technically impressive and scientifically clever. Genus edited pig embryos to remove the receptor that the PRRS virus uses to enter cells. No receptor means no infection.

According to Matt Culbertson, chief operating office of the Pig Improvement Company, a Genus subsidiary, the pigs appear entirely immune to more than 99% of the known versions of the PRRS virus, although there is one rare subtype that may break through the protection.

This project is scientifically similar to the work that led to the infamous CRISPR babies born in China in 2018. In that case a scientist named He Jiankui edited twin girls to be resistant to HIV, also by trying to remove a receptor gene when they were just embryos in a dish.

That experiment on humans was widely decried as misguided. But pigs are a different story. The ethical concerns about experimenting are less serious, and the benefits of changing the genomes can be measured in dollars and cents. It’s going to save a lot of money if pigs are immune to the PRRS virus, which spreads quite easily, causing losses of $300 million a year or more in the US alone.

Globally, people get animal protein mostly from chickens, with pigs and cattle in second and third place. A 2023 report estimated that pigs account for 34% of all meat that’s eaten. Of the billion pigs in the world, about half are in China; the US comes in a distant second, with 80 million.

Recently, there’s been a lot of fairly silly news about genetically modified animals. A company called Colossal Biosciences used gene editing to modify wolves in ways it claimed made them resemble an extinct species, the dire wolf. And then there’s the L.A. Project, an effort run by biohackers who say they’ll make glow-in-the-dark rabbits and have a stretch goal of creating a horse with a horn—that’s right, a unicorn.

Both those projects are more about showmanship than usefulness. But they’re demonstrations of the growing power scientists have to modify mammals, thanks principally to new gene-editing tools combined with DNA sequencing that lets them peer into animals’ DNA.

Stopping viruses is a much better use of CRISPR. And research is ongoing to make pigs—as well as other livestock—invulnerable to other infections, including African swine fever and influenza. While PRRS doesn’t infect humans, pig and bird flus can. But if herds and flocks could be changed to resist those infections, that could cut the chances of the type of spillover that can occasionally cause dangerous pandemics.  

There’s a chance the Genus pigs could turn out to be the most financially valuable genetically modified animal ever created—the first CRISPR hit product to reach the food system. After the approval, the company’s stock value jumped up by a couple of hundred million dollars on the London Stock Exchange.

But there is still a way to go before gene-edited bacon appears on shelves in the US. Before it makes its sales pitch to pig farms, Genus says, it needs to also gain approval in Mexico, Canada, Japan and China which are big export markets for American pork.

Culbertson says gene-edited pork could appear in the US market sometime next year. He says the company does not think pork chops or other meat will need to carry any label identifying it as bioengineered. “We aren’t aware of any labelling requirement,” Culbertson says.

This article is from The Checkup, MIT Technology Review’s weekly health and biotech newsletter. To receive it in your inbox every Thursday, sign up here.