The Facts About Trust Change Everything About Link Building via @sejournal, @martinibuster

Trust is commonly understood to be a standalone quality that is passed between sites regardless of link neighborhood or topical vertical. What I’m going to demonstrate is that “trust” is not a thing that trickles down from a trusted site to another site. The implication for link building is that many may have been focusing on the wrong thing.

Six years ago I was the first person to write about link distance ranking algorithms that are a way to create a map of the Internet that begins with a group of sites that are judged to be trustworthy. These sites are called the seed set. The seed set links to other sites, which in turn link to ever increasing groups of other sites. The sites closer to the original seed set tend to be trustworthy websites. The sites that are furthest away from the seed set tend to be not trustworthy.

Google still counts links as part of the ranking process so it’s likely that there continues to be a seed set that is considered trustworthy from which the further away you a site is linked from the seeds the likelier it is considered to be spam.

Circling back to the idea of trust as a ranking related factor, trust is not a thing that is passed from one site to another. Trust, in this context, is not even a part of the conversation. Sites are said to be trustworthy by the link distance between the site in question and the original seed set. So you see, there is no trust that is conveyed from one site or another.

The word Trustworthiness is even a part of the E-E-A-T standard of what constitutes a quality website. So trust should never be considered as a thing that is passed from one site to another because it does not exist.

The takeaway is that link building decisions based on the idea of trust propagated through links are built on an outdated premise. What matters is whether a site sits close to trusted seed sites within the same topical neighborhood, not whether it receives a link from a widely recognized or authoritative domain. This insight transforms link evaluation into a relevance problem rather than a reputation problem. This insight should encourage site owners to focus on earning links that reinforce topical alignment instead of chasing links that appear impressive but have little, if any, ranking value.

Why Third Party Authority Metrics Are Inaccurate

The second thing about the link distance ranking algorithms that I think is quite cool and elegant is that websites naturally coalesce around each other according to their topics. Some topics are highly linked and some, like various business association verticals, are not well linked at all. The consequence is that those poorly linked sites that are nevertheless close to the original seed set do not acquire much “link equity” because their link neighborhoods are so small.

What that means is that a low-linked vertical can be a part of the original seed set and display low third-party authority metrics scores. The implication is that the third-party link metrics that measure how many inbound links a site has fail. They fail because third-party authority metrics follow the old and outdated PageRank scoring method that counts the amount of inbound links a site has. PageRank was created around 1998 and is so old that the patent on it has expired.

The seed set paradigm does not measure inbound links. It measures the distance from sites that are judged to be trustworthy. That has nothing to do with how many links those seed set sites have and everything to do with them being trustworthy, which is a subjective judgment.

That’s why I say that third-party link authority metrics are outdated. They don’t follow the seed set paradigm, they follow the old and outdated PageRank paradigm.
The insight to take away from this is that many highly trustworthy sites are being overlooked for link building purposes because link builders are judging the quality of a site by outdated metrics that incorrectly devalue sites in verticals that aren’t well linked but are actually very close to the trustworthy seed set.

The Important Of Link Neighborhoods

Let’s circle back to the observation that websites tend to naturally link to other sites that are on the same topic. What’s interesting about this is that the seed sets can be chosen according to topic verticals. Some verticals have a lot of inbound links and some verticals are in their own little corner of the Internet and aren’t link to from outside of their clique.

A link distance ranking algorithm can thus be used to calculate the relevance according to whatever neighbhorhood a site is located in. Majestic does something like that with their Trust Flow and Topical Trust Flow metrics that actually start with trusted seed sites. Topical Trust Flow breaks that score down into specific topic categories. The Topical Trust Flow metric shows how relevant a website is for a given metric.

My point isn’t that you should use that metric, although I think it’s the best one available today. The point is that there is no context for thinking about trustworthiness as something that spreads from link to link.

Once you can think of links in the paradigm of distance within a topic category it becomes easier to understand why a link from a university website or some other so-called “high trust” site isn’t necessarily that good or useful. I know for certain because there was a time before distance ranking where the topic of the site didn’t matter but now it does matter very much and it has mattered for a long time now.

The takeaways here are:

  1. It is counterproductive to go after so-called “high trust” links from verticals that are well outside of the topic of the website you’re trying to get a link to.
  2. This means that it’s more important to get links from sites that are in the right topic or from a context that exactly matches the topic, from a website that’s in an adjacent topical category.

For example, a site like The Washington Post is not a part of the Credit Repair niche. Any “trust” that may be calculated from a New York Times link to a Credit Repair site will likely be dampened to zero. Of course it will. Remember, seed set trust distance is calculated within groups within a niche. There is no trust passed from one link to another link. It is only the distance that is counted.

Logically, it makes sense to assume that there will be no validating effect between irrelevant sites. relevant website for the purposes of the seed set trust calculations.

Takeaways

  • Trust is not something that’s passed by links
    Link distance ranking algorithms do not deal with “trust.” They only measure how close a site is to a trusted seed set within a topic.
  • Link distance matters more than link volume
    Ranking systems based on link distance assess proximity to trusted seed sites, not how many inbound links a site has.
  • Topic-based link neighborhoods shape relevance
    Websites naturally cluster by topic, and link value is likely evaluated within those topical clusters rather than across the entire web. A non-relevant link can still have some small value but irrelevant links stopped working almost twenty years ago.
  • Third-party authority metrics are misaligned with modern link ranking systems
    Some third-party metrics rely on outdated Page Rank-style link counting and fail to account for seed set distance and topical context.
  • Low-link verticals are undervalued by SEOs
    Entire niches that are lightly linked can still sit close to trusted seed sets, yet appear weak in third-party metrics, causing them to be overlooked in link builders.
  • Relevance outweighs perceived link strength
    Links from well-known but topically irrelevant sites likely contribute little or nothing compared to links from closely related or adjacent topic sites.

Modern link evaluation is about topical proximity, not “trust” or raw link counts. Search systems measure how close a site is to trusted seed sites within its own topic neighborhood, which means relevant links from smaller, niche sites can matter more than links from famous but unrelated domains.

This knowledge should enable smarter link building by focusing efforts on contextually relevant websites that may actually strengthen relevance and rankings, instead of chasing outdated link authority scores that no longer reflect how search works.

Featured Image by Shutterstock/Kues

What are permalinks? How to optimize them for SEO

If you’re planning to build a website or publish a blog post, you’ve probably heard the word permalink pop up. But what is a permalink, really? In simple words, it’s the permanent link to a page on your website, like the official street address of your house. No matter how many times you update your content, this link remains the same and tells people (and Google) exactly where that page is located. In this blog, we’ll break down what a permalink is, why it matters, how to pick the right permalink structure, and how Yoast SEO helps you manage everything easily.

Quick note: If your website is already established, changing existing permalinks can cause broken links and SEO issues. Don’t worry, we’ll show you how to do it safely later in this guide.

Table of contents

Key takeaways

  • A permalink is a permanent link to a specific page on your website, ensuring a stable URL even when content updates occur
  • Choosing the right permalink structure improves SEO by creating clean, readable, and memorable URLs
  • Changing existing permalinks can result in broken links; always set up redirects when updating them to maintain SEO value
  • Best practices include keeping permalinks short, using hyphens, and avoiding dates unless necessary for clarity
  • Tools like Yoast SEO help manage permalinks effectively and prevent 404 errors during changes

Before we go any deeper, let’s start with the basics: what is a permalink? A permalink (short for “permanent link”) is the stable URL that points to a specific page or post on your website. Think of it as the forever address of a piece of content. Even if you update the page, the permalink remains the same, ensuring that people and search engines can always find it.

💡 Fun fact:

Permalinks gained popularity around the early 2000s, when bloggers sought clean, permanent URLs instead of long, messy links filled with numbers and symbols. The idea quickly spread across blogging platforms, and that’s how permalinks became a standard part of the web.

A permalink is usually made up of two main parts:

  • Your domain name (like yourdomain.com)
  • The slug, which is the last part of the URL, tells people what the page is about

So a clean permalink might look like:

https://yourdomain.com/sponsored-tweets-guide

And it will always lead to that exact guide.

Compare that to a messy, auto-generated URL like:

https://yourdomain.com/post-id?=5726fjwenfkd

The first one is easier to read, easier to remember, more suitable for sharing, and more beneficial for SEO. That’s precisely why understanding what the permalink in WordPress is so important, especially when your site grows, and you want your content to be easy to find.

Permalinks can also include categories or subfolders depending on your structure. For example:

https://yourdomain.com/blog/best-yoga-poses

No matter how your website changes on the backend, a permalink should always point to the same page. However, if you ever update your URL structure or change the slug, you’ll need to set up redirects; otherwise, the original permalink won’t work. We will discuss it further later in this blog post.

Clear, simple permalinks make your content easier to read, index, and trust. That’s why choosing a proper permalink structure early on matters so much.

Now that you know what a permalink is, let’s quickly break down what it’s actually made of. A permalink may seem simple on the outside, but each part of it has a specific purpose. Think of it like a small puzzle; every piece helps your browser understand exactly where to take you.

Let’s use a sample URL to make things easy:

Here’s a brief overview of each element in a permalink:

Protocol (or scheme) This is the beginning of every URL, the http:// or https:// part. It tells your browser how to connect to a website.

http:// is the old, not-so-secure version
https:// is the secure, modern version that protects your data

Today, https:// is a must, especially if you care about trust, SEO, and safety.

Subdomain This is the little prefix that comes before your main domain. In our example, it’s www. You can also use subdomains like shop.example.com or blog.example.com when you want to separate different sections of your website.
Root domain (or hostname) This is your main website address, the part you buy, like example.com.

It has two pieces:

→ The name you choose (example)
→ The extension that follows (.com, .org, .net, etc.)

Together, they form the foundation of your website’s identity.

Path (or slug) Everything that comes after the domain is usually the part WordPress users think of as the permalink.

In our example: /blog/my-first-post

→ The path (/blog/) shows the section or folder
→ The slug (my-first-post) is the unique part that describes the page

In WordPress, you can easily edit the slug for every post or page to make your permalink clean and SEO-friendly.

Parameters and anchors (optional extras) These parts don’t appear in every permalink, but when they do, they provide additional information.

Parameters come after a question mark, like ?page=2 or tracking codes like ?utm_source=email

Anchors are denoted by a hash (#), such as #comments, and direct the visitor to a specific section on the same page

You may wonder how a URL differs from a permalink. They look similar, they point to web pages, and they both live in your browser’s address bar, so what sets them apart? The easiest way to understand it is this: every permalink is a URL, but not every URL is a permalink.

Must read: Best practices for SEO-friendly URLs

Static URLs vs. dynamic URLs

To understand the difference better, URLs can be split into two types:

Static URLs:

These remain the same and always direct you to the same page. Example: yourdomain.com/blog/how-to-bake-sourdough. This is a static URL, and yes, it’s also a permalink.

Dynamic URLs:

These changes depend on user actions and typically include additional parameters, such as ?page=2 or ?color=blue. Example: yourdomain.com/products/shirt?color=blue&size=large. Dynamic URLs are not considered permalinks because search engines treat each version with different parameters as separate pages.

Not every URL qualifies as a permalink. Here’s why:

Dynamic URLs containing parameters

These URLs load content, but the added parameters make them temporary and subject to change. For example, yourdomain.com/blog?page=2 and yourdomain.com/best-yoga-poses?source=email.

Static URLs that don’t point to a specific page

These are still URLs, but they direct you to the homepage, root domain, or a general section, rather than a specific piece of content. So they aren’t considered permalinks. For example, yourdomain.com, www.yourdomain.com, and shop.yourdomain.com.

Now that you know what a permalink is and how it’s built, let’s talk about why it actually matters. Many people think permalinks are just tiny technical settings inside a content management system CMS, but they play a much bigger role in how your website looks, feels, and performs on the SERPs.

Do check out: Features of the Google Search Engine Results Page (SERP)

Search engines, such as Google, pay close attention to your URLs. A clean permalink clearly indicates what your page is about, making it easier for your content to rank high on Google.

For example: yourdomain.com/blog/what-is-a-permalink vs. yourdomain.com/?p=123. The first one clearly explains the topic. The second one tells Google nothing.

A strong permalink structure helps with:

  • Keywords: If your slug includes your main keyword, Google gets instant context
  • Crawlability: Clear folders, such as /services/web-design/, help search engines understand your site’s hierarchy
  • Link equity: People are far more likely to link to clean URLs than long, confusing ones. More links = stronger SEO

Permalinks aren’t just for search engines; they also affect how real people feel when they visit your website.

Clean, readable URLs make your site look professional and trustworthy. When someone sees a link like /contact-us/ or /shop/, they instantly know where they’re going. However, when they encounter something like /c/post?id=72, it appears suspicious and difficult to understand.

Good permalinks help with:

  • Clarity: Users can guess the topic from just the URL
  • Confidence: A neat URL feels more trustworthy than a random string of numbers
  • Memorability: Simple slugs are easy to remember or type again later

In short, clean permalinks create a smoother, friendlier experience for every visitor.

Your permalink structure is basically the map of your website. It shows how your content fits together, and it helps both users and search engines move around your site easily.

For example, a URL like: yourdomain.com/services/web-design/.

Immediately tells someone:

  • They’re in the Services section
  • They’re looking at the Web Design page

This clear parent–child relationship makes your site feel more organized. And when your site structure is clean, Google can crawl and understand your content much faster.

Choosing the right permalink structure early on keeps your website simple, safe, and easy to manage as it grows.

When you publish a new page or post in WordPress, the platform automatically creates a permalink for you. The problem? The default permalink isn’t great for SEO or user experience. The good news is that WordPress makes it super easy to change your permalink settings and choose a structure that works better for your website.

Changing your WordPress permalink structure is a simple three step process, here’s how you can do it:

Step 1: Log in to your WordPress dashboard and look for the Settings option on the left-hand menu

Step 2: Click on the ‘Permalinks.’ option to open the page permalink settings.

Lastly, select your desired permalink format as per your needs.

WordPress permalink options

Recommendations for each WordPress configuration

Each structure has its own purpose, so the best one for you depends on your SEO goals and the kind of content you publish. Here’s a quick and friendly breakdown:

Permalink settings Setting recommendations
Day and name
[https://example.com/2025/10/27/sample-post/]
Good for news-heavy sites.

Ideal for publishers who post multiple updates daily. However, for most websites, adding the full date makes your content appear outdated too quickly.

Month and name
[https://example.com/2025/10/sample-post/]
The verdict is the same as above. Slightly shorter, but it still adds a timestamp that most businesses might not need.
Numeric
[https://example.com/archives/123]
Skip this one.

Just another version of an unclear, non-descriptive link. It provides readers and search engines with no indication of what the page is about.

Post name
[https://example.com/sample-post/]
The best option for 99% of websites.

Clean. Short. Easy to read. Keyword-friendly. Perfect for SEO. If you’ve ever searched for ‘what is a permalink in WordPress’ or ‘what is a WordPress permalink’, this is the recommended format.

Custom structure This allows you to create your own format using tags like /%category%/, /%postname%/, or /%author%/.

Although we refer to them as permalinks or permanent links, there may be instances when you need to update them. Maybe a page title has changed, maybe you’re fixing your site structure, or maybe you’re cleaning up old URLs. Whatever the reason, changing a permalink isn’t something you should do casually; one wrong move can lead to 404 errors.

So before we talk about how to change a permalink in WordPress, it’s important to understand when you should change it, why it matters, and what the possible impact might be.

Even though permalinks are meant to stay the same, there are situations where changing them makes sense. Here are the most common use cases where updating a permalink is not only acceptable, but sometimes necessary.

If your site started with WordPress’s default “plain” URLs (like /?p=123), you’ll quickly realize they don’t describe the content, which makes ranking harder. Switching to a clean, keyword-rich structure helps search engines better understand your page.

Example:

yourdomain.com/?p=245

yourdomain.com/how-to-start-a-blog

If your permalink doesn’t reflect your topic, fixing it may be a smart move.

When you’re improving user experience

Sometimes, older permalink formats are confusing or too long. Updating them to something short and clear makes URLs easier for people to read, remember, and share.

Users are much happier to click something like yourdomain.com/blue-dress instead of yourdomain.com/products/?id=blue&ref=123.

Clean permalinks help establish trust, which in turn leads to more clicks.

Also read: SEO Basics: What are user signals?

When your content feels “dated” because of the URL

If your permalink includes the year or full date (like news-style URLs), users may assume the content is outdated, even if the blog post remains relevant. Switching to a timeless structure can improve click-through rates; for example, use yourdomain.com/best-seo-tips/ instead of yourdomain.com/best-seo-tips-2025/.

When you’re rebranding or restructuring your site

If you are rebuilding your website, changing domain names, or reorganizing categories, ensure that your permalinks align with your new structure. This keeps your content consistent and prevents confusion.

When you’re moving to HTTPS

Switching from HTTP to HTTPS is a major security upgrade, and it affects your permalinks. It requires redirects to make sure your old links still work.

Also read: HTTP status codes and what they mean for SEO

When you inherit or audit an old website

If the previous owner used messy or unclear permalinks, updating them can help you improve SEO, rebuild trust, and create a more organised structure.

Changing permalinks without a plan can cause serious problems, especially for SEO. Since permalinks function like permanent addresses, updating them incorrectly can break links throughout your entire website.

Here’s what can go wrong:

  • You may trigger 404 errors: This happens when the old URL no longer exists, and you haven’t added a redirect. Too many 404s hurt both user experience and SEO
  • You can lose rankings: If you change a permalink without a 301 redirect, Google treats the new URL as a brand-new page, causing drops in traffic and lost link value
  • Internal links can break: Any links inside your own site that point to the old URL will stop working unless they’re updated or redirected
  • External links stop sending traffic: Backlinks from other websites, emails, or social posts will lead to broken pages if redirects aren’t in place

Also read: Clean up your bad backlinks

If you ever need to update a permalink, you shouldn’t jump straight in. There’s a simple three-step process that keeps your website safe, your rankings stable, and your visitors away from 404 errors. Think of it as your mini checklist for making permalink changes the right way.

Step 1: Back up your website (just to be safe)

Before touching your URL structure, always create a full backup. If anything goes wrong, you can restore your site in seconds, rather than trying to fix broken links one by one.

Step 2: Set up a 301 redirect for the old URL

This is the most important step. A 301 redirect informs Google and your visitors that your page has been permanently moved. It redirects everyone to the new permalink and retains nearly all of your SEO value.

Without a 301 redirect With a 301 redirect
Your old link becomes a 404 Your traffic stays safe
Your new link loses rankings Your SEO strength moves with the new URL
Any backlinks pointing to the old URL lose their power Google updates the new permalink over time

You can set up redirects manually, but this usually requires knowledge of databases or cPanel. Thankfully, WordPress plugins make it easy, and this is where Yoast SEO Premium becomes incredibly helpful.

The plugin’s redirect manager feature automatically creates a 301 redirect every time you change a URL or move/delete a page. So even if you forget to set up a redirect, Yoast handles it for you and protects your SEO behind the scenes.

Here’s how the Yoast SEO Premium plugin takes the stress out of the process:

  • Automatically creates redirects when you change or delete a URL
  • Prevents 404 errors by forwarding visitors to the correct page
  • Let you choose the right redirect type (301, 302, 307, 410, etc.)
  • Organises all redirects in one clean dashboard
  • Supports advanced options like REGEX redirects and import/export

With Yoast SEO Premium, you don’t have to remember any of these steps. You change the permalink, and the plugin handles the redirect instantly, keeping your SEO, structure, and user experience intact.

A smarter analysis in Yoast SEO Premium

Yoast SEO Premium has a smart content analysis that helps you take your content to the next level!

Once your redirects are ready, you can safely update your permalink in the WordPress editor or change the global permalink settings. At this point, you won’t break any links because your redirects are already in place.

A well-structured permalink saves you from future headaches. Here are the best practices to follow every time you create a new permalink on your WordPress site.

A slug should not look like a full sentence. It should act like an address that clearly tells users and search engines what the page is about. Shorter permalinks are easier to read, share, and understand. For example, /improve-seo-2025/

Use your target keyword naturally

Your main keyword should appear in the slug, but only once. This helps search engines identify the topic without making your URL look spammy. For example, /what-is-a-permalink/.

Use hyphens to separate words

Hyphens are the correct standard for URLs. Search engines read them as natural separators between words.

Avoid:

  • Underscores (my_post)
  • Spaces (my%20post)
  • Words combined without separation (mypost)

Always write them like this: yourdomain.com/chocolate-cake

Use dates with care

As mentioned earlier, dates can be particularly helpful for news sites or when covering strictly time-sensitive topics. However, for most blogs and business websites, dates in permalinks can make your content appear outdated, even when it remains relevant. Therefore, when possible, choose evergreen URLs, such as /best-yoga-poses/.

Use lowercase letters in all URLs

Since URLs can be case-sensitive, uppercase and lowercase versions of the same slug may be treated as distinct pages. This can cause duplicate content issues. Always stick to lowercase, such as: yourdomain.com/types-of-tea.

Encode special characters and emojis

If your slug includes accented characters (ä, å, ö) or emojis (which you should avoid using), they must be encoded using UTF-8. Without encoding, browsers may break the URL or cause crawl errors.

Encoding ensures the permalink displays correctly and remains accessible to search engines.

Whether you are using /blog/%postname%/ or placing all service pages under /services/%postname%/, choose one approach and follow it for the entire site; consistency improves navigation, user experience, and SEO.

Your permalink structure may appear simple on the surface, but it silently supports the way users and search engines interact with your site. A clear and consistent URL helps readers understand your content, builds trust, reduces confusion, and prevents the need for constant changes later. It also provides search engines with a clear path to follow, which enhances your overall SEO performance.

By keeping permalinks short, descriptive, and keyword-focused, avoiding unnecessary stop words, choosing hyphens, maintaining consistency in formats, and using lowercase letters, you create a structure designed to last. These small choices make your URLs easy to share, maintain, and understand by Google.

With the right permalink strategy and the help of tools like Yoast SEO for managing redirects and site structure, you can build a solid foundation that supports your content for years to come.

Your URLs are not just technical details. They are part of your site’s identity. Treat them with care, and they will continue to guide search engines and users to the right place every time.

Google-Engaged Audience: Worry-Free Remarketing, Or A Waste Of Money?

Are you tired of remarketing headaches in your Google Ads account? In a time when we’re all facing increasing privacy restrictions, browser setting changes, and complex tracking setups, building reliable audiences can feel overwhelmingly difficult. What you may not know is that Google quietly launched a new type of “Your data segment” called the Google-engaged audience last year – and it’s still so underrated.

Available to every Google Ads account, this segment represents an elegant solution to a complicated problem. But for advanced Google Ads specialists who typically demand granular control and deep data insights, the simplicity of this audience raises a pivotal question: Is this worry-free segment a reliable source of high-quality traffic? Or will the Google-engaged audience potentially waste your time and budget?

In this article, I’ll share exactly what the Google-engaged audience is and how it works, original data comparing the Google-engaged audience to other website-based remarketing solutions, and when this segment may (or may not) make sense for your Google Ads strategy.

What Is The Google-Engaged Audience?

The Google-engaged audience is the newest type of “Your data segment” available in Google Ads. I love using and recommending this audience segment because it elegantly solves many of the complex implementation issues associated with traditional remarketing solutions.

Here’s how it works: Every Google Ads account is automatically populated with one Google-engaged audience segment. You can find yours under Tools > Shared Library > Audience Manager > Your data segments. Critically, the Google-engaged audience requires no Google tag, no account linking, and no data uploads.

Instead, this segment populates whenever a user clicks to your website from a Google property. For example, when they click from:

Why The Google-Engaged Audience Is So Powerful

The Google-engaged audience is helpful for small business owners because they don’t need to install the Google tag, connect Google Analytics, or sync their CRM with Google in order to start remarketing. It’s just there, there’s just one, it just works.

But small business owners aren’t the only ones who should be looking into using this audience type. Since users join this list when they click to your website from a Google-owned property, Google knows exactly who these users are (most are signed in to Google). Google has a first-party relationship with these users.

Because Google handles user consent and tracking within its own ecosystem, and “captures” those users for you before they leave the Google ecosystem, you get a high-quality audience that is generally more reliable and robust than third-party solutions, which suffer from challenges around browser settings, privacy controls, and consent management frameworks.

In short, it’s an easy-to-use, high-quality audience of people who visited your website from Google.

Where The Google-Engaged Audience Falls Short

Despite its clear benefits in data quality and ease of implementation, the Google-engaged audience does have some limitations that may make it unsatisfying for you to use.

The first constraint is the obvious one: This audience segment only tracks people who click to your website from Google-owned properties. This means that your Google-engaged audience will not capture everyone who visits your website from other sources, such as:

  • Direct traffic.
  • Social media traffic.
  • Non-Google paid ads (Meta, TikTok, etc.).
  • Email traffic.
  • etc.

If a significant portion of your website traffic is not coming from Google, then your Google-engaged audience may not be as useful for your campaigns.

Next, the Google-engaged audience is not compatible with the Google Display Network (GDN). This is because the GDN is mostly made up of non-Google-owned properties, so Google doesn’t have as robust audience data about those users. This means that you can’t use this audience in a standard Display campaign, and you can’t use it on Display inventory within other campaign types, such as Search, Demand Gen, or Video campaigns. Keep this in mind if a significant portion of your Google Ads investment is going towards the GDN.

Finally, while the simplicity of one single Google-engaged audience may be welcomed by small business owners, it doesn’t afford the granularity that large advertisers may crave. Since every account receives only one Google-engaged audience segment, there is no built-in mechanism to create specific segments based on when the user visited, what pages they visited, what actions they took, etc., unlike the granular options available with tag-based lists or Google Analytics lists.

How Does The Google-Engaged Audience List Size Compare To Other Types Of Remarketing Lists?

To provide a data-driven perspective on the usefulness of the Google-engaged audience, I conducted an original study comparing the size of the Google-engaged audience to two other types of “Your data segments” across a dozen advertisers: the “All Visitors” list from the Google Ads tag, and the standard “All Users” list from Google Analytics 4 (GA4). When comparing all three lists for the same advertiser, which list was the largest? Which was the smallest? How did this vary across Google’s inventory?

Google-Engaged Audiences Are Generally Larger Than Google Tag-Based Audiences

In my study, I found that the tag-based All Visitors list was usually significantly smaller than the Google-engaged audience, across all eligible inventory.

On average, the Google tag-based remarketing audience was:

  • 62% smaller than the Google-engaged audience for Search inventory.
  • 61% smaller on YouTube inventory.
  • 90% smaller on Gmail inventory.

The takeaway: If you’re relying exclusively on the Google tag for your remarketing, you are likely missing out on a lot of users. This issue is likely exacerbated if you are not using data-preserving solutions like enhanced conversions or Consent Mode.

Google-Engaged Audiences Are Generally Smaller Than Google Analytics Audiences

My study found that the Google-engaged audience was smaller in size than the Google Analytics default “All Users” list, but not on Gmail.

On average, the GA4 “All Users” audience was:

  • 28% larger than the Google-engaged audience for Search inventory.
  • 46% larger on YouTube inventory.
  • 10% smaller on Gmail inventory.

This is more in line with what I would have expected, since GA4 captures audiences from all sources, not just from Google-owned properties. In fact, I would have expected the difference to be even larger, and was surprised by how robust the Google-engaged audience is on Gmail inventory.

Remember, the “size by inventory” looks at how many active matched records Google can find on Search, on YouTube, on Gmail, and on the Display network. While Google seems to match users quite nicely in Search and YouTube, it seems more difficult for the system to match users on Gmail – unless, of course, they’re coming from the Google-engaged audience, where Google already knows exactly who they are. I call this the “Gmail dropoff.”

The takeaway: The Google Analytics audience makes a good default for website-based remarketing. If you are running a Demand Gen campaign, however, and are explicitly looking to remarket to users via Gmail, consider adding the Google-engaged audience to your audience targeting alongside your Google Analytics audience.

Is The Google-Engaged Audience A Waste Of Money?

Absolutely not! I’ve seen dozens of my Google Ads coaching clients see great results when targeting the Google-engaged audience, specifically in Demand Gen campaigns where the focus is on Google-owned inventory. I’ve seen this audience segment be especially useful for freelancers and agencies working with local service providers, since they can just check a box and get remarketing live without having to worry about tags or integrations.

You can also consider targeting, observing, or excluding your Google-engaged audience in Search and Shopping campaigns, as either a complement or replacement for how you would use a website-based remarketing list in your strategy.

I would not, however, recommend using the Google-engaged audience in your Performance Max audience signals, or as the seed list for a Lookalike, as it is too broad to be useful in those scenarios. In fact, I don’t recommend using any website-based remarketing in these scenarios; in my opinion, for an audience signal or seed list, you should only use an actual customer list.

To conclude, the Google-engaged audience is a clear example of worry-free remarketing. It is built on a durable foundation of Google’s own first-party data, bypassing the technical headaches and privacy challenges associated with traditional tag-based remarketing. It is especially useful for small business owners, but can also be helpful for all practitioners running Demand Gen campaigns due to its advantages on Gmail inventory. When in doubt, layer the Google-engaged audience alongside your existing Google tag or Google Analytics-based website remarketing segments in your Search, Shopping, Demand Gen, or Video campaigns.

More Resources:


Featured Image: Roman Samborskyi/Shutterstock

Improve Any Link Building Strategy With One Small Change via @sejournal, @martinibuster

Link building outreach is not just blasting out emails. There’s also a conversation that happens when someone emails you back with a skeptical question. The following are tactics to use for overcoming skeptical responses.

In my opinion it’s always a positive sign when someone responds to an email, even if they’re skeptical. I consider nearly all email responses to be indicators that a link is waiting to happen. This is why a good strategy that anticipates common questions will help you convert skeptical responses into links.

Many responses tend to be questions. What they are asking, between the lines, is for you to help them overcome their suspicions. Anytime you receive a skeptical response, try to view it as them asking you, “Help me understand that you are legitimate and represent a legitimate website that we should be linking to.”

The question is asked between the lines. The answer should similarly be addressed between the lines. Ninety nine percent of the time, a between-the-lines question should not be answered directly. The perfect way to answer those questions, the perfect way to address an underlying concern, is to answer it in the same way you received it, between the lines.

Common  and weird questions that I used to get were like:

  • Who are you?
  • Who do you work for?
  • How did get my email address?

Before I discuss how I address those questions, I want to mention something important that I do not do. I do not try to actively convert the respondent in the first response. In my response to their response to my outreach, I never ask them to link to the site.

The question of linking is already hanging in the air and is the object of their email to you- there is no need to bring that up. If in your response you ask them again to link to your site it will tilt them back to being suspicious of you, raising the odds of losing the link.

In trout fishing, the successful angler crouches so that the trout does not see you. The successful angler may even wear clothing that helps them blend into the background. The best anglers imitate the crane, a fish-eating bird that stands perfectly still, imperceptibly inching closer to its prey. This is done to avoid being noticed. Your response should imitate the crane or the camouflaged angler. You should put yourself into the mindset of anything but a marketer asking for a link.

Your response must not be to immediately ask for a link because that in my opinion will just lose the link. So don’t do it just yet.

Tribal Affinity

One approach that I used to use successful is what I called the Tribal Affinity approach. For a construction/home/real estate related campaign, I used to approach it with the mindset of a homeowner. I wouldn’t say that I’m a homeowner (even though I was), I would just think in terms of what would I say as a homeowner contacting a company to suggest a real estate or home repair type a link. In the broken link or suggest a link strategy, I would say that the three links I am suggesting for their links page have been useful to me.

Be A Mirror

A tribal affinity response that was useful to me is to mirror the person I’m outreaching to, to assume the mindset of the person I am responding to. So for example, if they are a toy collector then your mindset can also be a toy collector. If the outreach target is a club member then your outreach mindset can be an enthusiast of whatever the club is about. I never claim membership in any particular organization, club or association. I limit my affinity to mirroring the same shared mindset as the person I’m outreaching to.

Assume The Mindset

Another approach is to assume the mindset of someone who happened upon the links page with a broken link or missing a good quality link. When you get into the mindset the text of your email will be more natural.

Thus, when someone responds by challenging me by asking how I found their site or who am I working for my response is to just stick to my mindset of a homeowner and respond accordingly.

And really, what’s going on is that they’re not really asking how you found their site. What they’re really asking, between the lines, is if you’re a marketer of some kind. You can go ahead and say yes, you are. Or you can respond between the lines and say that you’re just a homeowner. Up to you.

There are many variations to this approach. The important points are:

  • Responses that challenge you are not necessarily hostile but are often link conversions waiting to happen.
  • Never respond to a response by asking for a link.
  • Put yourself into the right mindset. Thinking like a marketer will usually lead to a conversion dampening response.
  • Put yourself into the mindset that mirrors the person you outreach to.

Get into the mindset that gives you a plausible reason for finding their site and the best words for asking for a link will write themselves.

Featured Image by Shutterstock/Luis Molinero

Questions The CEO Should Be Asking About Their Website (But Rarely Does) via @sejournal, @billhunt

Few CEOs ever ask hard questions about their company website. They’ll sign off on multimillion-dollar redesigns, approve ad budgets, and endorse “digital transformation” plans, but rarely ask how much enterprise value their digital infrastructure is actually creating.

That’s a problem, because the website is no longer a marketing artifact. It’s the factory floor of digital value creation. Every lead, sale, customer interaction, and data signal runs through it. When the site performs well, it compounds growth. When it underperforms, it silently leaks shareholder value.

Executives don’t need to understand HTML or crawl budgets. But they do need to ask sharper questions.  They need to ask the kind that expose hidden risk, surface inefficiencies, and align digital investments with measurable business outcomes. In the age of AI-driven search, where visibility and trust are determined algorithmically, these questions aren’t optional. They’re fiduciary.

Why CEOs Must Ask – Even If SEO’s Believe It Is “Beneath” Them

There’s a persistent misconception in digital circles: that CEOs shouldn’t concern themselves with SEO, site performance, or technical issues. “That’s marketing’s job,” people say. But the truth is, these issues directly affect the metrics that boards and investors care about most – operating margin, revenue growth, capital efficiency, and risk mitigation.

When a website is treated as an expense line rather than a capital asset, accountability disappears. Teams chase traffic over value, marketing spend rises to offset organic losses, and executives are left with fragmented data that hides the real cost of inefficiency.

A CEO’s job isn’t to approve color palettes or keyword lists. It’s to ensure the digital infrastructure is producing measurable returns on invested capital just as they would for a factory, logistics system, or data center.

The Cost Of Not Asking

Every company has a “digital balance sheet,” even if it’s never been documented. Behind every campaign and click lies a network of dependencies, from page speed and content accuracy to structured data, discoverability, and cross-market alignment. When those systems falter, the losses are invisible but compounding:

  • Organic visibility declines, forcing paid media spend to rise.
  • Technical debt accumulates, slowing innovation.
  • AI search engines misattribute content or cite competitors instead.
  • Global teams duplicate content, fragmenting authority and wasting budget.

In one multinational I audited, over $5 million per month in paid search spend was compensating for lost organic traffic caused by broken hreflang tags and indexation gaps.

A similar disconnect played out publicly when the CMO of a major retail brand was asked during an earnings call about their online holiday strategy. He confidently declared, “As the largest reseller in our category, we’ll dominate the season online.” Within seconds, a reporter searched the category term, and the brand didn’t appear on page one. The CMO was stunned. He had assumed offline dominance guaranteed online visibility. It didn’t.

That thirty-second fact-check illustrated a billion-dollar truth: market leadership offline doesn’t ensure findability online. Without the right questions and governance, digital equity erodes silently until someone outside the company exposes it.

No CEO would tolerate that level of inefficiency in their supply chain. Yet it happens online every day, unnoticed, because few know which questions to ask.

The 10 Questions Every CEO Should Be Asking

These questions aren’t tactical; they’re financial. They surface whether the digital system that represents your brand to the world is operating efficiently, effectively, and in alignment with corporate goals.

Question Why It Matters Executive Red Flag
1. Are we treating the website as a capital asset or a cost center? Capital assets require lifecycle planning, maintenance, and reinvestment. Budgets are reset annually with no cumulative accountability.
2. What’s our digital yield – the value per visit or per impression? Links traffic and investment to tangible business outcomes. Traffic grows, revenue stays flat.
3. Where are we leaking value? Surfaces inefficiencies across SEO, paid, content, and conversion funnels. Paid media dependency rises while organic visibility declines.
4. How fast can we diagnose and fix a problem? Measures organizational agility and governance maturity. Issues discovered only after quarterly reports.
5. Do we have digital “command and control”? Reveals whether teams, agencies, and regions share accountability. Multiple CMSs, duplicated content, and conflicting data.
6. How does our web performance translate to shareholder metrics? Connects digital KPIs to ROIC and margin. Dashboards report sessions, not value.
7. Who owns web effectiveness? Ownership drives accountability and resourcing. Everyone claims a piece; no one owns the outcome.
8. Are we findable, understandable, and trusted by both humans and machines? Future-proofs the brand in AI-driven search. Generative engines cite competitors, not us.
9. How resilient is our digital ecosystem? Tests readiness for migrations, rebrands, and AI shifts. Every platform change causes a traffic cliff.
10. What are we learning from our data that informs decisions? Turns analytics into strategy, not hindsight. Insights exist but never reach decision-makers.

Each question reframes a “marketing” issue as a governance issue. When CEOs ask these questions, they encourage teams to think systemically, connecting content, code, and conversion as interdependent components of a single digital value chain.

From Questions To Action: Building A Culture Of Digital Accountability

Asking the right questions isn’t micromanagement – it’s leadership through intent.

When a CEO defines the Commander’s Intent for digital, it brings clarity of purpose, alignment of teams, and shared metrics, and it changes how the organization approaches the web. Instead of chasing redesigns or vanity KPIs, teams operate with a shared understanding:

“Our website’s job is to create enterprise value – measurable, sustainable, and scalable.”

That intent cascades into structure:

  • Visibility: Reporting evolves from traffic to contribution value.
  • Speed: Teams track time-to-detect and time-to-resolve issues.
  • Alignment: Marketing, IT, and product teams operate under a unified governance framework.

This is where the Web Effectiveness Score or Digital Value Creation Framework bridges web metrics (load time, index coverage) to enterprise KPIs (ROIC, margin, growth). Once that link is visible, executives start managing digital performance as a financial asset because it is.

The CEO’s Digital Playbook

CEOs who ask these questions consistently outperform those who don’t – not because they know more about SEO, but because they lead with system awareness. When they do:

  1. Wasted Spend Decreases.
    Duplicative content, overlapping agencies, and redundant tools are identified and rationalized.

  2. Visibility and Trust Increase.
    Content becomes findable, structured, and cited by both search engines and generative AI.

  3. Risk Declines.
    Technical debt, migration shocks, and compliance failures are detected early.

  4. Innovation Accelerates.
    Modular systems and shared data layers enable faster experimentation.

  5. Enterprise Value Compounds.
    Web performance improvements flow into revenue growth and cost efficiency.

This is the same logic CFOs apply to physical assets. The only difference is that digital assets rarely appear on the balance sheet, so their underperformance remains invisible until a crisis.

Why Now: The AI Search Inflection Point

The rise of generative search makes these questions urgent. Search is no longer a static list of links; it’s a recommendation system. AI engines evaluate authority, trust, and structured data across the web to synthesize answers.

If your website isn’t structured, trusted, and machine-readable, your company risks digital disintermediation and being invisible in the ecosystems that shape decisions. For CEOs, that’s not a marketing problem; it’s an enterprise risk.

As AI systems determine which brands get cited and recommended, your digital infrastructure becomes the new supply chain for relevance and reputation.

Final Thought

The CEOs who win the next decade won’t outspend their competitors – they’ll out-align them. They’ll treat digital infrastructure with the same financial discipline as physical assets, measure contribution instead of activity, and lead teams to think in systems rather than silos.

Every boardroom already measures financial capital. It’s time to start measuring digital capital, and your website is where it compounds.

In the AI era, your website isn’t just how people find you.
It’s how machines define you.

More Resources:


Featured Image: Master1305/Shutterstock

Five Ways To Boost Traffic To Informational Sites via @sejournal, @martinibuster

Informational sites can easily decline into a crisis of search visibility. No site is immune. Here are five ways to manage content to maintain steady traffic, increase the ability to adapt to changing audiences, and make confident choices that help the site maintain growth momentum over time.

1. Create A Mix Of Content Types

Publishers are in a constant race to publish what’s latest because being first to publish can be a source of massive traffic. The main problem with these kinds of sites is that publishing content about current events can run into problems, putting into question the sustainability of the publication.

  • Current events quickly become stale and no longer relevant to an audience.
  • Unforeseen events like an industry strike, accidents, world events, and pandemics can disrupt interest in a topic.

The focus then is to identify content topics that are reliably relevant to the website’s current audience. This kind of content is called evergreen content, and it can form a safety net of reliable traffic that can sustain the business during slow cycles.

An example of the mixed approach to content that comes to mind is how the New York Times has a standalone recipes section on a subdomain of the main website. It also has a category-based section dedicated to gadget reviews called The Wirecutter.

Another example is the entertainment niche which in addition to industry news also publish interviews with stars and essays about popular movies. Music websites publish the latest news but also content based on snippets from interviews with famous musicians where the musicians make interesting statements about songs, inspirations, and cultural observations.

Rolling Stone magazine publishes content about music but also about current events like politics that align with their reader interests.

All three of those examples expand their topics to adjacent topics in order to bulk up their ability to attract steady and consistent traffic that is reliable.

2. Evergreen Content Also Needs Current Event Topics

Conversely, evergreen topics can generate new audience reach and growth by expanding to cover current events. Content sites about recipes, gardening, home repairs, DIY, crafts, parenting, personal finance, and fitness are all examples of topics that feature evergreen content and can also expand to cover current events. The flow of traffic derived from trending topics is an excellent source of devoted readers who return to read evergreen content and end up recommending the site to friends for both current events and evergreen topics.

Current events can be related to products and even to statements by famous people. If you enjoy creating content or making discoveries, then you’ll enjoy the challenge of discovering new sources of trending topics.

If you don’t already have a mix of evergreen and ephemeral content, then I would encourage you to seek opportunities to focus on those kinds of articles. They can help sustain traffic levels while feeding growth and life into the website.

3. Beware Of Old Content

Google evaluates the total content of a website in order to generate a quality score. Google is vague about these whole-site evaluations. We only know that they do it and that a good evaluation can have a positive effect on traffic.

However, what happens when the site becomes top-heavy with old, stale content that’s no longer relevant to site visitors? This can become a drag on a website. There are multiple ways of handling this situation.

Content that is absolutely out of date and of no interest to anyone and is therefore no longer useful should be removed. The criteria to judge content with is usefulness, not the age of the content. The reason to prune this content is because it’s possible that a whole-site evaluation may conclude that most of the website is comprised of unhelpful, outdated web pages. This could be a negative drag on site performance.

There’s nothing inherently wrong with old content as long as it’s useful. For example, the New York Times keeps old movie reviews in archives that are organized by year, month, day, category, and article title.

The URL slug for the movie review of E.T. looks like this:  /1982/06/11/movies/et-fantasy-from-spielberg.html

Screenshot Of Archived Article

Take Decisive Steps

  • Useful historical content can be archived.
  • Older content that is out of date can be rehabilitated.
  • Content that’s out of date and has been superseded by new content can be redirected with a 301 response code to the new content.
  • Content that is out of date and objectively useless should be removed from the website and allowed to show a 404 response code.

4. Topic Interest

Something that can cause traffic to decline on an informational site is waning interest. Technological innovation can cause the popularity of another product to decline, dragging website traffic along with it. For example, I consulted for a website that reported its traffic was declining. The site still ranked for its keywords, but a quick look at Google Trends showed that interest in the website topic was declining. This was several months after the introduction of the iPhone, which negatively impacted a broad category of products that the website was centered on.

Always keep track of how interested your audience is in your topic. Follow influencers in your niche topic on social media to gauge what they are talking about and whether there are any shifts in the conversation that indicate waning interest or growing interest in a related topic.

Always try out new subcategories of your topic that cross over with your readership to see if there is an audience there that can be cultivated.

Another nuance to consider is the difference between temporary dips in interest and long-term structural decline. Some topics experience predictable cycles driven by seasons, economic conditions, or news coverage, while others face permanent erosion as user needs change or alternatives emerge. Misreading a cyclical slowdown as a permanent decline can lead to unnecessary pivots, while ignoring structural shifts can leave a site over-invested in content that no longer aligns with how people search, buy, or learn.

Monitoring topic interest is less about reacting to short-term fluctuations and more about keeping aware of topical interest and trends. By monitoring audience behavior, tracking broader trends, and experimenting at the edges of the core topic, an informational site can adjust gradually rather than being forced into abrupt changes after traffic has already declined. This ongoing attention helps ensure that content decisions remain grounded in how interest evolves over time.

5. Differentiate

Something that happens to a lot of informational websites is that competitors in a topic tend to cover the exact same stories and even have similar styles of photos, about pages, and bios.

B2B software sites have images of people around a laptop, images of a serious professional, and people gesturing at a computer or a whiteboard.

Recipe sites feature the Flat Lay (food photographed from above), the Ingredient Still Life portrait, and action shots of ingredients grated, sprinkled, or in mid air.

Websites tend to converge into homogeneity in the images they use and the kind of content that’s shared, based on the idea that if it’s working for competitors, then it may be a good approach. But sometimes it’s best to step out of the pack and do things differently.

Evolve your images so that they stand out or catch the eye, try a different way of communicating your content, identify the common concept that everyone uses, and see if there’s an alternate approach that makes your site more authentic.

For example, a recipe site can show photographic bloopers or discuss what can go wrong and how to fix or avoid it. Being real is authentic. So why not show what underbaked looks like? Instagram and Pinterest are traffic drivers, but does that mean all images must be impossibly perfect? Maybe people might respond to the opposite of homogeneity and fake perfection.

The thing that’s almost always missing from product reviews is photos of the testers actually using the products. Is it because the reviews are fake? Hm… Show images of the products with proof that they’ve been used.

Takeaways

  • Sustainable traffic can be cultivated with a mix of evergreen and timely content. Find the balance that works for your website.
  • Evergreen content performs best when it is periodically refreshed with up-to-date details.
  • Outdated content that lacks utility or meaning in people’s lives can quietly grow to suppress site-wide performance. Old pages should be reviewed for usefulness and then archived, updated, redirected, or removed.
  • Audience interest in a topic can decline even if rankings remain stable. Monitoring search demand and cultural shifts helps publishers know when it’s time to expand into adjacent topics before traffic erosion becomes severe.
  • Differentiation matters as much as coverage. Sites that mirror competitors in visuals, formats, and voice risk blending into sameness, while original presentation and proof of authentic experience build trust and attention.

Search visibility declines are not caused by a single technical flaw or isolated content mistake but by gradual misalignment between what a site publishes and what audiences continue to value. Sites that rely too heavily on fleeting interest, allow outdated material to accumulate, or follow competitors into visual and editorial homogeneity risk signaling mediocrity rather than relevance and inspiring enthusiasm. Sustained performance depends on actively managing content, balancing evergreen coverage with current events, pruning what’s no longer useful, and making deliberate choices that distinguish the site as useful, authentic, and credible.

Featured Image by Shutterstock/Sergey Nivens

Quantum navigation could solve the military’s GPS jamming problem

In late September, a Spanish military plane carrying the country’s defense minister to a base in Lithuania was reportedly the subject of a kind of attack—not by a rocket or anti-aircraft rounds, but by radio transmissions that jammed its GPS system. 

The flight landed safely, but it was one of thousands that have been affected by a far-reaching Russian campaign of GPS interference since the 2022 invasion of Ukraine. The growing inconvenience to air traffic and risk of a real disaster have highlighted the vulnerability of GPS and focused attention on more secure ways for planes to navigate the gauntlet of jamming and spoofing, the term for tricking a GPS receiver into thinking it’s somewhere else. 

US military contractors are rolling out new GPS satellites that use stronger, cleverer signals, and engineers are working on providing better navigation information based on other sources, like cellular transmissions and visual data. 

But another approach that’s emerging from labs is quantum navigation: exploiting the quantum nature of light and atoms to build ultra-sensitive sensors that can allow vehicles to navigate independently, without depending on satellites. As GPS interference becomes more of a problem, research on quantum navigation is leaping ahead, with many researchers and companies now rushing to test new devices and techniques. In recent months, the US’s Defense Advanced Research Projects Agency (DARPA) and its Defense Innovation Unit have announced new grants to test the technology on military vehicles and prepare for operational deployment. 

Tracking changes

Perhaps the most obvious way to navigate is to know where you started and then track where you go by recording the speed, direction, and duration of travel. But while this approach, known in the field as inertial navigation, is conceptually simple, it’s difficult to do well; tiny uncertainties in any of those measurements compound over time and lead to big errors later on. Douglas Paul, the principal investigator of the UK’s Hub for Quantum Enabled Precision, Navigation & Timing (QEPNT), says that existing specialized inertial-navigation devices might be off by 20 kilometers after 100 hours of travel. Meanwhile, the cheap sensors commonly used in smartphones produce more than twice that level of uncertainty after just one hour. 

“If you’re guiding a missile that flies for one minute, that might be good enough,” he says. “If you’re in an airliner, that’s definitely not good enough.” 

A more accurate version of inertial navigation instead uses sensors that rely on the quantum behavior of subatomic particles to more accurately measure acceleration, direction, and time.

Several companies, like the US-based Infleqtion, are developing quantum gyroscopes, which track a vehicle’s bearing, and quantum accelerometers, which can reveal how far it’s traveled. Infleqtion’s sensors are based on a technique called atom interferometry: A beam of rubidium atoms is zapped with precise laser pulses, which split the atoms into two separate paths. Later, other laser pulses recombine the atoms, and they’re measured with a detector. If the vehicle has turned or accelerated while the atoms are in motion, the two paths will be slightly out of phase in a way the detector can interpret. 

Last year the company trialed these inertial sensors on a customized plane flying at a British military testing site. In October of this year, Infleqtion ran its first real-world test of a new generation of inertial sensors that use a steady stream of atoms instead of pulses, allowing for continuous navigation and avoiding long dead times.

Infleqtion's atomic clock named Tiqker.
A view of Infleqtion’s atomic clock Tiqker.
COURTESY INFLEQTION

Infleqtion also has an atomic clock, called Tiqker, that can help determine how far a vehicle has traveled. It is a kind of optical clock that uses infrared lasers tuned to a specific frequency to excite electrons in rubidium, which then release photons at a consistent, known rate. The device “will lose one second every 2 million years or so,” says Max Perez, who oversees the project, and it fits in a standard electronics equipment rack. It has passed tests on flights in the UK, on US Army ground vehicles in New Mexico, and, in late October, on a drone submarine

“Tiqker operated happily through these conditions, which is unheard-of for previous generations of optical clocks,” says Perez. Eventually the company hopes to make the unit smaller and more rugged by switching to lasers generated by microchips. 

Magnetic fields

Vehicles deprived of satellite-based navigation are not entirely on their own; they can get useful clues from magnetic and gravitational fields that surround the planet. These fields vary slightly depending on the location, and the variations, or anomalies, are recorded in various maps. By precisely measuring the local magnetic or gravitational field and comparing those values with anomaly maps, quantum navigation systems can track the location of a vehicle. 

Allison Kealy, a navigation researcher at Swinburne University in Australia, is working on the hardware needed for this approach. Her team uses a material called nitrogen-vacancy diamond. In NV diamonds, one carbon atom in the lattice is replaced with a nitrogen atom, and one neighboring carbon atom is removed entirely. The quantum state of the electrons at the NV defect is very sensitive to magnetic fields. Carefully stimulating the electrons and watching the light they emit offers a way to precisely measure the strength of the field at the diamond’s location, making it possible to infer where it’s situated on the globe. 

Kealy says these quantum magnetometers have a few big advantages over traditional ones, including the fact that they measure the direction of the Earth’s magnetic field in addition to its strength. That additional information could make it easier to determine location. 

The technology is far from commercial deployment, but Kealy and several colleagues successfully tested their magnetometer in a set of flights in Australia late last year, and they plan to run more trials this year and next. “This is where it gets exciting, as we transition from theoretical models and controlled experiments to on-the-ground, operational systems,” she says. “This is a major step forward.” 

Delicate systems

Other teams, like Q-CTRL, an Australian quantum technology company, are focusing on using software to build robust systems from noisy quantum sensors. Quantum navigation involves taking those delicate sensors, honed in the placid conditions of a laboratory, and putting them in vehicles that make sharp turns, bounce with turbulence, and bob with waves, all of which interferes with the sensors’ functioning. Even the vehicles themselves present problems for magnetometers, especially “the fact that the airplane is made of metal, with all this wiring,” says Michael Biercuk, the CEO of Q-CTRL. “Usually there’s 100 to 1,000 times more noise than signal.” 

After Q-CTRL engineers ran trials of their magnetic navigation system in a specially outfitted Cessna last year, they used machine learning to go through the data and try to sift out the signal from all the noise. Eventually they found they could track the plane’s location up to 94 times as accurately as a strategic-grade conventional inertial navigation system could, according to Biercuk. They announced their findings in a non-peer-reviewed paper last spring. 

In August Q-CTRL received two contracts from DARPA to develop its “software-ruggedized” mag-nav product, named Ironstone Opal, for defense applications. The company is also testing the technology with commercial partners, including the defense contractors Northrop Grumman and Lockheed Martin and Airbus, an aerospace manufacturer. 

Infleqtion's atomic clock named Tiqker.
An illustration showing the placement of Q-CTRL’s Ironstone Opal in a drone.
COURTESY Q-CTRL

“Northrop Grumman is working with Q-CTRL to develop a magnetic navigation system that can withstand the physical demands of the real world,” says Michael S. Larsen, a quantum systems architect at the company. “Technology like magnetic navigation and other quantum sensors will unlock capabilities to provide guidance even in GPS-denied or -degraded environments.”

Now Q-CTRL is working on putting Ironstone Opal into a smaller, more rugged container appropriate for deployment; currently, “it looks like a science experiment because it is a science experiment,” says Biercuk. He anticipates delivering the first commercial units next year. 

Sensor fusion

Even as quantum navigation emerges as a legitimate alternative to satellite-based navigation, the satellites themselves are improving. Modern GPS III satellites include new civilian signals called L1C and L5, which should be more accurate and harder to jam and spoof than current signals. Both are scheduled to be fully operational later this decade. 

US and allied military users are intended to have access to far hardier GPS tools, including M-code, a new form of GPS signal that is rolling out now, and Regional Military Protection, a focused GPS beam that will be restricted to small geographic areas. The latter will start to become available when the GPS IIIF generation of satellites is in orbit, with the first scheduled to go up in 2027. A Lockheed Martin spokesperson says new GPS satellites with M-code are eight times as powerful as previous ones, while the GPS IIIF model will be 60 times as strong.

Other plans involve using navigation satellites in low Earth orbit—the zone inhabited by SpaceX’s internet-providing Starlink constellation—rather than the medium Earth orbit used by GPS. Since objects in LEO are closer to Earth, their signals are stronger, which makes them harder to jam and spoof. LEO satellites also transit the sky more quickly, which makes them harder still to spoof and helps GPS receivers get a lock on their position faster. “This really helps for signal convergence,” says Lotfi Massarweh, a satellite navigation researcher at Delft University of Technology, in the Netherlands. “They can get a good position in just a few minutes. So that is a huge leap.”

Ultimately, says Massarweh, navigation will depend not only on satellites, quantum sensors, or any other single technology, but on the combination of all of them. “You need to think always in terms of sensor fusion,” he says. 

The navigation resources that a vehicle draws on will change according to its environment—whether it’s an airliner, a submarine, or an autonomous car in an urban canyon. But quantum navigation will be one important resource. He says, “If quantum technology really delivers what we see in the literature—if it’s stable over one week rather than tens of minutes—at that point it is a complete game changer.”

Why it’s time to reset our expectations for AI

Can I ask you a question: How do you feel about AI right now? Are you still excited? When you hear that OpenAI or Google just dropped a new model, do you still get that buzz? Or has the shine come off it, maybe just a teeny bit? Come on, you can be honest with me.

Truly, I feel kind of stupid even asking the question, like a spoiled brat who has too many toys at Christmas. AI is mind-blowing. It’s one of the most important technologies to have emerged in decades (despite all its many many drawbacks and flaws and, well, issues).

At the same time I can’t help feeling a little bit: Is that it?

If you feel the same way, there’s good reason for it: The hype we have been sold for the past few years has been overwhelming. We were told that AI would solve climate change. That it would reach human-level intelligence. That it would mean we no longer had to work!

Instead we got AI slop, chatbot psychosis, and tools that urgently prompt you to write better email newsletters. Maybe we got what we deserved. Or maybe we need to reevaluate what AI is for.

That’s the reality at the heart of a new series of stories, published today, called Hype Correction. We accept that AI is still the hottest ticket in town, but it’s time to re-set our expectations.

As my colleague Will Douglas Heaven puts it in the package’s intro essay, “You can’t help but wonder: When the wow factor is gone, what’s left? How will we view this technology a year or five from now? Will we think it was worth the colossal costs, both financial and environmental?” 

Elsewhere in the package, James O’Donnell looks at Sam Altman, the ultimate AI hype man, through the medium of his own words. And Alex Heath explains the AI bubble, laying out for us what it all means and what we should look out for.

Michelle Kim analyzes one of the biggest claims in the AI hype cycle: that AI would completely eliminate the need for certain classes of jobs. If ChatGPT can pass the bar, surely that means it will replace lawyers? Well, not yet, and maybe not ever. 

Similarly, Edd Gent tackles the big question around AI coding. Is it as good as it sounds? Turns out the jury is still out. And elsewhere David Rotman looks at the real-world work that needs to be done before AI materials discovery has its breakthrough ChatGPT moment.

Meanwhile, Garrison Lovely spends time with some of the biggest names in the AI safety world and asks: Are the doomers still okay? I mean, now that people are feeling a bit less scared about their impending demise at the hands of superintelligent AI? And Margaret Mitchell reminds us that hype around generative AI can blind us to the AI breakthroughs we should really celebrate.

Let’s remember: AI was here before ChatGPT and it will be here after. This hype cycle has been wild, and we don’t know what its lasting impact will be. But AI isn’t going anywhere. We shouldn’t be so surprised that those dreams we were sold haven’t come true—yet.

The more likely story is that the real winners, the killer apps, are still to come. And a lot of money is being bet on that prospect. So yes: The hype could never sustain itself over the short term. Where we’re at now is maybe the start of a post-hype phase. In an ideal world, this hype correction will reset expectations. 

Let’s all catch our breath, shall we?

This story first appeared in The Algorithm, our weekly free newsletter all about AI. Sign up to read past editions here.

The Download: why 2025 has been the year of AI hype correction, and fighting GPS jamming

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

The great AI hype correction of 2025

Some disillusionment was inevitable. When OpenAI released a free web app called ChatGPT in late 2022, it changed the course of an entire industry—and several world economies. Millions of people started talking to their computers, and their computers started talking back. We were enchanted, and we expected more.

Well, 2025 has been a year of reckoning. For a start, the heads of the top AI companies made promises they couldn’t keep. At the same time, updates to the core technology are no longer the step changes they once were.

To be clear, the last few years have been filled with genuine “Wow” moments. But this remarkable technology is only a few years old, and in many ways it is still experimental. Its successes come with big caveats. Read the full story to learn more about why we may need to readjust our expectations.

—Will Douglas Heaven

This story is part of our new Hype Correction package, a collection of stories designed to help you reset your expectations about what AI makes possible—and what it doesn’t. Check out the rest of the package here, and you can read more about why it’s time to reset our expectations for AI in the latest edition of the Algorithm, our weekly AI newsletter. Sign up here to make sure you receive future editions straight to your inbox.

Quantum navigation could solve the military’s GPS jamming problem

Since the 2022 invasion of Ukraine, thousands of flights have been affected by a far-reaching Russian campaign of using radio transmissions that jammed its GPS system.

The growing inconvenience to air traffic and risk of a real disaster have highlighted the vulnerability of GPS and focused attention on more secure ways for planes to navigate the gauntlet of jamming and spoofing, the term for tricking a GPS receiver into thinking it’s somewhere else.

One approach that’s emerging from labs is quantum navigation: exploiting the quantum nature of light and atoms to build ultra-sensitive sensors that can allow vehicles to navigate independently, without depending on satellites. Read the full story.

—Amos Zeeberg

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 The Trump administration has launched its US Tech Force program
In a bid to lure engineers away from Big Tech roles and straight into modernizing the government. (The Verge)
+ So, essentially replacing the IT workers that DOGE got rid of, then. (The Register)

2 Lawmakers are investigating how AI data centers affect electricity costs
They want to get to the bottom of whether it’s being passed onto consumers. (NYT $)
+ Calculating AI’s water usage is far from straightforward, too. (Wired $)
+ AI is changing the grid. Could it help more than it harms? (MIT Technology Review)

3 Ford isn’t making a large all-electric truck after all
After the US government’s support for EVs plummeted. (Wired $)
+ Instead, the F-150 Lightning pickup will be reborn as a plug-in hybrid. (The Information $)
+ Why Americans may be finally ready to embrace smaller cars. (Fast Company $)
+ The US could really use an affordable electric truck. (MIT Technology Review)

4 PayPal wants to become a bank in the US
The Trump administration is very friendly to non-traditional financial companies, after all. (FT $)
+ It’s been a good year for the crypto industry when it comes to banking. (Economist $)

5 A tech trade deal between the US and UK has been put on ice
America isn’t happy with the lack of progress Britain has made, apparently. (NYT $)
+ It’s a major setback in relations between the pair. (The Guardian)

6 Why does no one want to make the cure for dengue?
A new antiviral pill appears to prevent infection—but its development has been abandoned. (Vox)

7 The majority of the world’s glaciers are forecast to disappear by 2100
At a rate of around 3,000 per year. (New Scientist $)
+ Inside a new quest to save the “doomsday glacier”. (MIT Technology Review)

8 Hollywood is split over AI
While some filmmakers love it, actors are horrified by its inexorable rise. (Bloomberg $)

9 Corporate America is obsessed with hiring storytellers
It’s essentially a rehashed media relations manager role overhauled for the AI age. (WSJ $)

10 The concept of hacking existed before the internet
Just ask this bunch of teenage geeks. (IEEE Spectrum)

Quote of the day

“So the federal government deleted 18F, which was doing great work modernizing the government, and then replaced it with a clone? What is the point of all this?”

—Eugene Vinitsky, an assistant professor at New York University, takes aim at the US government’s decision to launch a new team to overhaul its approach to technology in a post on Bluesky.

One more thing

How DeepSeek became a fortune teller for China’s youth

As DeepSeek has emerged as a homegrown challenger to OpenAI, young people across the country have started using AI to revive fortune-telling practices that have deep roots in Chinese culture.

Across Chinese social media, users are sharing AI-generated readings, experimenting with fortune-telling prompt engineering, and revisiting ancient spiritual texts—all with the help of DeepSeek.

The surge in AI fortune-telling comes during a time of pervasive anxiety and pessimism in Chinese society. And as spiritual practices remain hidden underground thanks to the country’s regime, computers and phone screens are helping younger people to gain a sense of control over their lives. Read the full story.

—Caiwen Chen

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ Chess has been online as far back as the 1800s (no, really!) ♟
+ Jane Austen was born 250 years ago today. How well do you know her writing? ($)
+ Rob Reiner, your work will live on forever.
+ I enjoyed this comprehensive guide to absolutely everything you could ever want to know about New England’s extensive seafood offerings.

Creating psychological safety in the AI era

Rolling out enterprise-grade AI means climbing two steep cliffs at once. First, understanding and implementing the tech itself. And second, creating the cultural conditions where employees can maximize its value. While the technical hurdles are significant, the human element can be even more consequential; fear and ambiguity can stall momentum of even the most promising initiatives.

Psychological safety—feeling free to express opinions and take calculated risks without worrying about career repercussions1—is essential for successful AI adoption. In psychologically safe workspaces, employees are empowered to challenge assumptions and raise concerns about new tools without fear of reprisal. This is nothing short of a necessity when introducing a nascent and profoundly powerful technology that still lacks established best practices.

“Psychological safety is mandatory in this new era of AI,” says Rafee Tarafdar, executive vice president and chief technology officer at Infosys. “The tech itself is evolving so fast—companies have to experiment, and some things will fail. There needs to be a safety net.”

To gauge how psychological safety influences success with enterprise-level AI, MIT Technology Review Insights conducted a survey of 500 business leaders. The findings reveal high self-reported levels of psychological safety, but also suggest that fear still has a foothold. Anecdotally, industry experts highlight a reason for the disconnect between rhetoric and reality: while organizations may promote a safe to experiment message publicly, deeper cultural undercurrents can counteract that intent.

Building psychological safety requires a coordinated, systems-level approach, and human resources (HR) alone cannot deliver such transformation. Instead, enterprises must deeply embed psychological safety into their collaboration processes.

Key findings for this report include:

  • Companies with experiment-friendly cultures have greater success with AI projects. The majority of executives surveyed (83%) believe a company culture that prioritizes psychological safety measurably improves the success of AI initiatives. Four in five leaders agree that organizations fostering such safety are more successful at adopting AI, and 84% have observed connections between psychological safety and tangible AI outcomes.
  • Psychological barriers are proving to be greater obstacles to enterprise AI adoption than technological challenges. Encouragingly, nearly three-quarters (73%) of respondents indicated they feel safe to provide honest feedback and express opinions freely in their workplace. Still, a significant share (22%) admit they’ve hesitated to lead an AI project because they might be blamed if it misfires.
  • Achieving psychological safety is a moving target for many organizations. Fewer than half of leaders (39%) rate their organization’s current level of psychological safety as “very high.” Another 48%report a “moderate” degree of it. This may mean that some enterprises are pursuing AI adoption on cultural foundations that are not yet fully stable.

Download the report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff. It was researched, designed, and written by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.