Microsoft Explains How Duplicate Content Affects AI Search Visibility via @sejournal, @MattGSouthern

Microsoft has shared new guidance on duplicate content that’s aimed at AI-powered search.

The post on the Bing Webmaster Blog discusses which URL serves as the “source page” for AI answers when several similar URLs exist.

Microsoft describes how “near-duplicate” pages can end up grouped together for AI systems, and how that grouping can influence which URL gets pulled into AI summaries.

How AI Systems Handle Duplicates

Fabrice Canel and Krishna Madhavan, Principal Product Managers at Microsoft AI, wrote:

“LLMs group near-duplicate URLs into a single cluster and then choose one page to represent the set. If the differences between pages are minimal, the model may select a version that is outdated or not the one you intended to highlight.”

If multiple pages are interchangeable, the representative page might be an older campaign URL, a parameter version, or a regional page you didn’t mean to promote.

Microsoft also notes that many LLM experiences are grounded in search indexes. If the index is muddied by duplicates, that same ambiguity can show up downstream in AI answers.

How Duplicates Can Reduce AI Visibility

Microsoft lays out several ways duplication can get in the way.

One is intent clarity. If multiple pages cover the same topic with nearly identical copy, titles, and metadata, it’s harder to tell which URL best fits a query. Even when the “right” page is indexed, the signals are split across lookalikes.

Another is representation. If the pages are clustered, you’re effectively competing with yourself for which version stands in for the group.

Microsoft also draws a line between real page differentiation and cosmetic variants. A set of pages can make sense when each one satisfies a distinct need. But when pages differ only by minor edits, they may not carry enough unique signals for AI systems to treat them as separate candidates.

Finally, Microsoft links duplication to update lag. If crawlers spend time revisiting redundant URLs, changes to the page you actually care about can take longer to show up in systems that rely on fresh index signals.

Categories Of Duplicate Content Microsoft Highlights

The guidance calls out a few repeat offenders.

Syndication is one. When the same article appears across sites, identical copies can make it harder to identify the original. Microsoft recommends asking partners to use canonical tags that point to the original URL and to use excerpts instead of full reprints when possible.

Campaign pages are another. If you’re spinning up multiple versions targeting the same intent and differing only slightly, Microsoft recommends choosing a primary page that collects links and engagement, then using canonical tags for the variants and consolidating older pages that no longer serve a distinct purpose.

Localization comes up in the same way. Nearly identical regional pages can look like duplicates unless they include meaningful differences. Microsoft suggests localizing with changes that actually matter, such as terminology, examples, regulations, or product details.

Then there are technical duplicates. The guidance lists common causes such as URL parameters, HTTP and HTTPS versions, uppercase and lowercase URLs, trailing slashes, printer-friendly versions, and publicly accessible staging pages.

The Role Of IndexNow

Microsoft points to IndexNow as a way to shorten the cleanup cycle after consolidating URLs.

When you merge pages, change canonicals, or remove duplicates, IndexNow can help participating search engines discover those changes sooner. Microsoft links that faster discovery to fewer outdated URLs lingering in results, and fewer cases where an older duplicate becomes the page that’s used in AI answers.

Microsoft’s Core Principle

Canel and Madhavan wrote:

“When you reduce overlapping pages and allow one authoritative version to carry your signals, search engines can more confidently understand your intent and choose the right URL to represent your content.”

The message is consolidation first, technical signals second. Canonicals, redirects, hreflang, and IndexNow help, but they work best when you’re not maintaining a long tail of near-identical pages.

Why This Matters

Duplicate content isn’t a penalty by itself. The downside is weaker visibility when signals are diluted, and intent is unclear.

Syndicated articles can keep outranking the original if canonicals are missing or inconsistent. Campaign variants can cannibalize each other if the “differences” are mostly cosmetic. Regional pages can blend together if they don’t clearly serve different needs.

Routine audits can help you catch overlap early. Microsoft points to Bing Webmaster Tools as a way to spot patterns such as identical titles and other duplication indicators.

Looking Ahead

As AI answers become a more common entry point, the “which URL represents this topic” problem becomes harder to ignore.

Cleaning up near-duplicates can influence which version of your content gets surfaced when an AI system needs a single page to ground an answer.

What is a redirect? Types, how to set them up, and impact on SEO 

Ever clicked a link and landed on a “Page Not Found” error? Redirects prevent that. They send visitors and search engines to the right page automatically. Redirects are crucial for both SEO and user experience. For SEO, they preserve link equity and keep your rankings intact. Additionally, it enhances the user experience, as no one likes dead ends. 

Table of contents

Key takeaways

  • A redirect automatically sends users and search engines from one URL to another, preventing errors like ‘Page Not Found.’
  • Redirects are crucial for SEO and user experience, preserving link equity and maintaining rankings.
  • Different types of redirects exist: 301 for permanent moves and 302 for temporary ones.
  • Avoid client-side redirects, such as meta refresh or JavaScript, as they can harm SEO.
  • Use Yoast SEO Premium to easily set up and manage redirects on your site.

What is a redirect? 

A redirect is a method that automatically sends users and search engines from one URL to another. For example, if you delete a page, a redirect can send visitors to a new or related page instead of a 404 error. 

How redirects work

  1. A user or search engine requests a URL (e.g., yoursite.com/page-old).
  2. The server responds with a redirect instruction.
  3. The browser or search engine follows the redirect to the new URL (e.g., yoursite.com/page-new).

Redirects can point to any URL, even on a different domain. 

Why redirects matter 

Redirects keep your website running smoothly. Without them, visitors hit dead ends, links break, and search engines get lost. They’re not just technical fixes, because they protect your traffic, preserve rankings, and make sure users land where they’re supposed to. Whether you’re moving a page, fixing a typo in a URL, or removing old content, redirects make sure that nothing gets left behind. 

When to use a redirect 

Use redirects in these scenarios: 

  1. Deleted pages: Redirect to a similar page to preserve traffic. 
  2. Domain changes: Redirect the old domain to the new one. 
  3. HTTP→HTTPS: Redirect insecure URLs to secure ones. 
  4. URL restructuring: Redirect old URLs to new ones (e.g., /blog/post → /articles/post). 
  5. Temporary changes: Use a 302 for A/B tests or maintenance pages. 

Types of redirects 

There are various types of redirects, each serving a distinct purpose. Some are permanent, some are temporary, and some you should avoid altogether. Here’s what you need to know to pick the right one. 

Not all redirects work the same way. A 301 redirect tells search engines a page has moved permanently, while a 302 redirect signals a temporary change. Client-side redirects, like meta refresh or JavaScript, exist because they’re sometimes the only option on restrictive hosting platforms or static sites, but they often create more problems than they solve. Below, we break down each type, explain when to use it, and discuss its implications for your SEO. 

Redirect types at a glance 

Redirect type  Use case  When to use  Browser impact  SEO impact  SEO risk 
301  Permanent move  Deleted pages, domain changes, HTTP→HTTPS  Cached forever  Passes (almost) all link equity  None if used correctly 
302  Temporary move  A/B testing, maintenance pages  Not cached  May not pass link equity  Can dilute SEO if used long-term 
307  Temporary move (strict)  API calls, temporary content shifts  Not cached  Search engines may ignore  High if misused 
308  Permanent move (strict)  Rare; use 301 instead  Cached forever  Passes link equity  None 
Meta Refresh  Client-side redirect  Avoid where possible  Slow, not cached  Unreliable  High (hurts UX/SEO) 
JavaScript  Client-side redirect  Avoid where possible  Slow, not cached  Unreliable  High (hurts UX/SEO) 

301 redirects: Permanent moves 

A 301 redirect tells browsers and search engines that a page has moved permanently. Use it when: 

  • You delete a page and want to send visitors to a similar one.
  • You change your domain name.
  • You switch from HTTP to HTTPS.

SEO impact: 301 redirects pass virtually all link equity to the new URL. But be sure to never redirect to irrelevant pages, as this can confuse users and hurt SEO. For example, redirecting a deleted blog post about “best running shoes” to your homepage, instead of a similar post about running gear. This wastes link equity and frustrates visitors. 

Example HTTP header

HTTP/1.1 301 Moved Permanently 
Location: https://example.com/new-page

302 redirects: Temporary moves 

A 302 redirect tells browsers and search engines that a move is temporary. Use it for: 

  • A/B testing different versions of a page.
  • Temporary promotions or sales pages.
  • Maintenance pages.

SEO impact: 302 redirects typically don’t pass ranking power like 301s. Google treats them as temporary, so they may not preserve SEO value. For permanent moves, always use a 301 to ensure link equity transfers smoothly. 

Examples of when to use a 301 and 302 redirect:  

Example 1: Temporary out-of-stock product (302): An online store redirects example.com/red-sneakers to example.com/blue-sneakers while red sneakers are restocked. A 302 redirect keeps the original URL alive for future use. 

Example 2: A permanent domain change (301): A company moves from old-site.com to new-site.com. A 301 redirect makes sure visitors and search engines land on the new domain while preserving SEO rankings. 

307 and 308 redirects: Strict rules 

These redirects follow HTTP rules more strictly than 301 or 302: 

  1. Same method: If a browser sends a POST request, the redirect must also use POST. 
  2. Caching
    • 307: Never cached (temporary). 
    • 308: Always cached (permanent). 

When to use them

  • 307: For temporary redirects where you must keep the same HTTP method (e.g., forms or API calls). 
  • 308: Almost never, use a 301 instead. 

For most sites: Stick with 301 (permanent) or 302 (temporary). These are for specific technical cases only. 

What to know about client-side redirects:

Client-side redirects, such as meta refresh or JavaScript, execute within the browser instead of on the server. They’re rarely the right choice, but here’s why you might encounter them: 

  • Meta refresh: A HTML tag that redirects after a delay (e.g., “You’ll be redirected in 5 seconds…”).
  • JavaScript redirects: Code that changes the URL after the page loads.

Why should you avoid them? 

  • Slow: The browser must load the page first, then redirect.
  • Unreliable: Search engines may ignore them, hurting SEO.
  • Bad UX: Users see a flash of the original page before redirecting.
  • Security risks: JavaScript redirects can be exploited for phishing. 

When they’re used (despite the risks): 

  • Shared hosting with no server access. 
  • Legacy systems or static HTML sites.
  • Ad tracking or A/B testing tools.

Stick with server-side redirects (301/302) whenever possible. If you must use a client-side redirect, test it thoroughly and monitor for SEO issues. 

How redirects impact SEO 

Redirects do more than just send users to a new URL. They shape how search engines crawl, index, and rank your site. A well-planned redirect preserves traffic and rankings. A sloppy one can break both. Here’s what you need to know about their impact. 

Ranking power 

301 redirects pass most of the link equity from the old URL to the new one. This helps maintain your rankings. 302 redirects may not pass ranking power, especially if used long-term. 

Crawl budget 

Too many redirects can slow down how quickly search engines crawl your site. Avoid redirect chains (A→B→C) to save crawl budget

User experience 

Redirects prevent 404 errors and keep users engaged. A smooth redirect experience can reduce bounce rates. 

Common redirect mistakes 

Redirects seem simple, but small errors can cause big problems. Here are the most common mistakes and how to avoid them. 

Redirect chains 

A redirect chain happens when one URL redirects to another, which redirects to another, and so on. For example:  

  • old-page → new-page → updated-page → final-page

Why it’s bad

  • Slows down the user experience. 
  • Wastes crawl budget, as search engines may stop following the chain before reaching the final URL. 
  • Dilutes ranking power with each hop. 

How to fix it

  • Map old URLs directly to their final destination. 
  • Use tools like Screaming Frog to find and fix chains. 

Redirect loops 

A redirect loop sends users and search engines in circles. For example:  

  • page-A → page-B → page-A → page-B...

Why it’s bad

  • Users see an error page (e.g., “Too many redirects”). 
  • Search engines can’t access the content, so it won’t rank. 

How to fix it

  • Check your redirect rules for cblonflicts. 
  • Test redirects with a tool like Redirect Path (Chrome extension) or curl -v in the terminal. 

Using 302s for permanent moves 

A 302 redirect is meant for temporary changes, but many sites use it for permanent moves. For example: 

  • Redirecting old-product to new-product with a 302 and leaving it for years. 

Why it’s bad

  • Search engines may not pass link equity to the new URL. 
  • The old URL might stay in search results longer than intended. 

How to fix it

  • Use a 301 for permanent moves. 
  • If you accidentally used a 302, switch it to a 301 as soon as possible. 

Redirecting to irrelevant pages 

Redirecting a page to unrelated content confuses users and search engines. For example: 

  • Redirecting a blog post about “best running shoes” to the homepage or a page about “kitchen appliances”. 

Why it’s bad

  • Users land on content they didn’t expect, increasing bounce rates. 
  • Search engines may ignore the redirect or penalize it for being manipulative. 
  • Wastes ranking power that could have been passed to a relevant page. 

How to fix it

  • Always redirect to the most relevant page available. 
  • If no relevant page exists, let the old URL return a 404 or 410 error instead. 

Ignoring internal links after redirects 

After setting up a redirect, many sites forget to update internal links. For example: 

  • Redirecting old-page to new-page but keeping links to old-page in the site’s navigation or blog posts. 

Why it’s bad

  • Internal links to the old URL force users and search engines through the redirect, slowing down the experience. 
  • Wastes crawl budget and dilutes ranking power. 

How to fix it

  • Update all internal links to point directly to the new URL. 
  • Use a tool like Screaming Frog to find and fix outdated links. 

Not testing redirects 

Assuming redirects work without testing can lead to surprises. For example: 

  • Setting up a redirect but not checking if it sends users to the right place. 
  • Missing errors like 404s or redirect loops. 

Why it’s bad

  • Broken redirects frustrate users and hurt SEO. 
  • Search engines may drop pages from the index if they can’t access them. 

How to fix it

  • Test every redirect manually or with a tool. 
  • Check Google Search Console for crawl errors after implementing redirects. 

Redirecting everything to the homepage 

When a page is deleted, some sites redirect all traffic to the homepage. For example: 

  • Redirecting old-blog-post to example.com instead of a relevant blog post. 

Why it’s bad

  • Confuses users who expected specific content. 
  • Search engines may see this as a “soft 404” and ignore the redirect. 
  • Wastes ranking power that could have been passed to a relevant page. 

How to fix it

  • Redirect to the most relevant page available. 
  • If no relevant page exists, return a 404 or 410 error. 

Forgetting to update sitemaps 

After setting up redirects, many sites forget to update their XML sitemaps. For example: 

  • Keeping the old URL in the sitemap while redirecting it to a new URL. 

Why it’s bad

  • Sends mixed signals to search engines. 
  • Wastes crawl budget on outdated URLs. 

How to fix it

  • Remove old URLs from the sitemap. 
  • Add the new URLs to help search engines discover them faster. 

Using redirects for thin or duplicate content 

Some sites use redirects to hide thin or duplicate content. For example, redirecting multiple low-quality pages to a single high-quality page to “clean up” the site. 

Why it’s bad

  • Search engines may see this as manipulative. 
  • Doesn’t address the root problem, which is low-quality content. 

How to fix it

  • Improve or consolidate content instead of redirecting. 
  • Use canonical tags if duplicate content is unavoidable. 

Not monitoring redirects over time 

Redirects aren’t a set-it-and-forget-it task. For example: 

  • Setting up a redirect and never checking if it’s still needed or working. 

Why it’s bad

  • Redirects can break over time (e.g., due to site updates or server changes). 
  • Unnecessary redirects waste crawl budget. 

How to fix it

  • Audit redirects regularly (e.g., every 6 months). 
  • Remove redirects that are no longer needed. 

How to set up a redirect 

Setting up redirects isn’t complicated, but the steps vary depending on your platform. Below, you’ll find straightforward instructions for the most common setups, whether you’re using WordPress, Apache, Nginx, or Cloudflare.  

Pick the method that matches your setup and follow along. If you’re unsure which to use, start with the platform you’re most comfortable with. 

WordPress (using Yoast SEO Premium) 

Yoast SEO Premium makes it easy to set up redirects, especially when you delete or move content. Here’s how to do it: 

Option 1: Manual redirects 

  1. Go to Yoast SEO → Redirects in your WordPress dashboard. 
  2. Enter the old URL (the one you want to redirect from). 
  3. Enter the new URL (the one you want to redirect to). 
  4. Select the redirect type: 
  • 301 (Permanent): For deleted or permanently moved pages. 
  • 302 (Found): For short-term changes. 
  1. Click Add Redirect
Manually redirecting a URL in Yoast’s redirect manager

Option 2: Automatic redirects when deleting content 

Yoast SEO can create redirects automatically when you delete a post or page. Here’s how: 

  1. Go to Posts or Pages in your WordPress dashboard. 
  2. Find the post or page you want to delete and click Trash
  3. Yoast SEO will show a pop-up asking what you’d like to do with the deleted content. You’ll see two options: 
    • Redirect to another URL: Enter a new URL to send visitors to. 
    • Return a 410 Content Deleted header: Inform search engines that the page is permanently deleted and should be removed from their index. 
  4. Select your preferred option and confirm. 

This feature saves time and ensures visitors land on the right page. No manual setup required. 

Need help with redirects? Try Yoast SEO Premium

No code, no hassle. Just smarter redirects and many other invaluable tools.

Apache (.htaccess file) 

Apache uses the .htaccess file to manage redirects. If your site runs on Apache, this is the simplest way to set them up. Add the rules below to your .htaccess file, ensuring it is located in the root directory of your site. 

Add these lines to your .htaccess file: 

# 301 Redirect 
Redirect 301 /old-page.html /new-page.html
# 302 Redirect 
Redirect 302 /temporary-page.html /new-page.html

Nginx (server config) 

Nginx handles redirects in the server configuration file. If your site runs on Nginx, add these rules to your server block and then reload the service to apply the changes. 

Add this to your server configuration: 

# 301 Redirect 
server { 
    listen 80; 
    server_name example.com; 
    return 301 https://example.com$request_uri; 
}
# 302 Redirect 
server { 
    listen 80; 
    server_name example.com; 
    location = /old-page { 
        return 302 /new-page; 
    } 
}

Cloudflare (page rules) 

Cloudflare allows you to set up redirects without modifying server files. Create a page rule to forward traffic from one URL to another, without requiring any coding. Simply enter the old and new URLs, select the redirect type, and click Save. 

  1. Go to Rules → Page Rules
  2. Enter the old URL (e.g., example.com/old-page). 
  3. Select Forwarding URL and choose 301 or 302
  4. Enter the new URL (e.g., https://example.com/new-page). 

Troubleshooting redirects 

Redirects don’t always work as expected. A typo, a cached page, or a conflicting rule can break them, or worse, create loops that frustrate users and search engines. Below are the most common issues and how to fix them.  

If something’s not working, start with the basics: check for errors, test thoroughly, and clear your cache. The solutions are usually simpler than they seem. 

Why isn’t my redirect working? 

  • Check for typos: Ensure the URLs are correct. 
  • Clear your cache: Browsers cache 301 redirects aggressively. 
  • Test with curl: Run curl -v http://yoursite.com/old-url to see the HTTP headers. 

Can redirects hurt SEO? 

Yes, if you: 

  • Create redirect chains (A→B→C
  • Use 302s for permanent moves 
  • Redirect to irrelevant pages 

How do I find broken redirects? 

  • Use Google Search Console → Coverage report. 
  • Use Screaming Frog to crawl your site for 404s and redirects. 

What’s the difference between a 301 and 308 redirect? 

  • 301: Most common for permanent moves. Broad browser support. 
  • 308: Strict permanent redirect. Rarely used. Same SEO impact as 301. 

What is a proxy redirect? 

A proxy redirect keeps the URL the same in the browser but fetches content from a different location. Used for load balancing or A/B testing. Avoid for SEO, as search engines may not follow them. 

Conclusion about redirects

Redirects are a simple but powerful tool. A redirect automatically sends users and search engines from one URL to another. As a result, they keep your site running smoothly and preserve SEO value and ranking power. Remember: 

  • Use 301 redirects for permanent moves. 
  • Use 302 redirects for temporary changes. 
  • Avoid client-side redirects, such as meta refresh or JavaScript. 

Need help? Try Yoast SEO Premium’s redirect manager.  

PPC Pulse: More Apple Search Inventory, Exact Match Limits In AI Overviews via @sejournal, @brookeosmundson

In this week’s PPC Pulse: updates include an inventory expansion for Apple Ads, and Google confirms that Exact match keywords are not eligible to serve for Ads in AI Overviews.

Apple announced additional ad placements coming to App Store search results in early 2026.

Google confirmed that exact match keywords cannot serve in AI Overviews, even when identical broad match keywords exist in an account.

Both updates reinforce an ongoing shift. Search inventory is growing across new surfaces, but the level of control advertisers once relied on is changing.

Read on for more details and why they matter for advertisers.

Apple Search Ads Will Add New Search Placements In 2026

Apple officially announced that it will introduce additional ads within App Store Search Results starting in 2026. Today, advertisers can appear only in the top position. Beginning next year, ads will also show further down the results page across more queries, expanding total available inventory.

In its email announcement, Apple shared several supporting data points in its announcement:

  • Nearly 65% of App Store downloads occur directly after a search.
  • The App Store sees 800 million weekly visitors.
  • More than 85% of visitors download at least one app during their visit.
  • Current Search Results ads see 60% or higher conversion rates at the top of results.
Screenshot taken via email by author, December 2025

Per the announcement, advertisers will not need to adjust campaigns to qualify for the new placements. Apple noted that ads will be automatically eligible and cannot be targeted or bid separately by position. The format and billing model will remain the same.

Expanding On An Already Big Year For Apple

Apple has consistently rolled out upgrades and expansions throughout 2025, including:

  • Custom Product Page expansion (March 2025): Apple expanded testing capabilities by allowing more CPP variants tied to specific keywords, improving message alignment.
  • Reporting enhancements (June 2025): Apple introduced clearer diagnostics around impression share, keyword performance, and CPP impact. These updates made it easier to identify friction points in search campaigns.
  • Creative refinements for Today Tab and Search Tab (August 2025): Apple improved visual consistency and added support for higher-funnel experimentation, hinting at broader expansion across App Store surfaces.

These updates all point toward a more robust Apple Ads marketing platform, making the 2026 inventory expansion feel like a natural progression.

Why This Matters For Advertisers

More placements signal higher reach, but also more variability. Top-position performance is unlikely to change, but additional placements may bring new traffic patterns as more users scroll past the first result.

Advertisers should expect incremental installs paired with slightly wider performance swings.

This also means that metadata, product page quality, and CPP strategy will influence performance more than before, since every placement will rely on the same creative foundation.

Read More: An In-Depth Guide To Apple Search Ads

Google Confirms Exact Match Keywords Not Eligible For AI Overviews

A few questions came in to Google Ads Liaison, Ginny Marvin, this week on X (Twitter) regarding the eligibility of exact match keywords for ads in AI Overviews.

Marvin confirmed via a thread on X (Twitter) that exact match ads are not eligible to serve ads inside Google’s AI Overviews. This clarification explains a pattern many advertisers have seen over the last year. Even if an account contains the same query in both exact and broad match, only broad match can enter AI Overview auctions.

Screenshot taken by author, December 2025

The update circulated quickly after Arpan Banerjee shared it on LinkedIn, giving the topic more visibility among PPC practitioners.

Screenshot taken by author, December 2025

This means advertisers may see broad match triggering queries that they assumed would be handled by exact match. It also means AI Overview impressions are routed through a different layer of Google’s system with its own eligibility rules. Since Google does not provide separate AI Overview reporting, changes in performance may not be clearly attributed to this shift.

Why This Matters For Advertisers

This update makes it clear that match types do not operate the same way inside AI-driven surfaces.

The long-standing assumption that exact match provides clean, isolated coverage does not apply within AI Overviews. Broad match becomes the only entry point, which could influence spend allocation, campaign structure, query mapping, and performance diagnostics.

Advertisers should expect shifts in query distribution on terms where they rely heavily on exact match control.

Read More: AI-Enhanced Keyword Selection In PPC

This Week’s Theme: Search Control Looks Different Than It Used To

Both updates highlight a similar pattern. Platforms are expanding search inventory, but advertisers have less control over how placements are allocated.

Apple is opening new ad positions without letting advertisers bid separately for them. Google is routing some search coverage through AI Overviews, where exact match does not participate. In both cases, the legacy structure of “keyword plus bid plus placement” is giving way to a more interpretive system.

This does not mean advertisers lose influence. It means influence shifts to metadata quality, creative alignment, first-party data, and smart segmentation. Both updates remind advertisers to stay flexible because new surfaces will continue to emerge.

More Resources:


Featured Image: Pixel-Shot/Shutterstock

SEO Pulse: AI Mode Hits 75M Users, Gemini 3 Flash Launches via @sejournal, @MattGSouthern

In this week’s Pulse: updates include AI Mode’s growth and missing features, what Google’s latest model brings to search, and what drives citations across different AI experiences.

Google’s Nick Fox confirmed that AI Mode has reached 75 million daily active users, but the personal context features promised at I/O are still in internal testing.

Google launched Gemini 3 Flash with improved speed and performance. Ahrefs research showed AI Mode and AI Overviews cite different URLs.

Here’s what matters for you that happened this week.

Google’s AI Mode Hits 75M Daily Users, But Personal Context Still Delayed

Google’s Nick Fox confirmed AI Mode has grown to 75 million daily active users worldwide, but acknowledged personal context features announced at I/O seven months ago remain in internal testing.

Key Facts:

In an interview on the AI Inside podcast, Fox said personal context features that would connect AI Mode to Gmail and other Google apps are “still to come” with no public timeline.

AI Mode queries run two to three times longer than traditional searches. Google rolled out a preferred sources feature globally and announced improvements to links within AI experiences.

Why This Matters

The personal context delay affects how you should think about AI Mode optimization. If you’ve been preparing for a world where AI Mode knows users’ email confirmations and calendar entries, that world isn’t arriving soon. Currently, users manually add context to longer queries.

That changes what you prioritize. Content still needs to answer the longer, more specific questions users are asking. But the automated personalization layer that might have made some informational queries feel self-contained inside Google’s interface isn’t active yet.

The 75 million daily active user figure matters for traffic planning. AI Mode is no longer a small experiment. It’s a significant channel that’s still evolving. The query length data (two to three times longer than traditional searches) suggests users are having conversations rather than making quick lookups, which affects what content formats and depth work best.

What People Are Saying

AI Inside shared additional highlights on LinkedIn:

“Nick Fox suggests that optimizing for Google’s AI experiences mirrors the approach for traditional search: building a great site with great content

… focus on building for users and creating content that resonates with human readers.”

Read our full coverage: Google’s AI Mode Personal Context Features “Still To Come”

Google Launches Gemini 3 Flash With Faster Performance

Google launched Gemini 3 Flash, its latest AI model focused on speed and efficiency, and immediately shipped it in search products.

Key Facts:

Gemini 3 Flash delivers improved performance across benchmarks while maintaining faster response times than previous models. It’s now the default model in the Gemini app, and AI Mode for Search.

Why SEOs Should Pay Attention

Google’s shipping speed for Gemini 3 Flash suggests how AI model updates might flow into search products going forward. Rather than waiting months between model releases and search integration, you’re now dealing with immediate deployment of new models that can change how AI features behave.

Faster performance matters for user experience in AI Mode and AI Overviews, where latency affects whether people continue using it or switch to traditional results. Faster models make longer multi-turn interactions more practical, potentially leading to more search sessions.

What People Are Saying

Robby Stein, SVP of Product for Google Search, posted about the rollout on LinkedIn:

“3 Flash brings the incredible reasoning capabilities of Gemini 3 Pro, at the speed you expect of Search. So AI Mode better interprets your toughest, multi-layered questions – considering each of your constraints or requirements – and provides a visually digestible response along with helpful links to dive deeper on the web.”

Rhiannon Bell, VP of user experience for Google Search, noted that this update brings Gemini 3 Pro to more users. Bell highlights the ability of 3 Pro to redesign search results:

“My team is constantly thinking about what “helpful” design means, and Gemini 3 Pro is allowing us to fundamentally re-architect what a helpful Search response looks like.

Hema Budaraju, vice president of product management for Search at Google, highlighted the “speed and smarts”:

“As product builders, we often need to balance speed and smarts. Today, we’re bringing that even closer together: Gemini 3 Flash is rolling out globally in Search as the new default model for AI Mode… We’re also putting our Pro models in more hands. Gemini 3 Pro is now available to everyone in the U.S

Read our full coverage: Google Gemini 3 Flash Becomes Default In Gemini App & AI Mode

AI Mode & AI Overviews Cite Same URLs Only 13.7% Of The Time

Ahrefs analyzed 730,000 query pairs and found AI Mode and AI Overviews reach semantically similar conclusions 86% of the time, but cite the same specific URLs just 13.7% of the time.

Key Facts:

Ahrefs compared AI Mode and AI Overview responses across identical queries. While both experiences frequently agree on general information, they’re pulling that information from different sources.

Why SEOs Should Pay Attention

You’re dealing with a split optimization target. Getting cited in AI Overviews doesn’t automatically get you cited in AI Mode, even when both systems are answering the same query with similar information. These are two separate citation engines, not one system with different interfaces.

If you track which AI experience appears for your target queries, you can focus citation efforts accordingly. For queries where AI Mode dominates, publishing frequency and content freshness may matter more. For queries where AI Overviews appear, authority signals and deep resource coverage may matter more.

The 13.7% overlap suggests many sites will see uneven results across surfaces. You might do well in one experience without automatically carrying that visibility into the other.

What People Are Saying

Despina Gavoyannis, senior SEO specialist at Ahrefs, summarized the results on LinkedIn:

“Only 13.7% citation overlap … 86% semantic similarity … In short, 9 out of 10 times, AI Mode and AI Overviews agreed on what to say; they just said it differently and cited different sources.”

Read our full coverage: Google AI Mode & AI Overviews Cite Different URLs, Per Ahrefs Report

Theme Of The Week: AI Search In Practice, Not Theory

Each story this week shows AI search moving from promise to operational reality.

AI Mode’s 75 million daily users and immediate Gemini 3 Flash deployment reveal Google’s AI features are production systems at scale, not experimental labs. The personal context delay shows the gap between what was announced and what’s shipping. The citation study quantifies how these systems work differently despite appearing similar.

For you, this week’s about treating AI search as current infrastructure rather than future speculation. Optimize for how AI Mode and AI Overviews work today, longer manual queries without personal context, immediate model updates that can change behavior, and separate optimization targets for each experience.

The features Google promised at I/O aren’t here yet, but 75 million people are using what is here.

Top Stories Of The Week:

More Resources:


Featured Image: Pixel-Shot/Shutterstock

Search & Social: How To Engineer Cross-Channel Synergy via @sejournal, @rio_seo

When your search and social strategies are intertwined, they work together like a well-oiled machine, and your search visibility can multiply.

For years, SEO and social media teams more often than not operated in silos, rarely engaging with each other and never working in tandem. SEO focused on optimizing for the latest Google algorithm update while social media teams worked earnestly to respond to brand mentions.

Today, these functions must merge from parallel paths to transparent collaboration. Audience engagement on social platforms can influence how search engines interpret trust, authority, and relevance.

Google’s Helpful Content evolution highlighted social platforms in the search engine results pages (SERPs). Discussion forums like Reddit and Quora often surface answers to queries at the top of the SERPs, especially answers that have plenty of comments and upvotes.

Current marketing means SEO and social go hand-in-hand, building unified systems to ensure cross-channel amplification is maximized. Together, these two divergent roles work towards the same goal of helping your business rank higher, improve brand recognition, and build a consistent story across every single touchpoint.

Why Search And Social Belong Together

Search and social belong together. They aren’t focusing on divergent tactics; they’re working in unison to compound your marketing and SEO efforts. The marriage of the two helps improve customer experiences from first search to reading reviews to aid in the decision-making phase of the sales journey.

Here’s what that synergy might look like in practice.

1. Social Creates The Spark Of Discovery

A decade ago, traditional blue links reigned supreme. Social media today is “top of the funnel” for organic search. According to GWI, nearly half (46%) of Gen Z turns to social media first when conducting product research. Not Google. But, many of those users will later turn to search to validate and compare what they discovered on social media.

Social media content shouldn’t just be entertaining or chasing the latest viral trend. It must answer questions your customers are asking. Smart marketing leaders analyze trending social conversations to discover the right queries and phrases people are using related to their products or services. They’re then working with SEO teams to optimize for those terms in the form of visual and written content, as well as back-end optimizations.

Knowing that social sentiment is often the early determinant of rising search demand, it’s crucial for CMOs, SEOs, and social marketers alike to watch for engagement spikes around an emerging topic and create high-quality content quickly in order to turn buzz into business.

2. Search Anchors And Sustains The Momentum

Social engagement is fast and fickle. What’s trending one day is quickly forgotten the next. Search visibility, on the other hand, is a slow process that doesn’t happen overnight. Together, they create the right balance of speed and longevity. A social post may receive thousands of comments in a matter of hours, but an optimized landing page built on that same topic can rank and drive sales for years to come.

Consider Gong, which generates roughly 2.2 million visits a month from organic traffic, according to SimilarWeb. The social media platform invests effort into growing its LinkedIn. At the bottom of Gong’s blog posts, they don’t ask their readers to navigate to a demo or related blog post, they invite them to follow their LinkedIn, and their efforts are paying off.

Gong has 315,000 followers on LinkedIn. Its competitor, Chorus, meanwhile, has about a third of the followers. Additionally, Gong shares about 10-15 posts on its company page per week. The velocity has paid off, as many of its posts receive thousands of interactions and hundreds of comments. This type of momentum is what Google favors and pays attention to, making them more likely to be highlighted in the SERPs.

3. Shared Data Creates Precision

When SEO and social data remain separated, it’s impossible to see the bigger picture and extract key takeaways. Integrating both data sets helps marketing leaders identify what’s working and what isn’t. It showcases what content is delivering return on investment and which should be repurposed. It identifies patterns such as posts that earn high engagement but low search volume or blog posts that earn clicks but fail to be shared on social.

By cross-referencing these insights, teams gain a 360° view of their performance. That level of insight fuels smarter creative, better results, and higher ROI.

How To Engineer Cross-Channel Synergy

Bridging the gap between SEO and social teams requires work. When two teams are accustomed to working independently, structure and strategy must come into play. Below are the five tactics to ensure cross-team synergy is as seamless as possible.

1. Share Objectives

Merge SEO and social teams with intent, aligning on KPIs to ensure everyone is working towards the same goal. Creating joint goals, such as brand visibility, intent coverage, and more, helps teams come together to maximize organizational success.

For example, both SEOs and social marketers should work towards visibility, tracking growth of branded keywords, hashtags, and mentions (both on social and search). Joint goals motivate teams to work closely together, turning to one another to pave the path towards success. This shared measurement philosophy removes team rivalry and breeds co-creators of growth.

2. Plan Content Around Signals

Building content around internal agendas rarely works well. Cross-channel listening opens the door to conversions content marketers often aren’t involved in. Social media marketers leverage social listening to detect emotional signals (what people care about now) and SEOs measure search data to discern what users will look for next. Merging the two together enables content marketers to create click-worthy and relevant content that meets audiences exactly where interest turns into action.

Forecasting content identifies future search demand by tracking early-stage social conversations, leading to a strategy that stays well ahead of your competitors.

3. Implement A Content Relay System

Top-performing brands treat search and social as relay partners. They work together for the greater good of the organization and embrace the team player ideology. Here’s how the content relay model works when implemented right:

  1. Social Spark: Social media teams create a thought leadership thread, poll, or conversation starter in hopes of attracting interest and engagement.
  2. Search Foundation: Based on the responses, social hands off those insights to content to produce a more detailed blog or landing page. SEO helps optimize the content to improve the chances of appearing in the SERPs.
  3. Social Reinforcement: Once the piece has been optimized for search, share the content with social with audience-driven context (you asked, we answered/analyzed).
  4. Search Reinforcement: Embed high-performing social content (such as quotes, videos, or user-generated content) into pages for richer signals. Use structured data to tell search engines what the content is and how to index it.

Every piece of content fuels another, creating a loop of engagement, validation, and authority that compounds across platforms and the content’s lifetime extends.

4. Pair AI With Human Expertise

AI isn’t a replacement for human creativity and expertise. It’s merely an aid to help power smarter business decisions. In the case of social media and search, AI-powered tools can be used to help analyze language consistency and detect sentiment shifts. For example, if users are consistently complaining about long wait times at your fast-food chain in Memphis, TN, AI can flag this as an issue that needs to be resolved before your reputation and bottom line suffer.

Similarly, AI can also identify when your top-performing social post is driving branded search volume or when a keyword starts trending related to your products or services in user-generated content. Intelligent automation enables your team to be notified in real time, allowing you to strike while the iron is hot.

5. Align Leadership And Cultural Change

Marketing leaders must create environments where SEOs and social media team members understand why and how they’re working together. This might include:

  • Hosting bi-weekly meetings with both teams to get both teams up to speed on shared goals and priorities.
  • Creating “bridge roles” like Audience Insights Manager.
  • Recognizing shared wins (e.g., content that ranked and went viral on TikTok).
  • Transparency into what both teams are working on and towards
  • In-person team building events to allow both teams to connect outside of work

A good company culture that fosters collaboration is imperative for team building, employee retention, and business success. When collaboration feels like extra work or leaves one team in the dark, performance and employee satisfaction suffer.

6. Embrace An Ecosystem Mentality

Once marketing leaders align data, culture, and goals, your organization’s ecosystem begins to operate like a living, breathing organism. Search informs social, social accelerates search, and together they improve the longevity of your business. In return, your business becomes more resilient to Google’s constant algorithm evolution. Siloed strategy starts to shift from stagnant results to seamless execution.

A Real-World Case: Social And Search Synergy In Action

When I worked with a leading fast-casual Mexican restaurant, the business had seen inconsistent reviews across its hundreds of locations. We centralized customer feedback, identified common complaints and praises, which led to a revamped online reputation.

Within just two months, according to our agency internal rating metrics, the chain’s average star rating rose from 4.2 to 4.4, five-star reviews increased by 32%, and no one-star reviews were left during that time period. Positive feedback trends emerged almost immediately, signaling local teams were acting on customer feedback faster and more diligently.

The ripple effects reached both search and social ecosystems as improved reviews and higher star ratings typically lead to a boost in visibility in Google Search and Maps. Simultaneously, the same credibility fueled social proof across the brand’s social platforms, where patrons frequently leave both positive and negative feedback.

Search visibility was boosted due to review quality, and social visibility was also enhanced because of customer advocacy. Together, they created a unified trust signal that influenced consumer behavior across every touchpoint. That represents the power of marrying search and social; a blissful union that drives favorable outcomes like visibility that converts.

Future-Facing: The Algorithmic Convergence Of Search And Social

We are now in an era where search and social converge effortlessly. TikTok is an influential discovery engine, while Google’s prominent AI Overviews pull in content that resembles social threads. Social content and discussion forums are now indexed prominently in the SERPs.

SEO should maintain semantic and emotional consistency at every step of discovery across the digital buyer’s journey across all channels.

Marketing executives should ask themselves the following:

  • How do we establish a unified signal map? How does your audience move from discovery to intent? Which social triggers lead to which search behaviors?
  • How can we centralize our listening structure? Does our social listening platform allow us to integrate with our search analytics technology?
  • How can we create rapid-response workflows to capitalize on trending topics before our competitors do?
  • Do we need to reevaluate our reporting cadence? How do we move from channel-based reports to intent-based dashboards that track trending topics across platforms?
  • Are we relying too heavily on AI? Do we use human judgment to craft narratives that align with our brand’s voice and ethics?

Search and social are no longer divergent roles that never speak to one another. They’re an integral effort that plays for the same team and can amplify one another to create something bigger and better than either could solo.

More Resources:


Featured Image: SvetaZi/Shutterstock

Sam Altman Explains OpenAI’s Bet On Profitability via @sejournal, @martinibuster

In an interview with the Big Technology Podcast, Sam Altman seemed to struggle answering the tough questions about OpenAI’s path to profitability.

At about the 36 minute mark the interviewer asked the big question about revenues and spending. Sam Altman said OpenAI’s losses are tied to continued increases in training costs while revenue is growing. He said the company would be profitable much earlier if it were not continuing to grow its training spend so aggressively.

Altman said concern about OpenAI’s spending would be reasonable only if the company reached a point where it had large amounts of computing it could not monetize profitably.

The interviewer asked:

“Let’s, let’s talk about numbers since you brought it up. Revenue’s growing, compute spend is growing, but compute spend still outpaces revenue growth. I think the numbers that have been reported are OpenAI is supposed to lose something like 120 billion between now and 2028, 29, where you’re going to become profitable.

So talk a little bit about like, how does that change? Where does the turn happen?”

Sam Altman responded:

“I mean, as revenue grows and as inference becomes a larger and larger part of the fleet, it eventually subsumes the training expense. So that’s the plan. Spend a lot of money training, but make more and more.

If we weren’t continuing to grow our training costs by so much, we would be profitable way, way earlier. But the bet we’re making is to invest very aggressively in training these big models.”

At this point the interviewer pressed Altman harder about the path to profitability, this time mentioning the spending commitment of $1.4 trillion dollars versus the $20 billion dollars in revenue. This was not a softball question.

The interviewer pushed back:

“I think it would be great just to lay it out for everyone once and for all how those numbers are gonna work.”

Sam Altman’s first attempt to answer seemed to stumble in a word salad kind of way: 

“It’s very hard to like really, I find that one thing I certainly can’t do it and very few people I’ve ever met can do it.

You know, you can like, you have good intuition for a lot of mathematical things in your head, but exponential growth is usually very hard for people to do a good quick mental framework on.

Like for whatever reason, there were a lot of things that evolution needed us to be able to do well with math in our heads. Modeling exponential growth doesn’t seem to be one of them.”

Altman then regained his footing with a more coherent answer:

“The thing we believe is that we can stay on a very steep growth curve of revenue for quite a while. And everything we see right now continues to indicate that we cannot do it if we don’t have the compute.

Again, we’re so compute constrained, and it hits the revenue line so hard that I think if we get to a point where we have like a lot of compute sitting around that we can’t monetize on a profitable per unit of compute basis, it’d be very reasonable to say, okay, this is like a little, how’s this all going to work?

But we’ve penciled this out a bunch of ways. We will of course also get more efficient on like a flops per dollar basis, as you know, all of the work we’ve been doing to make compute cheaper comes to pass.

But we see this consumer growth, we see this enterprise growth. There’s a whole bunch of new kinds of businesses that, that we haven’t even launched yet, but will. But compute is really the lifeblood that enables all of this.

We have always been in a compute deficit. It has always constrained what we’re able to do.

I unfortunately think that will always be the case, but I wish it were less the case, and I’d like to get it to be less of the case over time, because I think there’s so many great products and services that we can deliver, and it’ll be a great business.”

The interviewer then sought to clarify the answer, asking:

“And then your expectation is through things like this enterprise push, through things like people being willing to pay for ChatGPT through the API, OpenAI will be able to grow revenue enough to pay for it with revenue.”

Sam Altman responded:

“Yeah, that is the plan.”

Altman’s comments define a specific threshold for evaluating whether OpenAI’s spending is a problem. He points to unused or unmonetizable computing power as the point at which concern would be justified, rather than current losses or large capital commitments.

In his explanation, the limiting factor is not willingness to pay, but how much computing capacity OpenAI can bring online and use. The follow-up question makes that explicit, and Altman’s confirmation makes clear that the company is relying on revenue growth from consumer use, enterprise adoption, and additional products to cover its costs over time.

Altman’s path to profitability rests on a simple bet: that OpenAI can keep finding buyers for its computing as fast as it can build it. Eventually, that bet either keeps winning or the chips run out.

Watch the interview starting at about the 36 minute mark:

Featured Image/Screenshot

Core Web Vitals Champ: Open Source Versus Proprietary Platforms via @sejournal, @martinibuster

The Core Web Vitals Technology Report by the open source HTTPArchive community ranks content management systems by how well they perform on Google’s Core Web Vitals (CWV). The November 2025 data shows a significant gap between platforms with the highest ranked CMS scoring 84.87% of sites passing CWV, while the lowest ranked CMS scored 46.28%.

What’s of interest this month is that the top three Core Web Vitals champs are all closed source proprietary platforms while the open source systems were at the bottom of the pack.

Importance Of Core Web Vitals

Core Web Vitals (CWV) are metrics created by Google to measure how fast, stable, and responsive a website feels to users. Websites that load quickly and respond smoothly keep visitors engaged and tend to perform better in terms of sales, reads, and add impressions, while sites that fall short frustrate users, increase bounce rates, and perform less well for business goals. CWV scores reflect the quality of the user experience and how a site performs under real-world conditions.

How the Data Is Collected

The CWV Technology Report combines two public datasets.

The Chrome UX Report (CrUX) uses data from Chrome users who opt in to share performance statistics as they browse. This reflects how real users experience websites.
The HTTP Archive runs lab-based tests that analyze how sites are built and whether they follow performance best practices.

Together, the report I generated provides a snapshot of how each content management system performs on Core Web Vitals.

Ranking By November 2025 CWV Score

Duda Is The Number One Ranked Core Web Vitals Champ

Duda ranked first in November 2025, with 84.87% of sites built on the platform delivering a passing Core Web Vitals score. It was the only platform in this comparison where more than four out of five sites achieved a good CWV score. Duda has consistently ranked #1 for Core Web Vitals for several years now.

Wix Ranked #2

Wix ranked second, with 74.86% of sites passing CWV. While it trailed Duda by ten percentage points, Wix was just about four percentage points ahead of the third place CMS in this comparison.

Squarespace Ranked #3

Squarespace ranked third, at 70.39%. Its CWV pass rate placed it closer to Wix than to Drupal, maintaining a clear position in the top three ranked publishing platforms.

Drupal Ranked #4

Drupal ranked fourth, with 63.27% of sites passing CWV. That score put Drupal in the middle of the comparison, below the three private label site builders. This is a curious situation because the bottom three CMS’s in this comparison are all open source platforms.

Joomla Ranked #5

Joomla ranked fifth, at 56.92%. While more than half of Joomla sites passed CWV, the platform remained well behind the top performers.

WordPress Ranked Last at position #6

WordPress ranked last, with 46.28% of sites passing Core Web Vitals. Fewer than half of WordPress sites met the CWV thresholds in this snapshot. What’s notable about WordPress’s poor ranking is that it lags behind the fifth place Joomla by about ten percentage points. So not only is WordPress ranked last in this comparison, it’s decisively last.

Why the Numbers Matter

Core Web Vitals scores translate into measurable differences in how users experience websites. Platforms at the top of the ranking deliver faster and more stable experiences across a larger share of sites, while platforms at the bottom expose a greater number of users to slower and less responsive pages. The gap between Duda and WordPress in the November 2025 comparison was nearly 40 percentage points, 38.59 percentage points.

While an argument can be made that the WordPress ecosystem of plugins and themes may be to blame for the low CWV scores, the fact remains that WordPress is dead last in this comparison. Perhaps WordPress needs to become more proactive about how themes and plugins perform, such as come up with standards that they have to meet in order to gain a performance certification. That might cause plugin and theme makers to prioritize performance.

Do Content Management Systems Matter For Ranking?

I have mentioned this before and will repeat it this month. There have been discussions and debates about whether the choice of content management system affects search rankings. Some argue that plugins and flexibility make WordPress easier to rank in Google. But the fact is that private platforms like Duda, Wix, and Squarespace have all focused on providing competitive SEO functionalities that automate a wide range of technical SEO tasks.

Some people insist that Core Web Vitals make a significant contribution to their rankings and I believe them. But in general, the fact is that CWV performance is a minor ranking factor.

Nevertheless, performance still matters for outcomes that are immediate and measurable, such as user experience and conversions, which means that the November 2025 HTTPArchive Technology Report should not be ignored.

The HTTPArchive report is available here but it will be going away and replaced very soon. I’ve tried the new report and, unless I missed something, it lacks a way to constrain the report by date.

Featured Image by Shutterstock/Red Fox studio

Google Says Ranking Systems Reward Content Made For Humans via @sejournal, @martinibuster

Google’s Danny Sullivan discussed SEO and AI where they observed that their ranking systems are tuned for one thing, regardless if it’s classic search or AI search. What he talked about was optimizing for people, which is something I suspect the search marketing industry will increasingly be talking about.

Nothing New You Need To Be Doing For AI Search

The first thing Danny Sullivan discussed was that despite there being new search experiences powered by AI there isn’t anything new that they need to be doing.

John Mueller asked:

“So everything kind of around AI, or is this really a new thing? It feels like these fads come and go. Is AI in fad? How do you think?”

Danny Sullivan responded:

“Oh gosh, my favorite thing is that we should be calling it LMNOPEO because there’s just so many acronyms for it. It’s GEO for generative engine optimization or AEO for answer engine optimization and AIEO. I don’t know. There’s so many different names for it.

I used to write about SEO and search. I did that for like 20 years. And part of me is just so relieved. I don’t have to do that aspect of it anymore to try to keep up with everything that people are wondering about.

And on the other hand, you still have to kind of keep up on it because we still try to explain to people what’s going on. And I think the good news is like, There’s not a lot you actually really need to be worrying about.

It’s understandable. I think people keep having these questions, right? I mean, you see search formats changing, you see all sorts of things happening and you wonder, well, is there something new I should be doing? Totally get that.

And remember, we, John and I and others, we all came together because we had this blog post we did in May, which we’ll drop a link to or we’ll point you to somehow to it, but it was… we were getting asked again and again, well, what should we be doing? What should we be thinking about?

And we all put our heads together and we talked with the engineers and everything else. So we came up with nothing really that different.”

Google’s Systems Are Tuned To Rank Human Optimized Content

Danny Sullivan next turned to discussing what Google’s systems are designed to rank, which is content that satisfies humans. Robbie Stein, currently Vice President of Product for Google Search, recently discussed the signals Google uses to identify helpful content, discussing how human feedback contributes to helping ranking systems understand what helpful content looks like.

While Danny didn’t get into exact details about the helpfulness signals the way Stein did, Danny’s comments confirmed the underlying point that Robbie Stein was making about how their systems are tuned to identify content that satisfies humans.

Danny continued explaining what SEOs and creators should know about Google’s ranking systems. He began by acknowledging that it’s reasonable that people see a different search experience and conclude that they must be doing something different.

He explained:

“…I think people really see stuff and they think they want to be doing something different. …It is the natural reaction you have, but we talk about sort of this North Star or the point that you should be heading to.”

Next he explained how all of Google’s ranking systems are engineered to rank content that was made for humans and specifically calls out content that is created for search engines as examples of what not to do.

Danny continued his answer:

“And when it comes to all of our ranking systems, it’s about how are we trying to reward content that we think is great for people, that it was written for human beings in mind, not written for search algorithms, not written for LLMs, not written for LMNO, PEO, whatever you want to call it.

It’s that everything we do and all the things that we tailor and all the things that we try to improve, it’s all about how do we reward content that human beings find satisfying and say, that was what I was looking for, that’s what I needed. So if all of our systems are lining up with that, it’s that thing about you’re going to be ahead of it if you’re already doing that.

To whereas the more you’re trying to… Optimize or GEO or whatever you think it is for a specific kind of system, the more you’re potentially going to get away from the main goal, especially if those systems improve and get better, then you’re kind of having to shift and play a lot of catch up.

So, you know, we’re going to talk about some of that stuff here with the big caveat, we’re only talking about Google, right? That’s who we work for. So we don’t say what, anybody else’s AI search, chat search, whatever you want to kind of deal with and kind of go with it from there. But we’ll talk about how we look at things and how it works.”

What Danny is clearly saying is that Google is tuned to rank content that’s written for humans and that optimizing for specific LLMs sets up a situation where it could backfire.

Why Optimizing For LLMs Is Misguided

Although Danny didn’t mention it, this is the right moment to point out that OpenAI, Perplexity, and Claude together have a total traffic referral volume of less than 1%. So it’s clearly a mistake to optimize content for LLMs at the risk of losing significant traffic from search engines.

Content that is genuinely satisfying to people remains aligned with what Google’s systems are built to reward.

Why SEOs Don’t Believe Google

Google’s insistence that their algorithms are tuned toward user satisfaction is not new. They have been saying it for over two decades, and over the years it has been a given that Google was overstating their technology. That is no longer the case.

Arguably, since at least 2018’s Medic broad core update, Google has been making genuine strides toward actually delivering search results that are influenced by user behavior signals that guide Google’s machines toward understanding what kind of content people like, plus AI and neural networks that are better able to match content to a search query.

If there is any doubt about this, check out the interview with Robbie Stein, where he explains exactly how human feedback, in aggregate, influences the search results.

Is Human Optimized Content The New SEO?

So now we are at a point where links no longer are the top ranking criteria. Google’s systems have the ability to understand queries and content and match one to the other. User behavior data, which has been a part of Google’s algorithms since at least 2004, plays a strong role in helping Google understand what kinds of content satisfy users.

It may be well past time for SEOs and creators to let go of the old SEO playbooks and start focusing on optimizing their websites for humans.

Featured Image by Shutterstock/Bas Nastassia

Can AI really help us discover new materials?

Judging from headlines and social media posts in recent years, one might reasonably assume that AI is going to fix the power grid, cure the world’s diseases, and finish my holiday shopping for me. But maybe there’s just a whole lot of hype floating around out there.

This week, we published a new package called Hype Correction. The collection of stories takes a look at how the world is starting to reckon with the reality of what AI can do, and what’s just fluff.

One of my favorite stories in that package comes from my colleague David Rotman, who took a hard look at AI for materials research. AI could transform the process of discovering new materials—innovation that could be especially useful in the world of climate tech, which needs new batteries, semiconductors, magnets, and more. 

But the field still needs to prove it can make materials that are actually novel and useful. Can AI really supercharge materials research? What could that look like?

For researchers hoping to find new ways to power the world (or cure disease or achieve any number of other big, important goals), a new material could change everything.

The problem is, inventing materials is difficult and slow. Just look at plastic—the first totally synthetic plastic was invented in 1907, but it took until roughly the 1950s for companies to produce the wide range we’re familiar with today. (And of course, though it is incredibly useful, plastic also causes no shortage of complications for society.)

In recent decades, materials science has fallen a bit flat—David has been covering this field for nearly 40 years, and as he puts it, there have been just a few major commercial breakthroughs in that time. (Lithium-ion batteries are one.)

Could AI change everything? The prospect is a tantalizing one, and companies are racing to test it out.

Lila Sciences, based in Cambridge, Massachusetts, is working on using AI models to uncover new materials. The company can not only train an AI model on all the latest scientific literature, but also plug it into an automated lab, so it can learn from experimental data. The goal is to speed up the iterative process of inventing and testing new materials and look at research in ways that humans might miss.

At an MIT Technology Review event earlier this year, I got to listen to David interview Rafael Gómez-Bombarelli, one of Lila’s cofounders. As he described what the company is working on, Gómez-Bombarelli acknowledged that AI materials discovery hasn’t yet seen a big breakthrough moment. Yet.

Gómez-Bombarelli described how models Lila has trained are providing insights that are “as deep [as] or deeper than our domain scientists would have.” In the future, AI could “think” in ways that depart from how human scientists approach a problem, he added: “There will be a need to translate scientific reasoning by AI to the way we think about the world.”

It’s exciting to see this sort of optimism in materials research, but there’s still a long and winding road before we can satisfyingly say that AI has transformed the field. One major difficulty is that it’s one thing to take suggestions from a model about new experimental methods or new potential structures. It’s quite another to actually make a material and show that it’s novel and useful.

You might remember that a couple of years ago, Google’s DeepMind announced it had used AI to predict the structures of “millions of new materials” and had made hundreds of them in the lab.

But as David notes in his story, after that announcement, some materials scientists pointed out that some of the supposedly novel materials were basically slightly different versions of known ones. Others couldn’t even physically exist in normal conditions (the simulations were done at ultra-low temperatures, where atoms don’t move around much).

It’s possible that AI could give materials discovery a much-needed jolt and usher in a new age that brings superconductors and batteries and magnets we’ve never seen before. But for now, I’m calling hype. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The 8 worst technology flops of 2025

Welcome to our annual list of the worst, least successful, and simply dumbest technologies of the year.

This year, politics was a recurring theme. Donald Trump swept back into office and used his executive pen to reshape the fortunes of entire sectors, from renewables to cryptocurrency. The wrecking-ball act began even before his inauguration, when the president-elect marketed his own memecoin, $TRUMP, in a shameless act of merchandising that, of course, we honor on this year’s worst tech list.

We like to think there’s a lesson in every technological misadventure. But when technology becomes dependent on power, sometimes the takeaway is simpler: it would have been better to stay away.

That was a conclusion Elon Musk drew from his sojourn as instigator of DOGE, the insurgent cost-cutting initiative that took a chainsaw to federal agencies. The public protested. Teslas were set alight, and drivers of his hyped Cybertruck discovered that instead of a thumbs-up, they were getting the middle finger.

On reflection, Musk said he wouldn’t do it again. “Instead of doing DOGE, I would have, basically … worked on my companies,” he told an interviewer this month. “And they wouldn’t have been burning the cars.”

Regrets—2025 had a few. Here are some of the more notable ones.

NEO, the home robot

1X TECH

Imagine a metal butler that fills your dishwasher and opens the door. It’s a dream straight out of science fiction. And it’s going to remain there—at least for a while.

That was the hilarious, and deflating, takeaway from the first reviews of NEO, a 66-pound humanoid robot whose maker claims it will “handle any of your chores reliably” when it ships next year.

But as a reporter for the Wall Street Journal learned, NEO took two minutes to fold a sweater and couldn’t crack a walnut. Not only that, but the robot was teleoperated the entire time by a person wearing a VR visor.

Still interested? Neo is available on preorder for $20,000 from startup 1X.

More: I Tried the Robot That’s Coming to Live With You. It’s Still Part Human (WSJ), The World’s Stupidest Robot Maid (The Daily Show) Why the humanoid workforce is running late (MIT Technology Review), NEO The Home Robot | Order Today (1X Corp.)

Sycophantic AI

It’s been said that San Francisco is the kind of place where no one will tell you if you have a bad idea. And its biggest product in a decade—ChatGPT—often behaves exactly that way.

This year, OpenAI released an especially sycophantic update that told users their mundane queries were brilliantly incisive. This electronic yes-man routine isn’t an accident; it’s a product strategy. Plenty of people like the flattery.

But it’s disingenuous and dangerous, too. Chatbots have shown a willingness to indulge users’ delusions and worst impulses, up to and including suicide.

In April, OpenAI acknowledged the issue when the company dialed back a model update whose ultra-agreeable personality, it said, had the side effect of “validating doubts, fueling anger, urging impulsive actions, or reinforcing negative emotions.”

Don’t you dare agree the problem is solved. This month, when I fed ChatGPT one of my dumbest ideas, its response began: “I love this concept.”

More: What OpenAI Did When ChatGPT Users Lost Touch With Reality (New York Times), Sycophantic AI Decreases Prosocial Intentions and Promotes Dependence (arXiv), Expanding on what we missed with sycophancy (OpenAI)

The company that cried “dire wolf”

Two dire wolves are seen at 3 months old.

COLOSSAL BIOSCIENCES

When you tell a lie, tell it big. Make it frolic and give it pointy ears. And make it white. Very white.

That’s what the Texas biotech concern Colossal Biosciences did when it unveiled three snow-white animals that it claimed were actual dire wolves, which went extinct more than 10 millennia ago.

To be sure, these genetically modified gray wolves were impressive feats of engineering. They’d been made white via a genetic mutation and even had some bits and bobs of DNA copied over from old dire wolf bones. But they “are not dire wolves,” according to canine specialists at the International Union for Conservation of Nature.

Colossal’s promotional blitz could hurt actual endangered species. Presenting de-extinction as “a ready-to-use conservation solution,” said the IUCN, “risks diverting attention from the more urgent need of ensuring functioning and healthy ecosystems.”

In a statement, Colossal said that sentiment analysis of online activity shows 98% agreement with its furry claims. “They’re dire wolves, end of story,” it says.  

More: Game of Clones: Colossal’s new wolves are cute, but are they dire? (MIT Technology Review), Conservation perspectives on gene editing in wild canids (IUCN),  A statement from Colossal’s Chief Science Officer, Dr. Beth Shapiro (Reddit)

mRNA political purge

RFK Jr composited with a vaccine vial that has a circle and slash icon over it

MITTR | GETTY IMAGES

Save the world, and this is the thanks you get?

During the covid-19 pandemic, the US bet big on mRNA vaccines—and the new technology delivered in record time. 

But now that America’s top health agencies are led by the antivax wackadoodle Robert F. Kennedy Jr., “mRNA” has become a political slur.

In August, Kennedy abruptly canceled hundreds of millions in contracts for next-generation vaccines. And shot maker Moderna—once America’s champion—has seen its stock slide by more than 90% since its Covid peak.

The purge targeting a key molecule of life (our bodies are full of mRNA) isn’t just bizarre. It could slow down other mRNA-based medicine, like cancer treatments and gene editing for rare diseases.

In August, a trade group fought back, saying: “Kennedy’s unscientific and misguided vilification of mRNA technology and cancellation of grants is the epitome of cutting off your nose to spite your face.”

More: HHS Winds Down mRNA Vaccine Development (US Department of Health and Human Services),  Cancelling mRNA studies is the highest irresponsibility (Nature), How Moderna, the company that helped save the world, unraveled (Stat News)

​​Greenlandic Wikipedia

WIKIPEDIA

Wikipedia has editions in 340 languages. But as of this year, there’s one less: Wikipedia in Greenlandic is no more.

Only around 60,000 people speak the Inuit language. And very few of them, it seems, ever cared much about the online encyclopedia. As a result, many of the entries were machine translations riddled with errors and nonsense.

Perhaps a website no one visits shouldn’t be a problem. But its existence created the risk of a linguistic “doom spiral” for the endangered language. That could happen if new AIs were trained on the corrupt Wikipedia articles.  

In September, administrators voted to close Greenlandic Wikipedia, citing possible “harm to the Greenlandic language.”

Read more:  Can AI Help Revitalize Indigenous Languages? (Smithsonian), How AI and Wikipedia have sent vulnerable languages into a doom spiral (MIT Technology Review), Closure of Greenlandic Wikipedia (Wikimedia)

Tesla Cybertruck

Tesla Cybertruck-rows of new cars in port

ADOBE STOCK

There’s a reason we’re late to the hate-fest around Elon Musk’s Cybertruck. That’s because 12 months ago, the polemical polygon was the #1 selling electric pickup in the US.

So maybe it would end up a hit.

Nope. Tesla is likely to sell only around 20,000 trucks this year, about half last year’s total. And a big part of the problem is that the entire EV pickup category is struggling. Just this month, Ford decided to scrap its own EV truck, the F-150 Lightning. 

With unsold inventory building, Musk has started selling Cybertrucks as fleet vehicles to his other enterprises, like SpaceX.

More: Elon’s Edsel: Tesla Cybertruck Is The Auto Industry’s Biggest Flop In Decades (Forbes), Why Tesla Cybertrucks Aren’t Selling (CNBC), Ford scraps fully-electric F-150 Lightning as mounting losses and falling demand hits EV plans (AP)

Presidential shitcoin

VIA GETTRUMPMEMES.COM

Donald Trump launched a digital currency called $TRUMP just days before his 2025 inauguration, accompanied by a logo showing his fist-pumping “Fight, fight, fight” pose.

This was a memecoin, or shitcoin, not real money. Memecoins are more like merchandise—collectibles designed to be bought and sold, usually for a loss. Indeed, they’ve been likened to a consensual scam in which a coin’s issuer can make a bundle while buyers take losses.

The White House says there’s nothing amiss. “The American public believe[s] it’s absurd for anyone to insinuate that this president is profiting off of the presidency,” said spokeswoman Karoline Leavitt in May.

More: Donald and Melania Trump’s Terrible, Tacky, Seemingly Legal Memecoin Adventure (Bloomberg), A crypto mogul who invested millions into Trump coins is getting a reprieve (CNN), How the Trump companies made $1 bn from crypto (Financial Times), Staff Statement on Meme Coins (SEC)

“Carbon-neutral” Apple Watch

Apple's Carbon Neutral logo with the product Apple Watch

APPLE

In 2023, Apple announced its “first-ever carbon-neutral product,” a watch with “zero” net emissions. It would get there using recycled materials and renewable energy, and by preserving forests or planting vast stretches of eucalyptus trees.

Critics say it’s greenwashing. This year, lawyers filed suit in California against Apple for deceptive advertising, and in Germany, a court ruled that the company can’t advertise products as carbon neutral because the “supposed storage of CO2 in commercial eucalyptus plantations” isn’t a sure thing.

Apple’s marketing team relented. Packaging for its newest watches doesn’t say “carbon neutral.” But Apple believes the legal nitpicking is counterproductive, arguing that it can only “discourage the kind of credible corporate climate action the world needs.”

More: Inside the controversial tree farms powering Apple’s carbon neutral goal (MIT Technology Review), Apple Watch not a ‘CO2-neutral product,’ German court finds (Reuters), Apple 2030: Our ambition to become carbon neutral (Apple)