This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.
China figured out how to sell EVs. Now it has to bury their batteries.
In the past decade, China has seen an EV boom, thanks in part to government support. Buying an electric car has gone from a novel decision to a routine one; by late 2025, nearly 60% of new cars sold were electric or plug-in hybrids.
But as the batteries in China’s first wave of EVs reach the end of their useful life, early owners are starting to retire their cars, and the country is now under pressure to figure out what to do with those aging components.
The issue is putting strain on China’s still-developing battery recycling industry and has given rise to a gray market that often cuts corners on safety and environmental standards. National regulators and commercial players are also stepping in, but so far these efforts have struggled to keep pace with the flood of batteries coming off the road. Read the full story.
—Caiwei Chen
The AI doomers feel undeterred
It’s a weird time to be an AI doomer.This small but influential community believes, in the simplest terms, that AI could get so good it could be bad—very, very bad—for humanity.
The doomer crowd has had some notable success over the past several years: including helping shape AI policy coming from the Biden administration. But a number of developments over the past six months have put them on the back foot. Talk of an AI bubble has overwhelmed the discourse as tech companies continue to invest in multiple Manhattan Projects’ worth of data centers without any certainty that future demand will match what they’re building.
So where does this leave the doomers? We decided to ask some of the movement’s biggest names to see if the recent setbacks and general vibe shift had altered their views. See what they had to say in our story.
—Garrison Lovely
This story is part of our new Hype Correction package, a collection of stories designed to help you reset your expectations about what AI makes possible—and what it doesn’t. Check out the rest of the package.
Take our quiz on the year in health and biotechnology
In just a couple of weeks, we’ll be bidding farewell to 2025. And what a year it has been! Artificial intelligence is being incorporated into more aspects of our lives, weight-loss drugs have expanded in scope, and there have been some real “omg” biotech stories from the fields of gene therapy, IVF, neurotech, and more.
This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.
The must-reads
I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 TikTok has signed a deal to sell its US unit Its new owner will be a joint venture controlled by American investors including Oracle. (Axios) + But the platform is adamant that its Chinese owner will retain its core US business. (FT $) + The deal is slated to close on January 22 next year. (Bloomberg $) + It means TikTok will sidestep a US ban—at least for now. (The Guardian)
2 A tip on Reddit helped to end the hunt for the Brown University shooter The suspect, who has been found dead, is also suspected of killing an MIT professor. (NYT $) + The shooter’s motivation is still unclear, police say. (WP $)
3 Tech leaders are among those captured in newly-released Epstein photos Bill Gates and Google’s Sergey Brin are both in the pictures. (FT $) + They’ve been pulled from a tranche of more than 95,000. (Wired $)
4 A Starlink satellite appears to have exploded And it’s now falling back to earth. (The Verge) + On the ground in Ukraine’s largest Starlink repair shop. (MIT Technology Review)
5 YouTube has shut down two major channels that share fake movie trailers Screen Culture and KH Studio uploaded AI-generated mock trailers with over a billion views. (Deadline) + Google is treading a thin line between embracing and shunning generative AI. (Ars Technica)
6 Trump is cracking down on investment in Chinese tech firms Lawmakers are increasingly worried that US money is bolstering the country’s surveillance state. (WSJ $) + Meanwhile, China is working on boosting its chip output. (FT $)
7 ICE has paid an AI agent company to track down targets It claims to be able to rapidly trace a target’s online network. (404 Media) 8 America wants to return to the Moon by 2028 And to build some nuclear reactors while it’s up there. (Ars Technica) + Southeast Asia seeks its place in space. (MIT Technology Review)
9 Actors in the UK are refusing to be scanned for AI They’re reportedly routinely pressured to consent to creating digital likenesses of themselves. (The Guardian) + How Meta and AI companies recruited striking actors to train AI. (MIT Technology Review)
10 Indian tutors are explaining how to use AI over WhatsApp Lessons are cheap and personalized—but the teachers aren’t always credible. (Rest of World) + How Indian health-care workers use WhatsApp to save pregnant women. (MIT Technology Review)
Quote of the day
“Trump wants to hand over even more control of what you watch to his billionaire buddies. Americans deserve to know if the president struck another backdoor deal for this billionaire takeover of TikTok.”
—Democratic senator Elizabeth Warren queries the terms of the deal that TikTok has made to allow it to continue operating in the US in a post on Bluesky.
One more thing
Synthesia’s AI clones are more expressive than ever. Soon they’ll be able to talk back.
—Rhiannon Williams
Earlier this summer, I visited the AI company Synthesia to create a hyperrealistic AI-generated avatar of me. The company’s avatars are a decent barometer of just how dizzying progress has been in AI over the past few years, so I was curious just how accurately its latest AI model, introduced last month, could replicate me.
I found my avatar as unnerving as it is technically impressive. It’s slick enough to pass as a high-definition recording of a chirpy corporate speech, and if you didn’t know me, you’d probably think that’s exactly what it was.
My avatar shows how it’s becoming ever-harder to distinguish the artificial from the real. And before long, these avatars will even be able to talk back to us. But how much better can they get? And what might interacting with AI clones do to us? Read the full story.
We can still have nice things
A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)
+ You can keep your beef tallow—here are the food trends that need to remain firmly in 2025. + The Library of Congress has some lovely images of winter that are completely free to use. + If you’ve got a last minute Christmas work party tonight, don’t make these Secret Santa mistakes. + Did you realize Billie Eilish’s smash hit Birds of a Feather has the same chord progression as Wham’s Last Christmas? They sound surprisingly good mashed together.
I talk a lot on the podcast about business, growth, and solving problems, but at some point it’s worth stepping back to ask why we’re doing any of this in the first place.
This recap is about Beardbrand (my company) and our 2025 performance: What worked, what didn’t, what was painful, and what made it all worth it.
It’s also a reminder to take stock of your own priorities — how you’re allocating your time, energy, and attention — and whether they align with the life you’re trying to build.
The Good
Longtime listeners know that 2023 and 2024 were extremely challenging for me personally and for Beardbrand. We lost a lot of money in 2023 and less, but still meaningful, in 2024. The good news is that in 2025, we became profitable again.
Looking back, our conservative financial strategy before things turned bad helped us survive. It allowed us to withstand rapid market changes and support our staff for as long as possible. That discipline helped us weather the storm.
From a growth standpoint, subscriptions have been a major win. At our lowest point, we had roughly 1,500 subscriptions. We made a focused effort to rebuild, and recently we surpassed 11,000 active subscriptions. Hitting 10,000-plus gives us predictable revenue and long-term stability. Churn has remained low, and we’re still adding members weekly, which is encouraging.
Another big win was finding the right fulfillment partner. After two moves — including one near our manufacturer that didn’t work out — we landed on a small Austin-based provider. The staff offers white-glove service, takes responsibility when issues arise, and aligns with the customer experience we want to deliver. Plus, being local helps. We can visit, meet the team, and fine-tune packaging and shipping costs.
Manufacturing has also improved. Finding the right manufacturing partner is a Goldilocks problem — not too big, not too small, just right. One of our supplier-partners discovered us through this podcast. They’ve allowed us to keep inventory lean, place smaller, more frequent orders, and maintain quality. That’s reduced customer complaints, lowered stress, and helped us avoid unsellable inventory — a major contributor to losses in prior years.
Engagement with customers has improved as we let them vote on which limited-edition fragrance would become permanent.
Another win — we subleased our oversized office, a costly remnant from when our team size was at its peak, easing a significant financial burden until the lease ends in 2026.
The Bad
The biggest hurdle is that the beard care industry has shifted from a blue to a red ocean. A blue ocean is wide open — lots of opportunity, little competition. Today, beard care feels saturated and stagnant.
I see this in search data. Terms like “how to grow a beard,” “beard oil,” and “beard balm” are flat or declining. Meanwhile, other personal care categories such as shampoo, bar soap, and cologne continue to grow. When I look at Beardbrand and our top competitors, we’re all flat or down.
One way to resume growth is with organic content. We’ve had content hits and misses, but we haven’t reliably delivered the quality and volume I want. If we fix it, we can deepen relationships with our audience and stand out again.
Paid media has also been frustrating. Like many brands, we haven’t cracked Meta at scale. We’ll find an ad that works, get excited, then watch it fall flat days later. We’ve hovered around $30,000 a month in spend without breaking through. We recently started integrating more data-driven decision-making.
I expected revenue to grow in 2025 after fixing problems from 2023 and 2024. That didn’t happen. We likely won’t beat last year’s numbers, which forced us to make painful staffing cuts — letting go of two long-tenured, incredible team members. That was one of the hardest decisions I’ve had to make.
Amazon sales have also regressed. We’ve worked with the same agency for three years, and while they’ve done good work, it feels like we’ve plateaued. We’re planning to switch partners.
The Ugly
Overall, 2025 was fairly stress-free, which I’ll gladly take. The biggest issue was that we got sued again. This one came from a patent troll.
Patent lawsuits are very different from the Americans with Disabilities Act lawsuit, which we chose to fight. We had invested heavily in making our site accessible for people with disabilities, including those with vision impairments, and ultimately, we were able to get that case dismissed.
Patent cases are another story. The financial risk of fighting is much higher. Defending the ADA lawsuit cost roughly the same as a settlement. Given where Beardbrand was after multiple years of losses, I swallowed my pride and settled.
What made the decision easier is that, once settled, a patent holder cannot sue again for the same alleged infringement. Another party would need to hold the same patent, which is unlikely. I feel at peace with the choice. The direct-to-consumer community on X was also incredibly helpful, connecting us with a great attorney, which made the process smoother.
Hopefully, that’s the last lawsuit for a while. We’re doing everything we can to protect ourselves — updated privacy policies, cookie consent for pixel tracking in applicable states, and ongoing ADA audits.
Personal Wins and Losses
One of my goals for 2026 is to return to a “profit first” mindset — building a business that’s profitable while also supporting my personal life. Over the past few years, I’ve pulled from savings to maintain our standard of living. I’m grateful I had that cushion, but I don’t want it to be the norm.
The highlight of 2025 was a trip to Japan with my 12-year-old daughter. Travel is something we both love, and it gave us a shared experience during a fleeting stage of life. This trip felt meaningful for her and me as she grows into her own independence. I’m incredibly pleased we did it.
Health-wise, it’s been a good year. I’m rowing again, lifting consistently, and I avoided major injuries. My wife and kids have been healthy, which I never take for granted.
I’m also profoundly grateful for my friends — in Austin, online, and the broader D2C community — who’ve helped me navigate challenging moments.
There was a personal loss, however. My wife and I transferred our final IVF embryo, and it wasn’t successful. That chapter is now closed after more than a decade of infertility and loss. I share this because many are going through similar struggles. You’re not alone.
Microsoft has shared new guidance on duplicate content that’s aimed at AI-powered search.
The post on the Bing Webmaster Blog discusses which URL serves as the “source page” for AI answers when several similar URLs exist.
Microsoft describes how “near-duplicate” pages can end up grouped together for AI systems, and how that grouping can influence which URL gets pulled into AI summaries.
How AI Systems Handle Duplicates
Fabrice Canel and Krishna Madhavan, Principal Product Managers at Microsoft AI, wrote:
“LLMs group near-duplicate URLs into a single cluster and then choose one page to represent the set. If the differences between pages are minimal, the model may select a version that is outdated or not the one you intended to highlight.”
If multiple pages are interchangeable, the representative page might be an older campaign URL, a parameter version, or a regional page you didn’t mean to promote.
Microsoft also notes that many LLM experiences are grounded in search indexes. If the index is muddied by duplicates, that same ambiguity can show up downstream in AI answers.
How Duplicates Can Reduce AI Visibility
Microsoft lays out several ways duplication can get in the way.
One is intent clarity. If multiple pages cover the same topic with nearly identical copy, titles, and metadata, it’s harder to tell which URL best fits a query. Even when the “right” page is indexed, the signals are split across lookalikes.
Another is representation. If the pages are clustered, you’re effectively competing with yourself for which version stands in for the group.
Microsoft also draws a line between real page differentiation and cosmetic variants. A set of pages can make sense when each one satisfies a distinct need. But when pages differ only by minor edits, they may not carry enough unique signals for AI systems to treat them as separate candidates.
Finally, Microsoft links duplication to update lag. If crawlers spend time revisiting redundant URLs, changes to the page you actually care about can take longer to show up in systems that rely on fresh index signals.
Categories Of Duplicate Content Microsoft Highlights
The guidance calls out a few repeat offenders.
Syndication is one. When the same article appears across sites, identical copies can make it harder to identify the original. Microsoft recommends asking partners to use canonical tags that point to the original URL and to use excerpts instead of full reprints when possible.
Campaign pages are another. If you’re spinning up multiple versions targeting the same intent and differing only slightly, Microsoft recommends choosing a primary page that collects links and engagement, then using canonical tags for the variants and consolidating older pages that no longer serve a distinct purpose.
Localization comes up in the same way. Nearly identical regional pages can look like duplicates unless they include meaningful differences. Microsoft suggests localizing with changes that actually matter, such as terminology, examples, regulations, or product details.
Then there are technical duplicates. The guidance lists common causes such as URL parameters, HTTP and HTTPS versions, uppercase and lowercase URLs, trailing slashes, printer-friendly versions, and publicly accessible staging pages.
The Role Of IndexNow
Microsoft points to IndexNow as a way to shorten the cleanup cycle after consolidating URLs.
When you merge pages, change canonicals, or remove duplicates, IndexNow can help participating search engines discover those changes sooner. Microsoft links that faster discovery to fewer outdated URLs lingering in results, and fewer cases where an older duplicate becomes the page that’s used in AI answers.
Microsoft’s Core Principle
Canel and Madhavan wrote:
“When you reduce overlapping pages and allow one authoritative version to carry your signals, search engines can more confidently understand your intent and choose the right URL to represent your content.”
The message is consolidation first, technical signals second. Canonicals, redirects, hreflang, and IndexNow help, but they work best when you’re not maintaining a long tail of near-identical pages.
Why This Matters
Duplicate content isn’t a penalty by itself. The downside is weaker visibility when signals are diluted, and intent is unclear.
Syndicated articles can keep outranking the original if canonicals are missing or inconsistent. Campaign variants can cannibalize each other if the “differences” are mostly cosmetic. Regional pages can blend together if they don’t clearly serve different needs.
Routine audits can help you catch overlap early. Microsoft points to Bing Webmaster Tools as a way to spot patterns such as identical titles and other duplication indicators.
Looking Ahead
As AI answers become a more common entry point, the “which URL represents this topic” problem becomes harder to ignore.
Cleaning up near-duplicates can influence which version of your content gets surfaced when an AI system needs a single page to ground an answer.
Ever clicked a link and landed on a “Page Not Found” error? Redirects prevent that. They send visitors and search engines to the right page automatically. Redirects are crucial for both SEO and user experience. For SEO, they preserve link equity and keep your rankings intact. Additionally, it enhances the user experience, as no one likes dead ends.
Table of contents
Key takeaways
A redirect automatically sends users and search engines from one URL to another, preventing errors like ‘Page Not Found.’
Redirects are crucial for SEO and user experience, preserving link equity and maintaining rankings.
Different types of redirects exist: 301 for permanent moves and 302 for temporary ones.
Avoid client-side redirects, such as meta refresh or JavaScript, as they can harm SEO.
Use Yoast SEO Premium to easily set up and manage redirects on your site.
What is a redirect?
A redirect is a method that automatically sends users and search engines from one URL to another. For example, if you delete a page, a redirect can send visitors to a new or related page instead of a 404 error.
How redirects work
A user or search engine requests a URL (e.g., yoursite.com/page-old).
The server responds with a redirect instruction.
The browser or search engine follows the redirect to the new URL (e.g., yoursite.com/page-new).
Redirects can point to any URL, even on a different domain.
Why redirects matter
Redirects keep your website running smoothly. Without them, visitors hit dead ends, links break, and search engines get lost. They’re not just technical fixes, because they protect your traffic, preserve rankings, and make sure users land where they’re supposed to. Whether you’re moving a page, fixing a typo in a URL, or removing old content, redirects make sure that nothing gets left behind.
When to use a redirect
Use redirects in these scenarios:
Deleted pages: Redirect to a similar page to preserve traffic.
Domain changes: Redirect the old domain to the new one.
HTTP→HTTPS: Redirect insecure URLs to secure ones.
URL restructuring: Redirect old URLs to new ones (e.g., /blog/post → /articles/post).
Temporary changes: Use a 302 for A/B tests or maintenance pages.
Types of redirects
There are various types of redirects, each serving a distinct purpose. Some are permanent, some are temporary, and some you should avoid altogether. Here’s what you need to know to pick the right one.
Not all redirects work the same way. A 301 redirect tells search engines a page has moved permanently, while a 302 redirect signals a temporary change. Client-side redirects, like meta refresh or JavaScript, exist because they’re sometimes the only option on restrictive hosting platforms or static sites, but they often create more problems than they solve. Below, we break down each type, explain when to use it, and discuss its implications for your SEO.
Redirect types at a glance
Redirect type
Use case
When to use
Browser impact
SEO impact
SEO risk
301
Permanent move
Deleted pages, domain changes, HTTP→HTTPS
Cached forever
Passes (almost) all link equity
None if used correctly
302
Temporary move
A/B testing, maintenance pages
Not cached
May not pass link equity
Can dilute SEO if used long-term
307
Temporary move (strict)
API calls, temporary content shifts
Not cached
Search engines may ignore
High if misused
308
Permanent move (strict)
Rare; use 301 instead
Cached forever
Passes link equity
None
Meta Refresh
Client-side redirect
Avoid where possible
Slow, not cached
Unreliable
High (hurts UX/SEO)
JavaScript
Client-side redirect
Avoid where possible
Slow, not cached
Unreliable
High (hurts UX/SEO)
301 redirects: Permanent moves
A 301 redirect tells browsers and search engines that a page has moved permanently. Use it when:
You delete a page and want to send visitors to a similar one.
You change your domain name.
You switch from HTTP to HTTPS.
SEO impact: 301 redirects pass virtually all link equity to the new URL. But be sure to never redirect to irrelevant pages, as this can confuse users and hurt SEO. For example, redirecting a deleted blog post about “best running shoes” to your homepage, instead of a similar post about running gear. This wastes link equity and frustrates visitors.
A 302 redirect tells browsers and search engines that a move is temporary. Use it for:
A/B testing different versions of a page.
Temporary promotions or sales pages.
Maintenance pages.
SEO impact: 302 redirects typically don’t pass ranking power like 301s. Google treats them as temporary, so they may not preserve SEO value. For permanent moves, always use a 301 to ensure link equity transfers smoothly.
Examples of when to use a 301 and 302 redirect:
Example 1: Temporary out-of-stock product (302): An online store redirects example.com/red-sneakers to example.com/blue-sneakers while red sneakers are restocked. A 302 redirect keeps the original URL alive for future use.
Example 2: A permanent domain change (301): A company moves from old-site.com to new-site.com. A 301 redirect makes sure visitors and search engines land on the new domain while preserving SEO rankings.
307 and 308 redirects: Strict rules
These redirects follow HTTP rules more strictly than 301 or 302:
Same method: If a browser sends a POST request, the redirect must also use POST.
Caching:
307: Never cached (temporary).
308: Always cached (permanent).
When to use them:
307: For temporary redirects where you must keep the same HTTP method (e.g., forms or API calls).
308: Almost never, use a 301 instead.
For most sites: Stick with 301 (permanent) or 302 (temporary). These are for specific technical cases only.
What to know about client-side redirects:
Client-side redirects, such as meta refresh or JavaScript, execute within the browser instead of on the server. They’re rarely the right choice, but here’s why you might encounter them:
Meta refresh: A HTML tag that redirects after a delay (e.g., “You’ll be redirected in 5 seconds…”).
JavaScript redirects: Code that changes the URL after the page loads.
Why should you avoid them?
Slow: The browser must load the page first, then redirect.
Unreliable: Search engines may ignore them, hurting SEO.
Bad UX: Users see a flash of the original page before redirecting.
Security risks: JavaScript redirects can be exploited for phishing.
When they’re used (despite the risks):
Shared hosting with no server access.
Legacy systems or static HTML sites.
Ad tracking or A/B testing tools.
Stick with server-side redirects (301/302) whenever possible. If you must use a client-side redirect, test it thoroughly and monitor for SEO issues.
How redirects impact SEO
Redirects do more than just send users to a new URL. They shape how search engines crawl, index, and rank your site. A well-planned redirect preserves traffic and rankings. A sloppy one can break both. Here’s what you need to know about their impact.
Ranking power
301 redirects pass most of the link equity from the old URL to the new one. This helps maintain your rankings. 302 redirects may not pass ranking power, especially if used long-term.
Crawl budget
Too many redirects can slow down how quickly search engines crawl your site. Avoid redirect chains (A→B→C) to save crawl budget.
User experience
Redirects prevent 404 errors and keep users engaged. A smooth redirect experience can reduce bounce rates.
Common redirect mistakes
Redirects seem simple, but small errors can cause big problems. Here are the most common mistakes and how to avoid them.
Redirect chains
A redirect chain happens when one URL redirects to another, which redirects to another, and so on. For example:
old-page → new-page → updated-page → final-page
Why it’s bad:
Slows down the user experience.
Wastes crawl budget, as search engines may stop following the chain before reaching the final URL.
When a page is deleted, some sites redirect all traffic to the homepage. For example:
Redirecting old-blog-post to example.com instead of a relevant blog post.
Why it’s bad:
Confuses users who expected specific content.
Search engines may see this as a “soft 404” and ignore the redirect.
Wastes ranking power that could have been passed to a relevant page.
How to fix it:
Redirect to the most relevant page available.
If no relevant page exists, return a 404 or 410 error.
Forgetting to update sitemaps
After setting up redirects, many sites forget to update their XML sitemaps. For example:
Keeping the old URL in the sitemap while redirecting it to a new URL.
Why it’s bad:
Sends mixed signals to search engines.
Wastes crawl budget on outdated URLs.
How to fix it:
Remove old URLs from the sitemap.
Add the new URLs to help search engines discover them faster.
Using redirects for thin or duplicate content
Some sites use redirects to hide thin or duplicate content. For example, redirecting multiple low-quality pages to a single high-quality page to “clean up” the site.
Why it’s bad:
Search engines may see this as manipulative.
Doesn’t address the root problem, which is low-quality content.
How to fix it:
Improve or consolidate content instead of redirecting.
Use canonical tags if duplicate content is unavoidable.
Not monitoring redirects over time
Redirects aren’t a set-it-and-forget-it task. For example:
Setting up a redirect and never checking if it’s still needed or working.
Why it’s bad:
Redirects can break over time (e.g., due to site updates or server changes).
Unnecessary redirects waste crawl budget.
How to fix it:
Audit redirects regularly (e.g., every 6 months).
Remove redirects that are no longer needed.
How to set up a redirect
Setting up redirects isn’t complicated, but the steps vary depending on your platform. Below, you’ll find straightforward instructions for the most common setups, whether you’re using WordPress, Apache, Nginx, or Cloudflare.
Pick the method that matches your setup and follow along. If you’re unsure which to use, start with the platform you’re most comfortable with.
WordPress (using Yoast SEO Premium)
Yoast SEO Premium makes it easy to set up redirects, especially when you delete or move content. Here’s how to do it:
Option 1: Manual redirects
Go to Yoast SEO → Redirects in your WordPress dashboard.
Enter the old URL (the one you want to redirect from).
Enter the new URL (the one you want to redirect to).
Select the redirect type:
301 (Permanent): For deleted or permanently moved pages.
302 (Found): For short-term changes.
Click Add Redirect.
Manually redirecting a URL in Yoast’s redirect manager
Option 2: Automatic redirects when deleting content
Yoast SEO can create redirects automatically when you delete a post or page. Here’s how:
Go to Posts or Pages in your WordPress dashboard.
Find the post or page you want to delete and click Trash.
Yoast SEO will show a pop-up asking what you’d like to do with the deleted content. You’ll see two options:
Redirect to another URL: Enter a new URL to send visitors to.
Return a 410 Content Deleted header: Inform search engines that the page is permanently deleted and should be removed from their index.
Select your preferred option and confirm.
This feature saves time and ensures visitors land on the right page. No manual setup required.
Need help with redirects? Try Yoast SEO Premium
No code, no hassle. Just smarter redirects and many other invaluable tools.
Apache (.htaccess file)
Apache uses the .htaccess file to manage redirects. If your site runs on Apache, this is the simplest way to set them up. Add the rules below to your .htaccess file, ensuring it is located in the root directory of your site.
Nginx handles redirects in the server configuration file. If your site runs on Nginx, add these rules to your server block and then reload the service to apply the changes.
Cloudflare allows you to set up redirects without modifying server files. Create a page rule to forward traffic from one URL to another, without requiring any coding. Simply enter the old and new URLs, select the redirect type, and click Save.
Redirects don’t always work as expected. A typo, a cached page, or a conflicting rule can break them, or worse, create loops that frustrate users and search engines. Below are the most common issues and how to fix them.
If something’s not working, start with the basics: check for errors, test thoroughly, and clear your cache. The solutions are usually simpler than they seem.
Why isn’t my redirect working?
Check for typos: Ensure the URLs are correct.
Clear your cache: Browsers cache 301 redirects aggressively.
Use Screaming Frog to crawl your site for 404s and redirects.
What’s the difference between a 301 and 308 redirect?
301: Most common for permanent moves. Broad browser support.
308: Strict permanent redirect. Rarely used. Same SEO impact as 301.
What is a proxy redirect?
A proxy redirect keeps the URL the same in the browser but fetches content from a different location. Used for load balancing or A/B testing. Avoid for SEO, as search engines may not follow them.
Conclusion about redirects
Redirects are a simple but powerful tool. A redirect automatically sends users and search engines from one URL to another. As a result, they keep your site running smoothly and preserve SEO value and ranking power. Remember:
Use 301 redirects for permanent moves.
Use 302 redirects for temporary changes.
Avoid client-side redirects, such as meta refresh or JavaScript.
Edwin is an experienced strategic content specialist. Before joining Yoast, he worked for a top-tier web design magazine, where he developed a keen understanding of how to create great content.
In this week’s PPC Pulse: updates include an inventory expansion for Apple Ads, and Google confirms that Exact match keywords are not eligible to serve for Ads in AI Overviews.
Apple announced additional ad placements coming to App Store search results in early 2026.
Google confirmed that exact match keywords cannot serve in AI Overviews, even when identical broad match keywords exist in an account.
Both updates reinforce an ongoing shift. Search inventory is growing across new surfaces, but the level of control advertisers once relied on is changing.
Read on for more details and why they matter for advertisers.
Apple Search Ads Will Add New Search Placements In 2026
Apple officially announced that it will introduce additional ads within App Store Search Results starting in 2026. Today, advertisers can appear only in the top position. Beginning next year, ads will also show further down the results page across more queries, expanding total available inventory.
In its email announcement, Apple shared several supporting data points in its announcement:
Nearly 65% of App Store downloads occur directly after a search.
More than 85% of visitors download at least one app during their visit.
Current Search Results ads see 60% or higher conversion rates at the top of results.
Screenshot taken via email by author, December 2025
Per the announcement, advertisers will not need to adjust campaigns to qualify for the new placements. Apple noted that ads will be automatically eligible and cannot be targeted or bid separately by position. The format and billing model will remain the same.
Expanding On An Already Big Year For Apple
Apple has consistently rolled out upgrades and expansions throughout 2025, including:
Custom Product Page expansion (March 2025): Apple expanded testing capabilities by allowing more CPP variants tied to specific keywords, improving message alignment.
Reporting enhancements (June 2025): Apple introduced clearer diagnostics around impression share, keyword performance, and CPP impact. These updates made it easier to identify friction points in search campaigns.
Creative refinements for Today Tab and Search Tab (August 2025): Apple improved visual consistency and added support for higher-funnel experimentation, hinting at broader expansion across App Store surfaces.
These updates all point toward a more robust Apple Ads marketing platform, making the 2026 inventory expansion feel like a natural progression.
Why This Matters For Advertisers
More placements signal higher reach, but also more variability. Top-position performance is unlikely to change, but additional placements may bring new traffic patterns as more users scroll past the first result.
Advertisers should expect incremental installs paired with slightly wider performance swings.
This also means that metadata, product page quality, and CPP strategy will influence performance more than before, since every placement will rely on the same creative foundation.
Google Confirms Exact Match Keywords Not Eligible For AI Overviews
A few questions came in to Google Ads Liaison, Ginny Marvin, this week on X (Twitter) regarding the eligibility of exact match keywords for ads in AI Overviews.
Marvin confirmed via a thread on X (Twitter) that exact match ads are not eligible to serve ads inside Google’s AI Overviews. This clarification explains a pattern many advertisers have seen over the last year. Even if an account contains the same query in both exact and broad match, only broad match can enter AI Overview auctions.
Screenshot taken by author, December 2025
The update circulated quickly after Arpan Banerjee shared it on LinkedIn, giving the topic more visibility among PPC practitioners.
Screenshot taken by author, December 2025
This means advertisers may see broad match triggering queries that they assumed would be handled by exact match. It also means AI Overview impressions are routed through a different layer of Google’s system with its own eligibility rules. Since Google does not provide separate AI Overview reporting, changes in performance may not be clearly attributed to this shift.
Why This Matters For Advertisers
This update makes it clear that match types do not operate the same way inside AI-driven surfaces.
The long-standing assumption that exact match provides clean, isolated coverage does not apply within AI Overviews. Broad match becomes the only entry point, which could influence spend allocation, campaign structure, query mapping, and performance diagnostics.
Advertisers should expect shifts in query distribution on terms where they rely heavily on exact match control.
This Week’s Theme: Search Control Looks Different Than It Used To
Both updates highlight a similar pattern. Platforms are expanding search inventory, but advertisers have less control over how placements are allocated.
Apple is opening new ad positions without letting advertisers bid separately for them. Google is routing some search coverage through AI Overviews, where exact match does not participate. In both cases, the legacy structure of “keyword plus bid plus placement” is giving way to a more interpretive system.
This does not mean advertisers lose influence. It means influence shifts to metadata quality, creative alignment, first-party data, and smart segmentation. Both updates remind advertisers to stay flexible because new surfaces will continue to emerge.
In this week’s Pulse: updates include AI Mode’s growth and missing features, what Google’s latest model brings to search, and what drives citations across different AI experiences.
Google’s Nick Fox confirmed that AI Mode has reached 75 million daily active users, but the personal context features promised at I/O are still in internal testing.
Google launched Gemini 3 Flash with improved speed and performance. Ahrefs research showed AI Mode and AI Overviews cite different URLs.
Here’s what matters for you that happened this week.
Google’s AI Mode Hits 75M Daily Users, But Personal Context Still Delayed
Google’s Nick Fox confirmed AI Mode has grown to 75 million daily active users worldwide, but acknowledged personal context features announced at I/O seven months ago remain in internal testing.
Key Facts:
In an interview on the AI Inside podcast, Fox said personal context features that would connect AI Mode to Gmail and other Google apps are “still to come” with no public timeline.
AI Mode queries run two to three times longer than traditional searches. Google rolled out a preferred sources feature globally and announced improvements to links within AI experiences.
Why This Matters
The personal context delay affects how you should think about AI Mode optimization. If you’ve been preparing for a world where AI Mode knows users’ email confirmations and calendar entries, that world isn’t arriving soon. Currently, users manually add context to longer queries.
That changes what you prioritize. Content still needs to answer the longer, more specific questions users are asking. But the automated personalization layer that might have made some informational queries feel self-contained inside Google’s interface isn’t active yet.
The 75 million daily active user figure matters for traffic planning. AI Mode is no longer a small experiment. It’s a significant channel that’s still evolving. The query length data (two to three times longer than traditional searches) suggests users are having conversations rather than making quick lookups, which affects what content formats and depth work best.
What People Are Saying
AI Inside shared additional highlights on LinkedIn:
“Nick Fox suggests that optimizing for Google’s AI experiences mirrors the approach for traditional search: building a great site with great content
… focus on building for users and creating content that resonates with human readers.”
Google Launches Gemini 3 Flash With Faster Performance
Google launched Gemini 3 Flash, its latest AI model focused on speed and efficiency, and immediately shipped it in search products.
Key Facts:
Gemini 3 Flash delivers improved performance across benchmarks while maintaining faster response times than previous models. It’s now the default model in the Gemini app, and AI Mode for Search.
Why SEOs Should Pay Attention
Google’s shipping speed for Gemini 3 Flash suggests how AI model updates might flow into search products going forward. Rather than waiting months between model releases and search integration, you’re now dealing with immediate deployment of new models that can change how AI features behave.
Faster performance matters for user experience in AI Mode and AI Overviews, where latency affects whether people continue using it or switch to traditional results. Faster models make longer multi-turn interactions more practical, potentially leading to more search sessions.
“3 Flash brings the incredible reasoning capabilities of Gemini 3 Pro, at the speed you expect of Search. So AI Mode better interprets your toughest, multi-layered questions – considering each of your constraints or requirements – and provides a visually digestible response along with helpful links to dive deeper on the web.”
Rhiannon Bell, VP of user experience for Google Search, noted that this update brings Gemini 3 Pro to more users. Bell highlights the ability of 3 Pro to redesign search results:
“My team is constantly thinking about what “helpful” design means, and Gemini 3 Pro is allowing us to fundamentally re-architect what a helpful Search response looks like.
Hema Budaraju, vice president of product management for Search at Google, highlighted the “speed and smarts”:
“As product builders, we often need to balance speed and smarts. Today, we’re bringing that even closer together: Gemini 3 Flash is rolling out globally in Search as the new default model for AI Mode… We’re also putting our Pro models in more hands. Gemini 3 Pro is now available to everyone in the U.S
AI Mode & AI Overviews Cite Same URLs Only 13.7% Of The Time
Ahrefs analyzed 730,000 query pairs and found AI Mode and AI Overviews reach semantically similar conclusions 86% of the time, but cite the same specific URLs just 13.7% of the time.
Key Facts:
Ahrefs compared AI Mode and AI Overview responses across identical queries. While both experiences frequently agree on general information, they’re pulling that information from different sources.
Why SEOs Should Pay Attention
You’re dealing with a split optimization target. Getting cited in AI Overviews doesn’t automatically get you cited in AI Mode, even when both systems are answering the same query with similar information. These are two separate citation engines, not one system with different interfaces.
If you track which AI experience appears for your target queries, you can focus citation efforts accordingly. For queries where AI Mode dominates, publishing frequency and content freshness may matter more. For queries where AI Overviews appear, authority signals and deep resource coverage may matter more.
The 13.7% overlap suggests many sites will see uneven results across surfaces. You might do well in one experience without automatically carrying that visibility into the other.
“Only 13.7% citation overlap … 86% semantic similarity … In short, 9 out of 10 times, AI Mode and AI Overviews agreed on what to say; they just said it differently and cited different sources.”
Theme Of The Week: AI Search In Practice, Not Theory
Each story this week shows AI search moving from promise to operational reality.
AI Mode’s 75 million daily users and immediate Gemini 3 Flash deployment reveal Google’s AI features are production systems at scale, not experimental labs. The personal context delay shows the gap between what was announced and what’s shipping. The citation study quantifies how these systems work differently despite appearing similar.
For you, this week’s about treating AI search as current infrastructure rather than future speculation. Optimize for how AI Mode and AI Overviews work today, longer manual queries without personal context, immediate model updates that can change behavior, and separate optimization targets for each experience.
The features Google promised at I/O aren’t here yet, but 75 million people are using what is here.
When your search and social strategies are intertwined, they work together like a well-oiled machine, and your search visibility can multiply.
For years, SEO and social media teams more often than not operated in silos, rarely engaging with each other and never working in tandem. SEO focused on optimizing for the latest Google algorithm update while social media teams worked earnestly to respond to brand mentions.
Today, these functions must merge from parallel paths to transparent collaboration. Audience engagement on social platforms can influence how search engines interpret trust, authority, and relevance.
Google’s Helpful Content evolution highlighted social platforms in the search engine results pages (SERPs). Discussion forums like Reddit and Quora often surface answers to queries at the top of the SERPs, especially answers that have plenty of comments and upvotes.
Current marketing means SEO and social go hand-in-hand, building unified systems to ensure cross-channel amplification is maximized. Together, these two divergent roles work towards the same goal of helping your business rank higher, improve brand recognition, and build a consistent story across every single touchpoint.
Why Search And Social Belong Together
Search and social belong together. They aren’t focusing on divergent tactics; they’re working in unison to compound your marketing and SEO efforts. The marriage of the two helps improve customer experiences from first search to reading reviews to aid in the decision-making phase of the sales journey.
Here’s what that synergy might look like in practice.
1. Social Creates The Spark Of Discovery
A decade ago, traditional blue links reigned supreme. Social media today is “top of the funnel” for organic search. According to GWI, nearly half (46%) of Gen Z turns to social media first when conducting product research. Not Google. But, many of those users will later turn to search to validate and compare what they discovered on social media.
Social media content shouldn’t just be entertaining or chasing the latest viral trend. It must answer questions your customers are asking. Smart marketing leaders analyze trending social conversations to discover the right queries and phrases people are using related to their products or services. They’re then working with SEO teams to optimize for those terms in the form of visual and written content, as well as back-end optimizations.
Knowing that social sentiment is often the early determinant of rising search demand, it’s crucial for CMOs, SEOs, and social marketers alike to watch for engagement spikes around an emerging topic and create high-quality content quickly in order to turn buzz into business.
2. Search Anchors And Sustains The Momentum
Social engagement is fast and fickle. What’s trending one day is quickly forgotten the next. Search visibility, on the other hand, is a slow process that doesn’t happen overnight. Together, they create the right balance of speed and longevity. A social post may receive thousands of comments in a matter of hours, but an optimized landing page built on that same topic can rank and drive sales for years to come.
Consider Gong, which generates roughly 2.2 million visits a month from organic traffic, according to SimilarWeb. The social media platform invests effort into growing its LinkedIn. At the bottom of Gong’s blog posts, they don’t ask their readers to navigate to a demo or related blog post, they invite them to follow their LinkedIn, and their efforts are paying off.
Gong has 315,000 followers on LinkedIn. Its competitor, Chorus, meanwhile, has about a third of the followers. Additionally, Gong shares about 10-15 posts on its company page per week. The velocity has paid off, as many of its posts receive thousands of interactions and hundreds of comments. This type of momentum is what Google favors and pays attention to, making them more likely to be highlighted in the SERPs.
3. Shared Data Creates Precision
When SEO and social data remain separated, it’s impossible to see the bigger picture and extract key takeaways. Integrating both data sets helps marketing leaders identify what’s working and what isn’t. It showcases what content is delivering return on investment and which should be repurposed. It identifies patterns such as posts that earn high engagement but low search volume or blog posts that earn clicks but fail to be shared on social.
By cross-referencing these insights, teams gain a 360° view of their performance. That level of insight fuels smarter creative, better results, and higher ROI.
How To Engineer Cross-Channel Synergy
Bridging the gap between SEO and social teams requires work. When two teams are accustomed to working independently, structure and strategy must come into play. Below are the five tactics to ensure cross-team synergy is as seamless as possible.
1. Share Objectives
Merge SEO and social teams with intent, aligning on KPIs to ensure everyone is working towards the same goal. Creating joint goals, such as brand visibility, intent coverage, and more, helps teams come together to maximize organizational success.
For example, both SEOs and social marketers should work towards visibility, tracking growth of branded keywords, hashtags, and mentions (both on social and search). Joint goals motivate teams to work closely together, turning to one another to pave the path towards success. This shared measurement philosophy removes team rivalry and breeds co-creators of growth.
2. Plan Content Around Signals
Building content around internal agendas rarely works well. Cross-channel listening opens the door to conversions content marketers often aren’t involved in. Social media marketers leverage social listening to detect emotional signals (what people care about now) and SEOs measure search data to discern what users will look for next. Merging the two together enables content marketers to create click-worthy and relevant content that meets audiences exactly where interest turns into action.
Forecasting content identifies future search demand by tracking early-stage social conversations, leading to a strategy that stays well ahead of your competitors.
3. Implement A Content Relay System
Top-performing brands treat search and social as relay partners. They work together for the greater good of the organization and embrace the team player ideology. Here’s how the content relay model works when implemented right:
Social Spark: Social media teams create a thought leadership thread, poll, or conversation starter in hopes of attracting interest and engagement.
Search Foundation: Based on the responses, social hands off those insights to content to produce a more detailed blog or landing page. SEO helps optimize the content to improve the chances of appearing in the SERPs.
Social Reinforcement: Once the piece has been optimized for search, share the content with social with audience-driven context (you asked, we answered/analyzed).
Search Reinforcement: Embed high-performing social content (such as quotes, videos, or user-generated content) into pages for richer signals. Use structured data to tell search engines what the content is and how to index it.
Every piece of content fuels another, creating a loop of engagement, validation, and authority that compounds across platforms and the content’s lifetime extends.
4. Pair AI With Human Expertise
AI isn’t a replacement for human creativity and expertise. It’s merely an aid to help power smarter business decisions. In the case of social media and search, AI-powered tools can be used to help analyze language consistency and detect sentiment shifts. For example, if users are consistently complaining about long wait times at your fast-food chain in Memphis, TN, AI can flag this as an issue that needs to be resolved before your reputation and bottom line suffer.
Similarly, AI can also identify when your top-performing social post is driving branded search volume or when a keyword starts trending related to your products or services in user-generated content. Intelligent automation enables your team to be notified in real time, allowing you to strike while the iron is hot.
5. Align Leadership And Cultural Change
Marketing leaders must create environments where SEOs and social media team members understand why and how they’re working together. This might include:
Hosting bi-weekly meetings with both teams to get both teams up to speed on shared goals and priorities.
Creating “bridge roles” like Audience Insights Manager.
Recognizing shared wins (e.g., content that ranked and went viral on TikTok).
Transparency into what both teams are working on and towards
In-person team building events to allow both teams to connect outside of work
A good company culture that fosters collaboration is imperative for team building, employee retention, and business success. When collaboration feels like extra work or leaves one team in the dark, performance and employee satisfaction suffer.
6. Embrace An Ecosystem Mentality
Once marketing leaders align data, culture, and goals, your organization’s ecosystem begins to operate like a living, breathing organism. Search informs social, social accelerates search, and together they improve the longevity of your business. In return, your business becomes more resilient to Google’s constant algorithm evolution. Siloed strategy starts to shift from stagnant results to seamless execution.
A Real-World Case: Social And Search Synergy In Action
When I worked with a leading fast-casual Mexican restaurant, the business had seen inconsistent reviews across its hundreds of locations. We centralized customer feedback, identified common complaints and praises, which led to a revamped online reputation.
Within just two months, according to our agency internal rating metrics, the chain’s average star rating rose from 4.2 to 4.4, five-star reviews increased by 32%, and no one-star reviews were left during that time period. Positive feedback trends emerged almost immediately, signaling local teams were acting on customer feedback faster and more diligently.
The ripple effects reached both search and social ecosystems as improved reviews and higher star ratings typically lead to a boost in visibility in Google Search and Maps. Simultaneously, the same credibility fueled social proof across the brand’s social platforms, where patrons frequently leave both positive and negative feedback.
Search visibility was boosted due to review quality, and social visibility was also enhanced because of customer advocacy. Together, they created a unified trust signal that influenced consumer behavior across every touchpoint. That represents the power of marrying search and social; a blissful union that drives favorable outcomes like visibility that converts.
Future-Facing: The Algorithmic Convergence Of Search And Social
We are now in an era where search and social converge effortlessly. TikTok is an influential discovery engine, while Google’s prominent AI Overviews pull in content that resembles social threads. Social content and discussion forums are now indexed prominently in the SERPs.
SEO should maintain semantic and emotional consistency at every step of discovery across the digital buyer’s journey across all channels.
Marketing executives should ask themselves the following:
How do we establish a unified signal map? How does your audience move from discovery to intent? Which social triggers lead to which search behaviors?
How can we centralize our listening structure? Does our social listening platform allow us to integrate with our search analytics technology?
How can we create rapid-response workflows to capitalize on trending topics before our competitors do?
Do we need to reevaluate our reporting cadence? How do we move from channel-based reports to intent-based dashboards that track trending topics across platforms?
Are we relying too heavily on AI? Do we use human judgment to craft narratives that align with our brand’s voice and ethics?
Search and social are no longer divergent roles that never speak to one another. They’re an integral effort that plays for the same team and can amplify one another to create something bigger and better than either could solo.
In an interview with the Big Technology Podcast, Sam Altman seemed to struggle answering the tough questions about OpenAI’s path to profitability.
At about the 36 minute mark the interviewer asked the big question about revenues and spending. Sam Altman said OpenAI’s losses are tied to continued increases in training costs while revenue is growing. He said the company would be profitable much earlier if it were not continuing to grow its training spend so aggressively.
Altman said concern about OpenAI’s spending would be reasonable only if the company reached a point where it had large amounts of computing it could not monetize profitably.
The interviewer asked:
“Let’s, let’s talk about numbers since you brought it up. Revenue’s growing, compute spend is growing, but compute spend still outpaces revenue growth. I think the numbers that have been reported are OpenAI is supposed to lose something like 120 billion between now and 2028, 29, where you’re going to become profitable.
So talk a little bit about like, how does that change? Where does the turn happen?”
Sam Altman responded:
“I mean, as revenue grows and as inference becomes a larger and larger part of the fleet, it eventually subsumes the training expense. So that’s the plan. Spend a lot of money training, but make more and more.
If we weren’t continuing to grow our training costs by so much, we would be profitable way, way earlier. But the bet we’re making is to invest very aggressively in training these big models.”
At this point the interviewer pressed Altman harder about the path to profitability, this time mentioning the spending commitment of $1.4 trillion dollars versus the $20 billion dollars in revenue. This was not a softball question.
The interviewer pushed back:
“I think it would be great just to lay it out for everyone once and for all how those numbers are gonna work.”
Sam Altman’s first attempt to answer seemed to stumble in a word salad kind of way:
“It’s very hard to like really, I find that one thing I certainly can’t do it and very few people I’ve ever met can do it.
You know, you can like, you have good intuition for a lot of mathematical things in your head, but exponential growth is usually very hard for people to do a good quick mental framework on.
Like for whatever reason, there were a lot of things that evolution needed us to be able to do well with math in our heads. Modeling exponential growth doesn’t seem to be one of them.”
Altman then regained his footing with a more coherent answer:
“The thing we believe is that we can stay on a very steep growth curve of revenue for quite a while. And everything we see right now continues to indicate that we cannot do it if we don’t have the compute.
Again, we’re so compute constrained, and it hits the revenue line so hard that I think if we get to a point where we have like a lot of compute sitting around that we can’t monetize on a profitable per unit of compute basis, it’d be very reasonable to say, okay, this is like a little, how’s this all going to work?
But we’ve penciled this out a bunch of ways. We will of course also get more efficient on like a flops per dollar basis, as you know, all of the work we’ve been doing to make compute cheaper comes to pass.
But we see this consumer growth, we see this enterprise growth. There’s a whole bunch of new kinds of businesses that, that we haven’t even launched yet, but will. But compute is really the lifeblood that enables all of this.
We have always been in a compute deficit. It has always constrained what we’re able to do.
I unfortunately think that will always be the case, but I wish it were less the case, and I’d like to get it to be less of the case over time, because I think there’s so many great products and services that we can deliver, and it’ll be a great business.”
The interviewer then sought to clarify the answer, asking:
“And then your expectation is through things like this enterprise push, through things like people being willing to pay for ChatGPT through the API, OpenAI will be able to grow revenue enough to pay for it with revenue.”
Sam Altman responded:
“Yeah, that is the plan.”
Altman’s comments define a specific threshold for evaluating whether OpenAI’s spending is a problem. He points to unused or unmonetizable computing power as the point at which concern would be justified, rather than current losses or large capital commitments.
In his explanation, the limiting factor is not willingness to pay, but how much computing capacity OpenAI can bring online and use. The follow-up question makes that explicit, and Altman’s confirmation makes clear that the company is relying on revenue growth from consumer use, enterprise adoption, and additional products to cover its costs over time.
Altman’s path to profitability rests on a simple bet: that OpenAI can keep finding buyers for its computing as fast as it can build it. Eventually, that bet either keeps winning or the chips run out.
Watch the interview starting at about the 36 minute mark:
The Core Web Vitals Technology Report by the open source HTTPArchive community ranks content management systems by how well they perform on Google’s Core Web Vitals (CWV). The November 2025 data shows a significant gap between platforms with the highest ranked CMS scoring 84.87% of sites passing CWV, while the lowest ranked CMS scored 46.28%.
What’s of interest this month is that the top three Core Web Vitals champs are all closed source proprietary platforms while the open source systems were at the bottom of the pack.
Importance Of Core Web Vitals
Core Web Vitals (CWV) are metrics created by Google to measure how fast, stable, and responsive a website feels to users. Websites that load quickly and respond smoothly keep visitors engaged and tend to perform better in terms of sales, reads, and add impressions, while sites that fall short frustrate users, increase bounce rates, and perform less well for business goals. CWV scores reflect the quality of the user experience and how a site performs under real-world conditions.
How the Data Is Collected
The CWV Technology Report combines two public datasets.
The Chrome UX Report (CrUX) uses data from Chrome users who opt in to share performance statistics as they browse. This reflects how real users experience websites. The HTTP Archive runs lab-based tests that analyze how sites are built and whether they follow performance best practices.
Together, the report I generated provides a snapshot of how each content management system performs on Core Web Vitals.
Ranking By November 2025 CWV Score
Duda Is The Number One Ranked Core Web Vitals Champ
Duda ranked first in November 2025, with 84.87% of sites built on the platform delivering a passing Core Web Vitals score. It was the only platform in this comparison where more than four out of five sites achieved a good CWV score. Duda has consistently ranked #1 for Core Web Vitals for several years now.
Wix Ranked #2
Wix ranked second, with 74.86% of sites passing CWV. While it trailed Duda by ten percentage points, Wix was just about four percentage points ahead of the third place CMS in this comparison.
Squarespace Ranked #3
Squarespace ranked third, at 70.39%. Its CWV pass rate placed it closer to Wix than to Drupal, maintaining a clear position in the top three ranked publishing platforms.
Drupal Ranked #4
Drupal ranked fourth, with 63.27% of sites passing CWV. That score put Drupal in the middle of the comparison, below the three private label site builders. This is a curious situation because the bottom three CMS’s in this comparison are all open source platforms.
Joomla Ranked #5
Joomla ranked fifth, at 56.92%. While more than half of Joomla sites passed CWV, the platform remained well behind the top performers.
WordPress Ranked Last at position #6
WordPress ranked last, with 46.28% of sites passing Core Web Vitals. Fewer than half of WordPress sites met the CWV thresholds in this snapshot. What’s notable about WordPress’s poor ranking is that it lags behind the fifth place Joomla by about ten percentage points. So not only is WordPress ranked last in this comparison, it’s decisively last.
Why the Numbers Matter
Core Web Vitals scores translate into measurable differences in how users experience websites. Platforms at the top of the ranking deliver faster and more stable experiences across a larger share of sites, while platforms at the bottom expose a greater number of users to slower and less responsive pages. The gap between Duda and WordPress in the November 2025 comparison was nearly 40 percentage points, 38.59 percentage points.
While an argument can be made that the WordPress ecosystem of plugins and themes may be to blame for the low CWV scores, the fact remains that WordPress is dead last in this comparison. Perhaps WordPress needs to become more proactive about how themes and plugins perform, such as come up with standards that they have to meet in order to gain a performance certification. That might cause plugin and theme makers to prioritize performance.
Do Content Management Systems Matter For Ranking?
I have mentioned this before and will repeat it this month. There have been discussions and debates about whether the choice of content management system affects search rankings. Some argue that plugins and flexibility make WordPress easier to rank in Google. But the fact is that private platforms like Duda, Wix, and Squarespace have all focused on providing competitive SEO functionalities that automate a wide range of technical SEO tasks.
Some people insist that Core Web Vitals make a significant contribution to their rankings and I believe them. But in general, the fact is that CWV performance is a minor ranking factor.
Nevertheless, performance still matters for outcomes that are immediate and measurable, such as user experience and conversions, which means that the November 2025 HTTPArchive Technology Report should not be ignored.
The HTTPArchive report is available here but it will be going away and replaced very soon. I’ve tried the new report and, unless I missed something, it lacks a way to constrain the report by date.
Google’s Danny Sullivan discussed SEO and AI where they observed that their ranking systems are tuned for one thing, regardless if it’s classic search or AI search. What he talked about was optimizing for people, which is something I suspect the search marketing industry will increasingly be talking about.
Nothing New You Need To Be Doing For AI Search
The first thing Danny Sullivan discussed was that despite there being new search experiences powered by AI there isn’t anything new that they need to be doing.
John Mueller asked:
“So everything kind of around AI, or is this really a new thing? It feels like these fads come and go. Is AI in fad? How do you think?”
Danny Sullivan responded:
“Oh gosh, my favorite thing is that we should be calling it LMNOPEO because there’s just so many acronyms for it. It’s GEO for generative engine optimization or AEO for answer engine optimization and AIEO. I don’t know. There’s so many different names for it.
I used to write about SEO and search. I did that for like 20 years. And part of me is just so relieved. I don’t have to do that aspect of it anymore to try to keep up with everything that people are wondering about.
And on the other hand, you still have to kind of keep up on it because we still try to explain to people what’s going on. And I think the good news is like, There’s not a lot you actually really need to be worrying about.
It’s understandable. I think people keep having these questions, right? I mean, you see search formats changing, you see all sorts of things happening and you wonder, well, is there something new I should be doing? Totally get that.
And remember, we, John and I and others, we all came together because we had this blog post we did in May, which we’ll drop a link to or we’ll point you to somehow to it, but it was… we were getting asked again and again, well, what should we be doing? What should we be thinking about?
And we all put our heads together and we talked with the engineers and everything else. So we came up with nothing really that different.”
Google’s Systems Are Tuned To Rank Human Optimized Content
Danny Sullivan next turned to discussing what Google’s systems are designed to rank, which is content that satisfies humans. Robbie Stein, currently Vice President of Product for Google Search, recently discussed the signals Google uses to identify helpful content, discussing how human feedback contributes to helping ranking systems understand what helpful content looks like.
While Danny didn’t get into exact details about the helpfulness signals the way Stein did, Danny’s comments confirmed the underlying point that Robbie Stein was making about how their systems are tuned to identify content that satisfies humans.
Danny continued explaining what SEOs and creators should know about Google’s ranking systems. He began by acknowledging that it’s reasonable that people see a different search experience and conclude that they must be doing something different.
He explained:
“…I think people really see stuff and they think they want to be doing something different. …It is the natural reaction you have, but we talk about sort of this North Star or the point that you should be heading to.”
Next he explained how all of Google’s ranking systems are engineered to rank content that was made for humans and specifically calls out content that is created for search engines as examples of what not to do.
Danny continued his answer:
“And when it comes to all of our ranking systems, it’s about how are we trying to reward content that we think is great for people, that it was written for human beings in mind, not written for search algorithms, not written for LLMs, not written for LMNO, PEO, whatever you want to call it.
It’s that everything we do and all the things that we tailor and all the things that we try to improve, it’s all about how do we reward content that human beings find satisfying and say, that was what I was looking for, that’s what I needed. So if all of our systems are lining up with that, it’s that thing about you’re going to be ahead of it if you’re already doing that.
To whereas the more you’re trying to… Optimize or GEO or whatever you think it is for a specific kind of system, the more you’re potentially going to get away from the main goal, especially if those systems improve and get better, then you’re kind of having to shift and play a lot of catch up.
So, you know, we’re going to talk about some of that stuff here with the big caveat, we’re only talking about Google, right? That’s who we work for. So we don’t say what, anybody else’s AI search, chat search, whatever you want to kind of deal with and kind of go with it from there. But we’ll talk about how we look at things and how it works.”
What Danny is clearly saying is that Google is tuned to rank content that’s written for humans and that optimizing for specific LLMs sets up a situation where it could backfire.
Why Optimizing For LLMs Is Misguided
Although Danny didn’t mention it, this is the right moment to point out that OpenAI, Perplexity, and Claude together have a total traffic referral volume of less than 1%. So it’s clearly a mistake to optimize content for LLMs at the risk of losing significant traffic from search engines.
Content that is genuinely satisfying to people remains aligned with what Google’s systems are built to reward.
Why SEOs Don’t Believe Google
Google’s insistence that their algorithms are tuned toward user satisfaction is not new. They have been saying it for over two decades, and over the years it has been a given that Google was overstating their technology. That is no longer the case.
Arguably, since at least 2018’s Medic broad core update, Google has been making genuine strides toward actually delivering search results that are influenced by user behavior signals that guide Google’s machines toward understanding what kind of content people like, plus AI and neural networks that are better able to match content to a search query.
If there is any doubt about this, check out the interview with Robbie Stein, where he explains exactly how human feedback, in aggregate, influences the search results.
Is Human Optimized Content The New SEO?
So now we are at a point where links no longer are the top ranking criteria. Google’s systems have the ability to understand queries and content and match one to the other. User behavior data, which has been a part of Google’s algorithms since at least 2004, plays a strong role in helping Google understand what kinds of content satisfy users.
It may be well past time for SEOs and creators to let go of the old SEO playbooks and start focusing on optimizing their websites for humans.