Google Hit By EU Probe Into Unfair Use Of Online Content via @sejournal, @martinibuster

The European Commission has launched an antitrust inquiry into Google to determine whether the company has violated EU competition rules, partly focusing on whether Google has used creator and publisher content in ways that leave publishers unable to refuse such use without risking their search traffic. It is also looking into whether Google is granting itself privileged access to YouTube content for AI in a way that leaves competitors at a disadvantage.

How Google’s Terms May Pressure Publishers and Creators

The Commission is focusing on publisher content is used by AI Overviews and AI Mode to generate answers but without a way to compensate the publishers or for them to opt out of having their content used to generate summaries.

They write:

“The Commission will investigate to what extent the generation of AI Overviews and AI Mode by Google is based on web publishers’ content without appropriate compensation for that, and without the possibility for publishers to refuse without losing access to Google Search. Indeed, many publishers depend on Google Search for user traffic, and they do not want to risk losing access to it.”

This raises concerns that Google may be using publisher content in its AI products without offering a workable opt-out, leaving publishers who rely on Search traffic with little choice but to accept this use.

Use of YouTube Content to Train Google’s AI Models

The Commission is also examining Google’s use of YouTube videos and other creator content for training its generative AI models. According to the announcement, creators “have an obligation to grant Google permission to use their data for different purposes, including for training generative AI models,” and cannot upload content while withholding that permission. Google provides no payment for this use while blocking rival AI developers from training on YouTube content under YouTube’s policies.

This mix of mandatory access for Google, limits on competitors, and no payment for creators underpins the Commission’s concern that Google may be giving itself preferred access to YouTube content in a way that may harm the wider AI market.

The Commission has notified Google that it has opened an investigation into whether they have breached EU competition rules prohibiting the abuse of a dominant position.

Featured Image by Shutterstock/Mo Arbid

The Download: a peek at AI’s future

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

The State of AI: A vision of the world in 2030  

There are huge gulfs of opinion when it comes to predicting the near-future impacts of generative AI. In one camp there are those who predict that over the next decade the impact of AI will exceed that of the Industrial Revolution—a 150-year period of economic and social upheaval so great that we still live in the world it wrought. 

At the other end of the scale we have team ‘Normal Technology’: experts who push back not only on these sorts of predictions but on their foundational worldview. That’s not how technology works, they argue.

Advances at the cutting edge may come thick and fast, but change across the wider economy, and society as a whole, moves at human speed. Widespread adoption of new technologies can be slow; acceptance slower. AI will be no different. What should we make of these extremes? 

Read the full conversation between MIT Technology Review’s senior AI editor Will Douglas Heaven and Tim Bradshaw, FT global tech correspondent, about where AI will go next, and what our world will look like in the next five years.

This is the final edition of The State of AI, a collaboration between the Financial Times and MIT Technology Review. Read the rest of the series, and if you want to keep up-to-date with what’s going on in the world of AI, sign up to receive our free Algorithm newsletter every Monday.

How AI is changing the economy

There’s a lot at stake when it comes to understanding how AI is changing the economy at large. What’s the right outlook to have? Join Mat Honan, editor in chief, David Rotman, editor at large, and Richard Waters, FT columnist, at 1pm ET today to hear them discuss what’s happening across industries and the market. Sign up now to be part of this exclusive subscriber-only event.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Trump says he’ll sign an order blocking states from regulating AI
But he’s facing a lot of pushback, including from members of his own party. (CNN)
+ The whole debacle can be traced back to congressional inaction. (Semafor)

2 Google’s new smart glasses are getting rave reviews 👓
You’ll be able to get your hands on a pair in 2026. Watch out, Apple and Meta. (Tech Radar)

3 Trump gave the go-ahead for Nvidia to sell powerful AI chips to China
The US gets a 25% cut of the sales—but what does it lose longer-term? (WP $)
And how much could China stand to gain? (NYT $)
How a top Chinese AI model overcame US sanctions. (MIT Technology Review)

4 America’s data center backlash is here
Republican and Democrat alike, local residents are sick of rapidly rising power bills. (Vox $)
More than 200 environmental groups are demanding a US-wide moratorium on new data centers. (The Guardian)
The data center boom in the desert. (MIT Technology Review)

5 A quarter of teens are turning to AI chatbots for mental health support
Given the lack of real-world help, can you really blame them? (The Guardian)
Therapists are secretly using ChatGPT. Clients are triggered. (MIT Technology Review)

6 ICEBlock is suing the US government over its App Store removal 
Its creator is arguing that the Department of Justice’s demands to Apple violated his First Amendment rights. (404 Media)
+ It’s one of a number of ICE-tracking initiatives to be pulled by tech platforms this year. (MIT Technology Review)

7 This band quit Spotify, but it’s been replaced by AI knockoffs
The platform seems to be struggling against the tide of slop. (Futurism
AI is coming for music, too. (MIT Technology Review)

8 Think you’re immune to online ads? Think again
If you’re scrolling on social media, you’re being sold to. Relentlessly. (The Verge $)

9 People really do not like Microsoft Copilot
It’s like Clippy all over again, except it’s even less avoidable. (Quartz $)

10 The longest solar eclipse for 100 years is coming
And we’ll only have to wait until 2027 to see it! (Wired $)

Quote of the day

“Governments and MPs are shooting themselves in the foot by pandering to tech giants, because that just tells young people that they don’t care about our future.”

—Adele Zeynep Walton, founding member of online safety campaign group Ctrl+Alt+Reclaim, tells The Guardian why young activists are taking matters into their own hands. 

One more thing

fleet of ships at sea

COURTESY OF OCEANBIRD

Inside the long quest to advance Chinese writing technology

Every second of every day, someone is typing in Chinese. Though the mechanics look a little different from typing in English—people usually type the pronunciation of a character and then pick it out of a selection that pops up, autocomplete-style—it’s hard to think of anything more quotidian. The software that allows this exists beneath the awareness of pretty much everyone who uses it. It’s just there.

What’s largely been forgotten is that a large cast of eccentrics and linguists, engineers and polymaths, spent much of the 20th century torturing themselves over how Chinese was ever going to move away from the ink brush to any other medium. Read the full story.

—Veronique Greenwood

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ Pantone chose a ‘calming’ shade of white for its Color of 2026… and people are fuming. 
+ Ozempic needles on the Christmas tree, anyone? Here’s why we’re going crazy for weird baubles. 
+ Can relate to this baby seal for instinctively heading to the nearest pub.
+ Thrilled to see One Battle After Another get so many Golden Globes nominations.

Long-Tail SEO in an AI World

In 2006, Wired magazine editor Chris Anderson famously described the availability of niche products online as the “long tail.” Search optimizers adopted the term, calling queries of three words or more “long-tail keywords.”

Optimizing for long-tail searches has multiple benefits. Consumers searching on extended keywords tend to know what they want, and longer queries typically have less keyword competition. Yet the biggest benefit could now be AI visibility: Generative AI platforms such as ChatGPT fan out using multiword queries to answer user prompts.

Long-Tail Queries

A seed term plus modifiers

Any long-tail query consists of a seed term and one or more modifiers. For example, “shoes” is a seed term, and potential modifiers are:

  • “for women,”
  • “red,”
  • “near me,”
  • “on-sale.”

Combining the seed term and modifiers — “red shoes for women,” “on sale near me” — yields narrow queries that describe searchers’ needs, such as gender, color, location, and price.

Modifiers reflect the searcher’s intent and stage in a buying journey, from exploration to purchase. Thus, keyword research is the process of extending a core term with modifiers to optimize a site for buying journeys.

The more modifiers, the more specific the intent and, typically, the lesser the volume and clicks. Conversely, more modifiers improve the likelihood of conversions, provided the content of the landing page follows closely from that phrase. A query of “red shoes for women” should link to a page with women wearing red shoes.

Types of modifiers

A core term can have many modifiers, such as:

  • Location,
  • Description (“red”),
  • Price (typically from searchers eager to buy),
  • Brand,
  • Age and gender,
  • Questions (“how to clean shoes”).

Long-Tail Opportunities

Keyword research tools

Grouping keywords by modifier type can reveal your audience’s search patterns. Keyword research tools such as Semrush and others can filter lists by modifiers to reveal the most popular.

Screenshot of Semrush's Keyword Magic Tool

Semrush’s Keyword Magic Tool reveals the most popular modifiers for “shoes.”

Adjust Semrush’s “Advanced filters” to see queries that contain more words.

Screenshot of Semrush's Advance filters.

“Advanced filters” reveal queries that contain more words.

Search Console

Regular expressions (regex) in Search Console can identify longer queries, such as fan-out searches from ChatGPT and other genAI platforms. In Search Console, go to “Performance,” click “Add filter,” choose “Query,” and “Custom (regex).”

Then type:

([^” “]*s){10,}?

This regex filters queries to those with more than 10 words. Change “10” to “5” or “25” to find queries longer than 5 or 25 words, respectively.

Screenshot of the regex dialog in Search Console

Regex in Search Console can identify longer queries, such as fan-out searches from ChatGPT and other genAI platforms.

Keyword Dos and Don’ts

Search engines no longer match queries to exact word strings on web pages, focusing instead on the searcher’s intent or meaning. Hence a query for “red shoes for women” could produce an organic listing for “maroon slippers for busy moms.”

Keyword optimization circa 2025 reflects this evolution.

  • Avoid stuffing a page with keywords. Instead, enrich content with synonyms and related phrases.
  • Don’t create a page with variations of a single keyword. Group pages by modifiers and optimize for the entire group.
  • Include the main keyword in the page title and the H1 heading. Google could use either of those to create the all-important search snippet.
  • Assign products to only one category. Don’t confuse Google (and genAI platforms) by creating multiple categories for the same item to target different keywords.
  • Search Google (and genAI platforms) for your target query and study the results. Are there other opportunities, such as images and videos?
  • Don’t force an exact match keyword if it’s awkward or grammatically incorrect. Ask yourself, “How would I search for this item?” In other words, write for people, not search engines.
Google Confirms Smaller Core Updates Happen Continuously via @sejournal, @MattGSouthern

Google updated its core updates documentation to say smaller core updates happen on an ongoing basis, so sites can improve without waiting for named updates.

  • Google explicitly confirms it makes “smaller core updates” beyond the named updates announced several times per year.
  • Sites that improve their content can see ranking gains without waiting for the next major core update to roll out.
  • The documentation change addresses whether recovery between named updates is possible.
YouTube AI Enforcement Questioned As Channels Get Restored via @sejournal, @MattGSouthern

YouTube creators are raising concerns about the platform’s AI-driven moderation system. Multiple accounts describe sudden channel terminations for “spam, deceptive practices and scams,” followed by rapid appeal rejections with templated responses.

In some cases, channels have been restored only after the creator generated attention on X or Reddit. YouTube’s message to creators states the company has “not identified any widespread issues” with channel terminations and says only “a small percentage” of enforcement actions are reversed.

There’s a gap between YouTube’s position and creator experiences that’s driving a debate.

What Creators Are Reporting

The pattern appearing across X and Reddit threads follows a similar sequence.

Channels receive termination notices citing “spam, deceptive practices and scams.” Appeals get rejected within hours, sometimes minutes, with generic language. When channels are restored, creators say they receive no explanation of what triggered the ban or how to prevent future issues.

One documented case comes from YouTube creator “Chase Car,” who runs an EV news channel. In a detailed post on r/YouTubeCreators, they describe a sequence where their channel was demonetized by an automated system, cleared by a human reviewer, then terminated months later for spam.

The creator says they escalated the case to an EU-certified dispute body under the Digital Services Act. According to their account, the decision found the termination “was not rightful.” As of their most recent update, YouTube had not acted on the ruling.

Channels Restored After Public Attention

A subset of terminated channels have been reinstated after their cases gained visibility on social media.

Film analysis channel Final Verdict shared a thread documenting a sudden spam-related termination and later reinstatement after posts on X gained traction.

True crime channel The Dark Archive had their channel removed and later restored after tagging TeamYouTube publicly.

Streamer ProkoTV said their channel was restricted from live streaming after a spam warning. TeamYouTube later acknowledged an error and restored access.

These reversals confirm that some enforcement actions are incorrect by YouTube’s own standards. They also suggest that escalation on X can function as a parallel appeal route.

YouTube Acknowledges Some Errors

In a few cases, YouTube or its representatives have publicly admitted mistakes.

Dexerto reported on a creator whose 100,000-plus subscriber channel was banned over a comment they wrote on a different account at age 13. YouTube eventually apologized, telling the creator the ban “was a mistake on our end.”

Tech YouTuber Enderman, with 350,000 subscribers, said an automated system shut down their channel after linking it to an unrelated banned account. Dexerto highlighted the case after it spread on X.

YouTube’s Official Position

YouTube frames its enforcement differently than creators describe.

The company’s spam, deceptive practices, and scams policy explains why it takes action on fraud, impersonation, fake engagement, and misleading metadata. The policy notes that YouTube may act at the channel level if an account exists “primarily” to violate rules.

In a FAQ post, YouTube says the “vast majority” of terminations are upheld on appeal. The company says it’s “confident” in its processes while acknowledging “a handful” of incorrect terminations that were later reversed.

YouTube also offers a “Second Chances” pilot program that allows some creators to start new channels if they meet specific criteria and were terminated more than a year ago. The program doesn’t restore lost videos or subscribers.

YouTube’s CEO recently indicated the company plans to expand AI moderation tools. In an interview with Time, he said YouTube will proceed with expanded AI enforcement despite creator concerns.

Why This Matters

If you rely on YouTube as a core channel, these accounts raise practical concerns. A channel termination removes your entire presence, including subscribers and revenue potential. When appeals feel automated, you have limited visibility into what triggered the enforcement.

The Chase Car timeline shows an AI system can overturn a positive human verdict months later. Creators without large followings may have fewer options for escalation if formal appeals fail.

Looking Ahead

The EU’s Digital Services Act gives European users access to certified dispute bodies for moderation decisions. The Chase Car case could test how platforms respond to unfavorable rulings under that system.

YouTube says its appeals process is the correct channel for enforcement disputes. The company has not announced changes to its moderation approach in response to creator complaints.

Monitor YouTube’s official help community for any updates to appeal procedures or policy clarifications.


Featured Image: T. Schneider/Shutterstock

So You Want To Paywall?

There are three inevitabilities in life. Death, taxes, and big tech companies dumping on the little guy. As zero-click searches reach an all-time high and content is stolen and repurposed for the gain of the almighty tech loser, there’s only one viable solution.

To paywall.

To create a value exchange that reduces reliance on third-party platforms. To become as self-sufficient as possible. Like an off-grid cabin or your mum’s basement, a paywall gives you a sense of security you just cannot put a price on.

As we’re all finding, any kind of reliance on these guys doesn’t put us in a good position. They do not want to send us traffic.

TL;DR

  1. Subscriber revenue is intrinsically more valuable to a business because it is predictable. Subscription and advertiser revenue are not created equal.
  2. Don’t paywall everything. Use dynamic/metered paywalls and leave high-reach, generally lower-quality platforms like Google Discover free for email signups.
  3. Subscription success relies on your USP – whether that’s exclusive data, deep, niche insights, or a certain vibe – you have to stand out.
  4. The customer experience and understanding of your audience matter. Create habit-forming connections and products. Become an essential part of their life.

But What About Our Traffic?

Your traffic will decline. But guess what? You’re already hemorrhaging clicks and have been for some time. And traffic doesn’t pay the bills.

Two comparable pages, one with a paywall, one without (Image Credit: Harry Clarkson-Bennett)

The only way to sustain rankings over time is with high-quality engagement data. Navboost stores and uses 13 months of data to identify good vs bad clicks, click quality, the last longest click, and on-page interactions to establish the most relevant content. All at a query level.

Paywalls are not your friend when it comes to user engagement. Not for the masses. But for a small cohort of people who like you enough to pay, your engagement data will be excellent.

In an ultra-personalized world, you will still do well to the people who really matter.

We have data that pretty perfectly highlights the impact of a paywall on rankings. Over the course of three to four months in traditional search, your rankings start to steadily drop before settling in severe mediocrity. You’ve got to fight for every click. With great content, marketing, savviness. Everything.

We have used an image manager to try and generate a free-to-air badge. It rarely shows up unless there’s no featured image, but the idea is excellent (Image Credit: Harry Clarkson-Bennett)

In Google Discover – a highly personalized, click and engagement driven platform – this is even more pronounced. While Discover’s clickless traffic is lower quality, there will be a small cohort of highly engaged users that develop over time, you can target with a paywall.

Unpaywall for the masses, build your owned channels, and paywall for the highly engaged. The platform will take care of the personalization for you.

So, maximize your value exchange with ads and email signups for most users, but don’t neglect those with a high return rate.

There’s some psychology involved in all of this. When a brand becomes widely known for paywalling, I suspect the likelihood of a click goes down as users know what to expect. Or maybe what not to expect.

This likely perpetuates over time, so you should clarify what articles are free to air.

Is Our Content Good Enough?

To nail SEO bingo, it depends. It depends on what your value is in the market. There is a lot of free stuff out there already. But broadly rubbish. So as long as the bar keeps dropping, we’ll all be fine.

I am old-ish. I like words. Writing great content isn’t easy and is usurped in many cases by richer, more visually striking content. Content that satisfies all types of users. Scanners, deep readers, listeners and get the answer and go-ers.

In some ways, you can satisfy all types of users more effectively than ever. I think you have to hit three of the four Es of content creation. Make it resonate, be consistent, and understand your audience. Whatever you create stands a chance.

But that doesn’t mean creating great stuff is any easier. If you work for a traditional publisher, the chances are you’ve brought a spoon to a gun fight. The war for attention is being fought on all fronts, and straight words are losing.

Fortunately, not every subscription model relies on the quality of the prose. It might be that you have unique data, granular insights into a specific market, or are just a bloody good laugh.

Subscriptions come in all shapes and sizes.

Ultimately, it comes down to your market, marketing, positioning and your USP. You have to know and speak to your audience and you have to stand out. As Barry would say, if you’re forgettable, you’re doomed.

How Do We Know If People Will Pay?

When it comes to paying for news, some markets are far more “advance” than others. The Scandinavian market is light-years ahead of almost everyone else when it comes to paying for news. You have to do your research to understand:

  • How many people currently pay for news?
  • What demographic of person pays?
  • How saturated is the market already?
  • What is your niche?
Where your audience are matters a lot (Image Credit: Harry Clarkson-Bennett)

While it doesn’t align perfectly, it’s not surprising that those most likely to pay for news have higher income levels. Higher disposable income tends to create an environment where people buy more stuff.

Shocking, I know.

But that doesn’t tell the whole story. Norwegian news outlets have (apparently) a long history of trust with their audience and have never had access to free multi-day newspapers. Ditto other Scandinavian countries. In an age of rubbish and spin, trust and E-E-A-T are more important than ever.

And while the UK sits in a pretty shocking-looking position, almost 24 million of us pay for a BBC license fee. That is, in essence, paying for news. Insert joke about BBC bias and woke cultural agendas here.

Cultural and societal factors really matter. As does your understanding of the market.

Important to note that according to Richard Reeves (AOP Director), Subscriptions have overtaken display advertising as the core source of digital revenue.

“Most heartening is what this represents as the wider information ecosystem fractures: audiences recognise the value of professional journalism and are willing to pay for it.”

In an era of slop, paying for something good is not a bad thing.

Macro And Micro Factors Are Influential

You can only control what you can control. But you shouldn’t dismiss the wider climate.

In the UK and arguably globally, there is a cost-of-living crisis. Globally, there have been a number of very significant geopolitical issues that affect the wider economy. Money doesn’t go as far as it once did, and most subscriptions are a luxury purchase.

Is a £20 or £30 monthly subscription more valuable than a £10 Netflix one? Or Spotify? These are questions you need to ask. Why would someone subscribe and stick around?

How far your money goes has been declining for some time… (Image Credit: Harry Clarkson-Bennett)

And we aren’t just competing with other publishers. While screen time and content consumption are at an all-time high, video consumption and the creator economy are booming.

It is quite literally a near half a trillion dollar market.

Not strengths for traditional publishers. While there have been some very good success stories in recent times (see Wired turning their journalists into individual subscription machines), legacy publishers need to adapt.

So your pricing strategy, customer service, and overall experience are hugely important. You are almost certainly going to be a nice-to-have. So make sure your customer journey and path to conversion are premium, and your audience feel listened to.

The standard customer experience (Image Credit: Harry Clarkson-Bennett)

You need to speak to your audience. You don’t have to go into this blind. Forging real connections with people is not impossible and making them feel listened to will go a long way.

You can try to figure out what they really value, how much they’re willing to spend and what’s stopping them.

Should I Paywall Everything?

No. Content is designed to do different things, and not everything is a premium product. Whatever journalists will tell you. If you shut down your site entirely, you become too closed off an ecosystem in my opinion.

  • Commercial Content: If you have affiliate-led content, paywalling is a questionable decision. It may not be wrong per se, but think about whether the pros outweigh the cons. Typically, it’s a good gateway drug for the rest of your content. And makes some money.
  • Content You Can Get Elsewhere: Evergreen content of a comparable quality to what already exists in the wider corpus is not a profitable opportunity. I’d argue that leaving this free-to-air has more pros than cons. You can always unpaywall the 100 best albums of all time, but gate the richer, individual album reviews.
  • Lower-Quality Platforms: A user that comes from a platform like Discover is far less likely to convert than someone who comes from organic search. So think about the role each platform plays in your content access ecosystem.
  • Paywall Vs. Newsletter signup: It is far easier to convert people to a paying subscriber from a newsletter database than from an on-page paywall. And the user journey is far less interrupted. Building an owned channel is never a bad thing, so think about how engaged users are and whether an email would be a more effective starting point.

The Type Of Paywall Matters (Now More Than Ever)

LLMs do not respect paywalls. As it turns out, neither does Google.

I, for one, am stunned.

As of just a few months ago, the search giant asked that publishers with paywalls change the way they block content to help Google out. The lighter touch paywall solution (a JavaScript-based one) includes the full content in the server response.

“…Some JavaScript paywall solutions include the full content in the server response, then use JavaScript to hide it until subscription status is confirmed.

This isn’t a reliable way to limit access to the content. Make sure your paywall only provides the full content once the subscription status is confirmed.”

According to Google, they are struggling to determine the difference. So the problem is on us, not them. They (and I strongly suspect other LLMs) are ingesting this content and training their models on us whether we like it or not.

For those of you who haven’t heard of Common Crawl, it stores a corpus of open web data accessible to “researchers.” By researchers, we now mean tech bros who don’t want to pay for, surprisingly, anything.

According to their CEO;

“If you didn’t want your content on the internet, you shouldn’t have put your content on the internet.”

It doesn’t stop there either. Even if you block all non-whitelisted bots from accessing your site at a CDN level, you may have syndication partnerships in place. If so, it’s likely your content is making it out into the wider world.

The internet is not exactly a leakproof vessel. If you’re setting one up now, try to implement a server-side option.

What Is The Right Paywall For Me?

I have written about the types of paywall available to you and the pros and cons of each. Generally, I think a metered or dynamic paywall is the best option for most publishers. At the very least, a freemium model. Something that gives people enough to draw them in.

And you can’t exactly draw them in if you just hard paywall everything.

You have to think of this as a full-blown marketing strategy. You need to know where people come from. How much of your content they have consumed. Whether it’s better to show them a newsletter signup as opposed to a paywall.

It is absolutely worth knowing that over time, a strong email database will convert far more effectively than a hard paywall.

So encouraging free signups and taking a longer-term view to conversions (you’ll need a good customer journey here) may be far more effective.

How Can I Set One Up?

There are a number of paywall management options out there for publishers. Leaky Paywall, Zephr, Piano. There are plenty.

The best ones integrate with your existing tech stacks, have excellent personalization and customization options, deploy ad-blocking strategies, and run flexible gating strategies.

Larger publishers tend to go with enterprise-level options with deep analytics and CRM integrations. Smaller publishers can work with lighter touch, cheaper operators. You really just need to scope out what will work best for you.

Particularly when it comes to monthly costs and revenue share options.

How Can I Map The Impact?

You’ll need to establish a few key things:

  • The average drop in traffic you expect to see.
  • The subsequent loss of existing revenue (probably ad-related, but there may be some knock-on wider commercial impact).
  • The average value of a subscription (and the expected conversion rate).
  • Your customer LTV.

Focusing on Customer LTV shifts marketing from chasing traffic to profitable, loyal audience relationships. It makes businesses understand that not all audiences or subscriptions are created equal.

You generate more subs through paid media because the net is larger. But lots slip through the net. So you need a quality product (in both a product and marketing sense) alongside UX and customer service that reduces friction.

Search and owned channels are smaller, but far more likely to pay because they have taken an action to find you. In some cases, they actually want you in their inbox. The quality is higher, but the overall returns are lower.

So you just can’t treat everybody the same.

Closing Thoughts

Subscriber revenue is so valuable because it’s predictable. Subscription business models have boomed for that very reason. A pound of subscriber revenue is far more valuable than almost anything else, and it should be the focus of your business.

But that doesn’t mean you put all your eggs in one basket. You can have multiple subscription types on your website, and that can help you become habitual with all types of users. But you need to add value to their lives every day.

Puzzles, recipes, short and long-form videos, et al.

Businesses make money in many ways. A diverse business is resilient. Resilient to macro and micro factors that will decimate some publishers over the next few years. So talk to your audience, trial new ways of adding value, and commit when one works. Become habitual.

And, shock horror, people want to belong to something. So while the digital experience is crucial, making an effort to connect with people IRL matters.

More Resources:


This post was originally published on Leadership in SEO.


Featured Image: beast01/Shutterstock

Ask An SEO: Digital PR Or Traditional Link Building, Which Is Better? via @sejournal, @rollerblader

This week’s ask an SEO question is:

“Should SEOs be focusing more on digital PR than traditional link building?”

Digital PR is synonymous with link building at this point as SEO’s needed a new way to package and resell the same service. Actual PR work will always be more valuable than link building because PR, whether digital or traditional, focuses on a core audience of customers and reaching specific demographics. This adds value to a business and drives revenue.

With that said, here’s how I’d define digital PR vs. link building if a client asked what the difference is.

  • Digital PR: Getting brand coverage and citations in media outlets, niche publications, trade journals, niche blogs, and websites that do not allow guest posting, paid links, or unvetted contributors with the goal of building brand awareness and driving traffic from the content.
  • Link Building: Getting links from websites as a way to try and increase SERP rankings. Traffic from the links, sales from the links, etc., are not being tracked, and the quality of the website can be questionable.

Digital PR is always going to be better than link building because you’re treating the technique as a business and not a scheme to try and game the rankings. Link building became a bad practice years ago as links became less relevant, they are still important, so I want to ensure that isn’t taken out of context, and we stopped doing link building completely. Quality content attracts links naturally, including media mentions. When this happens in a natural way, the website will begin rising as the site has a lot of value for users, and search engines can tell when the site is quality.

If you’re building links without evaluating the impact they have traffic and sales-wise, you’re likely setting your site up for failure. Getting a ton of links, just like creating content in mass with AI/LLMs or article spinners, can grow a site quickly. That URL/domain can then burn to the ground equally as fast.

That’s why when we purchase a link, an advertorial, or we’re doing a partnership, we always ask ourselves the following questions:

  • Is there an active audience on this website that is also coming back to the website via branded search for information?
  • Is the audience on this website part of our customer base?
  • Will the article we’re pitching or being featured in be helpful to the user, and is our product or service something that is part of the post naturally vs. being forced?
  • Are we ok with the link being nofollow or sponsored if we’re paying for the inclusion?

If the answer is yes to these four, then we’re good to go with the link. The active audience on the website and people returning by brand name means there is an audience that trusts them for information. If the readership, visitors, or customers are similar or the same demographics as our user base, then it makes sense we’d want to be in front of them where they go for information.

We may have knowledge that is helpful to the user, but if it is not on topic within the post, there is no reason for them to come through and use our services, buy our products, or subscribe to our newsletters. Instead, we’ll wait until there is a fit, so there is a direct “link” between the content we’re contributing, or being an expert on, and our website.

For the last question, our goal is always traffic and customer acquisition, not getting a link. The website owner controls this, and if they want to follow Google’s best practices (which we obviously recommend doing), we will still be happy if they mark it as sponsored or nofollow. This is the most important of the questions. Building links to game the SERPs is a bad idea; building a brand that people search for by name will overpower any link any day of the week. This is always our goal when it comes to Digital PR and link building. Driving that branded search.

So, that begs the question, where do we go for digital PR?

Sources To Get Digital PR Mentions And Links

When we’re about to start a Digital PR campaign, we create lists of the following targets to reach out to.

  • Mass Media: Household names like magazines, news websites, and local media, where everyone in the area, the customers, or the country or world knows them by name. The only stipulation we apply is if they have an active category vs. only a few articles here and there. The active category means it is something interesting enough to their reader base that they’re investing in it, so our customers may be there.
  • Trade Publications: Conferences, associations, and non-profits, as well as industry insiders will have websites and print publications that go out to members. Search Engine Journal could be considered a trade publication for the SEO and PPC industry, same with SEO Roundtable, and some of the communities like Webmaster World. They publish directly relevant content for search engine marketers and have active users, so if I was an SEO service provider or tool, this is where I’d be looking to get featured and ideally links from.
  • Niche Sites and Bloggers: There is no shortage of niche sites and content producers out there. The trick is finding ones that do not publicly allow guest contributions, advertorials, etc., and that do not link out to non-niche websites and content. This includes sites that got hacked and had link injections. Even if their “authority” is zero, there is value if they quality control and all links and mentions are earned.
  • Influencers: Whether it is YouTube, Facebook group leaders, LinkedIn that is crawlable, or other channels, getting coverage from people with subscribers and an active audience can let search engines crawl the link back to your website. It may not boost your rankings, but it drives customers to you and helps with page discoverability if the link gets crawled. LLMs are also citing their content as sources, so there could be value for AIO, too.

Link building is not dead by any means; links still matter. You just don’t need to build them anymore. Focus on quality where an active audience is and where you have a chance at getting traffic and revenue. This is what will move the needle for the long run and help you grow in SERPs that matter.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Why Is Organic Traffic Down? Here’s How To Segment The Data via @sejournal, @torylynne

As an SEO, there are few things that stoke panic like seeing a considerable decline in organic traffic. People are going to expect answers if they don’t already.

Getting to those answers isn’t always straightforward or simple, because SEO is neither of those things.

The success of an SEO investigation hinges on the ability to dig into the data, identify where exactly the performance decline is happening, and connect the dots to why it’s happening.

It’s a little bit like an actual investigation: Before you can catch the culprit or understand the motive, you have to gather evidence. In an SEO investigation, that’s a matter of segmenting data.

In this article, I’ll share some different ways to slice and dice performance data for valuable evidence that can help further your investigation.

Using Data To Confirm There’s An SEO Issue

Just because organic traffic is down doesn’t inherently mean that it’s an SEO problem.

So, before we dissect data to narrow down problem areas, the first thing we need to do is determine whether there’s actually an SEO issue at play.

After all, it could be something else altogether. In which case, we’re wasting unnecessary resources chasing a problem that doesn’t exist.

Is This A Tracking Issue?

In many cases, what looks like a big traffic drop is just an issue with tracking on the site.

To determine whether tracking is functioning correctly, there are a couple of things we need to look for in the data.

The first is consistent drops across channels.

Zoom out of organic search and see what’s happening in other sources and channels.

If you’re seeing meaningful drops across email, paid, etc., that are consistent with organic search, then it’s more than likely that tracking isn’t working correctly.

The other thing we’re looking for here is inconsistencies between internal data and Google Search Console.

Of course, there’s always a bit of inconsistency between first-party data and GSC-reported organic traffic. But if those differences are significantly more pronounced for the time period in question, that hints at a tracking problem.

Is This A Brand Issue?

Organic search traffic from Google falls into two primary camps:

  • Brand traffic: Traffic driven by user queries that include the brand name.
  • Non-brand traffic: Traffic driven by brand-agnostic user queries.

Non-brand traffic is directly affected by SEO work. Whereas, brand traffic is mostly impacted by the work that happens in other channels.

When a user includes the brand in their search, they’re already brand-aware. They’re a return user or they’ve encountered the brand through marketing efforts in channels like PR, paid social, etc.

When marketing efforts in other channels are scaled back, the brand reaches fewer users. Since fewer people see the brand, fewer people search for it.

Or, if customers sour on the brand, there are fewer people using search to come back to the site.

Either way, it’s not an SEO problem. But in order to confirm that, we need to filter the data down.

Go to Performance in Google Search Console and exclude any queries that include your brand. Then compare the data against a previous period – usually YoY if you need to account for seasonality. Do the same for queries that don’t include the brand name.

If non-brand traffic has stayed consistent, while brand traffic has dropped, then this is a brand issue.

filtering queries using regex in Google Search Console
Screenshot from Google Search Console, November 2025

Tip: Account for users misspelling your brand name by filtering queries using fragments. For example, at Gray Dot Co, we get a lot of brand searches for things like “Gray Company” and “Grey Dot Company.” By using the simple regex expression “gray|grey” I can catch brand search activity that would otherwise fall through the cracks. 

Is It Seasonal Demand?

The most obvious example of seasonal demand is holiday shopping on ecommerce sites.

Think about something like jewelry. Most people don’t buy jewelry every day; they buy it for special occasions. We can confirm that seasonality by looking at Google Trends.

Zooming out to the past five years of interest in “jewelry,” it clearly peaks in November and December.

Google Trends graph for interest in jewelry over the past five years
Screenshot from Google Trends, November 2025

As a site that sells jewelry, of course, traffic in Q1 is going to be down from Q4.

I use a pretty extreme example here to make my point, but in reality, seasonality is widespread and often more subtle. It impacts businesses where you might not expect much seasonality at all.

The best way to understand its impact is to look at organic search data year-over-year. Do the peaks and valleys follow the same patterns?

If so, then we need to compare data YoY to get a true sense of whether there’s a potential SEO problem.

Is It Industry Demand?

SEOs need to keep tabs on not just what’s happening internally, but also what’s going on externally. A big piece of that is checking the pulse of organic demand for the topics and products that are central to the brand.

Products fall out of vogue, technologies become obsolete, and consumer behavior changes – that’s just the reality of business. When there are fewer potential customers in the landscape, there are fewer clicks to win, and fewer sessions to drive.

Take cameras, for instance. As the cameras on our phones got more sophisticated, digital cameras became less popular. And as they became less popular, searches for cameras dwindled.

Now, they’re making a comeback with younger generations. More people searching, more traffic to win.

npr article headline why gen z loves the digital compact cameras that millennials used to covet
Screenshot from npr.com, November 2025

You can see all of this at play in the search landscape by turning to Google Trends. The downtrend in interest caused by advances in technology, AND the uptrend boosted by shifts in societal trends.

Google Trends graph showing search interest in cameras since 2004
Screenshot from Google Trends, November 2025

When there are drops in industry, product, or topic demand within the landscape, we need to ask ourselves whether the brand’s organic traffic loss is proportional to the overall loss in demand.

Is Paid Search Cannibalizing Organic Search?

Even if a URL on the site ranks well in organic results, ads are still higher on the SERP. So, if a site is running an ad for the same query it already ranks for, then the ad is going to get more clicks by nature.

When businesses give their PPC budgets a boost, there’s potential for this to happen across multiple, key SERPs.

Let’s say a site drives a significant chunk of its organic traffic from four or five product landing pages. If the brand introduces ads to those SERPs, clicks that used to go to the organic result start going to the ad.

That can have a significant impact on organic traffic numbers. But search users are still getting to the same URLs using the same queries.

To confirm, pull sessions by landing pages from both sources. Then, compare the data from before the paid search changes to the period following the change.

If major landing pages consistently show a positive delta that cancels out the negative delta in organic search, you’re not losing organic traffic; you’re lending it.

YoY comparison of sessions by landing page for paid search and organic search in GA4
Screenshot from Google Analytics, November 2025

Segmenting Data To Find SEO Issues

Once we have confirmation that the organic traffic declines point to an SEO issue, we can start zooming in.

Segmenting data in different ways helps pinpoint problem areas and find patterns. Only then can we trace those issues to the cause and craft a strategy for recovery.

URL

Most SEOs are going to filter their organic traffic down by URL. It lets us see which pages are struggling and analyze those pages for potential improvements.

It also helps find patterns across pages that make it easier to isolate the cause of more widespread issues. For example, if the site is losing traffic across its product listing pages, it could signal that there’s a problem with the template for that page.

But segmenting by URL also helps us answer a very important question when we pair it with conversion data.

Do We Really Care About This Traffic?

Clicks are only helpful if they help drive business-valuable user interactions like conversions or ad views. For some sites, like online publications, traffic is valuable in and of itself because users coming to the site are going to see ads. The site still makes money.

But for brands looking to drive conversions, it could just be empty traffic if it’s not helping drive that primary key performance indicator (KPI).

A top-of-funnel blog post might drive a lot of traffic because it ranks for very high-volume keywords. If that same blog post is a top traffic-driving organic landing page, a slip in rankings means a considerable organic traffic drop.

But the users entering those high-volume keywords might not be very qualified potential customers.

Looking at conversions by landing page can help brands understand whether the traffic loss is ultimately hurting the bottom line.

The best way to understand is to turn to attribution.

First-touch attribution quantifies an organic landing page’s value in terms of the conversions it helps drive down the line. For most businesses, someone isn’t likely to convert the first time they visit the site. They usually come back and purchase.

Whereas, last-touch attribution shows the organic landing pages that people come to when they’re ready to make a purchase. Both are valuable!

Query

Filtering performance by query can help understand which terms or topic areas to focus improvements on. That’s not new news.

Sometimes, it’s as easy as doing a period-over-period comparison in GSC, ordering by clicks lost, and looking for obvious patterns, i.e., are the queries with the most decline just subtle variants of one another?

If there aren’t obvious patterns and the queries in decline are more widespread, that’s where topic clustering can come into the mix.

Topic Clustering With AI

Using AI for topic clustering helps quickly identify any potential relationships between queries that are seeing performance dips.

Go to GSC and filter performance by query, looking for any YoY declines in clicks and average position.

YoY comparison in Google Search Console for clicks and average position by query
Screenshot from Google Search Console, November 2025

Then export this list of queries and use your favorite ML script to group the keywords into topic clusters.

The resulting list of semantic groupings can provide an idea of topics where a site’s authority is slipping in search.

In turn, it helps narrow the area of focus for content improvements and other optimizations to potentially build authority for the topics or products in question.

Identifying User Intent

When users search using specific terms, the type of content they’re looking for – and their objective – differs based on the query. These user expectations can be broken out into four different high-level categories:

User Intent Objective
Informational

(Top of funnel)

Users are looking for answers to questions, explanations, or general knowledge about topics, products, concepts, or events.
Commercial

(Middle of funnel)

Users are interested in comparing products, reading reviews, and gathering information before making a purchase decision.
Transactional

(Bottom of funnel)

Users are looking to perform a specific action, such as making a purchase, signing up for a service, or downloading a file.
Navigational Brand-familiar users are using the search engine as a shortcut to find a specific website or webpage.

By leveraging user intent, we identify user objectives for which the site or pages on the site are falling short. It gives us a lens into performance decline, making it easier to identify possible causes from the perspective of user experience.

If the majority of queries losing clicks and positionality are informational, it could signal shortcomings in the site’s blog content. If the queries are consistently commercial, it might call for an investigation into how the site approaches product detail and/or listing pages.

GSC doesn’t provide user intent in its reporting, so this is where a third-party SEO tool can come into play. If you have position tracking set up and GSC connected, you can use the tool’s rankings report to identify queries in decline and their user intent.

If not, you can still get the data you need by using a mix of GSC and a tool like Ahrefs.

Device

This view of performance data is pretty simple, but it’s equally easy to overlook!

When the large majority of performance declines are attributed to ONLY desktop or mobile, device data helps identify potential tech or UX issues within the mobile or desktop experience.

The important thing to remember is that any declines need to be considered proportionally. Take the metrics for the site below…

YoY comparison in Google Search Console of clicks by device type
Screenshot from Google Search Console, November 2025

At first glance, the data makes it look like there might be an issue with the desktop experience. But we need to look at things in terms of percentages.

Desktop: 1 – (648/1545) x 100 = 58% decline

Mobile: 1 – (149/316) x 100 = 52% decline

While desktop shows a much larger decline in terms of click count, the percentage of decline YoY is fairly similar across both desktop and mobile. So we’re probably not looking for anything device-specific in this scenario.

Search Appearance

Rich results and SERP features are an opportunity to stand out on the SERP and drive more traffic through enhanced results. Using the search appearance filter in Google Search Console, you can see traffic from different types of rich results and SERP features:

  • Forums.
  • AMP Top Story (AMP page + Article markup).
  • Education Q&A.
  • FAQ.
  • Job Listing.
  • Job Details.
  • Merchant Listing.
  • Product Snippet.
  • Q&A.
  • Review Snippet.
  • Recipe Gallery.
  • Video.

This is the full list of possible features with rich results (courtesy of SchemaApp), though you’ll only see filters for search appearances where the domain is currently positioned.

In most cases, Google is able to generate these types of results because there is structured data on pages. The notable exceptions are Q&A, translated results, and video.

So when there are significant traffic drops coming from a specific type of search appearance, it signals that there’s potentially a problem with the structured data that enables that search feature.

YoY comparison in Google Search Console for search appearance
Screenshot from Google Search Console, November 2025

You can investigate structured data issues in the Enhancements reports in GSC. The exception is product snippets, which nest under the Shopping menu. Either way, the reports only show up in your left-hand nav if Google is aware of relevant data on the site.

For example, the product snippets report shows why some snippets are invalid, as well as ways to potentially improve valid results.

Product snippets report in Google Search Console
Screenshot from Google Search Console, November 2025

This context is valuable as you begin to investigate the technical causes of traffic drops from specific search features. In this case, it’s clear that Google is able to crawl and utilize product schema on most pages – but there are some opportunities to improve that schema with additional data.

Featured Snippets

When featured snippets originally came on the scene, it was a major change to the SERP structure that resulted in a serious hit to traditional organic results.

Today, AI Overviews are doing the same. In fact, research from Seer shows that CTR has dropped 61% for queries that now include an AI overview (21% of searches). And that impact is outsized for informational queries.

In cases where rankings have remained relatively static, but traffic is dropping, there’s good reason to investigate whether this type of SERP change is a driver of loss.

While Google Search Console doesn’t report on featured snippets (example: PAA questions) and AI Overviews, third-party tools do.

In the third-party tool Semrush, you can use the Domain Overview report to check for featured snippet availability across keywords where the site ranks.

filtering to keyword with available AI overviews in the Semrush Domain Overview report
Screenshot from Semrush, November 2025

Do the keywords where you’re losing traffic have AI overviews? If you’re not cited, it’s time to start thinking about how you’re going to win that placement.

Search Type

Search type is another way to filter GSC data, where you’re seeing traffic declines despite healthy and consistent rankings.

After all, web search is just one prong of Google Search. Think about it: How often do you use Google Image search? At least in my case, that’s fairly often.

Filter performance data by each of these search types to understand which one(s) are having the biggest impact on performance decline. Then use that insight to start connecting the dots to the cause.

filtering to Google image search performance in Google Search Console
Screenshot from Google Search Console, November 2025

Images are a great example. One simple line in the robots.txt can block Google from crawling a subfolder that hosts multitudes of images. As those images disappear from image search results, any clicks from those results disappear in tandem.

We don’t know to look for this issue until we slice the data accordingly!

Geography

If the business operates physically in specific cities and states, then it likely already has geo-specific performance tracking set up through a tool.

But domains for online-only businesses shouldn’t dismiss geographic data – even at the city/state level! Declines are still a trigger to check geo-specific performance data.

Country

Just because the brand only sells and operates in one country doesn’t mean that’s where all the domain’s traffic is coming from. Drilling down by country in GSC allows you to see whether declines are coming from the country the brand is focused on or, potentially, another country altogether.

performance by country in Google Search Console
Screenshot from Google Search Console, November 2025

If it’s another country, it’s time to decide whether that matters. If the site is a publisher, it probably cares more about that traffic than an ecommerce brand that’s more focused on purchases in its country of operation.

Localization

When tools are reporting positionality at the country level, then rankings shifts in specific markets fly under the radar. It certainly happens, and major markets can have major traffic impact!

Tools like BrightLocal, Whitespark, and Semrush let you analyze SERP rankings one level deeper than GSC, providing data down to the city.

Checking for rankings discrepancies across cities is possible by checking a small sample of keywords with the greatest declines in clicks.

If I’m an SEO at the University of Phoenix, which is an online university, I’m probably pretty excited about ranking #1 in the United States for “online business degree.”

top five serp results for online business degree in the United States
Screenshot from Semrush, November 2025

But if I drill down further, I might be a little distraught to find that the domain isn’t in the top five SERP results for users in Denver, CO…

top five serp results for online business degree in Denver, Colorado
Screenshot from Semrush, November 2025

…or Raleigh, North Carolina.

top five serp results for online business degree in Raleigh. North Carolina
Screenshot from Semrush, November 2025

Catch Issues Faster By Leveraging AI For Data Analysis

Data segmentation is an important piece of any traffic drop investigation, because humans can see patterns in data that bots don’t.

However, the opposite is true too. With anomaly detection tooling, you get the best of both worlds.

When combined with monitoring and alert notifications, anomaly detection makes it possible to find and fix issues faster. Plus, it enables you to find data patterns in any after-the-impact investigations

All of this helps ensure that your analysis is comprehensive, and might even point out gaps for further investigation.

This Colab tool from Sam Torres can help get your site set up!

Congrats, You’re Close To Closing This Case

As Sherlock Holmes would say about an investigation, “It is a capital mistake to theorize before one has data.” With the right data in hand, the culprits start to reveal themselves.

Data segmentation empowers SEOs to uncover leads that point to possible causes. By narrowing it down based on the evidence, we ensure more accuracy, less work, faster answers, and quicker recovery.

And while leadership might not love a traffic drop, they’re sure to love that.

More Resources:


Featured Image: Vanz Studio/Shutterstock

4 technologies that didn’t make our 2026 breakthroughs list

If you’re a longtime reader, you probably know that our newsroom selects 10 breakthroughs every year that we think will define the future. This group exercise is mostly fun and always engrossing, but at times it can also be quite difficult. 

We collectively pitch dozens of ideas, and the editors meticulously review and debate the merits of each. We agonize over which ones might make the broadest impact, whether one is too similar to something we’ve featured in the past, and how confident we are that a recent advance will actually translate into long-term success. There is plenty of lively discussion along the way.  

The 2026 list will come out on January 12—so stay tuned. In the meantime, I wanted to share some of the technologies from this year’s reject pile, as a window into our decision-making process. 

These four technologies won’t be on our 2026 list of breakthroughs, but all were closely considered, and we think they’re worth knowing about. 

Male contraceptives 

There are several new treatments in the pipeline for men who are sexually active and wish to prevent pregnancy—potentially providing them with an alternative to condoms or vasectomies. 

Two of those treatments are now being tested in clinical trials by a company called Contraline. One is a gel that men would rub on their shoulder or upper arm once a day to suppress sperm production, and the other is a device designed to block sperm during ejaculation. (Kevin Eisenfrats, Contraline’s CEO, was recently named to our Innovators Under 35 list). A once-a-day pill is also in early-stage trials with the firm YourChoice Therapeutics. 

Though it’s exciting to see this progress, it will still take several years for any of these treatments to make their way through clinical trials—assuming all goes well.

World models 

World models have become the hot new thing in AI in recent months. Though they’re difficult to define, these models are generally trained on videos or spatial data and aim to produce 3D virtual worlds from simple prompts. They reflect fundamental principles, like gravity, that govern our actual world. The results could be used in game design or to make robots more capable by helping them understand their physical surroundings. 

Despite some disagreements on exactly what constitutes a world model, the idea is certainly gaining momentum. Renowned AI researchers including Yann LeCun and Fei-Fei Li have launched companies to develop them, and Li’s startup World Labs released its first version last month. And Google made a huge splash with the release of its Genie 3 world model earlier this year. 

Though these models are shaping up to be an exciting new frontier for AI in the year ahead, it seemed premature to deem them a breakthrough. But definitely watch this space. 

Proof of personhood 

Thanks to AI, it’s getting harder to know who and what is real online. It’s now possible to make hyperrealistic digital avatars of yourself or someone you know based on very little training data, using equipment many people have at home. And AI agents are being set loose across the internet to take action on people’s behalf. 

All of this is creating more interest in what are known as personhood credentials, which could offer a way to verify that you are, in fact, a real human when you do something important online. 

For example, we’ve reported on efforts by OpenAI, Microsoft, Harvard, and MIT to create a digital token that would serve this purpose. To get it, you’d first go to a government office or other organization and show identification. Then it’d be installed on your device and whenever you wanted to, say, log into your bank account, cryptographic protocols would verify that the token was authentic—confirming that you are the person you claim to be. 

Whether or not this particular approach catches on, many of us in the newsroom agree that the future internet will need something along these lines. Right now, though, many competing identity verification projects are in various stages of development. One is World ID by Sam Altman’s startup Tools for Humanity, which uses a twist on biometrics. 

If these efforts reach critical mass—or if one emerges as the clear winner, perhaps by becoming a universal standard or being integrated into a major platform—we’ll know it’s time to revisit the idea.  

The world’s oldest baby

In July, senior reporter Jessica Hamzelou broke the news of a record-setting baby. The infant developed from an embryo that had been sitting in storage for more than 30 years, earning him the bizarre honorific of “oldest baby.” 

This odd new record was made possible in part by advances in IVF, including safer methods of thawing frozen embryos. But perhaps the greater enabler has been the rise of “embryo adoption” agencies that pair donors with hopeful parents. People who work with these agencies are sometimes more willing to make use of decades-old embryos. 

This practice could help find a home for some of the millions of leftover embryos that remain frozen in storage banks today. But since this recent achievement was brought about by changing norms as much as by any sudden technological improvements, this record didn’t quite meet our definition of a breakthrough—though it’s impressive nonetheless.

The Download: four (still) big breakthroughs, and how our bodies fare in extreme heat

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

4 technologies that didn’t make our 2026 breakthroughs list

If you’re a longtime reader, you probably know that our newsroom selects 10 breakthroughs every year that we think will define the future. This group exercise is mostly fun and always engrossing, with plenty of lively discussion along the way, but at times it can also be quite difficult.  

The 2026 list will come out on January 12—so stay tuned. In the meantime, we wanted to share some of the technologies from this year’s reject pile, as a window into our decision-making process. These four technologies won’t be on our 2026 list of breakthroughs, but all were closely considered, and we think they’re worth knowing about. Read the full story to learn what they are

MIT Technology Review Narrated: The quest to find out how our bodies react to extreme temperatures 

Scientists hope to prevent deaths from climate change, but heat and cold are more complicated than we thought. Researchers around the world are revising rules about when extremes veer from uncomfortable to deadly. Their findings change how we should think about the limits of hot and cold—and how to survive in a new world. 

This is our latest story to be turned into a MIT Technology Review Narrated podcast, which we’re publishing each week on Spotify and Apple Podcasts. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 A CDC panel voted to recommend delaying the hepatitis B vaccine for babies
Overturning a 30-year policy that has contributed to a huge decline in the virus. (STAT)
Why childhood vaccines are a public health success story. (MIT Technology Review)

2 Critical climate risks are growing across the Arab region 
Drought is the most immediate problem countries are having to grapple with. (Ars Technica)
+ Why Tehran is running out of water. (Wired $)

3 Netflix is buying Warner Bros for $83 billion 
If approved, it’ll be one of the most significant mergers in Hollywood history. (NBC)
+ Trump says the deal “could be a problem” due to Netflix’s already huge market share. (BBC)

4 The EU is fining X $140 million 
For failing to comply with its new Digital Services Act. (NPR)
Elon Musk is now calling for the entire EU to be abolished. (CNBC)
X also hit back by deleting the European Commission’s account. (Engadget)

5 AI slop is ruining Reddit
Moderators are getting tired of fighting the rising tide of nonsense. (Wired $)
+ How AI and Wikipedia have sent vulnerable languages into a doom spiral. (MIT Technology Review)

6 Scientists have deeply mixed feelings about AI tools
They can boost researchers’ productivity, but some worry about the consequences of relying on them. (Nature $)
‘AI slop’ is undermining trust in papers presented at computer science gatherings. (The Guardian)
+ Meet the researcher hosting a scientific conference by and for AI. (MIT Technology Review)

7 Australia is about to ban under 16s from social media
It’s due to come into effect in two days—but teens are already trying to maneuver around it. (New Scientist $)

8 AI is enshittifying the way we write 🖊🤖
And most people haven’t even noticed. (NYT $)
AI can make you more creative—but it has limits. (MIT Technology Review)

9 Tech founders are taking etiquette lessons
The goal is to make them better at pretending to be normal. (WP $)

10 Are we getting stupider? 
It might feel that way sometimes, but there’s little solid evidence to support it. (New Yorker $)

Quote of the day

“It’s hard to be Jensen day to day. It’s almost nightmarish. He’s constantly paranoid about competition. He’s constantly paranoid about people taking Nvidia down.” 

—Stephen Witt, author of ‘The Thinking Machine’, a book about Nvidia’s rise, tells the Financial Times what it’s like to be its founder and chief executive, Jensen Huang.

One more thing

fleet of ships at sea

COURTESY OF OCEANBIRD

How wind tech could help decarbonize cargo shipping

Inhabitants of the Marshall Islands—a chain of coral atolls in the center of the Pacific Ocean—rely on sea transportation for almost everything. For millennia they sailed largely in canoes, but much of their seafaring movement today involves big, bulky, diesel-fueled cargo ships that are heavy polluters.

They’re not alone. Cargo shipping is responsible for about 3% of the world’s annual greenhouse-­gas emissions, and that figure is currently on track to rise to 10% by 2050.

The islands have been disproportionately experiencing the consequences of human-made climate change: warming waters, more frequent extreme weather, and rising sea levels. Now its residents are exploring a surprisingly traditional method of decarbonizing its fleets. Read the full story.

—Sofia Quaglia

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ Small daily habits can help build a life you enjoy.  
+ Using an air fryer to make an epic grilled cheese sandwich? OK, I’m listening
+ I’m sorry but AI does NOT get to ruin em dashes for the rest of us. 
+ Daniel Clarke’s art is full of life and color. Check it out!