These technologies could help put a stop to animal testing

Earlier this week, the UK’s science minister announced an ambitious plan: to phase out animal testing.

Testing potential skin irritants on animals will be stopped by the end of next year, according to a strategy released on Tuesday. By 2027, researchers are “expected to end” tests of the strength of Botox on mice. And drug tests in dogs and nonhuman primates will be reduced by 2030. 

The news follows similar moves by other countries. In April, the US Food and Drug Administration announced a plan to replace animal testing for monoclonal antibody therapies with “more effective, human-relevant models.” And, following a workshop in June 2024, the European Commission also began working on a “road map” to phase out animal testing for chemical safety assessments.

Animal welfare groups have been campaigning for commitments like these for decades. But a lack of alternatives has made it difficult to put a stop to animal testing. Advances in medical science and biotechnology are changing that.

Animals have been used in scientific research for thousands of years. Animal experimentation has led to many important discoveries about how the brains and bodies of animals work. And because regulators require drugs to be first tested in research animals, it has played an important role in the creation of medicines and devices for both humans and other animals.

Today, countries like the UK and the US regulate animal research and require scientists to hold multiple licenses and adhere to rules on animal housing and care. Still, millions of animals are used annually in research. Plenty of scientists don’t want to take part in animal testing. And some question whether animal research is justifiable—especially considering that around 95% of treatments that look promising in animals don’t make it to market.

In recent decades, we’ve seen dramatic advances in technologies that offer new ways to model the human body and test the effects of potential therapies, without experimenting on humans or other animals.

Take “organs on chips,” for example. Researchers have been creating miniature versions of human organs inside tiny plastic cases. These systems are designed to contain the same mix of cells you’d find in a full-grown organ and receive a supply of nutrients that keeps them alive.

Today, multiple teams have created models of livers, intestines, hearts, kidneys and even the brain. And they are already being used in research. Heart chips have been sent into space to observe how they respond to low gravity. The FDA used lung chips to assess covid-19 vaccines. Gut chips are being used to study the effects of radiation.

Some researchers are even working to connect multiple chips to create a “body on a chip”—although this has been in the works for over a decade and no one has quite managed it yet.

In the same vein, others have been working on creating model versions of organs—and even embryos—in the lab. By growing groups of cells into tiny 3D structures, scientists can study how organs develop and work, and even test drugs on them. They can even be personalized—if you take cells from someone, you should be able to model that person’s specific organs. Some researchers have even been able to create organoids of developing fetuses.

The UK government strategy mentions the promise of artificial intelligence, too. Many scientists have been quick to adopt AI as a tool to help them make sense of vast databases, and to find connections between genes, proteins and disease, for example. Others are using AI to design all-new drugs.

Those new drugs could potentially be tested on virtual humans. Not flesh-and-blood people, but digital reconstructions that live in a computer. Biomedical engineers have already created digital twins of organs. In ongoing trials, digital hearts are being used to guide surgeons on how—and where—to operate on real hearts.

When I spoke to Natalia Trayanova, the biomedical engineering professor behind this trial, she told me that her model could recommend regions of heart tissue to be burned off as part of treatment for atrial fibrillation. Her tool would normally suggest two or three regions but occasionally would recommend many more. “They just have to trust us,” she told me.

It is unlikely that we’ll completely phase out animal testing by 2030. The UK government acknowledges that animal testing is still required by lots of regulators, including the FDA, the European Medicines Agency, and the World Health Organization. And while alternatives to animal testing have come a long way, none of them perfectly capture how a living body will respond to a treatment.

At least not yet. Given all the progress that has been made in recent years, it’s not too hard to imagine a future without animal testing.

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

The Download: how AI really works, and phasing out animal testing

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

OpenAI’s new LLM exposes the secrets of how AI really works

The news: ChatGPT maker OpenAI has built an experimental large language model that is far easier to understand than typical models.

Why it matters: It’s a big deal, because today’s LLMs are black boxes: Nobody fully understands how they do what they do. Building a model that is more transparent sheds light on how LLMs work in general, helping researchers figure out why models hallucinate, why they go off the rails, and just how far we should trust them with critical tasks. Read the full story.

—Will Douglas Heaven

Google DeepMind is using Gemini to train agents inside Goat Simulator 3

Google DeepMind has built a new video-game-playing agent called SIMA 2 that can navigate and solve problems in 3D virtual worlds. The company claims it’s a big step toward more general-purpose agents and better real-world robots.   

The company first demoed SIMA (which stands for “scalable instructable multiworld agent”) last year. But this new version has been built on top of Gemini, the firm’s flagship large language model, which gives the agent a huge boost in capability. Read the full story.

—Will Douglas Heaven

These technologies could help put a stop to animal testing

Earlier this week, the UK’s science minister announced an ambitious plan: to phase out animal testing.

Testing potential skin irritants on animals will be stopped by the end of next year. By 2027, researchers are “expected to end” tests of the strength of Botox on mice. And drug tests in dogs and nonhuman primates will be reduced by 2030.

It’s good news for activists and scientists who don’t want to test on animals. And it’s timely too: In recent decades, we’ve seen dramatic advances in technologies that offer new ways to model the human body and test the effects of potential therapies, without experimenting on animals. Read the full story.

—Jessica Hamzelou

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Chinese hackers used Anthropic’s AI to conduct an espionage campaign   
It automated a number of attacks on corporations and governments in September. (WSJ $)
+ The AI was able to handle the majority of the hacking workload itself. (NYT $)
+ Cyberattacks by AI agents are coming. (MIT Technology Review)

2 Blue Origin successfully launched and landed its New Glenn rocket
It managed to deploy two NASA satellites into space without a hitch. (CNN)
+ The New Glenn is the company’s largest reusable rocket. (FT $)
+ The launch had been delayed twice before. (WP $)

3 Brace yourself for flu season
It started five weeks earlier than usual in the UK, and the US is next. (Ars Technica)
+ Here’s why we don’t have a cold vaccine. Yet. (MIT Technology Review)

4 Google is hosting a Border Protection facial recognition app    
The app alerts officials whether to contact ICE about identified immigrants. (404 Media)
+ Another effort to track ICE raids was just taken offline. (MIT Technology Review)

5 OpenAI is trialling group chats in ChatGPT
It’d essentially make AI a participant in a conversation of up to 20 people. (Engadget)

6 A TikTok stunt sparked debate over how charitable America’s churches really are
Content creator Nikalie Monroe asked churches for help feeding her baby. Very few stepped up. (WP $)

7 Indian startups are attempting to tackle air pollution
But their solutions are far beyond the means of the average Indian household. (NYT $)
+ OpenAI is huge in India. Its models are steeped in caste bias. (MIT Technology Review)

8 An AI tool could help reduce wasted efforts to transplant organs
It predicts how likely the would-be recipient is to die during the brief transplantation window. (The Guardian)
+ Putin says organ transplants could grant immortality. Not quite. (MIT Technology Review)

9 3D-printing isn’t making prosthetics more affordable
It turns out that plastic prostheses are often really uncomfortable. (IEEE Spectrum)
+ These prosthetics break the mold with third thumbs, spikes, and superhero skins. (MIT Technology Review)

10 What happens when relationships with AI fall apart
Can you really file for divorce from an LLM? (Wired $)
+ It’s surprisingly easy to stumble into a relationship with an AI chatbot. (MIT Technology Review)

Quote of the day

“It’s a funky time.”

—Aileen Lee, founder and managing partner of Cowboy Ventures, tells TechCrunch the AI boom has torn up the traditional investment rulebook.

One more thing

Restoring an ancient lake from the rubble of an unfinished airport in Mexico City

Weeks after Mexican President Andrés Manuel López Obrador took office in 2018, he controversially canceled ambitious plans to build an airport on the deserted site of the former Lake Texcoco—despite the fact it was already around a third complete.

Instead, he tasked Iñaki Echeverria, a Mexican architect and landscape designer, with turning it into a vast urban park, an artificial wetland that aims to transform the future of the entire Valley region.

But as López Obrador’s presidential team nears its end, the plans for Lake Texcoco’s rebirth could yet vanish. Read the full story.

—Matthew Ponsford

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ Maybe Gen Z is onto something when it comes to vibe dating.
+ Trust AC/DC to give the fans what they want, performing Jailbreak for the first time since 1991.
+ Nieves González, the artist behind Lily Allen’s new album cover, has an eye for detail.
+ Here’s what AI determines is a catchy tune.

Natural Toothpaste Propels Wellnesse.com

Seth Spears is a Colorado-based entrepreneur who once taught consumers how to make their own non-toxic personal care products. He says customers valued the results but not the actual production process. “They kept asking us for ready-made versions,” he told me.

So he launched Wellnesse, a direct-to-consumer brand producing all-natural self-care goods, in 2020. Toothpaste quickly became the dominant item.

In our recent conversation, Seth shared the origins of Wellnesse, the demand for holistic oral care, marketing challenges, and more.

Our entire audio is embedded below. The transcript is edited for length and clarity.

Eric Bandholz: Who are you, and what do you do?

Seth Spears: I’m the founder and chief visionary officer of Wellnesse, a B Corporation that produces all-natural personal care products. Our flagship item is a mint-flavored whitening toothpaste, made without toxic ingredients such as fluoride, glycerin, or sodium lauryl sulfate. We believe what goes in or on your mouth affects your entire body, so our focus is on safe, effective alternatives that outperform conventional options.

Our toothpaste’s key ingredient is micro hydroxyapatite, a naturally occurring mineral that makes up your teeth and bones. Unlike fluoride, it helps remineralize and repair enamel, filling soft spots and even reversing minor cavities. We’ve received hundreds of testimonials from customers who’ve seen major improvements in oral health.

We also use extracts from neem, a tree native to India, for whitening, and green tea extract for overall gum and tooth health — ingredients that work synergistically for stronger, cleaner teeth. Many customers with sensitive teeth, often longtime Sensodyne users, report reduced sensitivity and better results after switching to our toothpaste.

Before Wellnesse, I co-founded Wellness Media, a health education company that taught people how to make their own personal care products. Our audience loved the results but didn’t want the hassle of making them, so they kept asking us to sell ready-made versions. As an entrepreneur, I recognized repeated demand as a business opportunity.

We launched Wellnesse in 2020 as a natural personal care brand, starting with toothpaste, shampoo, conditioner, and deodorant. While we still offer all those, oral care quickly became our most successful category and is now our primary focus.

Bandholz: Many consumers are rethinking fluoride and turning to holistic dentistry.

Spears: We work closely with holistic and biological dentists through an advisory board that reviews the latest science on safe, effective oral care. These practitioners reject outdated methods such as routine drilling and fluoride use, instead emphasizing the role of diet, supplements, and the natural oral microorganisms.

We partner with influencers and communities that value non-toxic living. Our customers aren’t looking for the cheapest option; they want products that align with a clean, health-conscious lifestyle. They’ve often dealt with dental or health issues and are now seeking a more advanced, fluoride-free option.

As awareness grows around the connection between lifestyle and oral health, holistic dentistry continues to gain momentum. Consumers are questioning ingredients and demanding transparency.

Bandholz: So you’re growing through these practitioners. How do you find them?

Spears: There’s a strong network of holistic and biological dentists with their own organizations and conferences. We’ve sponsored several of those events in recent years to build relationships and raise awareness of our products.

Many connections also happen organically. When customers mention their holistic dentist, we often ask for introductions. Sometimes those dentists reach out after patients recommend us.

We maintain both affiliate and wholesale programs. Some dentists stock our products, while others prefer to promote them. We provide samples for dentists to share with patients, to experience the benefits firsthand. This multichannel approach ensures our partnerships remain authentic and genuine.

Bandholz: What marketing tactic is working best in 2025?

Spears: Growth has slowed in 2025. It’s been a challenging year. Meta remains our primary customer-acquisition channel, but performance has declined compared to previous years. We’re still bringing in new customers there, but it’s taking more testing and creativity to find what resonates.

Our most effective Meta approach has been a “us versus them” comparison, showcasing our clean, natural ingredients side by side with those in major brands. It highlights how our formulas are safer and more effective without being confrontational. We avoid targeting specific corporations directly. Procter & Gamble and similar enterprise brands have deep pockets and legal teams, and we’re not looking for that kind of fight.

We’re experimenting with Reddit ads, especially in health and oral care subreddits, as well as some campaigns on X. However, the results have been weaker on those channels. We’re now in full testing mode, trying different angles and messaging. We often focus on ingredient quality, but we also use influencer-style videos featuring real customers.

We had a strong email list (from my Wellness Media company) built through educational content — podcasts, blogs, and tutorials focused on health, vitality, and natural living. We regularly sent newsletters featuring recipes and DIY personal care guides, which helped us cultivate a loyal, informed audience.

When we launched Wellnesse, that list gave us a ready-made customer base. Many of those subscribers prioritized holistic health, and several became affiliates.

The landscape has undergone significant changes since then. Traditional affiliate marketing, based on content sites and email lists, has largely shifted toward influencer marketing on social media. Today’s promotions rely on selfie-style videos and personal testimonials, which feel more authentic to audiences. To me, this trend is too self-focused — but it’s undeniably where attention and conversions are happening.

An agency manages our ad strategy, so my focus is on broader direction and messaging rather than daily campaign tweaks. Overall, there’s no single breakthrough channel at the moment. It’s about constant experimentation and adapting to the changing ad landscape.

Bandholz: I heard that once enamel is gone, you can’t rebuild it. Is that true?

Spears: Not entirely. Teeth consist of hydroxyapatite, so when toothpaste contains that mineral, its tiny particles can penetrate crevices and help remineralize enamel. But oral health isn’t just about brushing; it’s also heavily influenced by diet and mouth acidity.

If you’re consuming a lot of processed or sugary foods or drinking soda, your mouth becomes more acidic, which can lead to cavities. Brushing helps, but it can’t fully offset a poor diet. A nutrient-dense, low-sugar diet rich in protein and vegetables supports stronger teeth and overall health.

I prefer a paleo-style diet — lean meats, fruits, vegetables, nuts— but there’s no one-size-fits-all approach. Everyone’s body chemistry is different. Getting blood work and allergy testing can help you understand your individual needs and optimize both oral and full-body health.

Bandholz: Where can people follow you, reach out to you, buy your products?

Spears: Our site is Wellnesse.com. My personal website is Sethspears.com. We’re on Instagram and Facebook. Find me on LinkedIn.

ChatGPT Outage Affects APIs And File Uploads via @sejournal, @martinibuster

OpenAI is experiencing a widespread outage affecting two systems, APIs and ChatGPT. The outage has been ongoing for at least a half an hour as of publication date.

ChatGPT API Jobs Stuck Outage

The first issue is that batch API jobs get stuck in the finalization state. There are twelve components of APIs that are monitored for uptime and it’s the Batch part that’s experiencing “degraded” performance. The issue has been ongoing since 3:54 PM.

According to OpenAI:

“Subset of Batch API jobs stuck in finalizing state”

ChatGPT Uploads Outage

The other error pertains to ChatGPT file uploads are failing. This is described as a partial outage.

OpenAI’s official explanation:

“File uploads to ChatGPT conversations are failing for some users, giving an error message indicating the file has expired.

…File uploads to ChatGPT conversations are failing for some users, giving an error message indicating the file has expired.”

This issue has been ongoing since 3:53 PM.

Screenshot of OpenAI Uploads Outage

Data: Translated Sites See 327% More Visibility in AI Overviews

This post was sponsored by Weglot. The opinions expressed in this article are the sponsor’s own.

When Google’s AI Overviews launched in 2024, dozens of questions quickly surfaced among SEO professionals, one being: if AI now curates and summarizes search results, how do websites earn visibility, especially across languages?

Weglot recently conducted a data-driven study, analyzing 1.3 million citations across Google AI Overviews and ChatGPT to determine if LLMs cite content in one language, would they also cite it in others?

The result: translated websites saw up to 327% more visibility in AI Overviews than untranslated ones, a clear signal that international SEO is becoming inseparable from AI search.

What’s more, websites with another language available were also more likely to be cited in AI Overviews, regardless of the language the search was made.

This shift is redefining the rules of visibility. AI Overviews and large language models (LLMs) now mediate how information is discovered. Instead of ranking pages, they “cite” sources in generated responses.

But with that shift comes a new risk: if your website isn’t available in the user’s search language, does AI simply overlook it, or worse, send users to Google Translate’s proxy page instead?

The risk with Google’s Translate proxy is that while it does the translation work for you, you have no control over the translations of your content. Worse still, you don’t get any of the traffic benefits, as users are not directed to your site.

The Study

Here’s how the research worked. To understand how translation affects AI visibility, Weglot focused the research on Spanish-language websites across two markets: Spain and Mexico.

The study was then split into two phases. Phase one focused on websites that weren’t translated, and therefore only displayed the language intended for their market, in this case, Spanish.

In that phase, Weglot looked at 153 websites without English translations: 98 from Spain and 55 from Mexico. Weglot deliberately selected high-traffic sites because they offered no English versions.

Phase two involved a comparison group of 83 Spanish and Mexican sites with versions in both Spanish and English. This allowed Weglot to directly compare the performance of translated versus untranslated content.

In total, this generated 22,854 queries in phase one and 12,138 in phase two. The methodology converted the top 50 non-branded keywords of each site into queries that users would likely search, and then these were translated between the Spanish and English versions.

In total, 1.3 million citations were analyzed.

The Key Results

Untranslated Sites Have Very Low AI Search Visibility

The findings show that untranslated websites experience a substantial drop in visibility for searches conducted in non-available languages, despite maintaining strong visibility in the current available language.

Diving deeper into this, untranslated sites essentially lose massive visibility. From the study, even when these Spanish websites performed well in Spanish searches, the sites virtually disappeared in English searches.

Looking at this data further within Google AI Overviews:

  • The sample size of 98 untranslated sites from Spain had 17,094 citations for Spanish queries vs 2,810 citations for the equivalent search in English, a 431% gap in visibility.
  • Taking a look at untranslated sites in Mexico, the study identified a similar pattern. 12,038 citations for Spanish queries vs 3,450 citations for English, showing 213% fewer citations when searching English.

Even ChatGPT, though slightly more balanced, still favored translated sites, with Spanish sites receiving 3.5% fewer citations in English and 4.9% fewer with Mexican sites.

Image created by Weglot, November 2025

Translated Sites Have 327% More AI Search Visibility

But what happens when you do translate your site?

Bringing in the comparison group of Spanish websites that also have an English version, we can see that translated sites dramatically close the visibility gap and that having a second language transformed visibility within Google AI Overviews.

Google AI Overviews:

  • Translated sites in Spain saw 10,046 citations vs 8,048 in English, showcasing only a 22% gap.
  • Translated sites in Mexico showed 5,527 citations for Spanish queries and 3,325 citations for English, and a difference of 59%.

Overall, translated sites achieved 327% more visibility than untranslated ones and earned 24% more total citations per query.

When looking at ChatGPT, the bias almost vanished. Translated sites saw near-equal citations in both languages.

Image created by Weglot, November 2025

Next Steps: Translate Your Site To Boost Global Visibility In AI SERPs

Translation does more than boost visibility, it multiplies it.

Not only does having multiple languages across your site ensure your site gets picked up for searches in multiple languages, but it also adds to the overall visibility of your site as a whole.

The study found that translated sites perform better across all metrics. The data shows that translated sites received 24% more citations per prompt than untranslated sites.

Looking at this by language, translation resulted in a 33% increase in English citations and a 16% increase in Spanish citations per query.

Weglot’s findings indicate that translation acts as a signal of authority and reliability for AIOs and ChatGPT, boosting citation performance across all languages, not only the ones content is translated.

Image created by Weglot, November 2025

AI Search Rewards Translated Content as a Visibility Signal

Traditional international SEO has long focused on hreflang tags and localized keywords. But in the age of AI search, translation itself becomes a visibility signal:

  1. Language alignment: AI engines prioritize content matching the query’s language.
  2. Authority building: Translated content attracts engagement across markets, improving perceived reliability.
  3. Traffic control: Proper translations prevent Google Translate proxies from intercepting clicks.
  4. Semantic reach: Multilingual content broadens your surface area for AI training and citation.

Put simply: If your content isn’t in the language of the question, it’s unlikely it will be in the answer either.

The Business Impact

The consequences aren’t theoretical. One case in Weglot’s dataset, a major Spanish book retailer selling English-language titles worldwide without an English version of its site, shows the impact.

When English speakers searched for relevant books:

  • The site appeared 64% less often in Google AI Overviews and ChatGPT.
  • In 36% of the cases where it did appear, the link pointed to Google Translate’s proxy, not the retailer’s own domain.

Despite offering exactly what English users wanted, the business lost visibility, traffic, and ultimately, sales.

The Bigger Picture: AI Search Is Redefining SEO and Translation Is Now a Growth Strategy

The implications reach far beyond Spain or Mexico, or even the Spanish language.

As AI search evolves, the SEO playbook is expanding. Ranking isn’t just about “position one” anymore; it’s about being cited, summarized, and surfaced by machines trained on multilingual web content.

Weglot’s findings point to a future where translation is both an SEO and an AI strategy and not a localization afterthought.

With Google AIOs now live in multiple languages and ChatGPT integrating real-time web data, multilingual visibility has become an equity issue: sites optimized for one language risk being invisible in another.

Image created by Weglot, November 2025

Final Takeaway: Untranslated Sites Are Invisible in AI Search

The evidence is clear: Untranslated = unseen. Website translation is high up there for AIO visibility.

As AI continues to shape how search engines understand relevance, translation isn’t just about accessibility; it’s how your brand gets recognized by algorithms and audiences alike.

For the easiest way to translate a website, start your free trial now!

Plus, enjoy a 15% discount for 12 months on public plans by using the promo code SEARCH15 on a paid plan purchase.

Image Credits

Featured Image: Image by Weglot. Used with permission.

In-Post Images: Image by Weglot. Used with permission.

SEO Pulse: AI Shopping, GPT-5.1 & EU Pressure On Google via @sejournal, @MattGSouthern

This week’s news in SEO brings changes and questions about control.

Google’s shopping AI moved from showing where to buy to completing purchases directly. Google added structured data for merchant shipping policies. OpenAI released GPT-5.1 with personality controls.

And the EU opened an investigation into Google’s site reputation abuse enforcement. Raising the question, should one gatekeeper control how independent media funds online journalism?

Here’s what you need to know for this week in SEO.

Google’s Shopping AI Completes Transactions Without Your Site

Google rolled out Gemini-powered shopping features that find products, compare prices, and handle checkout across multiple retailers.

AI Mode in Search can now automate checkout on participating merchants’ sites using your saved Google Pay details, so you don’t have to manually fill payment and shipping forms.

For full details, see our coverage: Google Adds AI Shopping Tools Across Search, Gemini.

Key Facts: AI Mode shopping is launching in Search. Agentic checkout works with select retailers. An AI calling feature can confirm stock, price, and availability with local stores. All features are U.S.-only and gradually rolling out.

Why SEOs Should Pay Attention

Google’s moving from showing where to buy things to completing transactions for you. This changes what “search” means for ecommerce sites.

When AI Mode handles checkout across retailers, your website becomes optional infrastructure. Users never see your brand presentation, never encounter your upsells, never make decisions on your pages. Google’s AI extracts the transaction with your site reduced to inventory management.

The local business calling feature shows where this goes. If Gemini calls five restaurants to check availability, users never see your website, reviews, or menu.

The impact goes beyond rankings to the transaction itself. Your SEO strategy is optimized for driving traffic where users make decisions. Google’s building an environment where AI makes those decisions using your business as a data source, not a destination.

Google Adds Structured Data For Merchant Shipping Policies

Google launched support for merchant shipping policy structured data, letting ecommerce sites describe shipping costs, delivery times, and regional availability so they can surface directly in search results.

The markup can appear alongside your products and in relevant knowledge panels for eligible merchants.

You can get the implementation details in our article: Google Launches Structured Data For Merchant Shipping Policies.

Key Facts: Shipping policy structured data appears with eligible merchant listings, with no geographic limits. It supports flat rate, free, and calculated shipping, including delivery times and regional restrictions. Best used with Product structured data for search results. Validation requires Rich Results Test or URL Inspection, as no specific Search Console report exists.

Why SEOs Should Pay Attention

Shipping costs affect purchase decisions before users reach your site. Displaying this information in search results answers a primary objection at the discovery stage.

The markup lets you differentiate on shipping when competitors don’t show it. If you offer free shipping or faster delivery, you can now surface that advantage in search results rather than hoping users click through to find out.

Implementation is straightforward if you already use Product markup. Add the shipping policy structured data to your existing schema and specify rates, zones, and delivery times. This is one of the more actionable structured data updates Google’s released this year.

OpenAI Releases GPT-5.1 With User-Controlled Personality

OpenAI shipped GPT-5.1 models with customizable personality controls and improved instruction following. Users can now adjust ChatGPT’s tone through preset styles or granular characteristic tuning instead of the previous one-size-fits-all approach.

We covered the release here: OpenAI Releases GPT-5.1 With Improved Instruction Following.

Key Facts: GPT-5.1 is now available first to paid users (Pro, Plus, Go, Business), with free access following. Adaptive reasoning optimizes processing time based on query complexity. Legacy GPT-5 models stay available for three months.

Why SEOs Should Pay Attention

You can now customize ChatGPT’s output style to match your needs, rather than editing around the default tone.

The adaptive reasoning means faster responses on simple queries without sacrificing quality on complex requests.

The three-month legacy model availability gives you time to test whether GPT-5.1 performs better for your specific use cases before GPT-5 sunsets.

EU Challenges Google’s Parasite SEO Crackdown

The European Commission opened a Digital Markets Act investigation into whether Google’s site reputation abuse enforcement discriminates against news publishers.

Google published a defense within hours, calling the investigation “misguided” and arguing it protects users from spam. We dug into both sides: Google Defends Parasite SEO Crackdown As EU Opens Investigation.

Key Facts: Publisher groups in France, Germany, Italy, Poland, Spain, and EU countries report significant traffic drops. DMA violations can incur fines up to 10% of global revenue, rising to 20% for repeats.

Why SEOs Should Pay Attention

The investigation exposes the tension publishers face. Google’s definition of “spam” now includes their revenue model.

Publishers aren’t defending payday loan scams on university domains. They’re arguing that sponsored content with editorial oversight shouldn’t be treated the same as affiliate coupon pages designed purely for ranking manipulation.

Google’s position treats business arrangements as ranking signals rather than judging content quality.

If the EU forces exemptions for “legitimate” sponsored content, every spammer will dress up their tactics with editorial oversight theater. The policy only works if it draws lines. But publishers aren’t wrong to question whether one gatekeeper should control how independent media funds journalism.

This Week In SEO: The Balance Of Power Is Shifting

The news from this week tells a bigger story: Search engines are no longer just organizing the web; they’re absorbing it.

The theme of the week? Power and control.

Google’s AI is deciding what users buy and what content deserves to be seen. OpenAI is letting creators shape the AI’s voice for the first time. And regulators are finally asking who gets to define “fair” visibility online.

As AI reshapes discovery, SEOs face the challenge of staying visible in a world where the search interface itself has become the destination.

Top Stories Of The Week:

More Resources:


Featured Image: Pixel-Shot/Shutterstock

PPC Pulse: PMax Expands, Clarity Now Mandatory & AI Max Data Debate via @sejournal, @brookeosmundson

This week, the paid media world focused less on new tools and more on what’s changing beneath the surface.

Google expanded Performance Max into a new channel and offered long-awaited reporting visibility. Microsoft took a firm stance on brand safety by requiring Clarity across its publisher network. And one viral LinkedIn post questioned the effectiveness of Google’s newest “AI-powered” campaign model.

Each of these stories points to the same theme: Platforms are redefining what control and accountability mean for advertisers.

Performance Max Expands To Waze And Adds Channel Reporting

Google confirmed two changes for Performance Max campaigns.

The first notable update is that for PMax campaigns using “Store Visits” as a campaign goal, your business can now show up on Waze ads inventory. The business will show up as a “Promoted Places in Navigation” pin for users.

This update is for all advertisers in the United States, and no additional setup is required.

The second update is that Google rolled out Channel Reporting for all PMax campaigns. While this has been rolling out for a few months now, not every advertiser had this available.

Why Advertisers Should Pay Attention

Local intent now includes the navigation moment. If your brand depends on foot traffic, showing up while someone is driving near a location adds a fresh, real-world touchpoint.

The channel reporting update matters just as much. It helps shift PMax conversations from “trust the system” to “here’s where the system actually worked.”

In my opinion, this is progress on transparency and reach. It also adds variables you’ll be asked to explain.

The win isn’t “more placements.” The win is being able to connect surfaces to outcomes with fewer leaps of faith.

Microsoft Clarity Now Mandatory For Third-Party Publishers

Microsoft Ads Liaison, Navah Hopkins, shared an important announcement for all 3P publishers on Microsoft:

Screenshot taken by author, November 2025

In her post, she mentions that all Microsoft Ads clicks need to make sure those pages have Microsoft Clarity enabled.

Her post got attention from the PPC industry, where she clarified in the comments that an official announcement from Microsoft will be coming out shortly. All Microsoft Ads partners have already been notified via email.

The post also sparked some questions and potential confusion about how Microsoft Ads wouldn’t be charged if they weren’t running Clarity.

Andy Hawes asked:

Thanks for this Navah Hopkins, but when you say “Any Microsoft Advertising clicks that do not have Clarity will be filtered out and result in nonbillable impressions/clicks.” Are you suggesting that if you don’t run clarity then you’re Microsoft Ads won’t cost anything? I’m assuming that is not the case? So could you explain that part please?

Hopkins clarified during the exchange:

Screenshot taken by author, November 2025

Why Advertisers Should Pay Attention

Microsoft seems to be taking a quality stance, not just making a tracking footnote.

Based on the conversation on LinkedIn, Microsoft is tying billable media to verifiable on-site experience. In theory, that should reduce questionable placements and give brands greater confidence that their ads appear in environments that meet baseline standards.

I see this as Microsoft is trading raw reach for higher trust. Advertisers should expect fewer gray-area placements and stronger conversations with brand-safety teams.

It also nudges the market toward a new normal where “transparency” includes a window into on-site behavior, not just a placement report.

The Industry Reacts To AI Max Performance Data

AI Max was another hot topic on LinkedIn this past week.

Xavier Mantica shared four months of results comparing AI Max to traditional match types.

Screenshot taken by author, November 2025

His data showed AI Max at $100.37 per conversion versus $43.97-$61.65 for most non-AI setups (and $97.67 for phrase close variants). His view: AI Max behaves like broad match with a new label, expanding beyond intended relevance and driving up cost.

As of this writing, the post has 991 engagements with over 170 comments from the PPC industry.

How Advertisers Are Reacting

Looking at the comments, it appears that many PPC pros agree that AI Max isn’t living up to the hype that Google made it out to be when originally announced.

Collin Slatterly, Founder of Taikun, shared his skeptical optimism by not just dismissing AI Max entirely, but shared it may just not be ready for its full potential:

Give it a year, and it’ll probably be ready to deploy. Feels like PMax all over again.

One of the top comments to Xavier’s post came from Mike Ryan, who agreed after analyzing 250 campaigns of his own:

Screenshot taken by author, November 2025

There were others in the comments that had the opposite take of Xavier. Denis Capko replied in the comments, stating:

Screenshot taken by author, November 2025

Why Advertisers Should Pay Attention

This debate goes beyond one account. It reflects a wider tension between volume and control.

“AI increases conversions” is only persuasive if cost, relevance, and repeatability hold up under scrutiny.

While the comments seemed overly negative to AI Max, I see it as AI Max feels more like growing pains than failure.

Automation continues to move faster than the frameworks we use to evaluate it, and advertisers are still learning how to guide it effectively.

When data quality, conversion accuracy, and negative signals are strong, AI Max can deliver meaningful scale. But without clear visibility into how the system interprets intent, results can vary widely.

Posts like Xavier’s highlight the need for transparency as much as performance. Google also benefits from that same openness: It builds trust, helps advertisers use automation more responsibly, and ultimately makes the technology stronger for everyone.

Theme Of The Week: Accountability

The updates and discussions this past week all share one thread: accountability.

Google is expanding where automation can go, Microsoft is tightening the standards for who gets to monetize it, and advertisers are rethinking how much control they’re willing to trade for convenience.

As platforms lean further into automation, the real advantage won’t come from who adopts it first. It will come from who understands it best.

Are you confident in what your automation is doing, or just comfortable letting it run?

Top Stories Of The Week:

More Resources:


Featured Image: Roman Samborskyi/Shutterstock

How To Make Search Console Work Harder For You

TL;DR

  1. Search Console has some pretty severe limitations when it comes to storage, anonymized and incomplete data, and API limits.
  2. You can bypass a lot of these limitations and make GSC work much harder for you but setting up far more properties at a subfolder level.
  3. You can have up to 1,000 properties in your Search Console account. Don’t stop with one domain-level property.
  4. All of this allows for far richer indexation, query, and page-level analysis. All for free. Particularly if you make use of the 2,000 per property API URL indexing cap.
Image Credit: Harry Clarkson-Bennett

Now, this is mainly applicable to enterprise sites. Sites with a deep subfolder structure and a rich history of publishing a lot of content. Technically, this isn’t publisher-specific. If you work for an ecommerce brand, this should be incredibly useful, too.

I and it love all big and clunky sites equally.

What Is A Search Console Property?

A Search Console Property is a domain, subfolder, or subdomain variation of a website you can prove that you own.

You can set up domain-level or URL-prefix-level properties (Image Credit: Harry Clarkson-Bennett)

If you just set up a domain-level property, you still get access to all the good stuff GSC offers. Click and impression data, indexation analysis, and the crawl stats report (only available in domain-level properties), to name a few. But you’re hampered by some pretty severe limitations:

  • 1,000 rows of query and page-level data.
  • 2,000 URL API limit for indexation level analysis each day.
  • Sampled keyword data (and privacy masking).
  • Missing data (in some cases, 70% or more).
  • 16 months of data.

While the 16-month limit and sampled keyword data require you to export your data to BigQuery (or use one of the tools below), you can massively improve your GSC experience by making better use of properties.

There are a number of verification methods available – DNS verification, HTML tag or file upload, Google Analytics tracking code. Once you have set up and verified a domain-level property, you’re free to add any child-level property. Subdomains or subfolders alike.

The crawl stats report can be an absolute goldmine, particularly for large sites (not this one!) (Image Credit: Harry Clarkson-Bennett)

The crawl stats report can be extremely useful for debugging issues like spikes in parameter URLs or from naughty subdomains. Particularly on large sites where departments do things you and I don’t find out about until it’s too late.

But by breaking down changes at a host, file type, and response code level, you can stop things at the source. Easily identify issues affecting your crawl budget before you want to hit someone over the head with their approach to internal linking and parameter URLs.

Usually, anyway. Sometimes people just need a good clump. Metaphorically speaking, of course.

Subdomains are usually seen as separate entities with their own crawl budget. However, this isn’t always the case. According to John Mueller, it is possible that Google may group your subdomains together for crawl budget purposes.

According to Gary Illyes, crawl budget is typically set by host name. So subdomains should have their own crawl budget if the host name is separate from the main domain.

How Can I Identify The Right Properties?

As an SEO, it’s your job to know the website better than anybody else. In most cases, that isn’t too hard because you work with digital ignoramuses. Usually, you can just find this data in GSC. But larger sites need a little more love.

Crawl your site using Screaming Frog, Sitebulb, the artist formerly known as Deepcrawl, and build out a picture of your site structure if you don’t already know. Add the most valuable properties first (revenue first, traffic second) and work from there.

Some Alternatives To GSC

Before going any further, it would be remiss of me not to mention some excellent alternatives to GSC. Alternatives that completely remove these limitations for you.

SEO Stack

SEO Stack is a fantastic tool that removes all query limits, has an in-built MCP-style setup where you can really talk to your data. For example, show me content that has always performed well in September or identify pages with a health query counting profile.

Daniel has been very vocal about query counting, and it’s a fantastic way to understand the direction of travel your site or content is taking in search. Going up in the top 3 or 10 positions – good. Going down there and up further down – bad.

SEO Gets

SEO Gets is a more budget-friendly alternative to SEO Stack (which in itself isn’t that expensive). SEO Gets also removes the standard row limitations associated with Search Console and makes content analysis much more efficient.

Growing and decaying pages and queries in SEO Gets are super useful (Image Credit: Harry Clarkson-Bennett)

Create keyword and page groups for query counting and click and impression analysis at a content cluster level. SEO Gets has arguably the best free version of any tool on the market.

Indexing Insight

Indexing Insight – Adam Gent’s ultra-detailed indexation analysis tool – is a lifesaver for large, sprawling websites. 2,000 URLs per day just doesn’t cut the mustard for enterprise sites. But by cleverly taking the multi-property approach, you can leverage 2,000 URLs per property.

With some excellent visualizations and datapoints (did you know if a URL hasn’t been crawled for 130 days, it drops out of the index?), you need a solution like this. Particularly on legacy and enterprise sites.

Remove the indexation limits of 2,000 URLs per day with the API and the 1,000 row URL limit (Image Credit: Harry Clarkson-Bennett)

All of these tools instantly improve your Search Console experience.

Benefits Of A Multi-Property Approach

Arguably, the most effective way of getting around some of the aforementioned issues is to scale the number of properties you own. For two main reasons – it’s free and it gets around core API limitations.

Everyone likes free stuff. I once walked past a newsagent doing an opening day promotion where they were giving away tins of chopped tomatoes. Which was bizarre. What was more bizarre was that there was a queue. A queue I ended up joining.

Spaghetti Bolognese has never tasted so sweet.

Granular Indexation Tracking

Arguably, one of Search Console’s best but most limiting features is its indexation analysis. Understanding the differences between Crawled – Currently Not Indexed and Discovered – Currently Not Indexed can help you make smart decisions that improve the efficiency of your site. Significantly improving your crawl budget and internal linking strategies.

Image Credit: Harry Clarkson-Bennett

Pages that sit in the Crawled – Currently Not Indexed pipeline may not require any immediate action. The page has been crawled, but hasn’t been deemed fit for Google’s index. This could signify page quality issues, so worth ensuring your content is adding value and your internal linking prioritizes important pages.

Discovered – Currently Not Indexed is slightly different. It means that Google has found the URL, but hasn’t yet crawled it. It could be that your content output isn’t quite on par with Google’s perceived value of your site. Or that your internal linking structure needs to better prioritize important content. Or some kind of server of technical issue.

All of this requires at least a rudimentary understanding of how Google’s indexation pipeline works. It is not a binary approach. Gary Illyes said Google has a tiered system of indexation. Content that needs to be served more frequently is stored in a better-quality, more expensive system. Less valuable content is stored in a less expensive system.

How Google crawling and rendering system works (Image Credit: Harry Clarkson-Bennett)

Less monkey see, monkey do; more monkey see, monkey make decision based on the site’s value, crawl budget, efficiency, server load, and use of JavaScript.

The tiered approach to indexation prioritizes the perceived value and raw HTML of a page. JavaScript is queued because it is so much more resource-intensive. Hence why SEOs bang on about having your content rendered on the server side.

Adam has a very good guide to the types of not indexed pages in GSC and what they mean here.

Worth noting the page indexation tool isn’t completely up to date. I believe it’s updated a couple of times a week. But I can’t remember where I got that information, so don’t hold me to that…

If you’re a big news publisher you’ll see lots of your newsier content in the Crawled – Currently Not Indexed category. But when you inspect the URL (which you absolutely should do) it might be indexed. There is a delay.

Indexing API Scalability

When you start working on larger websites – and I am talking about websites where subfolders have well over 500,000 pages – the API’s 2,000 URL limitation becomes apparent. You just cannot effectively identify pages that drop in and out of the “Why Pages Aren’t Indexed?” section.

Not great, have seen worse (Image Credit: Harry Clarkson-Bennett)

But when you set up multiple properties, you can scale effectively.

The 2,000 limit only applies at a property level. So if you set up a domain-level property alongside 20 other properties (at the subfolder level), you can leverage up to 42,000 URLs per day. The more you do, the better.

And the API does have some other benefits:

But it doesn’t guarantee indexing. It is a request, not a command.

To set it up, you need to enable the API in Google Cloud Console. You can follow this semi-helpful quickstart guide. It is not fun. It is a pain in the arse. But it is worth it. Then you’ll need a Python script to send API requests and to monitor API quotas and responses (2xx, 3xx, 4xx, etc.).

If you want to get fancy, you can combine it with your publishing data to figure out exactly how long pages in specific sections take to get indexed. And you should always want to get fancy.

This is a really good signal as to what your most important subfolders are in Google’s eyes, too. Performant vs. under-performing categories.

Granular Click And Impression Data

An essential for large sites. Not only does the default Search Console only store 1,000 rows or query and URL data, but it only stores it for 16 months. While that sounds like a long time, fast forward a year or two, and you will wish you had started storing the data in BigQuery.

Particularly when it comes to looking at YoY click behavior and event planning. The teeth grinding alone will pay for your dentist’s annual trip to Aruba.

But by far and away the easiest way to see search data at a more granular level is to create more GSC properties. While you still have the same query and URL limits, because you have multiple properties instead of one, the data limits become far less limiting.

What About Sitemaps?

Not directly related to GSC indexation, but a point of note. Sitemaps are not a particularly strong tool in your arsenal when it comes to encouraging indexing of content. The indexation of content is driven by how “helpful” it is to users.

Now, it would be remiss of me not to highlight that news sitemaps are slightly different. When speed to publish and indexation are so important, you want to highlight your freshest articles in a ratified place.

Ultimately, it comes down to Navboost. Good vs. bad clicks and the last longest click. Or in more of a news sense, Glue – a huge table of user interactions, designed to rank fresh content in real-time and keep the index dynamic. Indexation is driven by your content being valuable enough to users for Google to continue to store in its index.

Glue emphasizes immediate interaction signals like hovers and swipes for more instant feedback (Image Credit: Harry Clarkson-Bennett)

Thanks to decades of experience (and confirmation via the DoJ trial and the Google Leak), we know that your site’s authority (Q*), impact over time, and internal linking structure all play a key role. But once it’s indexed, it’s all about user engagement. Sitemap or no sitemap, you can’t force people to love your beige, miserable content.

And Sitemap Indexes?

Most larger sites use sitemap indexes. Essentially, a sitemap of sitemaps to manage larger websites that exceed the 50,000 row limit. When you upload the sitemap index to Search Console, don’t stop there. Upload every individual sitemap in your sitemap index.

This gives you access to indexation at a sitemap level in the page indexing or sitemaps report. Something that is much harder to manage when you have millions of pages in a sitemap index.

Seeing data at a sitemap level gives more granular indexation data in GSC (Image Credit: Harry Clarkson-Bennett)

Take the same approach with sitemaps as we have discussed with properties. More is generally better.

Worth knowing that each document is also given DocID. The DocID stores signals to score the page’s popularity: user clicks, its quality and authoritativeness, crawl data, and a spam score among others.

Anything classified as crucial to ranking a page is stored and used for indexation and ranking purposes.

What Should I Do Next?

  1. Assess your current GSC setup – is it working hard enough for you?
  2. Do you have access to a domain-level property and a crawl stats report?
  3. Have you already broken your site down into “properties” in GSC?
  4. If not, crawl your site and establish the subfolders you want to add.
  5. Review your sitemap setup. Do you just have a sitemap index? Have you added the individual sitemaps to GSC, too?
  6. Consider connecting your data to BigQuery and storing more than 16 months of it.
  7. Consider connecting to the API via Google Cloud Console.
  8. Review the above tools and see if they’d add value.

Ultimately, Search Console is very useful. But it has significant limitations, and to be fair, it is free. Other tools have surpassed it in many ways. But if nothing else, you should make it work as hard as possible.r

More Resources:


This post was originally published on Leadership in SEO.


Featured Image: N Universe/Shutterstock

Is Google About To Go Full AI Mode? via @sejournal, @wburton27

AI search is rapidly changing the way people discover content and engage with brands. Logan Kilpatrick, Google’s lead product manager for AI products, suggested in an X (formerly Twitter) post that “AI Mode” will become the default for Google Search “soon” and then reclarified his statements, but if that happens, what could potentially happen to the SEO industry, especially with over 100 million monthly active users searching in AI Mode, according to Google?

Screenshot from X (Twitter), November 2025

Let’s explore some potential possibilities, but before we do, let’s distinguish the differences between AI Mode and AI Overviews, as there is a clear distinction between the two.

AI Overviews Vs. AI Mode

AI Overviews are short, AI-generated summaries that appear above traditional search results for some queries that help users find information quickly. AIOs provide quick, concise answers and save users time by reducing the need to click on links, reducing clicks and traffic to brands.

AI Mode is a more advanced, interactive search experience that might replace the standard search results page in the future, as it helps with complex, multi-step, or open-ended questions by providing a more comprehensive and conversational AI-powered response if you want to follow up and learn more. Google added “AI Mode” on its search page earlier this year, looking to retain its millions of users from going away to other AI models.

What Could Potentially Happen If AI Mode Becomes Default?

If Google decides to switch to AI Mode by default, brands will definitely see a decrease in organic traffic, since users will get direct answers to their queries and won’t need to click through to websites, because they will find what they need right in the AI Mode. With AI Overviews, this is a trend that we are already seeing happening, but if AI Mode becomes the default, this will further reduce clicks.

Brands May Rely More On Paid For Visibility  

Currently, the way AI Mode is designed, there are no ads and no way for Google to monetize the interface, but that is all about to change, and change extremely fast. Google’s head of Search, Liz Reid, shared a look into how the company is navigating its transition into the AI era – and how it’s thinking about keeping its multibillion-dollar ad business alive.  In 2024, Google made 264.59 billion in ads, according to Statista, and it’s been growing year over year.  

Screenshot from Statista, November 2025

Google is beginning to roll out ads in AI Mode, but it’s in its infancy. Google is looking into showing ads when they’re high-quality and relevant. Since queries are 2x to 3x longer than they are on main search, which means they can do better targeted, higher quality ads, according to Liz Read. Brands that can afford to be visible in AI Mode paid results will benefit from being visible, but brands that only focus on traditional SEO tactics and strategies could be left behind.

Google has also added advertisements to AI Overviews, increasing Search ad sales, so we can expect the same from AI Mode.

A Potential Shift In Visibility And Discovery

AI search is causing us to move away from traditional SEO metrics, i.e., keyword rankings and click-through rates, to brand visibility and relevance. Your brand should be cited as the authoritative source for AI answers, and if your brand is not visible as the answer, then you will lose more clicks.

Measurement

Tracking the customer journey may become harder because users interact within the AI interface rather than on your brand’s website. Traditional analytics will provide fewer insights and will cause brands to develop new metrics focused on AI citations, brand mentions, and local visibility. We are seeing this already with the emergence of AI tools, from traditional players like Semrush, Ahrefs, and new AI players like PeecAI and Profound, to name a few. 

Loss Of Control Over Brand Narrative

Since AI Overviews are taking information from various online sources to build a brand’s presence, if your brand does not have a good brand strategy and has inconsistent, outdated, or poorly managed information across the web, i.e., reviews, social signals, and local listings, etc., then AI may inaccurately represent your brand across the web. 

What Could Potentially Happen To Google Chrome?

If Google does go to full AI Mode by default, Chrome could potentially undergo a major transformation with deep integration of Gemini and other AI capabilities, which would change the web browsing experience from a passive tool to a proactive, intelligent assistant. According to eMarketer, Gemini is growing its user base faster than ChatGPT.

Screenshot from eMarketer, November 2025

ChatGPT has already opened its AI browser ChatGPT Atlas, which is currently only available on macOS and is challenging Google Chrome.

Screenshot from ChatGPT Atlas, November 2025

If Google Does Make AI Mode Default, What Can We Do?

  • Experiment with AI paid ads when they become available and put aside some budget and test the impact and return on investment (ROI) of ads in AI Mode.
  • Focus on making sure conversion funnels and processes are easy and provide a good user experience.
  • Be present and have great content everywhere your audience is. Your brand must have a strong brand presence across Reddit, Quora, YouTube, OpenAI, Perplexity, etc., and other places where end users are looking for information about your brand. For example, Apple is looking at search options on Safari, which could end its partnership with Google, but at the end of the day, we will see if Google will maintain the relationship or Apple will go somewhere else, like OpenAI, which could boost traffic and get more users using OpenAI or another large language model (LLM).
  • Continue to optimize for AIO by creating high-quality, authoritative content that directly answers user questions, is well-structured, and easy for AI to understand. This involves creating new content and refreshing your old content with up-to-date research, original information, and different perspectives.

Wrapping Up

The shift toward AI-powered search isn’t hypothetical anymore; it’s actually here and moving fast. With AI Overviews and AI Mode gaining traction among more than 100 million monthly users, Google is positioning itself for a future where conversational, answer-focused experiences may replace traditional search results.

If AI Mode becomes the default search for Google, it won’t just change how users search; it will fundamentally reshape how brands earn visibility, traffic, and trust online.

For brands, publishers, and SEOs, this transition presents both risks and opportunities. Organic traffic will almost certainly decline as more answers stay within Google’s ecosystem. Paid visibility in AI results will grow rapidly, favoring brands with budgets and adaptable strategies. And success will depend less on ranking for keywords and more on becoming a trusted source that AI cites, references, or recommends across platforms.

This era will demand a new kind of optimization centered on brand authority, AI citations, structured data, user trust signals, and multi-platform presence.

No one has a crystal ball and knows what a full AI Mode future looks like, but brands that adapt early will be the market share leaders, and those that wait will lose visibility, traffic, and relevance.

More Resources:


Featured Image: Collagery/Shutterstock

Google is still aiming for its “moonshot” 2030 energy goals

Last week, we hosted EmTech MIT, MIT Technology Review’s annual flagship conference in Cambridge, Massachusetts. Over the course of three days of main-stage sessions, I learned about innovations in AI, biotech, and robotics. 

But as you might imagine, some of this climate reporter’s favorite moments came in the climate sessions. I was listening especially closely to my colleague James Temple’s discussion with Lucia Tian, head of advanced energy technologies at Google. 

They spoke about the tech giant’s growing energy demand and what sort of technologies the company is looking to to help meet it. In case you weren’t able to join us, let’s dig into that session and consider how the company is thinking about energy in the face of AI’s rapid rise. 

I’ve been closely following Google’s work in energy this year. Like the rest of the tech industry, the company is seeing ballooning electricity demand in its data centers. That could get in the way of a major goal that Google has been talking about for years. 

See, back in 2020, the company announced an ambitious target: by 2030, it aimed to run on carbon-free energy 24-7. Basically, that means Google would purchase enough renewable energy on the grids where it operates to meet its entire electricity demand, and the purchases would match up so the electricity would have to be generated when the company was actually using energy. (For more on the nuances of Big Tech’s renewable-energy pledges, check out James’s piece from last year.)

Google’s is an ambitious goal, and on stage, Tian said that the company is still aiming for it but acknowledged that it’s looking tough with the rise of AI. 

“It was always a moonshot,” she said. “It’s something very, very hard to achieve, and it’s only harder in the face of this growth. But our perspective is, if we don’t move in that direction, we’ll never get there.”

Google’s total electricity demand more than doubled from 2020 to 2024, according to its latest Environmental Report. As for that goal of 24-7 carbon-free energy? The company is basically treading water. While it was at 67% for its data centers in 2020, last year it came in at 66%. 

Not going backwards is something of an accomplishment, given the rapid growth in electricity demand. But it still leaves the company some distance away from its finish line.

To close the gap, Google has been signing what feels like constant deals in the energy space. Two recent announcements that Tian talked about on stage were a project involving carbon capture and storage at a natural-gas plant in Illinois and plans to reopen a shuttered nuclear power plant in Iowa. 

Let’s start with carbon capture. Google signed an agreement to purchase most of the electricity from a new natural-gas plant, which will capture and store about 90% of its carbon dioxide emissions. 

That announcement was controversial, with critics arguing that carbon capture keeps fossil-fuel infrastructure online longer and still releases greenhouse gases and other pollutants into the atmosphere. 

One question that James raised on stage: Why build a new natural-gas plant rather than add equipment to an already existing facility? Tacking on equipment to an operational plant would mean cutting emissions from the status quo, rather than adding entirely new fossil-fuel infrastructure. 

The company did consider many existing plants, Tian said. But, as she put it, “Retrofits aren’t going to make sense everywhere.” Space can be limited at existing plants, for example, and many may not have the right geology to store carbon dioxide underground. 

“We wanted to lead with a project that could prove this technology at scale,” Tian said. This site has an operational Class VI well, the type used for permanent sequestration, she added, and it also doesn’t require a big pipeline buildout. 

Tian also touched on the company’s recent announcement that it’s collaborating with NextEra Energy to reopen Duane Arnold Energy Center, a nuclear power plant in Iowa. The company will purchase electricity from that plant, which is scheduled to reopen in 2029. 

As I covered in a story earlier this year, Duane Arnold was basically the final option in the US for companies looking to reopen shuttered nuclear power plants. “Just a few years back, we were still closing down nuclear plants in this country,” Tian said on stage. 

While each reopening will look a little different, Tian highlighted the groups working to restart the Palisades plant in Michigan, which was the first reopening to be announced, last spring. “They’re the real heroes of the story,” she said.

I’m always interested to get a peek behind the curtain at how Big Tech is thinking about energy. I’m skeptical but certainly interested to see how Google’s, and the rest of the industry’s, goals shape up over the next few years. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.