Just announced at Google Marketing Live, Google is launching Smart Bidding Exploration, a new opt-in feature designed to help advertisers capture more conversions from their existing campaigns.
This update marks one of the most significant changes to Google Ads bidding over a decade.
This isn’t a cosmetic update or a tweak to an existing bidding model.
It’s a fundamental shift in how Google allows advertisers to find value in queries they’ve likely been overlooking.
If you’re focused on maximizing ROAS or sticking tightly to past performance data, this is one update worth paying attention to.
How Does Smart Bidding Exploration Work?
Smart Bidding Exploration works within the bounds of your existing campaign structure.
It doesn’t expand your audience targeting or broaden your keyword strategy (no pun intended).
Instead, it allows the bidding algorithm to more aggressively pursue opportunities you were eligible for, particularly on Broad match and Dynamic Search Ads (DSA) campaigns.
But, there is a catch for using it: you’ll need to allow some flexibility in your ROAS targets to use it.
Advertisers can opt into Smart Bidding Exploration by giving Google permission to bid below their typical ROAS threshold, generally in the 10-30% range.
That means Google may raise your bids on certain queries if its AI systems determine those queries could convert at a healthy volume and cost.
Smart Bidding Exploration is a different approach to just adjusting ROAS targets across the board at the campaign level. In fact, constantly adjusting ROAS targets could cause more volatility in performance instead of improving it.
Instead, Smart Bidding Exploration fine-tunes bidding for queries that would otherwise be filtered out.
What You Can Expect From Reporting
While advertisers won’t see a detailed breakout of every new search query due to privacy threshold, Google is giving visibility into the impact of Smart Bidding Exploration through the Bid Strategy report.
You’ll be able to track:
The number of unique search categories generating impressions and conversions
How much traffic came from these categories
The volume of new conversions compared to your baseline
While the reporting is currently aggregate, Google is looking for more granular visibility on the roadmap.
The feature is also compatible with Drafts & Experiments, so you can run clean A/B tests to isolate results.
Support for Portfolio Bid Strategies is included at launch, and SA360 support is expected soon.
Why Should Advertisers Test This?
For marketers managing Search campaigns that have stalled in growth or seem overly narrow in scope, this could be a solid way to unlock additional conversions.
The feature offers a way to capture more conversions without blowing up campaign structure or budget.
Additionally, this feature is not changing your audience targeting. That’s an important distinction.
For example, Optimized Targeting on Display or Demand Gen actively expands who sees your ads.
Smart Bidding Exploration doesn’t do that. It keeps your targeting exactly as is, but unlocks the potential to show up for queries you wouldn’t have previously been eligible to show for, all within your existing targeting.
If you’re running campaigns that are too tightly bound by a strict ROAS target, you may be unintentionally capping performance.
Smart Bidding Exploration is a way to loosen those constraints just enough to let Google’s AI find opportunities you didn’t realize were there.
What This Signals From Google
Smart Bidding Exploration is more than just a new feature toggle.
It’s a fundamental shift in how we think about conversion opportunity within Google Ads.
Marketers are often pushed to optimize for what they already know works, especially under pressure to hit ROAS or CPA goals. But that approach can keep you from capturing the full value of the market.
With Smart Bidding Exploration, Google seems to be nudging advertisers to stop optimizing for comfort and start optimizing for growth.
During its annual Google Marketing Live event for advertisers, Google announced upgrades to its AI measurement tools, making access easier for small brands.
These updates were shared ahead of time with Search Engine Journal during an exclusive preview event, which showcase Google’s continued investment in providing advertisers of all sizes better visibility into performance, incrementality, and return on ad spend.
Here’s what’s coming for marketers, and why you should pay attention.
Incrementality Testing: Becoming More Accessible
Measurement has always been a pain point for marketers. We spend time and budget driving performance, but often struggle to prove what’s truly moving the needle.
Historically, incrementality testing in Google Ads was only feasible for high-spending accounts, requiring at least $100K in budget to run.
That changed today, as Google is lowering the spend requirement to just $5,000 per incrementality test.
That lowered threshold opens the door for many mid-market (and even smaller) advertisers to start running controlled tests that measure the true lift driven by their ads. Not just looking at conversions that likely would’ve happened anyway.
Credit: Google
In addition to the lower threshold, Google is rolling out a new Bayesian-based methodology that increases the chances of getting conclusive results.
Tests can now run as short as 7 days or up to 56, with 28 days considered the current best practice.
With this update, marketers no longer have to rely on directional data or last-click attribution.
They’ll be able to isolate the impact of their Google Ads campaigns and adjust budgets or creative with more confidence.
Cross-Channel Measurement Is Getting Smarter Inside Google Analytics
Another big enhancement is happening within Google Analytics.
Marketers will soon be able to see more comprehensive cross-channel performance (including impressions ) across Google properties and other platforms.
The aim is to help teams better map the full customer journey and more accurately calculate ROI.
While not all of this is live just yet, Google says deeper insights are on the way in the coming months.
This should be particularly useful for brands running Performance Max or upper-funnel campaigns across multiple surfaces.
Visibility into pre-click data has historically been limited, so any lift in impression-level reporting across channels is a step forward.
Data Manager: A Central Tool For First-Party Data Activation
Google is also introducing Data Manager as a centralized tool to help marketers collect, store, and activate their first-party data . It’s got all the existing privacy protections baked in.
With the rise of privacy regulations and cookie deprecation looming, brands have been scrambling to figure out how to make better use of their owned data.
Data Manager acts as a one-stop shop, using confidential computing to ensure sensitive data stays protected and is only used for authorized purposes.
Credit: Google
Marketers can expect upcoming features like data strength recommendations, which will help identify gaps in your data strategy and offer actionable ways to improve it.
To streamline things further, Google is also launching a new Data Manager API. This update consolidates multiple APIs into a single schema, helping developers connect audience and conversion data more easily across Google Ads, GA4, and GMP.
This might not be something every marketer will use directly, but it has major implications for teams that rely on agency or partner integrations to power their campaigns.
It reduces the technical lift required to activate more first-party data signals across platforms.
Why Marketers Should Pay Attention
One of the most notable parts of this update is who these tools are built for.
In the past, many of Google’s advanced measurement tools were only accessible to advertisers with deep pockets and large internal data teams.
That left small-to-mid-sized businesses at a disadvantage when it came to proving performance or scaling their investment.
These new AI measurement tools show a clear move toward making enterprise-grade measurement more attainable for all.
For marketers under pressure to drive measurable results without doubling spend, that’s welcome news.
We’re also seeing Google start to shift more clearly toward cross-channel, privacy-safe measurement with a bigger emphasis on first-party data.
Even with all the change in direction of third-party cookie deprecation (and reversal of that decision), these tools seem to be solid building blocks that marketers can use as privacy regulations continue to adapt across the world.
Looking Ahead
The latest updates from Google Ads mark a meaningful shift toward making AI-powered measurement smarter, faster, and more accessible.
From more affordable incrementality testing to a consolidated way to activate your first-party data, these tools promise better insights without the enterprise-level budget.
Marketers still need to approach these tools with a critical eye. AI-powered doesn’t mean hands-off.
You’ll want to validate the data assumptions being used, and stay involved in shaping your own measurement strategy.
Now, it feels like marketers with modest budgets aren’t stuck on the sidelines.
Which of these new measurement tools are you looking forward to trying within your accounts?
Today’s Memo is an updated version of my previous guides on topical authority, one that takes the Google leaks, documents revealed in Google lawsuits, my recent UX study of AIOs, and the latest shifts in the search landscape into account.
Image Credit: Lyna ™
I think this is one of these concepts that can fly under the radar in the AI and Search conversation, but it’s actually important.
I’ll cover:
The idea behind topical authority and why you should pay attention to it.
How to measure topical authority.
What internal Google documents and leaks say about topical authority.
How Google and LLMs could understand topical authority.
What concrete levers you should pull to build topical authority.
I would argue that, along with brand authority, topical authority matters more now than ever.
But before we dig in, we have to address the reality of our current search situation:
You and your team have likely poured countless hours into classic SEO plays, content clusters, and link‑building, only to watch your organic clicks plateau (or even dip) as AIOs claim more SERP real estate.
Heck, countless sites have been losing organic traffic since late 2023 due to meager topical authority.
Meanwhile, stakeholders crave confidence that your AI era playbook is working.
Topical authority is a critical concept for both the old and new SEO era.
In fact, a recent Graphite study found that pages with high topical authority gain traffic 57% faster than those with low authority – proof that “covering your bases” can still pay dividends in speedy visibility gains. And the study showed that topical authority can increase the percentage of pages that get visibility in the first three weeks.1
I’m working on a workflow for paid subscribers that makes tracking topic‑level gains easier. The anticipated launch date of the workflow is in June. Upgrade to paid so you don’t miss it.
I used to dismiss topical authority as an SEO ghost concept. You know, one of those buzz‑terms people use to justify link‑building or content‑depth plays.
But back in 2022, I was wrong: It’s far from a ghost.
In fact, internal docs leaks and public signals from Google show that topical relevance, i.e., how completely a site covers related entities and questions, is a real and important factor in ranking.
And in today’s era of AIOs and LLM‑powered snippets, brand authority (a close cousin of topical authority) can be the difference between earning the click or being buried beneath an AI summary.
How Is The SEO Community Defining Topical Authority Post-AIOs?
The idea behind topical authority is that by covering all aspects of a topic (well), sites get a ranking boost because Google sees them as an authority in the topic space.
On the other end of the spectrum would be sites that only touch the surface of a topic.
Here’s how the SEO community has defined topical authority over time:
Topical Authority is a way of balancing the PageRank for finding more authoritative sources with the information on the sources.
Topical authority can be described as “depth of expertise.” It’s achieved by consistently writing original high-quality, comprehensive content that covers the topic.
Topical authority is a perceived authority over a niche or broad idea set, as opposed to authority over a singular idea or term.
Topical authority is one of the ways Google measures “quality” as a ranking factor – along with page authority and domain authority.
Based on that, here’s how I see topical authority (a.k.a. topical relevance) showing up in SERPs today. It includes:
Depth of expertise: Consistently publishing original, high‑quality content that covers all facets of a topic.
Entity coverage: Matching your content’s scope against Google’s own understanding of entity relationships – i.e., how well you hit the concepts Google expects for a given topic.
Backlink and mention signals: Earning links and web mentions from other trusted sources that reinforce your authority within that topic space. Think quality mentions over quantity here.
Final answers: How often your site provides the final answer (think completes the user journey) for searchers with a specific problem in a specific topic.
Semantic proximity matters, too. It’s not just about the volume of topic coverage, but about meaningfully addressing subtopics and related questions across your topics – think token overlap or topic‑model similarity between your pages and “ideal” topic coverage.
And information gain comes into play here also: What new, non-consensus information are you adding to the targeted topic?
Our SEO team brings the concept of topical authority to me as an argument to invest more resources in content, backlinking, and digital PR, but they can’t really back up the concept.
I’ve read a ton of articles about topical authority and have had more conversations about it than I can count. This is how I make sense of the idea:
Google rewards sites that cover a topic in-depth.
It does so by comparing how well the site covers relevant entities with Google’s own understanding of entity relationships.
Google matches its own understanding with other factors like the site’s backlink profile and mentions on the web, user behavior, and brand combination searches (brand + generic keyword).
However, here’s the proof that it’s not a ghost concept and the concept does matter to earn organic visibility:
Leaked Google documents: The Google ranking factors leak verified the use of site‑level quality and “domain authority” signals, suggesting it uses whitelists of trusted sources for sensitive topics such as health or finance.
News topic authority signals: Google’s May 2023 Search Central post on “Understanding News Topic Authority” describes how it gauges a publication’s expertise across specialized verticals like finance, politics, and health.2
To better surface relevant, expert, and knowledgeable content in Google Search and News, Google developed a system called topic authority that helps determine which expert sources are helpful to someone’s newsy query in certain specialized topic areas, such as health, politics, or finance.2
Yandex leaked documents: Similar to Google, leaked Yandex materials indicate they factor in topic‑graph coverage when ranking news and content hubs (i.e., how many semantically related subtopics a site authoritatively addresses).
Google documents revealed in lawsuits: As reported by Danny Goodwin over at Search Engine Land, the trial exhibits released for the Google legal proceedings by the Department of Justice contain additional verification for the existence and importance of “topicality.” Key components include the ABC signals: Anchors (A): Links from a source page to a target page, Body (B): Terms in the document and Clicks (C): How long a user stayed on a linked page before returning to the SERP.
Together, the guidance from Google and leak confirmations make it very clear: Topical authority matters … even if sometimes it goes by a different name.
It isn’t just SEO folklore; it’s a (kind of) measurable signal of how comprehensively and credibly your site covers a topic, which is more important than ever in an AIO-saturated SERP.
Even though 15% of daily Google searches are new, websites cannot get more traffic than there are searches. That means the traffic from keywords within a topic is also limited by the number of searches.
In plain words:
The easiest way to measure topical authority is the share of traffic a site gets from a topic. I call this Topic Share, similar to market share or share of voice.
This is a very practical approach because it factors in the following:
Rank, driven by backlinks, content depth/quality, and user experience.
The easiest way to find an entity is by looking at whether Google shows a Knowledge Panel for it in the search results or not.
Next month, paid subscribers will get my topical authority workflow. Don’t miss out. Upgrade here.
In theory, those 29,000 keywords reflect 100% Topic Share. If a domain ranked No. 1 for all of them, it would have the highest Topic Share.
If it would magically rank No. 1 for all keywords, it would have 100% Topic Share, which is practically impossible.
As a result, we need to use Topic Share comparatively, meaning in comparison with other sites.
For “ecommerce,” I calculated Topic Share based on the top 3,000 keywords by search volume. Shopify is leading with 11% Topic Share, closely followed by Bigcommerce with 10% and Nerdwallet with 3%.
Image Credit: Kevin Indig
Here’s another example with a smaller topic.
“Spend analysis” has 142 keywords in Ahrefs when I first used this example. Following the same process, jaggaer.com has the highest Topic Share with 15%, Sievo 13%, and Tipalti 7%.
Image Credit: Kevin Indig
To track Topic Share continuously, you could set up a rank tracking project in Ahrefs and monitor traffic share for these keywords. However, for large topics, this might not be cost-efficient.
And if you wanted to do this for multiple topics, you would quickly get into the 100,000s of keywords to track.
The best solution I see is running this analysis once a month and tracking changes manually. (It’s not efficient but practical.)
Example: “Contract Lifecycle Management”
Another example is the topic “contract lifecycle management,” which has ~480 keywords.
Icertis and Contractworks are leading the topic, followed by Gartner, Docusign, Salesforce, and Ironclad.
Image Credit: Kevin Indig
If this process is so manual, is it worth the work to measure it every month?
In some cases, yes. If you need to demonstrate to your stakeholders in a practical way whether or not resources and investment into building topical authority are working, then you should measure it.
And what if you need to prove to stakeholders that you need to invest in topic X instead of topic Y for quicker SEO gains?
By scoring how well you currently cover each subtopic, you can identify the core topics Google already finds you an authority in.
Because putting resources in that specific topic will likely move the needle most and could have the quickest SEO ROI.
If you’re in a major growth push into a new topic area (based on a new service, product feature, etc.), it’s valuable to track and measure topical authority to understand how you’re progressing in Topic Share, based on who your competitors are, and what it takes to develop topical authority in your niche.
But if you commit to monitoring it over time, you can also correlate your topic share to your tracking for AIO and LLM visibility.
Find out what topics overlap and why. Discover what topics Google finds you an authority in, while LLMs don’t.
1. Content Breadth & Depth Essentially, how many pages (quantity) or target queries/subtopics does your site have within a topic, and how good are they (quality)?
This is your content library’s comprehensiveness and utility. Thoroughly explore every facet of your target topic: definitions, use cases, common questions, and related subtopics.
Comprehensive, well‑structured content shows both users and search engines that you’re the go‑to resource on your targeted topics and is actually adding to the overall topical conversation, rather than a site that only skims the surface.
Use entity‑based tactics or AI‑powered similarity scores to ensure you’re covering the concepts and questions Google associates with your topic.
2. Smart Internal Linking
Internal links are signals for the relationship between articles about a topic.
Optimizing the anchor text, context, and number of internal links sends stronger signals to Google and helps users find what they’re looking for.
3. Topically Relevant Backlinks And Mentions
Backlinks provide another confidence layer for Google that your content is good and relevant for a specific topic.
Aim for backlinks and mentions from trusted sites in adjacent categories.
Getting mentioned or linked in the Wall Street Journal’s retail section (www.wsj.com/business/retail) is more valuable for Shopify than Salesforce, for example.
4. Prune Content
I did a deep dive on IBM and Progressive, two organizations that are winning the SEO game in competitive topics. Both sites went through massive pruning efforts to improve domain authority.
And in SEOzempic, I showcased where DoorDash actually lost organic traffic by multiplying pages. Topical authority is all about hyperfocusing on the topics that are most relevant to your business, not having the most pages.
All of these businesses saw their organic traffic roar after pruning topically irrelevant content – in some cases, even high-quality content that just wasn’t a good fit for the domain (like Progressive’s agent pages).
Retrieval-augmented generation (RAG) – the grounding mechanism behind OpenAI’s, Google’s, Meta’s and others’ LLMs – explicitly ranks external documents for authority before passing them to the model to ground its answer.
Their technical notes stress pulling “current and authoritative sources” to reduce hallucination.
But it’s not just AIOs. The top 10% of most visible content in ChatGPT and other LLMs also rewards comprehensive content that matches the ideal profile of high authority.
Paid subscribers: I’m releasing a topical authority workflow for you soon (anticipated next month!). Not a paid subscriber yet? Don’t miss this! Upgrade here.
Topical Authority Predictions For The Future Of SEO
As we’ve seen in the example of HubSpot and other sites, straying away too far from your core topics is a serious SEO risk.
More context: https://surferseo.com/blog/hubspot-traffic-drop/
I call this “overclustering.” Essentially, overclustering is when topic clusters unrelated to your core offerings may dilute your brand if you stretch into tangential topics and subtopics. The next core update could cut a significant chunk of your traffic.
However, major authoritative brands will continue to dominate, despite the fact that their brand isn’t an authority on every topic – and possibly even for niche queries – due to entrenched domain-level trust. The most prominent examples are Forbes and LinkedIn.
A hidden opportunity exists in AI Overview citations, which sometimes surface smaller sites with strong topical authority on a very specific subtopic or a piece of content with a unique perspective, making it crucial to maintain deep coverage in your niche to get “picked” by AIO algorithms.
Human signals rebound: As AI content saturates the web, Google may place renewed emphasis on behavioral metrics (CTR, dwell time, return visits) to distinguish genuinely authoritative sources from AI‑built noise.
We know that humans prefer answers from other humans as a way to balance AI answers from the usability study I published last week.
How To Approach Topical Authority As An SEO In A Volatile Search Landscape
I think, at the core, there are two questions you need to ask about your brand:
1. Credibility: Are we “credible” enough to target this topic? Do we already have enough depth, expertise, and context?
2. Growth: What’s our roadmap for expanding that authority over time, especially as AI‑generated content and LLM snippets flood the SERP?
As a new way of searching takes over (and as AI continues to flood the web with consensus content), search engines will lean harder on authentic, useful, and authoritative sources.
True topical authority isn’t about checking boxes. It’s about earning the perception of being an authority in a space from humans and algorithms alike.
Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!
Remember when “Just Google it” was the solution to all your search needs? Unfortunately, those days are changing fast.
While Google remains the king of search, the ground beneath its feet is shifting as brands, marketers, and end users are noticing that there are some new sheriffs in town.
The search ecosystem that puts all your eggs in the Google basket might not be a wise move anymore.
Today’s search landscape isn’t just about algorithm updates and being visible on Google. It’s about recognizing that your audience exists across multiple touchpoints from traditional search engines, i.e., from Bing and Google to AI chatbots, from social platforms to specialized marketplaces like Amazon, etc.
The businesses that thrive won’t be the ones waiting to see what happens; they will be the pioneers already establishing a strong presence across this expanding universe of search.
Google Still Dominates, But It Is Being Challenged
Google is facing some competition.
StatCounter shows Google’s global search share dropped below 90% and remained there throughout the last quarter of 2024, marking the first such decline in nearly a decade.
This shift coincides with significant legal headwinds.
In 2025, Google faces multiple antitrust challenges, with a judge recently finding that Google has a monopoly in search and has acted to maintain it.
These legal troubles might cause Google to change its business practices and may have an impact on its market dominance, allowing other social and AI platforms to capture more of Google’s market share.
This does not mean that Google is going down; it just signifies that Google is no longer the only game in town, and therefore relying on Google only could be increasingly risky.
For example, if you’re an ecommerce retailer that generates 60-80% of your traffic from Google and your site experiences a temporary drop in visibility during a core update for creating AI content, you would be in big trouble.
If your marketing strategy does not have any alternative traffic sources, your revenue could potentially decrease by 40% or more in a matter of weeks.
Meanwhile, if your competitors have diversified their digital presence across multiple platforms, including AI shopping assistants and social commerce channels, they might experience only minor fluctuations in their traffic and sales.
It’s An Omnichannel World
Your audience does not think in terms of platforms; they think in terms of their needs.
For example, a user might ask ChatGPT for information on sustainable materials, browse Instagram for some home design inspiration, check Amazon for product comparisons, and then Google specific brands before making a purchase.
This changing customer journey means that businesses must be acutely aware of where their traffic originates and how much traffic comes from various sources.
The days of relying on and checking only your Google Analytics for Google traffic are over.
In order to succeed, you must have a holistic view of your visibility across the entire digital landscape.
For example, my friend Claudia has an outdated kitchen and is looking to get a new one after 20 years of living in her home with her family.
Here is what Claudia’s journey looked like in this new ecosystem:
Claudia started out by going to ChatGPT and typing in “best kitchen design brands,” and found some information mentioning several designer brands.
Since the intent behind kitchen design is image-based, Claudia then searches on Pinterest for visual inspiration, and saves some images from the designer brands that she found in ChatGPT.
Claudia then looks to Reddit to gather feedback about specific brands and learn from others’ experiences.
She checks YouTube for installation tutorials but decides she needs a professional.
Claudia then Googles local contractors with high ratings and reviews, contacts one of them, and gets a quote.
Screenshot from ChatGPT, April 2025
Now, if you’re a business that is only focused on Google, guess what? You would not gain Claudia and other clients because you would miss multiple touchpoints in their user journey, as she searched on different channels and platforms. You must have content that enforces your brand at every stage.
Don’t Fall Behind
The time is now to adopt an omnichannel strategy, stay ahead of trends, experiment with different platforms, and maintain a strong performance on established channels like Google so we won’t be left behind.
Imagine if the following scenarios were to occur; what would happen to your business?
A loss of 30% of your traffic overnight.
You’re not finding where your customers are spending time before they make purchase decisions.
You’re not visible on ChatGPT, Bing, YouTube, Reddit, etc.
One of the brands I consulted with in the financial industry noticed that searches about retirement planning were being asked for on AI platforms and in Google.
We created a comprehensive, citation-rich content strategy that got them mentioned in some major financial publications.
When users searched for retirement planning in ChatGPT, their brand was mentioned as a source, which drove leads and conversions.
Screenshot from ChatGPT, April 2025
AI Works Differently Than Traditional Search
AI chatbots like ChatGPT don’t work like Google’s algorithm. They don’t rank websites; instead, they gather information and identify authoritative sources.
If you want to be visible in ChatGPT, then you need to change your approach.
Being a recognized name in your industry increases your chances of being mentioned.
Being featured across multiple platforms strengthens your authority and increases your visibility in AI chatbots.
Getting referenced by other respected sources helps build trust.
Have clear, conversational, and structured content that AI chatbots can reference and find.
Build trust and credibility through positive reviews and ratings.
ChatGPT and other AI chatbots and platforms look more broadly at the digital ecosystem and get information from Quora, Reddit, social media, forum conversations, and reviews.
AI chatbots also understand long-tail queries in a more nuanced way than traditional search.
It’s All About Balance
People are not running for the hills and abandoning Google.
Google remains the king and is likely to retain its market share leadership for the foreseeable future, but things are changing.
To succeed today, you must implement an omnichannel SEO strategy, maintain a strong Google presence, and be where your audience is.
Continue to:
Wrapping Up
Search engines like Google will continue to evolve, alongside ChatGPT, social platforms, and other search technologies that are expected to emerge in the coming years.
But, the days of relying only on Google as your primary digital marketing channel are behind us.
Brands that are discoverable, credible, and helpful will be successful, wherever your audience seeks information.
Brands that win in 2025 won’t be asking “How do we rank better on Google?” but rather “How do we ensure we are visible on every channel and have content that resonates and answers all our customers’ questions?”
This shift in perspective, from platform-centric to audience-centric, is the true key to sustainable digital success.
The twin forces of disrupted attribution and changing user behavior are reshaping how audiences discover brands.
Google’s mass rollout of AI Overviews and its experimental AI Mode are not surface-level UX tweaks; they represent a fundamental transformation of the search experience – one that compresses the journey from query to answer.
PPC is now a more competitive, constrained, and less predictable environment.
If Google is effectively skipping traditional landing pages in certain query classes, by serving direct answers, the margin for interrupting or influencing a user shrinks dramatically.
If you are not building a brand that people proactively seek out – or that AI systems actively reference – you are playing an increasingly expensive, inefficient game.
Brand Advertising Isn’t Brand Bidding
First, let’s define the terms clearly, as this distinction is often misunderstood in performance marketing circles.
Brand advertising refers to any paid activity designed to build awareness, familiarity, and positive association with your brand.
The primary objective isn’t immediate conversion; it’s to create a demand and a pipeline that your lower-funnel activities can later capture.
By contrast, brand bidding occurs when someone already knows your brand and actively searches for it.
Bidding on your own brand terms in Google Ads or Bing ensures you’re visible when that existing demand materialises – but it’s harvesting, not creating demand.
Brand advertising builds the mental availability that ensures your brand is considered when a user enters a buying journey. Brand bidding simply captures people who were already predisposed to choose you.
Both are important, but confusing the two leads to systemic underinvestment in activities that generate future growth.
In longer buying cycles, particularly in B2B, high-ticket B2C, and considered-purchase categories, persistent brand presence is critical.
Furthermore, research points to the fact that if you’re not already on someone’s shortlist before they start looking for a solution, you’re unlikely to be chosen vs. those brands who are.
Image from author (research by Google x Bain Consulting), April 2025
When the balance between brand and performance activity is right, each amplifies the other, creating what is called the Multiplier Effect, a virtuous cycle where brand-driven demand lowers cost-per-acquisition (CPA), improves Quality Scores, and enhances overall media efficiency.
The Advertising ‘Doom Loop’
Despite its proven impact, brand advertising remains chronically underfunded in performance-led organisations. Why?
In part, because it doesn’t fit neatly into short-term attribution models. Brand activity often influences outcomes weeks or months later, in ways that are difficult to measure through traditional last-click frameworks.
This measurement gap creates what WARC calls the “Advertising Doom Loop.” Here’s how it unfolds:
Advertisers focus disproportionately on easily measurable performance channels, such as paid search.
Brand-building budgets are cut because they lack immediate, attributable return on investment (ROI) in platforms like Google Analytics 4.
As brand equity erodes, acquisition costs rise and conversion rates fall.
To compensate, advertisers double down on short-term tactics, further starving brand investment.
The cycle repeats, gradually eroding long-term growth potential.
This loop is not theoretical. It’s been observed repeatedly across sectors and is backed by large-scale research studies and documented in a recent WARC study.
The brands that escape the doom loop understand that marketing is interconnected.
Short-term sales activation delivers immediate returns, but brand building provides compound growth over time, lowering customer acquisition costs (CACs), increasing customer loyalty, and insulating against category volatility.
Ignoring brand advertising might look efficient quarter-to-quarter, but over a multi-year horizon, it is a recipe for brand decline.
Why Brand Interest Is Your Most Defensible Asset
In a world of AI-curated answers and zero-click behavior, one channel remains relatively stable: branded search interest.
When a user types your name, your product, or your branded category term into Google, you control the narrative. These searches are:
Cheaper than competitive generic terms.
Higher converting, often by a factor of 2x or more.
Less vulnerable to displacement by AI Overviews, as of current observation (which still reference brand entities prominently).
At Hallam, we’ve seen this play out across multiple paid search accounts.
Brands with stronger brand search volumes and higher unaided awareness consistently achieve lower CPAs, better Quality Scores, and more efficient media performance across both search and display.
The impact of running brand advertising campaigns on search demand and clicks for one of our clients (Image from author, April 2025)
This shows the compounding value that brand equity brings to lower-funnel paid media campaigns.
Measurement Solutions
One of the biggest challenges performance marketers face today is how to measure the impact of brand campaigns.
Marketers must treat brand search volume, direct traffic trends, and assisted conversions as leading indicators of paid media effectiveness.
If your top-of-funnel strategy includes YouTube, connected TV, or programmatic display, shifts in these upstream metrics are early signals of success, even before conversions materialize.
For example, metrics that directly track interest in your brand, such as share of search, have been proven to be leading indicators of market share.
Moreover, investment in econometric modeling, brand uplift studies, and incrementality testing will become critical tools for understanding the true impact of marketing spend and providing a holistic view of performance as we move into the future.
When And How To Get Started
If paid search is becoming more competitive and less reliable for visibility, the logical response is to rebalance your media mix, and that starts with brand.
1. Run Paid Media To Uplift Brand Search Volume
Don’t just optimize for direct conversions. Optimize for subsequent branded search. YouTube, connected TV, and upper-funnel Meta campaigns can all drive brand interest that pays off later through more efficient search activity.
Tracking this means looking beyond last-click. Use view-through conversions, uplift studies, and brand search volume trends to measure the impact.
2. Invest In Non-Google Surfaces
A diversified paid media strategy is no longer a nice-to-have; it’s essential. That includes:
YouTube Shorts and creator content to build brand relevance.
Programmatic display and native ads on publisher sites may support discoverability
Paid partnerships and sponsorships that build reputation across the web.
These touchpoints feed awareness, and could also contribute to the knowledge graph and large language models (LLMs).
3. Align PPC With SEO To Influence AI Outputs
Yes, SEO still plays a role, but performance marketers should work alongside organic teams to ensure:
Branded pages are structured correctly for AI inclusion.
Top-performing PPC assets (e.g., headlines, product descriptions) are reflected in organic content.
Messaging consistency across paid and organic channels supports brand memorability.
Final Thoughts
Clicks are now harder to win. Impressions are becoming more expensive. And digital attribution data is increasingly unreliable.
In this environment, the brands that thrive will be the ones that people search for by name, that AI references unprompted, and that exist in the user’s mind long before they type anything at all.
That doesn’t happen by accident. It happens when paid media stops acting like a demand-harvesting function and starts behaving like a brand growth engine.
Search marketer Michael Bonfils recently discussed how AI is disrupting search marketing and shared insights into what he feels is an appropriate response to one of the most difficult search environments he’s seen in his thirty years of experience.
Michael Bonfils (LinkedIn profile) has worked in digital marketing since virtually the dawn of it all, well before Google even existed. He’s a leading international digital marketer with experience across every aspect of digital marketing, from on-page SEO to digital advertising. Michael joined Gianluca Fiorelli (LinkedIn profile) on the Advanced Web Ranking podcast and shared his insights on the challenges AI is bringing to digital marketing and novel ideas for how to navigate them.
Brutal Environment For Digital Marketing
Gianluca mentioned there’s a perception gap with AI where on one side are marketers who are heralding the end of SEO and PPC and on the other side are the “AI bros” who cheerlead that everything is going to become even better, with better leads from ChatGPT, etc.
He shook his head and said:
“It’s neither going to be a disaster and it’s neither going to be an AI paradise.”
Gianluca asked him what trends he’s seeing. Michael responded that the trends he’s seeing is that click volume has gone down since the introduction of AI. He said during other times when volume is down the click through rates go up, like during the pandemic. But that’s not happening now. Click through rates are down, volume is down but Cost Per Clicks are at historic highs.
Michael observed,
“But now, …the level we’re at now is the worst time since 2019 during the pandemic and prior to that it was never that bad.
…If you want throw the CPC factor in, the CPC’s are historically higher than they have been for years. So now we’ve got this perfect problem, click through rates down, volume down, CPC’s up. What does that mean? ROI is getting hit and clients are leaning on organic to try to make up for whatever shortfall there is and they can’t find it, they can’t find the traffic.
So to answer your question, …now that we’re going into Europe with AI overviews, are they impacting things? One hundred percent. And they’ll continue to change. “
Later on they discussed how a lot of what Google is doing is reactionary, a response to external pressures from companies like Perplexity AI and OpenAI, and the search industry is caught in the middle of it.
AI Overviews Leads To Loss Of Strategic Data
Michael Bonfils discusses how AI overviews leads to zero-click behavior and while most SEOs stop right there, Michael points out that this situation affects the data that’s available to marketers and as a consequence impacts content strategy.
The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturers (no takers) and after that to other industries that might want to occupy more than a thousand acres just off the interstate.
This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution.
So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive $10 billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution!
The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone (the electricity for cooling and other building needs will add to that). When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity.
To power the data center, Entergy aims to spend $3.2 billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center.
Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel.
The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company’s plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions “by funding carbon capture and a solar project are vague and offer little reassurance.”
The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon.
The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost (more or less), and you know how to scale it and get it approved,” says Victor. “Even for [AI] companies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.”
The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth.
“It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.”
But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like.
For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments.
The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands.
Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit.
Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.
But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify.
Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035 (roughly equivalent to the emissions from a large US state such as Florida), relative to a future in which the use of fossil fuel gradually winds down.
Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestration (CCS) at power plants and use natural gas sourced with limited methane emissions.
Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.
But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up?
Times of stress
AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US.
Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power.
There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave.
The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke’s Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity.
Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029.
“The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models (such as Meta’s facility in Richland Parish), can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress.
The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.”
AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI (the Electric Power Research Institute), a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.”
Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says.
Footing the bill
Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants.
“The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation.
In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas.
In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much.
Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline.
“Entergy is saying [Meta] needs around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to take [Entergy’s] word for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”
In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but “as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate.
The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn’t respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early.
Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over $200 million to support the Richland Parish data centers with new infrastructure, including roads and water systems.
Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years.
“Our biggest long-term concern is that in 15 years, residential ratepayers [and] small businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director.
Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.”
The Harvard authors write, “Utilities tell [public utility commissions] what they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.”
The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies.
Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana’s residents will have to live with—and possibly pay for—the changes in the decades to come.
With seemingly no limit to the demand for artificial intelligence, everyone in the energy, AI, and climate fields is justifiably worried. Will there be enough clean electricity to power AI and enough water to cool the data centers that support this technology? These are important questions with serious implications for communities, the economy, and the environment.
This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution.
But the question about AI’s energy usage portends even bigger issues about what we need to do in addressing climate change for the next several decades. If we can’t work out how to handle this, we won’t be able to handle broader electrification of the economy, and the climate risks we face will increase.
Innovation in IT got us to this point. Graphics processing units (GPUs) that power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use.
In the late 2010s, however, the trends that had saved us began to break. As the accuracy of AI models dramatically improved, the electricity needed for data centers also started increasing faster; they now account for 4.4% of total demand, up from 1.9% in 2018. Data centers consume more than 10% of the electricity supply in six US states. In Virginia, which has emerged as a hub of data center activity, that figure is 25%.
Projections about the future demand for energy to power AI are uncertain and range widely, but in one study, Lawrence Berkeley National Laboratory estimated that data centers could represent 6% to 12% of total US electricity use by 2028. Communities and companies will notice this type of rapid growth in electricity demand. It will put pressure on energy prices and on ecosystems. The projections have resulted in calls to build lots of new fossil-fired power plants or bring older ones out of retirement. In many parts of the US, the demand will likely result in a surge of natural-gas-powered plants.
It’s a daunting situation. Yet when we zoom out, the projected electricity use from AI is still pretty small. The US generated about 4,300 billion kilowatt-hours last year. We’ll likely need another 1,000 billion to 1,200 billion or more in the next decade—a 24% to 29% increase. Almost half the additional electricity demand will be from electrified vehicles. Another 30% is expected to be from electrified technologies in buildings and industry. Innovation in vehicle and building electrification also advanced in the last decade, and this shift will be good news for the climate, for communities, and for energy costs.
We also need to understand what the energy consumption and carbon emissions associated with AI are buying us. While the impacts from producing semiconductors and powering AI data centers are important, they are likely small compared with the positive or negative effects AI may have on applications such as the electricity grid, the transportation system, buildings and factories, or consumer behavior. Companies could use AI to develop new materials or batteries that would better integrate renewable energy into the grid. But they could also use AI to make it easier to find more fossil fuels. The claims about potential benefits for the climate are exciting, but they need to be continuously verified and will need support to be realized.
This isn’t the first time we’ve faced challenges coping with growth in electricity demand. In the 1960s, US electricity demand was growing at more than 7% per year. In the 1970s that growth was nearly 5%, and in the 1980s and 1990s it was more than 2% per year. Then, starting in 2005, we basically had a decade and a half of flat electricity growth. Most projections for the next decade put our expected growth in electricity demand at around 2% again—but this time we’ll have to do things differently.
To manage these new energy demands, we need a “Grid New Deal” that leverages public and private capital to rebuild the electricity system for AI with enough capacity and intelligence for decarbonization. New clean energy supplies, investment in transmission and distribution, and strategies for virtual demand management can cut emissions, lower prices, and increase resilience. Data centers bringing clean electricity and distribution system upgrades could be given a fast lane to connect to the grid. Infrastructure banks could fund new transmission lines or pay to upgrade existing ones. Direct investment or tax incentives could encourage clean computing standards, workforce development in the clean energy sector, and open data transparency from data center operators about their energy use so that communities can understand and measure the impacts.
In 2022, the White House released a Blueprint for an AI Bill of Rights that provided principles to protect the public’s rights, opportunities, and access to critical resources from being restricted by AI systems. To the AI Bill of Rights, we humbly offer a climate amendment, because ethical AI must be climate-safe AI. It’s a starting point to ensure that the growth of AI works for everyone—that it doesn’t raise people’s energy bills, adds more clean power to the grid than it uses, increases investment in the power system’s infrastructure, and benefits communities while driving innovation.
By grounding the conversation about AI and energy in context about what is needed to tackle climate change, we can deliver better outcomes for communities, ecosystems, and the economy. The growth of electricity demand for AI and data centers is a test case for how society will respond to the demands and challenges of broader electrification. If we get this wrong, the likelihood of meeting our climate targets will be extremely low. This is what we mean when we say the energy and climate impacts from data centers are small, but they are also huge.
Costa Samaras is the Trustee Professor of Civil and Environmental Engineering and director of the Scott Institute for Energy Innovation at Carnegie Mellon University.
Emma Strubell is the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University.
Ramayya Krishnan is dean of the Heinz College of Information Systems and Public Policy and the William W. and Ruth F. Cooper Professor of Management Science and Information Systems at Carnegie Mellon University.
In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city.
Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.
This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution.
Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well.
The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.
Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.
EMILY NAJERA
The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.
But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.”
Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.
That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe.
It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.
Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.
EMILY NAJERA
The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water.
Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils.
“Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.
“We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.”
Luring data centers
In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills.
He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner.
In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt.
On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center.
Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.
GREGG SEGAL
After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space.
Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017.
Last August, during an event at the University of Nevada, Reno, the company announced it would spend $400 million to expand the data center campus along with another one in Las Vegas.
Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park (although several lots are available for resale following the failed gamble of one crypto tenant).
When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part.
“We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.”
Then there’s the generous tax policies.
In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park.
Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities.
Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development.
The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains.
The rain shadow
The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.
But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average.
The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.
Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained.
The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.
EMILY NAJERA
Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.
About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.
It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.
In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.
“In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education.
That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.
These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds.
“We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.
Thirsty data centers
Data centers suck up water in two main ways.
As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat.
To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning.
These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside. (The research has since been peer-reviewed and is awaiting publication.)
What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes.
You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study.
Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity. (“Consumed” here means the water is evaporated, not merely withdrawn and returned to the engineered water system.) The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid.
The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities.
Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.
Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.
But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined upseveral deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.
The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do.
Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It’s also forging ahead with a more than $4 billion transmission project.
But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume.
NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.
EMILY NAJERA
“NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement.
An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.
“You end up with the water-intensive resources looking more important,” she adds.
Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.”
Securing supplies
On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.
“I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.
We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.
Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021.
“Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.”
The reservoir within the industrial business park provides water to data centers and other tenants.
EMILY NAJERA
But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.
Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert.
Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.
But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company.
As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.
Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise.
Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.
When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.”
Waterbattles
As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business.
More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish.
The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services.
“It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe.
“That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.”
Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.
EMILY NAJERA
In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.
More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.
Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake.
“I think that the pipeline from [the Truckee Meadows Water Authority] to our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.”
Water efficiency
In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there.
He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”
During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.
“We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.”
Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well.
“With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.”
An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.
GOOGLE
Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system.
Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy.
The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables.
Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.
Microsoft clearly suggestedin earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites.
But the company now declines to specify what it intends to build in the region.
“While the land purchase is public knowledge, we have not disclosed specific details [of] our plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.
Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.
EMILY NAJERA
Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports.
Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling.
Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.
But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.
Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.
Coming conflicts
The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says.
“That’s actually very likely, because it uses a lot more energy,” he adds.
That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.
Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.
Pipes running along Google’s data center campus help the search company cool its servers.
GOOGLE
Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term.
“If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”
The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center.
Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.
Open for business
As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.
Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings.
Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park.
EMILY NAJERA
“We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.”
During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote.
“Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.”
Where the river ends
In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.
The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name.
A lone angler stands along the shores of Pyramid Lake.
In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely.
“We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”
In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.
Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake.
He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.”
“We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”