Google’s electricity demand is skyrocketing

We got two big pieces of energy news from Google this week. The company announced that it’s signed an agreement to purchase electricity from a fusion company’s forthcoming first power plant. Google also released its latest environmental report, which shows that its energy use from data centers has doubled since 2020.

Taken together, these two bits of news offer a fascinating look at just how desperately big tech companies are hunting for clean electricity to power their data centers as energy demand and emissions balloon in the age of AI. Of course, we don’t know exactly how much of this pollution is attributable to AI because Google doesn’t break that out. (Also a problem!) So, what’s next and what does this all mean? 

Let’s start with fusion: Google’s deal with Commonwealth Fusion Systems is intended to provide the tech giant with 200 megawatts of power. This will come from Commonwealth’s first commercial plant, a facility planned for Virginia that the company refers to as the Arc power plant. The agreement represents half its capacity.

What’s important to note here is that this power plant doesn’t exist yet. In fact, Commonwealth still needs to get its Sparc demonstration reactor, located outside Boston, up and running. That site, which I visited in the fall, should be completed in 2026.

(An aside: This isn’t the first deal between Big Tech and a fusion company. Microsoft signed an agreement with Helion a couple of years ago to buy 50 megawatts of power from a planned power plant, scheduled to come online in 2028. Experts expressed skepticism in the wake of that deal, as my colleague James Temple reported.)

Nonetheless, Google’s announcement is a big moment for fusion, in part because of the size of the commitment and also because Commonwealth, a spinout company from MIT’s Plasma Science and Fusion Center, is seen by many in the industry as a likely candidate to be the first to get a commercial plant off the ground. (MIT Technology Review is owned by MIT but is editorially independent.)

Google leadership was very up-front about the length of the timeline. “We would certainly put this in the long-term category,” said Michael Terrell, Google’s head of advanced energy, in a press call about the deal.

The news of Google’s foray into fusion comes just days after the tech giant’s release of its latest environmental report. While the company highlighted some wins, some of the numbers in this report are eye-catching, and not in a positive way.

Google’s emissions have increased by over 50% since 2019, rising 6% in the last year alone. That’s decidedly the wrong direction for a company that’s set a goal to reach net-zero greenhouse-gas emissions by the end of the decade.

It’s true that the company has committed billions to clean energy projects, including big investments in next-generation technologies like advanced nuclear and enhanced geothermal systems. Those deals have helped dampen emissions growth, but it’s an arguably impossible task to keep up with the energy demand the company is seeing.

Google’s electricity consumption from data centers was up 27% from the year before. It’s doubled since 2020, reaching over 30 terawatt-hours. That’s nearly the annual electricity consumption from the entire country of Ireland.

As an outsider, it’s tempting to point the finger at AI, since that technology has crashed into the mainstream and percolated into every corner of Google’s products and business. And yet the report downplays the role of AI. Here’s one bit that struck me:

“However, it’s important to note that our growing electricity needs aren’t solely driven by AI. The accelerating growth of Google Cloud, continued investments in Search, the expanding reach of YouTube, and more, have also contributed to this overall growth.”

There is enough wiggle room in that statement to drive a large electric truck through. When I asked about the relative contributions here, company representative Mara Harris said via email that they don’t break out what portion comes from AI. When I followed up asking if the company didn’t have this information or just wouldn’t share it, she said she’d check but didn’t get back to me.

I’ll make the point here that we’ve made before, including in our recent package on AI and energy: Big companies should be disclosing more about the energy demands of AI. We shouldn’t be guessing at this technology’s effects.

Google has put a ton of effort and resources into setting and chasing ambitious climate goals. But as its energy needs and those of the rest of the industry continue to explode, it’s obvious that this problem is getting tougher, and it’s also clear that more transparency is a crucial part of the way forward.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

This battery recycling company is now cleaning up AI data centers

In a sandy industrial lot outside Reno, Nevada, rows of battery packs that once propelled electric vehicles are now powering a small AI data center.

Redwood Materials, one of the US’s largest battery recycling companies, showed off this array of energy storage modules, sitting on cinder blocks and wrapped in waterproof plastic, during a press tour at its headquarters on June 26. 

The event marked the launch of the company’s new business line, Redwood Energy, which will initially repurpose (rather than recycle) batteries with years of remaining life to create renewable-powered microgrids. Such small-scale energy systems can operate on or off the larger electricity grid, providing electricity for businesses or communities.

Redwood Materials says many of the batteries it takes in for processing retain more than half their capacity. 

“We can extract a lot more value from that material by using it as an energy storage project before recycling it,” JB Straubel, Redwood’s founder and chief executive, said at the event. 

This first microgrid, housed at the company’s facility in the Tahoe Reno Industrial Center, is powered by solar panels and capable of generating 64 megawatt-hours of electricity, making it one of the nation’s largest such systems. That power flows to Crusoe, a cryptocurrency miner that pivoted into developing AI data centers, which has built a facility with 2,000 graphics processing units adjacent to the lot of repurposed EV batteries. 

(That’s tiny as modern data centers go: Crusoe is developing a $500 billion AI data center for OpenAI and others in Abilene, Texas, where it expects to install 100,000 GPUs across its first two facilities by the end of the year, according to Forbes.)

Redwood’s project underscores a growing interest in powering data centers partially or entirely outside the electric grid. Not only would such microgrids be quicker to build than conventional power plants, but consumer ratepayers wouldn’t be on the hook for the cost of new grid-connected power plants developed to serve AI data centers. 

Since Redwood’s batteries are used, and have already been removed from vehicles, the company says its microgrids should also be substantially cheaper than ones assembled from new batteries.

A close up of grid of battery packs from Redwood Materials.

COURTESY REDWOOD MATERIALS

Redwood Energy’s microgrids could generate electricity for any kind of operation. But the company stresses they’re an ideal fit for addressing the growing energy needs and climate emissions of data centers. The energy consumption of such facilities could double by 2030, mainly due to the ravenous appetite of AI, according to an April report by the International Energy Agency.

“Storage is this perfectly positioned technology, especially low-cost storage, to attack each of those problems,” Straubel says.

The Tahoe Reno Industrial Center is the epicenter of a data center development boom in northern Nevada that has sparked growing concerns about climate emissions and excessive demand for energy and water, as MIT Technology Review recently reported.

Straubel says the litany of data centers emerging around it “would be logical targets” for its new business line, but adds there are growth opportunities across the expanding data center clusters in Texas, Virginia, and the Midwest as well.

“We’re talking to a broad cross section of those companies,” he says.

Crusoe, which also provides cloud services, recently announced a joint venture with the investment firm Engine No. 1 to provide “powered data center real estate solutions” to AI companies by constructing 4.5 gigawatts of new natural-gas plants.

Redwood’s microgrid should provide more than 99% of the electricity Crusoe’s local facilities need. In the event of extended periods with little sunlight, a rarity in the Nevada desert, the company could still draw from the standard power grid.

Cully Cavness, cofounder and operating chief of Crusoe, says the company is already processing AI queries and producing conclusions for its customers at the Nevada facility. (Its larger data centers are dedicated to the more computationally intensive process of training AI models.)

Redwood’s new business division offers a test case for a strategy laid out in a paper late last year, which highlighted the potential for solar-powered microgrids to supply the energy that AI data centers need.

The authors of that paper found that microgrids could be built much faster than natural-gas plants and would generally be only a little more expensive as an energy source for data centers, so long as the facilities could occasionally rely on natural-gas generators to get them through extended periods of low sunlight.

If solar-powered microgrids were used to power 30 gigawatts of new AI data centers, with just 10% backup from natural gas, it would eliminate 400 million tons of carbon dioxide emissions relative to running the centers entirely on natural gas, the study found. 

“Having a data center running off solar and storage is more or less what we were advocating for in our paper,” says Zeke Hausfather, climate lead at the payments company Stripe and a coauthor of the paper. He hopes that Redwood’s new microgrid will establish that “these sorts of systems work in the real world” and encourage other data center developers to look for similar solutions. 

Redwood Materials says electric vehicles are its fastest-growing source of used batteries, and it estimates that more than 100,000 EVs will come off US roads this year.

The company says it tests each battery to determine whether it can be reused. Those that qualify will be integrated into its modular storage systems, which can then store up energy from wind and solar installations or connect to the grid. As those batteries reach the end of their life, they’ll be swapped out of the microgrids and moved into the company’s recycling process. 

Redwood says it already has enough reusable batteries to build a gigawatt-hour’s worth of microgrids, capable of powering a little more than a million homes for an hour. In addition, the company’s new division has begun designing microgrids that are 10 times larger than the one it unveiled this week.

Straubel expects Redwood Energy to become a major business line, conceivably surpassing the company’s core recycling operation someday.

“We’re confident this is the lowest-cost solution out there,” he says.

It’s officially summer, and the grid is stressed

It’s crunch time for the grid this week. As I’m writing this newsletter, it’s 100 °F (nearly 38 °C) here in New Jersey, and I’m huddled in the smallest room in my apartment with the shades drawn and a single window air conditioner working overtime.  

Large swaths of the US have seen brutal heat this week, with multiple days in a row nearing or exceeding record-breaking temperatures. Spain recently went through a dramatic heat wave too, as did the UK, which is unfortunately bracing for another one soon. As I’ve been trying to stay cool, I’ve had my eyes on a website tracking electricity demand, which is also hitting record highs. 

We rely on electricity to keep ourselves comfortable, and more to the point, safe. These are the moments we design the grid for: when need is at its very highest. The key to keeping everything running smoothly during these times might be just a little bit of flexibility. 

While heat waves happen all over the world, let’s take my local grid as an example. I’m one of the roughly 65 million people covered by PJM Interconnection, the largest grid operator in the US. PJM covers Virginia, West Virginia, Ohio, Pennsylvania, and New Jersey, as well as bits of a couple of neighboring states.

Earlier this year, PJM forecast that electricity demand would peak at 154 gigawatts (GW) this summer. On Monday, just a few days past the official start of the season, the grid blew past that, averaging over 160 GW between 5 p.m. and 6 p.m. 

The fact that we’ve already passed both last year’s peak and this year’s forecasted one isn’t necessarily a disaster (PJM says the system’s total capacity is over 179 GW this year). But it is a good reason to be a little nervous. Usually, PJM sees its peak in July or August. As a reminder, it’s June. So we shouldn’t be surprised if we see electricity demand creep to even higher levels later in the summer.

It’s not just PJM, either. MISO, the grid that covers most of the Midwest and part of the US South, put out a notice that it expected to be close to its peak demand this week. And the US Department of Energy released an emergency order for parts of the Southeast, which allows the local utility to boost generation and skirt air pollution limits while demand is high.

This pattern of maxing out the grid is only going to continue. That’s because climate change is pushing temperatures higher, and electricity demand is simultaneously swelling (in part because of data centers like those that power AI). PJM’s forecasts show that the summer peak in 2035 could reach nearly 210 GW, well beyond the 179 GW it can provide today. 

Of course, we need more power plants to be built and connected to the grid in the coming years (at least if we don’t want to keep ancient, inefficient, expensive coal plants running, as we covered last week). But there’s a quiet strategy that could limit the new construction needed: flexibility.

The power grid has to be built for moments of the absolute highest demand we can predict, like this heat wave. But most of the time, a decent chunk of capacity that exists to get us through these peaks sits idle—it only has to come online when demand surges. Another way to look at that, however, is that by shaving off demand during the peak, we can reduce the total infrastructure required to run the grid. 

If you live somewhere that’s seen a demand crunch during a heat wave, you might have gotten an email from your utility asking you to hold off on running the dishwasher in the early evening or to set your air conditioner a few degrees higher. These are called demand response programs. Some utilities run more organized programs, where utilities pay customers to ramp down their usage during periods of peak demand.

PJM’s demand response programs add up to almost eight gigawatts of power—enough to power over 6 million homes. With these programs, PJM basically avoids having to fire up the equivalent of multiple massive nuclear power plants. (It did activate these programs on Monday afternoon during the hottest part of the day.)

As electricity demand goes up, building in and automating this sort of flexibility could go a long way to reducing the amount of new generation needed. One report published earlier this year found that if data centers agreed to have their power curtailed for just 0.5% of the time (around 40 hours out of a year of continuous operation), the grid could handle about 18 GW of new power demand in the PJM region without adding generation capacity. 

For the whole US, this level of flexibility would allow the grid to take on an additional 98 gigawatts of new demand without building any new power plants to meet it. To give you a sense of just how significant that would be, all the nuclear reactors in the US add up to 97 gigawatts of capacity.

Tweaking the thermostat and ramping down data centers during hot summer days won’t solve the demand crunch on their own, but it certainly won’t hurt to have more flexibility.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The Debrief: Power and energy

It may sound bluntly obvious, but energy is power. Those who can produce it, especially lots of it, get to exert authority in all sorts of ways. It brings revenue and enables manufacturing, data processing, transportation, and military might. Energy resources are arguably a nation’s most important asset. Look at Russia, or Saudi Arabia, or China, or Canada, or Qatar, or—for that matter—the US. For all these nations, energy production plays key roles in their economies and their outsize global status. (Qatar, for example, has a population roughly the size of metro Portland, Oregon.) 

The US has always been a nation of energy and industry. It was a major producer of coal, which fed the Industrial Revolution. World War II was won in large part by the energy production in the United States—which fueled both manufacturing of the war machine at home and its ships, planes, and tanks in the Pacific and Europe. Throughout its history, the country has found strength in energy production. 

Yet in many ways right now the US seems to be forgetting those lessons. It is moving backward in terms of its clean-­energy strategy, especially when it comes to powering the grid, in ways that will affect the nation for decades to come—even as China and others are surging forward. And that retreat is taking place just as electricity demand and usage are growing again after being flat for nearly two decades. That growth, according to the US Energy Information Administration, is “coming from the commercial sector, which includes data centers, and the industrial sector, which includes manufacturing establishments.” 

As MIT Technology Review has extensively reported, energy demand from data centers is set to soar, not plateau, as AI inhales ever more electricity from the grid. As my colleagues James O’Donnell and Casey Crownhart reported, by 2028 the share of US electricity going to power data centers may triple. (For the full report, see technologyreview.com/energy-ai.)

Both manufacturing and data centers are obviously priorities for the US writ large and the Trump administration in particular. Given those priorities, it’s surprising to see the administration and Congress making moves that would both decrease our potential energy supply and increase demand by lowering efficiency. 

This will be most true for electricity generation. The administration’s proposed budget, still being considered as we went to press, would roll back tax credits for wind, solar, and other forms of clean energy. In households, they would hit credits for rooftop solar panels and residential energy efficiency programs. Simultaneously, the US is trying to roll back efficiency standards for household appliances. These standards are key to keeping consumer electricity prices down by decreasing demand. 

In short, what most analysts are expecting is more strain on the grid, which means prices will go up for everyone. Meanwhile, rollbacks to the Inflation Reduction Act and to credits for advanced manufacturing mean that fewer future-facing energy sources will be built. 

This is just belligerently shortsighted. 

That’s especially true because as the US takes steps to make energy less abundant and more expensive, China—our ostensible chief international antagonist—is moving in exactly the opposite direction. The country has made massive strides in renewable energy generation, hitting its goals six years ahead of schedule. In fact, China is now producing so much clean energy that its carbon dioxide emissions are declining as a result.

This issue is about power, in all its forms. Yet whether you’re talking about the ability to act or the act of providing electricity, power comes from energy. So when it comes to energy, we need “ands,” not “ors.” We need nuclear and solar and wind and hydropower and hydrogen and geothermal and batteries on the grid. And we need efficiency. And yes, we even need oil and gas in the mid term while we ramp up cleaner sources. That is the way to maintain and increase our prosperity, and the only way we can possibly head off some of the worst consequences of climate change.

Inside the US power struggle over coal

Coal power is on life support in the US. It used to carry the grid with cheap electricity, but now plants are closing left and right.

There are a lot of potential reasons to let coal continue its journey to the grave. Carbon emissions from coal plants are a major contributor to climate change. And those facilities are also often linked with health problems in nearby communities, as reporter Alex Kaufman explored in a new feature story on Puerto Rico’s only coal-fired power plant.

But the Trump administration wants to keep coal power alive, and the US Department of Energy recently ordered some plants to stay open past their scheduled closures. Here’s why there’s a power struggle over coal.

Coal used to be king in the US, but the country has dramatically reduced its dependence on the fuel over the past two decades. It accounted for about 20% of the electricity generated in 2024, down from roughly half in 2000.

While the demise of coal has been great for US emissions, the real driver is economics. Coal used to be the cheapest form of electricity generation around, but the fracking boom handed that crown to natural gas over a decade ago. And now, even cheaper wind and solar power is coming online in droves.

Economics was a major factor in the planned retirement of the J.H. Campbell coal plant in Michigan, which was set to close at the end of May, Dan Scripps, chair of the Michigan Public Service Commission, told the Washington Post.

Then, on May 23, US Energy Secretary Chris Wright released an emergency order that requires the plant to remain open. Wright’s order mandates 90 more days of operation, and the order can be extended past that, too. It states that the goal is to minimize the risk of blackouts and address grid security issues before the start of summer.

The DOE’s authority to require power plants to stay open is something that’s typically used in emergencies like hurricanes, rather than in response to something as routine as … seasons changing. 

It’s true that there’s growing concern in the US about meeting demand for electricity, which is rising for the first time after being basically flat for decades. (The recent rise is in large part due to massive data centers, like those needed to run AI. Have I mentioned we have a great package on AI and energy?)

And we are indeed heading toward summer, which is when the grid is stretched to its limits. In the New York area, the forecast high is nearly 100 °F (38 °C) for several days next week—I’ll certainly have my air conditioner on, and I’m sure I’ll soon be getting texts asking me to limit electricity use during times of peak demand.

But is keeping old coal plants open the answer to a stressed grid?

It might not be the most economical way forward. In fact, in almost every case today, it’s actually cheaper to build new renewables capacity than to keep existing coal plants running in the US, according to a 2023 report from Energy Innovation, an energy think tank. And coal is only getting more expensive—in an updated analysis, Energy Innovation found that three-quarters of coal plants saw costs rising faster than inflation between 2021 and 2024.

Granted, solar and wind aren’t always available, while coal plants can be fired up on demand. And getting new projects built and connected to the grid will take time (right now, there’s a huge backlog of renewable projects waiting in the interconnection queue). But some experts say we actually don’t need new generation that urgently anyway, if big electricity users can be flexible with their demand

And we’re already seeing batteries come to the rescue on the grid at times of stress. Between May 2024 and April 2025, US battery storage capacity increased by about 40%. When Texas faced high temperatures last month, batteries did a lot to help the state make it through without blackouts, as this Bloomberg story points out. Costs are falling, too; prices are about 19% lower in 2024 than they were in 2023. 

Even as the Trump administration is raising concerns about grid reliability, it’s moved to gut programs designed to get more electricity generation and storage online, like the tax credits that support wind, solar, and battery production and installation. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

These new batteries are finding a niche

Lithium-ion batteries have some emerging competition: Sodium-based alternatives are starting to make inroads.

Sodium is more abundant on Earth than lithium, and batteries that use the material could be cheaper in the future. Building a new battery chemistry is difficult, mostly because lithium is so entrenched. But, as I’ve noted before, this new technology has some advantages in nooks and crannies. 

I’ve been following sodium-ion batteries for a few years, and we’re starting to see the chemistry make progress, though not significantly in the big category of electric vehicles. Rather, these new batteries are finding niches where they make sense, especially in smaller electric scooters and large energy storage installations. Let’s talk about what’s new for sodium batteries, and what it’ll take for the chemistry to really break out.

Two years ago, lithium prices were, to put it bluntly, bonkers. The price of lithium hydroxide (an ingredient used to make lithium-ion batteries) went from a little under $10,000 per metric ton in January 2021 to over $76,000 per metric ton in January 2023, according to data from Benchmark Mineral Intelligence.

More expensive lithium drives up the cost of lithium-ion batteries. Price spikes, combined with concerns about potential shortages, pushed a lot of interest in alternatives, including sodium-ion.

I wrote about this swelling interest for a 2023 story, which focused largely on vehicle makers in China and a few US startups that were hoping to get in on the game.

There’s one key point to understand here. Sodium-based batteries will need to be cheaper than lithium-based ones to have a shot at competing, especially for electric vehicles, because they tend to be worse on one key metric: energy density. A sodium-ion battery that’s the same size and weight as a lithium-ion one will store less energy, limiting vehicle range.

The issue is, as we’ve seen since that 2023 story, lithium prices—and the lithium-ion battery market—are moving targets. Prices for precursor materials have come back down since the early 2023 peak, with lithium hydroxide crossing below $9,000 per metric ton recently.

And as more and more battery factories are built, costs for manufactured products come down too, with the average price for a lithium-ion pack in 2024 dropping 20%—the biggest annual decrease since 2017, according to BloombergNEF.

I wrote about this potential difficulty in that 2023 story: “If sodium-ion batteries are breaking into the market because of cost and material availability, declining lithium prices could put them in a tough position.”

One researcher I spoke with at the time suggested that sodium-ion batteries might not compete directly with lithium-ion batteries but could instead find specialized uses where the chemistry made sense. Two years later, I think we’re starting to see what those are.

One growing segment that could be a big win for sodium-ion: electric micromobility vehicles, like scooters and three-wheelers. Since these vehicles tend to travel shorter distances at lower speeds than cars, the lower energy density of sodium-ion batteries might not be as big a deal.

There’s a great BBC story from last week that profiled efforts to put sodium-ion batteries in electric scooters. It focused on one Chinese company called Yadea, which is one of the largest makers of electric two- and three-wheelers in the world. Yadea has brought a handful of sodium-powered models to the market so far, selling about 1,000 of the scooters in the first three months of 2025, according to the company’s statement to the BBC. It’s early days, but it’s interesting to see this market emerging.

Sodium-ion batteries are also seeing significant progress in stationary energy storage installations, including some on the grid. (Again, if you’re not worried about carting the battery around and fitting it into the limited package of a vehicle, energy density isn’t so important.)

The Baochi Energy Storage Station that just opened in Yunnan province, China, is a hybrid system that uses both lithium-ion and sodium-ion batteries and has a capacity of 400 megawatt-hours. And Natron Energy in the US is among those targeting other customers for stationary storage, specifically going after data centers.

While smaller vehicles and stationary installations appear to be the early wins for sodium, some companies aren’t giving up on using the alternative for EVs as well. The Chinese battery giant CATL announced earlier this year that it plans to produce sodium-ion batteries for heavy-duty trucks under the brand name Naxtra Battery.

Ultimately, lithium is the juggernaut of the battery industry, and going head to head is going to be tough for any alternative chemistry. But sticking with niches that make sense could help sodium-ion make progress at a time when I’d argue we need every successful battery type we can get. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Over $1 billion in federal funding got slashed for this polluting industry

The clean cement industry might be facing the end of the road, before it ever really got rolling. 

On Friday, the US Department of Energy announced that it was canceling $3.7 billion in funding for 24 projects related to energy and industry. That included nearly $1.3 billion for cement-related projects.

Cement is a massive climate problem, accounting for roughly 7% of global greenhouse-gas emissions. What’s more, it’s a difficult industry to clean up, with huge traditional players and expensive equipment and infrastructure to replace. This funding was supposed to help address those difficulties, by supporting projects on the cusp of commercialization. Now companies will need to fill in the gap left by these cancellations, and it’s a big one. 

First up on the list for cuts is Sublime Systems, a company you’re probably familiar with if you’ve been reading this newsletter for a while. I did a deep dive last year, and the company was on our list of Climate Tech Companies to Watch in both 2023 and 2024.

The startup’s approach is to make cement using electricity. The conventional process requires high temperatures typically achieved by burning fossil fuels, so avoiding that could prevent a lot of emissions. 

In 2024, Sublime received an $87 million grant from the DOE to construct a commercial demonstration plant in Holyoke, Massachusetts. That grant would have covered roughly half the construction costs for the facility, which is scheduled to open in 2026 and produce up to 30,000 metric tons of cement each year. 

“We were certainly surprised and disappointed about the development,” says Joe Hicken, Sublime’s senior VP of business development and policy. Customers are excited by the company’s technology, Hicken adds, pointing to Sublime’s recently announced deal with Microsoft, which plans to buy up to 622,500 metric tons of cement from the company. 

Another big name, Brimstone, also saw its funding affected. That award totaled $189 million for a commercial demonstration plant, which was expected to produce over 100,000 metric tons of cement annually. 

In a statement, a Brimstone representative said the company believes the cancellation was a “misunderstanding.” The statement pointed out that the planned facility would make not only cement but also alumina, supporting US-based aluminum production. (Aluminum is classified as a critical mineral by the US Geological Survey, meaning it’s considered crucial to the US economy and national security.) 

An award to Heidelberg Materials for up to $500 million for a planned Indiana facility was also axed. The idea there was to integrate carbon capture and storage to clean up emissions from the plant, which would have made it the first cement plant in the US to demonstrate that technology. In a written statement, a representative said the decision can be appealed, and the company is considering that option.

And National Cement’s funding for the Lebec Net-Zero Project, another $500 million award, was canceled. That facility planned to make carbon-neutral cement through a combination of strategies: reducing the polluting ingredients needed, using alternative fuels like biomass, and capturing the plant’s remaining emissions. 

“We want to emphasize that this project will expand domestic manufacturing capacity for a critical industrial sector, while also integrating new technologies to keep American cement competitive,” said a company spokesperson in a written statement. 

There’s a sentiment here that’s echoed in all the responses I received: While these awards were designed to cut emissions, these companies argue that they can fit into the new administration’s priorities. They’re emphasizing phrases like “critical minerals,” “American jobs,” and “domestic supply chains.” 

“We’ve heard loud and clear from the Trump administration the desire to displace foreign imports of things that can be made here in America,” Sublime’s Hicken says. “At the end of the day, what we deliver is what the policymakers in DC are looking for.” 

But this administration is showing that it’s not supporting climate efforts—often even those that also advance its stated goals of energy abundance and American competitiveness. 

On Monday, my colleague James Temple published a new story about cuts to climate research, including tens of millions of dollars in grants from the National Science Foundation. Researchers at Harvard were particularly hard hit. 

Even as there’s interest in advancing the position of the US on the world’s stage, these cuts are making it hard for researchers and companies alike to do the crucial work of understanding our climate and developing and deploying new technologies. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The Trump administration has shut down more than 100 climate studies

The Trump administration has terminated National Science Foundation grants for more than 100 research projects related to climate change amid a widening campaign to slash federal funding for scientists and institutions studying the rising risks of a warming world.

The move will cut off what’s likely to amount to tens of millions of dollars for studies that were previously approved and, in most cases, already in the works. 

Affected projects include efforts to develop cleaner fuels, measure methane emissions, improve understanding of how heat waves and sea-level rise disproportionately harm marginalized groups, and help communities transition to sustainable energy, according to an MIT Technology Review review of a GrantWatch database—a volunteer-led effort to track federal cuts to research—and a list of terminated grants from the National Science Foundation (NSF) itself. 

The NSF is one of the largest sources of US funding for university research, so the cancellations will deliver a big blow to climate science and clean-energy development.

They come on top of the White House’s broader efforts to cut research funding and revenue for universities and significantly raise their taxes. The administration has also strived to slash staff and budgets at federal research agencies, halt efforts to assess the physical and financial risks of climate change, and shut down labs that have monitored and analyzed the levels of greenhouse gases in the air for decades.

“I don’t think it takes a lot of imagination to understand where this is going,” says Daniel Schrag, co-director of the science, technology, and public policy program at Harvard University, which has seen greater funding cuts than any other university amid an escalating legal conflict with the administration. “I believe the Trump administration intends to zero out funding for climate science altogether.”

The NSF says it’s terminating grants that aren’t aligned with the agency’s program goals, “including but not limited to those on diversity, equity, and inclusion (DEI), environmental justice, and misinformation/disinformation.”

Trump administration officials have argued that DEI considerations have contaminated US science, favoring certain groups over others and undermining the public’s trust in researchers.

“Political biases have displaced the vital search for truth,” Michael Kratsios, head of the White House Office of Science and Technology Policy, said to a group of NSF administrations and others last month, according to reporting in Science.

Science v. politics

But research projects that got caught in the administration’s anti-DEI filter aren’t the only casualties of the cuts. The NSF has also canceled funding for work that has little obvious connections to DEI ambitions, such as research on catalysts. 

Many believe the administration’s broader motivation is to  undermine the power of the university system and prevent research findings that cut against its politics. 

Trump and his officials have repeatedly argued, in public statements and executive orders, that climate fears are overblown and that burdensome environmental regulations have undermined the nation’s energy security and economic growth.

“It certainly seems like a deliberate attempt to undo any science that contradicts the administration,” says Alexa Fredston, an assistant professor of ocean sciences at the University of California, Santa Cruz. 

On May 28, a group of states including California, New York, and Illinois sued the NSF, arguing that the cuts illegally violated diversity goals and funding priorities clearly established by Congress, which controls federal spending.

A group of universities also filed a lawsuit against the NSF over its earlier decision to reduce the indirect cost rate for research, which reimburses universities for overhead expenses associated with work carried out on campuses. The plaintiffs included the California Institute of Technology, Carnegie Mellon University, and the Massachusetts Institute of Technology, which has also lost a number of research grants.

(MIT Technology Review is owned by, but editorially independent from, MIT.)

The NSF declined to comment.

‘Theft from the American people’

GrantWatch is an effort among researchers at rOpenSci, Harvard, and other organizations to track terminations of grants issued by the National Institutes of Health (NIH) and NSF. It draws on voluntary submissions from scientists involved as well as public government information. 

A search of its database for the terms “climate change,” “clean energy,” “climate adaptation,” “environmental justice,” and “climate justice” showed that the NSF has canceled funds for 118 projects, which were supposed to receive more than $100 million in total. Searching for the word “climate” produces more than 300 research projects that were set to receive more than $230 million. (That word often indicates climate-change-related research, but in some abstracts it refers to the cultural climate.) 

Some share of those funds has already been issued to research groups. The NSF section of the database doesn’t include that “outlaid” figure, but it’s generally about half the amount of the original grants, according to Noam Ross, a computational researcher and executive director of rOpenSci, a nonprofit initiative that promotes open and reproducible science.

A search for “climate change” among the NIH projects produces another 22 studies that were terminated and were still owed nearly $50 million in grants. Many of those projects explored the mental or physical health effects of climate change and extreme weather events.

The NSF more recently released its own list of terminated projects, which mostly mirrored GrantWatch’s findings and confirms the specific terminations mentioned in this story.

“These grant terminations are theft from the American people,” Ross said in an email response. “By illegally ending this research the Trump administration is wasting taxpayer dollars, gutting US leadership in science, and telling the world that the US government breaks its promises.”

Harvard, the country’s oldest university, has been particularly hard hit.

In April, the university sued the Trump administration over cuts to its research funding and efforts to exert control over its admissions and governance policies. The White House, in turn, has moved to eliminate all federal funds for the university, including hundreds of NSF and NIH grants. 

Daniel Nocera, a professor at Harvard who has done pioneering work on so-called artificial photosynthesis, a pathway for producing clean fuels from sunlight, said in an email that all of his grants were terminated. 

“I have no research funds,” he added.

Another terminated grant involved a collaboration between Harvard and the NSF National Center for Atmospheric Research (NCAR), designed to update the atmospheric chemistry component of the Community Earth System Model, an open-source climate model widely used by scientists around the world.

The research was expected to “contribute to a better understanding of atmospheric chemistry in the climate system and to improve air quality predictions within the context of climate change,” according to the NSF abstract. 

“We completed most of the work and were able to bring it to a stopping point,” Daniel Jacob, a professor at Harvard listed as the principal investigator on the project, said in an email. “But it will affect the ability to study chemistry-climate interactions. And it is clearly not right to pull funding from an existing project.”

Plenty of the affected research projects do, in one way or another, grapple with issues of diversity, equity, and inclusion. But that’s because there is ample evidence that disadvantaged communities experience higher rates of illness from energy-sector pollution, will be harder hit by the escalating effects of extreme weather and are underrepresented in scientific fields.

One of the largest terminations cut off about $4 million dollars of remaining funds for the CLIMATE Justice Initiative, a fellowship program at the University of California, Irvine designed to recruit, train and mentor a more diverse array of researchers in Earth sciences.  

The NSF decision occurred halfway into the 5-year program, halting funds for a number of fellows who were in the midst of environmental justice research efforts with community partners in Southern California. Kathleen Johnson, a professor at UC Irvine and director of the initiative, says the university is striving to find ways to fund as many participants as possible for the remainder of their fellowships.

“We need people from all parts of society who are trained in geoscience and climate science to address all these global challenges that we are facing,” she says. “The people who will be best positioned to do this work …  are the people who understand the community’s needs and are able to therefore work to implement equitable solutions.”

“Diverse teams have been shown to do better science,” Johnson adds.

Numerous researchers whose grants were terminated didn’t respond to inquiries from MIT Technology Review or declined to comment, amid growing concerns that the Trump administration will punish scientists or institutions that criticize their policies.

Coming cuts

The termination of existing NSF and NIH grants is just the start of the administration’s plans to cut federal funding for climate and clean-energy research. 

The White House’s budget proposal for the coming fiscal year seeks to eliminate tens of billions of dollars in funding across federal agencies, specifically calling out “Green New Scam funds” at the Department of Energy; “low-priority climate monitoring satellites” at NASA; “climate-dominated research, data, and grant programs” at the National Oceanic and Atmospheric Administration; and “climate; clean energy; woke social, behavioral, and economic sciences” at the NSF.

The administration released a more detailed NSF budget proposal on May 30th, which called for a 60% reduction in research spending and nearly zeroed out the clean energy technology program. It also proposed cutting funds by 97% for the US Global Change Research Program, which produces regular assessments of climate risks; 80% for the Ocean Observatories Initiative, a global network of ocean sensors that monitor shifting marine conditions; and 40% for NCAR, the atmospheric research center.

If Congress approves budget reductions anywhere near the levels the administration has put forward, scientists fear, it could eliminate the resources necessary to carry on long-running climate observation of oceans, forests, and the atmosphere. 

The administration also reportedly plans to end the leases on dozens of NOAA facilities, including the Global Monitoring Laboratory in Hilo, Hawaii. The lab supports the work of the nearby Mauna Loa Observatory, which has tracked atmospheric carbon dioxide levels for decades.

Even short gaps in these time-series studies, which scientists around the world rely upon, would have an enduring impact on researchers’ ability to analyze and understand weather and climate trends.

“We won’t know where we’re going if we stop measuring what’s happening,” says Jane Long, formerly the associate director of energy and environment at Lawrence Livermore National Lab. “It’s devastating—there’s no two ways around it.” 

Stunting science 

Growing fears that public research funding will take an even larger hit in the coming fiscal year are forcing scientists to rethink their research plans—or to reconsider whether they want to stay in the field at all, numerous observers said.

“The amount of funding we’re talking about isn’t something a university can fill indefinitely, and it’s not something that private philanthropy can fill for very long,” says Michael Oppenheimer, a professor of geosciences and international affairs at Princeton University. “So what we’re talking about is potentially cataclysmic for climate science.”

“Basically it’s a shit show,” he adds, “and how bad a shit show it is will depend a lot on what happens in the courts and Congress over the next few months.”

One climate scientist, who declined to speak on the record out of concern that the administration might punish his institution, said the declining funding is forcing researchers to shrink their scientific ambitions down to a question of “What can I do with my laptop and existing data sets?”

“If your goal was to make the United States a second-class or third-class country when it comes to science and education, you would be doing exactly what the administration is doing,” the scientist said. “People are pretty depressed, upset, and afraid.”

Given the rising challenges, Harvard’s Schrag fears that the best young climate scientists will decide to shift their careers outside of the US, or move into high tech or other fields where they can make significantly more money.

“We might lose a generation of talent—and that’s not going to get fixed four years from now,” he says. “The irony is that Trump is attacking the institutions and foundation of US science that literally made America great.”

What will power AI’s growth?

It’s been a little over a week since we published Power Hungry, a package that takes a hard look at the expected energy demands of AI. Last week in this newsletter, I broke down the centerpiece of that package, an analysis I did with my colleague James O’Donnell. (In case you’re still looking for an intro, you can check out this Roundtable discussion with James and our editor in chief Mat Honan, or this short segment I did on Science Friday.)

But this week, I want to talk about another story that I also wrote for that package, which focused on nuclear energy. I thought this was an important addition to the mix of stories we put together, because I’ve seen a lot of promises about nuclear power as a saving grace in the face of AI’s energy demand. My reporting on the industry over the past few years has left me a little skeptical. 

As I discovered while I continued that line of reporting, building new nuclear plants isn’t so simple or so fast. And as my colleague David Rotman lays out in his story for the package, the AI boom could wind up relying on another energy source: fossil fuels. So what’s going to power AI? Let’s get into it. 

When we started talking about this big project on AI and energy demand, we had a lot of conversations about what to include. And from the beginning, the climate team was really focused on examining what, exactly, was going to be providing the electricity needed to run data centers powering AI models. As we wrote in the main story: 

“A data center humming away isn’t necessarily a bad thing. If all data centers were hooked up to solar panels and ran only when the sun was shining, the world would be talking a lot less about AI’s energy consumption.” 

But a lot of AI data centers need to be available constantly. Those that are used to train models can arguably be more responsive to the changing availability of renewables, since that work can happen in bursts, any time. Once a model is being pinged with questions from the public, though, there needs to be computing power ready to run all the time. Google, for example, would likely not be too keen on having people be able to use its new AI Mode only during daylight hours.

Solar and wind power, then, would seem not to be a great fit for a lot of AI electricity demand, unless they’re paired with energy storage—and that increases costs. Nuclear power plants, on the other hand, tend to run constantly, outputting a steady source of power for the grid. 

As you might imagine, though, it can take a long time to get a nuclear power plant up and running. 

Large tech companies can help support plans to reopen shuttered plants or existing plants’ efforts to extend their operating lifetimes. There are also some existing plants that can make small upgrades to improve their output. I just saw this news story from the Tri-City Herald about plans to upgrade the Columbia Generating Station in eastern Washington—with tweaks over the next few years, it could produce an additional 162 megawatts of power, over 10% of the plant’s current capacity. 

But all that isn’t going to be nearly enough to meet the demand that big tech companies are claiming will materialize in the future. (For more on the numbers here and why new tech isn’t going to come online fast enough, check out my full story.) 

Instead, natural gas has become the default to meet soaring demand from data centers, as David lays out in his story. And since the lifetime of plants built today is about 30 years, those new plants could be running past 2050, the date the world needs to bring greenhouse-gas emissions to net zero to meet the goals set out in the Paris climate agreement. 

One of the bits I found most interesting in David’s story is that there’s potential for a different future here: Big tech companies, with their power and influence, could actually use this moment to push for improvements. If they reduced their usage during peak hours, even for less than 1% of the year, it could greatly reduce the amount of new energy infrastructure required. Or they could, at the very least, push power plant owners and operators to install carbon capture technology, or ensure that methane doesn’t leak from the supply chain.

AI’s energy demand is a big deal, but for climate change, how we choose to meet it is potentially an even bigger one. 

This startup wants to make more climate-friendly metal in the US

A California-based company called Magrathea just turned on a new electrolyzer that can make magnesium metal from seawater. The technology has the potential to produce the material, which is used in vehicles and defense applications, with net-zero greenhouse-gas emissions.

Magnesium is an incredibly light metal, and it’s used for parts in cars and planes, as well as in aluminum alloys like those in vehicles. The metal is also used in defense and industrial applications, including the production processes for steel and titanium.

Today, China dominates production of magnesium, and the most common method generates a lot of the emissions that cause climate change. If Magrathea can scale up its process, it could help provide an alternative source of the metal and clean up industries that rely on it, including automotive manufacturing.

The star of Magrathea’s process is an electrolyzer, a device that uses electricity to split a material into its constituent elements. Using an electrolyzer in magnesium production isn’t new, but Magrathea’s approach represents an update. “We really modernized it and brought it into the 21st century,” says Alex Grant, Magrathea’s cofounder and CEO.

The whole process starts with salty water. There are small amounts of magnesium in seawater, as well as in salt lakes and groundwater. (In seawater, the concentration is about 1,300 parts per million, so magnesium makes up about 0.1% of seawater by weight.) If you take that seawater or brine and clean it up, concentrate it, and dry it out, you get a solid magnesium chloride salt.

Magrathea takes that salt (which it currently buys from Cargill) and puts it into the electrolyzer. The device reaches temperatures of about 700 °C (almost 1,300 °F) and runs electricity through the molten salt to split the magnesium from the chlorine, forming magnesium metal.

Typically, running an electrolyzer in this process would require a steady source of electricity. The temperature is generally kept just high enough to maintain the salt in a molten state. Allowing it to cool down too much would allow it to solidify, messing up the process and potentially damaging the equipment. Heating it up more than necessary would just waste energy. 

Magrathea’s approach builds in flexibility. Basically, the company runs its electrolyzer about 100 °C higher than is necessary to keep the molten salt a liquid. It then uses the extra heat in inventive ways, including to dry out the magnesium salt that eventually goes into the reactor. This preparation can be done intermittently, so the company can take in electricity when it’s cheaper or when more renewables are available, cutting costs and emissions. In addition, the process will make a co-product, called magnesium oxide, that can be used to trap carbon dioxide from the atmosphere, helping to cancel out the remaining carbon pollution.

The result could be a production process with net-zero emissions, according to an independent life cycle assessment completed in January. While it likely won’t reach this bar at first, the potential is there for a much more climate-friendly process than what’s used in the industry today, Grant says.

Breaking into magnesium production won’t be simple, says Simon Jowitt, director of the Nevada Bureau of Mines and of the Center for Research in Economic Geology at the University of Nevada, Reno.

China produces roughly 95% of the global supply as of 2024, according to data from the US Geological Survey. This dominant position means companies there can flood the market with cheap metal, making it difficult for others to compete. “The economics of all this is uncertain,” Jowitt says.

The US has some trade protections in place, including an anti-dumping duty, but newer players with alternative processes can still face obstacles. US Magnesium, a company based in Utah, was the only company making magnesium in the US in recent years, but it shut down production in 2022 after equipment failures and a history of environmental concerns. 

Magrathea plans to start building a demonstration plant in Utah in late 2025 or early 2026, which will have a capacity of roughly 1,000 tons per year and should be running in 2027. In February the company announced that it signed an agreement with a major automaker, though it declined to share its name on the record. The automaker pre-purchased material from the demonstration plant and will incorporate it into existing products.

After the demonstration plant is running, the next step would be to build a commercial plant with a larger capacity of around 50,000 tons annually.