How US research cuts are threatening crucial climate data

Over the last few months, and especially the last few weeks, there’s been an explosion of news about proposed budget cuts to science in the US. One trend I’ve noticed: Researchers and civil servants are sounding the alarm that those cuts mean we might lose key data that helps us understand our world and how climate change is affecting it.

My colleague James Temple has a new story out today about researchers who are attempting to measure the temperature of mountain snowpack across the western US. Snow that melts in the spring is a major water source across the region, and monitoring the temperature far below the top layer of snow could help scientists more accurately predict how fast water will flow down the mountains, allowing farmers, businesses, and residents to plan accordingly.

But long-running government programs that monitor the snowpack across the West are among those being threatened by cuts across the US federal government. Also potentially in trouble: carbon dioxide measurements in Hawaii, hurricane forecasting tools, and a database that tracks the economic impact of natural disasters. It’s all got me thinking: What do we lose when data is in danger?

Take for example the work at Mauna Loa Observatory, which sits on the northern side of the world’s largest active volcano. In this Hawaii facility, researchers have been measuring the concentration of carbon dioxide in the atmosphere since 1958.

The resulting graph, called the Keeling Curve (after Charles David Keeling, the scientist who kicked off the effort) is a pillar of climate research. It shows that carbon dioxide, the main greenhouse gas warming the planet, has increased in the atmosphere from around 313 parts per million in 1958 to over 420 parts per million today.

Proposed cuts to the National Oceanic and Atmospheric Administration (NOAA) jeopardize the Keeling Curve’s future. As Ralph Keeling (current steward of the curve and Keeling’s son) put it in a new piece for Wired, “If successful, this loss will be a nightmare scenario for climate science, not just in the United States, but the world.”

This story has echoes across the climate world right now. A lab at Princeton that produces what some consider the top-of-the-line climate models used to make hurricane forecasts could be in trouble because of NOAA budget cuts. And last week, NOAA announced it would no longer track the economic impact of the biggest natural disasters in the US.

Some of the largest-scale climate efforts will feel the effects of these cuts, and as James’s new story shows, they could also seep into all sorts of specialized fields. Even seemingly niche work can have a huge impact not just on research, but on people.

The frozen reservoir of the Sierra snowpack provides about a third of California’s groundwater, as well as the majority used by towns and cities in northwest Nevada. Researchers there are hoping to help officials better forecast the timing of potential water supplies across the region.

This story brought to mind my visit to El Paso, Texas, a few years ago. I spoke with farmers there who rely on water coming down the Rio Grande, alongside dwindling groundwater, to support their crops. There, water comes down from the mountains in Colorado and New Mexico in the spring and is held in the Elephant Butte Reservoir. One farmer I met showed me pages and pages of notes of reservoir records, which he had meticulously copied by hand. Those crinkled pages were a clear sign: Publicly available data was crucial to his work.

The endeavor of scientific research, particularly when it involves patiently gathering data, isn’t always exciting. Its importance is often overlooked. But as cuts continue, we’re keeping a lookout, because losing data could harm our ability to track, address, and adapt to our changing climate. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Did solar power cause Spain’s blackout?

At roughly midday on Monday, April 28, the lights went out in Spain. The grid blackout, which extended into parts of Portugal and France, affected tens of millions of people—flights were grounded, cell networks went down, and businesses closed for the day.

Over a week later, officials still aren’t entirely sure what happened, but some (including the US energy secretary, Chris Wright) have suggested that renewables may have played a role, because just before the outage happened, wind and solar accounted for about 70% of electricity generation. Others, including Spanish government officials, insisted that it’s too early to assign blame.

It’ll take weeks to get the full report, but we do know a few things about what happened. And even as we wait for the bigger picture, there are a few takeaways that could help our future grid.

Let’s start with what we know so far about what happened, according to the Spanish grid operator Red Eléctrica:

  • A disruption in electricity generation took place a little after 12:30 p.m. This may have been a power plant flipping off or some transmission equipment going down.
  • A little over a second later, the grid lost another bit of generation.
  • A few seconds after that, the main interconnector between Spain and southwestern France got disconnected as a result of grid instability.
  • Immediately after, virtually all of Spain’s electricity generation tripped offline.

One of the theories floating around is that things went wrong because the grid diverged from its normal frequency. (All power grids have a set frequency: In Europe the standard is 50 hertz, which means the current switches directions 50 times per second.) The frequency needs to be constant across the grid to keep things running smoothly.

There are signs that the outage could be frequency-related. Some experts pointed out that strange oscillations in the grid frequency occurred shortly before the blackout.

Normally, our grid can handle small problems like an oscillation in frequency or a drop that comes from a power plant going offline. But some of the grid’s ability to stabilize itself is tied up in old ways of generating electricity.

Power plants like those that run on coal and natural gas have massive rotating generators. If there are brief issues on the grid that upset the balance, those physical bits of equipment have inertia: They’ll keep moving at least for a few seconds, providing some time for other power sources to respond and pick up the slack. (I’m simplifying here—for more details I’d highly recommend this report from the National Renewable Energy Laboratory.)

Solar panels don’t have inertia—they rely on inverters to change electricity into a form that’s compatible with the grid and matches its frequency. Generally, these inverters are “grid-following,” meaning if frequency is dropping, they follow that drop.

In the case of the blackout in Spain, it’s possible that having a lot of power on the grid coming from sources without inertia made it more possible for a small problem to become a much bigger one.

Some key questions here are still unanswered. The order matters, for example. During that drop in generation, did wind and solar plants go offline first? Or did everything go down together?

Whether or not solar and wind contributed to the blackout as a root cause, we do know that wind and solar don’t contribute to grid stability in the same way that some other power sources do, says Seaver Wang, climate lead of the Breakthrough Institute, an environmental research organization. Regardless of whether renewables are to blame, more capability to stabilize the grid would only help, he adds.

It’s not that a renewable-heavy grid is doomed to fail. As Wang put it in an analysis he wrote last week: “This blackout is not the inevitable outcome of running an electricity system with substantial amounts of wind and solar power.”

One solution: We can make sure the grid includes enough equipment that does provide inertia, like nuclear power and hydropower. Reversing a plan to shut down Spain’s nuclear reactors beginning in 2027 would be helpful, Wang says. Other options include building massive machines that lend physical inertia and using inverters that are “grid-forming,” meaning they can actively help regulate frequency and provide a sort of synthetic inertia.

Inertia isn’t everything, though. Grid operators can also rely on installing a lot of batteries that can respond quickly when problems arise. (Spain has much less grid storage than other places with a high level of renewable penetration, like Texas and California.)

Ultimately, if there’s one takeaway here, it’s that as the grid evolves, our methods to keep it reliable and stable will need to evolve too.

If you’re curious to hear more on this story, I’d recommend this Q&A from Carbon Brief about the event and its aftermath and this piece from Heatmap about inertia, renewables, and the blackout.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

A long-abandoned US nuclear technology is making a comeback in China

China has once again beat everyone else to a clean energy milestone—its new nuclear reactor is reportedly one of the first to use thorium instead of uranium as a fuel and the first of its kind that can be refueled while it’s running.

It’s an interesting (if decidedly experimental) development out of a country that’s edging toward becoming the world leader in nuclear energy. China has now surpassed France in terms of generation, though not capacity; it still lags behind the US in both categories. But one recurring theme in media coverage about the reactor struck me, because it’s so familiar: This technology was invented decades ago, and then abandoned.

You can basically copy and paste that line into countless stories about today’s advanced reactor technology. Molten-salt cooling systems? Invented in the mid-20th century but never commercialized. Same for several alternative fuels, like TRISO. And, of course, there’s thorium.

This one research reactor in China running with an alternative fuel says a lot about this moment for nuclear energy technology: Many groups are looking into the past for technologies, with a new appetite for building them.

First, it’s important to note that China is the hot spot for nuclear energy right now. While the US still has the most operational reactors in the world, China is catching up quickly. The country is building reactors at a remarkable clip and currently has more reactors under construction than any other country by far. Just this week, China approved 10 new reactors, totaling over $27 billion in investment.

China is also leading the way for some advanced reactor technologies (that category includes basically anything that deviates from the standard blueprint of what’s on the grid today: large reactors that use enriched uranium for fuel and high-pressure water to keep the reactor cool). High-temperature reactors that use gas as a coolant are one major area of focus for China—a few reactors that use this technology have recently started up, and more are in the planning stages or under construction.

Now, Chinese state media is reporting that scientists in the country reached a milestone with a thorium-based reactor. The reactor came online in June 2024, but researchers say it recently went through refueling without shutting down. (Conventional reactors generally need to be stopped to replenish the fuel supply.) The project’s lead scientists shared the results during a closed meeting at the Chinese Academy of Sciences.

I’ll emphasize here that this isn’t some massive power plant: This reactor is tiny. It generates just two megawatts of heat—less than the research reactor on MIT’s campus, which rings in at six megawatts. (To be fair, MIT’s is one of the largest university research reactors in the US, but still … it’s small.)

Regardless, progress is progress for thorium reactors, as the world has been entirely focused on uranium for the last 50 years or so.

Much of the original research on thorium came out of the US, which pumped resources into all sorts of different reactor technologies in the 1950s and ’60s. A reactor at Oak Ridge National Laboratory in Tennessee that ran in the 1960s used Uranium-233 fuel (which can be generated when thorium is bombarded with radiation).

Eventually, though, the world more or less settled on a blueprint for nuclear reactors, focusing on those that use Uranium-238 as fuel and are cooled by water at a high pressure. One reason for the focus on uranium for energy tech? The research could also be applied to nuclear weapons.

But now there’s a renewed interest in alternative nuclear technologies, and the thorium-fueled reactor is just one of several examples. A prominent one we’ve covered before: Kairos Power is building reactors that use molten salt as a coolant for small nuclear reactors, also a technology invented and developed in the 1950s and ’60s before being abandoned. 

Another old-but-new concept is using high-temperature gas to cool reactors, as X-energy is aiming to do in its proposed power station at a chemical plant in Texas. (That reactor will be able to be refueled while it’s running, like the new thorium reactor.) 

Some problems from decades ago that contributed to technologies being abandoned will still need to be dealt with today. In the case of molten-salt reactors, for example, it can be tricky to find materials that can withstand the corrosive properties of super-hot salt. For thorium reactors, the process of transforming thorium into U-233 fuel has historically been one of the hurdles. 

But as early progress shows, the archives could provide fodder for new commercial reactors, and revisiting these old ideas could give the nuclear industry a much-needed boost. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The vibes are shifting for US climate tech

The past few years have been an almost nonstop parade of good news for climate tech in the US. Headlines about billion-dollar grants from the government, massive private funding rounds, and labs churning out advance after advance have been routine. Now, though, things are starting to shift.  

About $8 billion worth of US climate tech projects have been canceled or downsized so far in 2025. (You can see a map of those projects in my latest story here.) 

There are still projects moving forward, but these cancellations definitely aren’t a good sign. And now we have tariffs to think about, adding additional layers of expense and, worse, uncertainty. (Businesses, especially those whose plans require gobs of money, really don’t like uncertainty.) Honestly, I’m still getting used to an environment that isn’t such a positive one for climate technology. How worried should we be? Let’s get into the context.

Sometimes, one piece of news can really drive home a much larger trend. For example, I’ve read a bazillion studies about extreme weather and global warming, but every time a hurricane comes close to my mom’s home in Florida, the threat of climate-fueled extreme weather becomes much more real for me. A recent announcement about climate tech hit me in much the same fashion.

In February, Aspen Aerogels announced it was abandoning plans for a Georgia factory that would have made materials that can suppress battery fires. The news struck me, because just a few months before, in October, I had written about the Department of Energy’s $670 million loan commitment for the project. It was a really fun story, both because I found the tech fascinating and because MIT Technology Review got the exclusive access to cover it first.

And now, suddenly, that plan is just dead. Aspen said it will shift some of its production to a factory in Rhode Island and send some overseas. (I reached out to the company with questions for my story last week, but they didn’t get back to me.)

One example doesn’t always mean there’s a trend; I got food poisoning at a sushi restaurant once, but I haven’t cut out sashimi permanently. The bad news, though, is that Aspen’s cancellation is just one of many. Over a dozen major projects in climate technology have gotten killed so far this year, as the nonprofit E2 tallied up in a new report last week. That’s far from typical.

I got some additional context from Jay Turner, who runs Big Green Machine, a database that also tracks investments in the climate-tech supply chain. That project includes some data that E2 doesn’t account for: news about when projects are delayed or take steps forward. On Monday, the Big Green Machine team released a new update, one that Turner called “concerning.”

Since Donald Trump took office on January 20, about $10.5 billion worth of investment in climate tech projects has progressed in some way. That basically means 26 projects were announced, secured new funding, increased in scale, or started construction or production.

Meanwhile, $12.2 billion across 14 projects has slowed down in some way. This covers projects that were canceled, were delayed significantly, or lost funding, as well as companies that went bankrupt. So by total investment, there’s been more bad news in climate tech than good news, according to Turner’s tracking.

It’s tempting to look for the silver lining here. The projects still moving forward are certainly positive, and we’ll hopefully continue to see some companies making progress even as we head into even more uncertain times. But the signs don’t look good.

One question that I have going forward is how a seemingly inevitable US slowdown on climate technology will ripple around the rest of the world. Several experts I’ve spoken with seem to agree that this will be a great thing for China, which has aggressively and consistently worked to establish itself as a global superpower in industries like EVs and batteries.

In other words, the energy transition is rolling on. Will the US get left behind? 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

These four charts sum up the state of AI and energy

While it’s rare to look at the news without finding some headline related to AI and energy, a lot of us are stuck waving our hands when it comes to what it all means.

Sure, you’ve probably read that AI will drive an increase in electricity demand. But how that fits into the context of the current and future grid can feel less clear from the headlines. That’s true even for people working in the field. 

A new report from the International Energy Agency digs into the details of energy and AI, and I think it’s worth looking at some of the data to help clear things up. Here are four charts from the report that sum up the crucial points about AI and energy demand.

1. AI is power hungry, and the world will need to ramp up electricity supply to meet demand. 

This point is the most obvious, but it bears repeating: AI is exploding, and it’s going to lead to higher energy demand from data centers. “AI has gone from an academic pursuit to an industry with trillions of dollars at stake,” as the IEA report’s executive summary puts it.

Data centers used less than 300 terawatt-hours of electricity in 2020. That could increase to nearly 1,000 terawatt-hours in the next five years, which is more than Japan’s total electricity consumption today.

Today, the US has about 45% of the world’s data center capacity, followed by China. Those two countries will continue to represent the overwhelming majority of capacity through 2035.  

2. The electricity needed to power data centers will largely come from fossil fuels like coal and natural gas in the near term, but nuclear and renewables could play a key role, especially after 2030.

The IEA report is relatively optimistic on the potential for renewables to power data centers, projecting that nearly half of global growth by 2035 will be met with renewables like wind and solar. (In Europe, the IEA projects, renewables will meet 85% of new demand.)

In the near term, though, natural gas and coal will also expand. An additional 175 terawatt-hours from gas will help meet demand in the next decade, largely in the US, according to the IEA’s projections. Another report, published this week by the energy consultancy BloombergNEF, suggests that fossil fuels will play an even larger role than the IEA projects, accounting for two-thirds of additional electricity generation between now and 2035.

Nuclear energy, a favorite of big tech companies looking to power operations without generating massive emissions, could start to make a dent after 2030, according to the IEA data.

3. Data centers are just a small piece of expected electricity demand growth this decade.

We should be talking more about appliances, industry, and EVs when we talk about energy! Electricity demand is on the rise from a whole host of sources: Electric vehicles, air-conditioning, and appliances will each drive more electricity demand than data centers between now and the end of the decade. In total, data centers make up a little over 8% of electricity demand expected between now and 2030.

There are interesting regional effects here, though. Growing economies will see more demand from the likes of air-conditioning than from data centers. On the other hand, the US has seen relatively flat electricity demand from consumers and industry for years, so newly rising demand from high-performance computing will make up a larger chunk. 

4. Data centers tend to be clustered together and close to population centers, making them a unique challenge for the power grid.  

The grid is no stranger to facilities that use huge amounts of energy: Cement plants, aluminum smelters, and coal mines all pull a lot of power in one place. However, data centers are a unique sort of beast.

First, they tend to be closely clustered together. Globally, data centers make up about 1.5% of total electricity demand. However, in Ireland, that number is 20%, and in Virginia, it’s 25%. That trend looks likely to continue, too: Half of data centers under development in the US are in preexisting clusters.

Data centers also tend to be closer to urban areas than other energy-intensive facilities like factories and mines. 

Since data centers are close both to each other and to communities, they could have significant impacts on the regions where they’re situated, whether by bringing on more fossil fuels close to urban centers or by adding strain to the local grid. Or both.

Overall, AI and data centers more broadly are going to be a major driving force for electricity demand. It’s not the whole story, but it’s a unique part of our energy picture to continue watching moving forward. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Tariffs are bad news for batteries

Update: Since this story was first published in The Spark, our weekly climate newsletter, the White House announced that most reciprocal tariffs would be paused for 90 days. That pause does not apply to China, which will see an increased tariff rate of 125%.

Today, new tariffs go into effect for goods imported into the US from basically every country on the planet.

Since Donald Trump announced his plans for sweeping tariffs last week, the vibes have been, in a word, chaotic. Markets have seen one of the quickest drops in the last century, and it’s widely anticipated that the global economic order may be forever changed.  

While many try not to look at the effects on their savings and retirement accounts, experts are scrambling to understand what these tariffs might mean for various industries. As my colleague James Temple wrote in a new story last week, anxieties are especially high in climate technology.

These tariffs could be particularly rough on the battery industry. China dominates the entire supply chain and is subject to monster tariff rates, and even US battery makers won’t escape the effects.   

First, in case you need it, a super-quick refresher: Tariffs are taxes charged on goods that are imported (in this case, into the US). If I’m a US company selling bracelets, and I typically buy my beads and string from another country, I’ll now be paying the US government an additional percentage of what those goods cost to import. Under Trump’s plan, that might be 10%, 20%, or upwards of 50%, depending on the country sending them to me. 

In theory, tariffs should help domestic producers, since products from competitors outside the country become more expensive. But since so many of the products we use have supply chains that stretch all over the world, even products made in the USA often have some components that would be tariffed.

In the case of batteries, we could be talking about really high tariff rates, because most batteries and their components currently come from China. As of 2023, the country made more than 75% of the world’s lithium-ion battery cells, according to data from the International Energy Agency.

Trump’s new plan adds a 34% tariff on all Chinese goods, and that stacks on top of a 20% tariff that was already in place, making the total 54%. (Then, as of Wednesday, the White House further raised the tariff on China, making the total 104%.)

But when it comes to batteries, that’s not even the whole story. There was already a 3.5% tariff on all lithium-ion batteries, for example, as well as a 7.5% tariff on batteries from China that’s set to increase to 25% next year.

If we add all those up, lithium-ion batteries from China could have a tariff of 82% in 2026. (Or 132%, with this additional retaliatory tariff.) In any case, that’ll make EVs and grid storage installations a whole lot more expensive, along with phones, laptops, and other rechargeable devices.

The economic effects could be huge. The US still imports the majority of its lithium-ion batteries, and nearly 70% of those imports are from China. The US imported $4 billion worth of lithium-ion batteries from China just during the first four months of 2024.

Although US battery makers could theoretically stand to benefit, there are a limited number of US-based factories. And most of those factories are still purchasing components from China that will be subject to the tariffs, because it’s hard to overstate just how dominant China is in battery supply chains.

While China makes roughly three-quarters of lithium-ion cells, it’s even more dominant in components: 80% of the world’s cathode materials are made in China, along with over 90% of anode materials. (For those who haven’t been subject to my battery ramblings before, the cathode and anode are two of the main components of a battery—basically, the plus and minus ends.)

Even battery makers that work in alternative chemistries don’t seem to be jumping for joy over tariffs. Lyten is a California-based company working to build lithium-sulfur batteries, and most of its components can be sourced in the US. (For more on the company’s approach, check out this story from 2024.) But tariffs could still spell trouble. Lyten has plans for a new factory, scheduled for 2027, that rely on sourcing affordable construction materials. Will that be possible? “We’re not drawing any conclusions quite yet,” Lyten’s chief sustainability officer, Keith Norman, told Heatmap News.

The battery industry in the US was already in a pretty tough spot. Billions of dollars’ worth of factories have been canceled since Trump took office.  Companies making investments that can total hundreds of millions or billions of dollars don’t love uncertainty, and tariffs are certainly adding to an already uncertain environment.

We’ll be digging deeper into what the tariffs mean for climate technology broadly, and specifically some of the industries we cover. If you have questions, or if you have thoughts to share about what this will mean for your area of research or business, I’d love to hear them at casey.crownhart@technologyreview.com. I’m also on Bluesky @caseycrownhart.bsky.social.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

We should talk more about air-conditioning

Things are starting to warm up here in the New York City area, and it’s got me thinking once again about something that people aren’t talking about enough: energy demand for air conditioners. 

I get it: Data centers are the shiny new thing to worry about. And I’m not saying we shouldn’t be thinking about the strain that gigawatt-scale computing installations put on the grid. But a little bit of perspective is important here.

According to a report from the International Energy Agency last year, data centers will make up less than 10% of the increase in energy demand between now and 2030, far less than the energy demand from space cooling (mostly air-conditioning).

I just finished up a new story that’s out today about a novel way to make heat exchangers, a crucial component in air conditioners and a whole host of other technologies that cool our buildings, food, and electronics. Let’s dig into why I’m writing about the guts of cooling technologies, and why this sector really needs innovation. 

One twisted thing about cooling and climate change: It’s all a vicious cycle. As temperatures rise, the need for cooling technologies increases. In turn, more fossil-fuel power plants are firing up to meet that demand, turning up the temperature of the planet in the process.

“Cooling degree days” are one measure of the need for additional cooling. Basically, you take a preset baseline temperature and figure out how much the temperature exceeds it. Say the baseline (above which you’d likely need to flip on a cooling device) is 21 °C (70 °F). If the average temperature for a day is 26 °C, that’s five cooling degree days on a single day. Repeat that every day for a month, and you wind up with 150 cooling degree days.

I explain this arguably weird metric because it’s a good measure of total energy demand for cooling—it lumps together both how many hot days there are and just how hot it is.  

And the number of cooling degree days is steadily ticking up globally. Global cooling degree days were 6% higher in 2024 than in 2023, and 20% higher than the long-term average for the first two decades of the century. Regions that have high cooling demand, like China, India, and the US, were particularly affected, according to the IEA report. You can see a month-by-month breakdown of this data from the IEA here.

That increase in cooling degree days is leading to more demand for air conditioners, and for energy to power them. Air-conditioning accounted for 7% of the world’s electricity demand in 2022, and it’s only going to get more important from here.

There were fewer than 2 billion AC units in the world in 2016. By 2050, that could be nearly 6 billion, according to a 2018 report from the IEA. This is a measure of progress and, in a way, something we should be happy about; the number of air conditioners tends to rise with household income. But it does present a challenge to the grid.  

Another piece of this whole thing: It’s not just about how much total electricity we need to run air conditioners but about when that demand tends to come. As we’ve covered in this newsletter before, your air-conditioning habits aren’t unique. Cooling devices tend to flip on around the same time—when it’s hot. In some parts of the US, for example, air conditioners can represent more than 70% of residential energy demand at times when the grid is most stressed.

The good news is that we’re seeing innovations in cooling technology. Some companies are building cooling systems that include an energy storage component, so they can charge up when energy is plentiful and demand is low. Then they can start cooling when it’s most needed, without sucking as much energy from the grid during peak hours.

We’ve also covered alternatives to air conditioners called desiccant cooling systems, which use special moisture-sucking materials to help cool spaces and deal with humidity more efficiently than standard options.

And in my latest story, I dug into new developments in heat exchanger technology. Heat exchangers are a crucial component of air conditioners, but you can really find them everywhere—in heat pumps, refrigerators, and, yes, the cooling systems in large buildings and large electronics installations, including data centers.

We’ve been building heat exchangers basically the same way for nearly a century. These components basically move heat around, and there are a few known ways to do so with devices that are relatively straightforward to manufacture. Now, though, one team of researchers has 3D-printed a heat exchanger that outperforms some standard designs and rivals others. This is still a long way from solving our looming air-conditioning crisis, but the details are fascinating—I hope you’ll give it a read

We need more innovation in cooling technology to help meet global demand efficiently so we don’t stay stuck in this cycle. And we’ll need policy and public support to make sure that these technologies make a difference and that everyone has access to them too. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

How to save a glacier

Glaciers generally move so slowly you can’t see their progress with the naked eye. (Their pace is … glacial.) But these massive bodies of ice do march downhill, with potentially planet-altering consequences.  

There’s a lot we don’t understand about how glaciers move and how soon some of the most significant ones could collapse into the sea. That could be a problem, since melting glaciers could lead to multiple feet of sea-level rise this century, potentially displacing millions of people who live and work along the coasts.

A new group is aiming not only to further our understanding of glaciers but also to look into options to save them if things move toward a worst-case scenario, as my colleague James Temple outlined in his latest story. One idea: refreezing glaciers in place.

The whole thing can sound like science fiction. But once you consider how huge the stakes are, I think it gets easier to understand why some scientists say we should at least be exploring these radical interventions.

It’s hard to feel very optimistic about glaciers these days. (The Thwaites Glacier in West Antarctica is often called the “doomsday glacier”—not alarming at all!)

Take two studies published just in the last month, for example. The British Antarctic Survey released the most detailed map to date of Antarctica’s bedrock—the foundation under the continent’s ice. With twice as many data points as before, the study revealed that more ice than we thought is resting on bedrock that’s already below sea level. That means seawater can flow in and help melt ice faster, so Antarctica’s ice is more vulnerable than previously estimated.

Another study examined subglacial rivers—streams that flow under the ice, often from subglacial lakes. The team found that the fastest-moving glaciers have a whole lot of water moving around underneath them, which speeds melting and lubricates the ice sheet so it slides faster, in turn melting even more ice.

And those are just two of the most recent surveys. Look at any news site and it’s probably delivered the same gnarly message at some point recently: The glaciers are melting faster than previously realized. (Our site has one, too: “Greenland’s ice sheet is less stable than we thought,” from 2016.) 

The new group is joining the race to better understand glaciers. Arête Glacier Initiative, a nonprofit research organization founded by scientists at MIT and Dartmouth, has already awarded its first grants to researchers looking into how glaciers melt and plans to study the possibility of reversing those fortunes, as James exclusively reported last week.

Brent Minchew, one of the group’s cofounders and an associate professor of geophysics at MIT, was drawn to studying glaciers because of their potential impact on sea-level rise. “But over the years, I became less content with simply telling a more dramatic story about how things were going—and more open to asking the question of what can we do about it,” he says.

Minchew is among the researchers looking into potential plans to alter the future of glaciers. Strategies being proposed by groups around the world include building physical supports to prop them up and installing massive curtains to slow the flow of warm water that speeds melting. Another approach, which will be the focus of Arête, is called basal intervention. It basically involves drilling holes in glaciers, which would allow water flowing underneath the ice to be pumped out and refrozen, hopefully slowing them down.

If you have questions about how all this would work, you’re not alone. These are almost inconceivably huge engineering projects, they’d be expensive, and they’d face legal and ethical questions. Nobody really owns Antarctica, and it’s governed by a huge treaty—how could we possibly decide whether to move forward with these projects?

Then there’s the question of the potential side effects. Just look at recent news from the Arctic Ice Project, which was researching how to slow the melting of sea ice by covering it with substances designed to reflect sunlight away. (Sea ice is different from glaciers, but some of the key issues are the same.) 

One of the project’s largest field experiments involved spreading tiny silica beads, sort of like sand, over 45,000 square feet of ice in Alaska. But after new research revealed that the materials might be disrupting food chains, the organization announced that it’s concluding its research and winding down operations.

Cutting our emissions of greenhouse gases to stop climate change at the source would certainly be more straightforward than spreading beads on ice, or trying to stop a 74,000-square-mile glacier in its tracks. 

But we’re not doing so hot on cutting emissions—in fact, levels of carbon dioxide in the atmosphere rose faster than ever in 2024. And even if the world stopped polluting the atmosphere with planet-warming gases today, things may have already gone too far to save some of the most vulnerable glaciers. 

The longer I cover climate change and face the situation we’re in, the more I understand the impulse to at least consider every option out there, even if it sounds like science fiction. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The elephant in the room for energy tech? Uncertainty.

At a conference dedicated to energy technology that I attended this week, I noticed an outward attitude of optimism and excitement. But it’s hard to miss the current of uncertainty just underneath. 

The ARPA-E Energy Innovation Summit, held this year just outside Washington, DC, gathers some of the most cutting-edge innovators working on everything from next-generation batteries to plants that can mine for metals. Researchers whose projects have received funding from ARPA-E—part of the US Department of Energy that gives money to high-risk research in energy—gather to show their results and mingle with each other, investors, and nosy journalists like yours truly. (For more on a few of the coolest things I saw, check out this story.)

This year, though, there was an elephant in the room, and it’s the current state of the US federal government. Or maybe it’s climate change? In any case, the vibes were weird. 

The last time I was at this conference, two years ago, climate change was a constant refrain on stage and in conversations. The central question was undoubtedly: How do we decarbonize, generate energy, and run our lives without relying on polluting fossil fuels? 

This time around, I didn’t hear the phrase “climate change” once during the opening session, which included speeches from US Secretary of Energy Chris Wright and acting ARPA-E director Daniel Cunningham. The focus was on American energy dominance—on how we can get our hands on more, more, more energy to meet growing demand. 

Last week, Wright spoke at an energy conference in Houston and had a lot to say about climate, calling climate change a “side effect of building the modern world” and climate policies irrational and quasi-religious, and he said that when it came to climate action, the cure had become worse than the disease

I was anticipating similar talking points at the summit, but this week, climate change hardly got a mention.

What I noticed in Wright’s speech and in the choice of programming throughout the conference is that some technologies appear to be among the favored, and others are decidedly less prominent. Nuclear power and fusion were definitely on the “in” list. There was a nuclear panel in the opening session, and in his remarks Wright called out companies like Commonwealth Fusion Systems and Zap Energy. He also praised small modular reactors

Renewables, including wind and solar, were mentioned only in the context of their inconsistency—Wright dwelled on that, rather than on other facts I’d argue are just as important, like that they are among the cheapest methods of generating electricity today. 

In any case, Wright seemed appropriately hyped about energy, given his role in the administration. “Call me biased, but I think there’s no more impactful place to work in than energy,” he said during his opening remarks on the first morning of the summit. He sang the praises of energy innovation, calling it a tool to drive progress, and outlined his long career in the field. 

This all comes after a chaotic couple of months for the federal government that are undoubtedly affecting the industry. Mass layoffs have hit federal agencies, including the Department of Energy. President Donald Trump very quickly tried to freeze spending from the Inflation Reduction Act, which includes tax credits and other support for EVs and power plants. 

As I walked around the showcase and chatted with experts over coffee, I heard a range of reactions to the opening session and feelings about this moment for the energy sector. 

People working in industries the Trump administration seems to favor, like nuclear energy, tended to be more positive. Some in academia who rely on federal grants to fund their work were particularly nervous about what comes next. One researcher refused to talk to me when I said I was a journalist. In response to my questions about why they weren’t able to discuss the technology on display at their booth, another member on the same project said only that it’s a wild time.

Making progress on energy technology doesn’t require that we all agree on exactly why we’re doing it. But in a moment when we need all the low-carbon technologies we can get to address climate change—a problem scientists overwhelmingly agree is a threat to our planet—I find it frustrating that politics can create such a chilling effect in some sectors. 

At the conference, I listened to smart researchers talk about their work. I saw fascinating products and demonstrations, and I’m still optimistic about where energy can go. But I also worry that uncertainty about the future of research and government support for emerging technologies will leave some valuable innovations in the dust. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

This startup just hit a big milestone for green steel production

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Green-steel startup Boston Metal just showed that it has all the ingredients needed to make steel without emitting gobs of greenhouse gases. The company successfully ran its largest reactor yet to make steel, producing over a ton of metal, MIT Technology Review can exclusively report.

The latest milestone means that Boston Metal just got one step closer to commercializing its technology. The company’s process uses electricity to make steel, and depending on the source of that electricity, it could mean cleaning up production of one of the most polluting materials on the planet. The world produces about 2 billion metric tons of steel each year, emitting over 3 billion metric tons of carbon dioxide in the process.

While there are still a lot of milestones left before reaching the scale needed to make a dent in the steel industry, the latest run shows that the company can scale up its process.

Boston Metal started up its industrial reactor for steelmaking in January, and after it had run for several weeks, the company siphoned out roughly a ton of material on February 17. (You can see a video of the molten metal here. It’s really cool.)

Work on this reactor has been underway for a while. I got to visit the facility in Woburn, Massachusetts, in 2022, when construction was nearly done. In the years since, the company has been working on testing it out to make other metals before retrofitting it for steel production. 

Boston Metal’s approach is very different from that of a conventional steel plant. Steelmaking typically involves a blast furnace, which uses a coal-based fuel called coke to drive the reactions needed to turn iron ore into iron (the key ingredient in steel). The carbon in coke combines with oxygen pulled out of the iron ore, which gets released as carbon dioxide.

Instead, Boston Metal uses electricity in a process called molten oxide electrolysis (MOE). Iron ore gets loaded into a reactor, mixed with other ingredients, and then electricity is run through it, heating the mixture to around 1,600 °C (2,900 °F) and driving the reactions needed to make iron. That iron can then be turned into steel. 

Crucially for the climate, this process emits oxygen rather than carbon dioxide (that infamous greenhouse gas). If renewables like wind and solar or nuclear power are used as the source of electricity, then this approach can virtually cut out the climate impact from steel production. 

MOE was developed at MIT, and Boston Metal was founded in 2013 to commercialize the technology. Since then, the company has worked to take it from lab scale, with reactors roughly the size of a coffee cup, to much larger ones that can produce tons of metal at a time. That’s crucial for an industry that operates on the scale of billions of tons per year.

“The volumes of steel everywhere around us—it’s immense,” says Adam Rauwerdink, senior vice president of business development at Boston Metal. “The scale is massive.”

factory view of Boston Metal and MOE Green Steel

BOSTON METAL

Making the huge amounts of steel required to be commercially relevant has been quite the technical challenge. 

One key component of Boston Metal’s design is the anode. It’s basically a rounded metallic bit that sticks into the reactor, providing a way for electricity to get in and drive the reactions required. In theory, this anode doesn’t get used up, but if the conditions aren’t quite right, it can degrade over time.

Over the past few years, the company has made a lot of progress in preventing inert anode degradation, Rauwerdink says. The latest phase of work is more complicated, because now the company is adding multiple anodes in the same reactor. 

In lab-scale reactors, there’s one anode, and it’s quite small. Larger reactors require bigger anodes, and at a certain point it’s necessary to add more of them. The latest run continues to prove how Boston Metal’s approach can scale, Rauwerdink says: making reactors larger, adding more anodes, and then adding multiple reactors together in a single plant to make the volumes of material needed.

Now that the company has completed its first run of the multi-anode reactor for steelmaking, the plan is to keep exploring how the reactions happen at this larger scale. These runs will also help the company better understand what it will cost to make its products.

The next step is to build an even bigger system, Rauwerdink says—something that won’t fit in the Boston facility. While a reactor of the current size can make a ton or two of material in about a month, the truly industrial-scale equipment will make that amount of metal in about a day. That demonstration plant should come online in late 2026 and begin operation in 2027, he says. Ultimately, the company hopes to license its technology to steelmakers. 

In steel and other heavy industries, the scale can be mind-boggling. Boston Metal has been at this for over a decade, and it’s fascinating to see the company make progress toward becoming a player in this massive industry. 


Now read the rest of The Spark

Related reading

We named green steel one of our 2025 Breakthrough Technologies. Read more about why here.

I visited Boston Metal’s facility in Massachusetts in 2022—read more about the company’s technology in this story (I’d say it pretty much holds up). 

Climate tech companies like Boston Metal have seen a second boom period for funding and support following the cleantech crash a decade ago. Read more in this 2023 feature from David Rotman

High voltage towers at sunset background. Power lines against the sky

GETTY

Another thing

Electricity demand is rising faster in the US than it has in decades, and meeting it will require building new power plants and expanding grid infrastructure. That could be a problem, because it’s historically been expensive and slow to get new transmission lines approved. 

New technologies could help in a major way, according to Brian Deese and Rob Gramlich. Read more in this new op-ed

And one more

Plants have really nailed the process of making food from sunlight in photosynthesis. For a very long time, researchers have been trying to mimic this process and make an artificial leaf that can make fuels using the sun’s energy.

Now, researchers are aiming to make energy-dense fuels using a specialized, copper-containing catalyst. Read more about the innovation in my colleague Carly Kay’s latest story

Keeping up with climate

Energy storage is still growing quickly in the US, with 18 gigawatts set to come online this year. That’s up from 11 GW in 2024. (Canary Media)

Oil companies including Shell, BP, and Equinor are rolling back climate commitments and ramping up fossil-fuel production. Oil and gas companies were accounting for only a small fraction of clean energy investment, so experts say that’s not a huge loss. But putting money toward new oil and gas could be bad for emissions. (Grist)

Butterfly populations are cratering around the US, dropping by 22% in just the last 20 years. Check out this visualization to see how things are changing where you live. (New York Times)

New York City’s congestion pricing plan, which charges cars to enter the busiest parts of the city, is gaining popularity: 42% of New York City residents support the toll, up from 32% in December. (Bloomberg)

Here’s a reality check for you: Ukraine doesn’t have minable deposits of rare earth metals, experts say. While tensions between US and Ukraine leaders ran high in a meeting to discuss a minerals deal, IEEE Spectrum reports that the reality doesn’t match the political theater here. (IEEE Spectrum)

Quaise Energy has a wild drilling technology that it says could unlock the potential for geothermal energy. In a demonstration, the company recently drilled several inches into a piece of rock using its millimeter-wave technology. (Wall Street Journal)

Here’s another one for the “weird climate change effects” file: greenhouse-gas emissions could mean less capacity for satellites. It’s getting crowded up there. (Grist)

The Biden administration funded agriculture projects related to climate change, and now farmers are getting caught up in the Trump administration’s efforts to claw back the money. This is a fascinating case of how the same project can be described with entirely different language depending on political priorities. (Washington Post)

You and I are helping to pay for the electricity demands of big data centers. While some grid upgrades are needed just to serve big projects like those centers, the cost of building and maintaining the grid is shared by everyone who pays for electricity. (Heatmap)