Three climate technologies breaking through in 2026

Happy New Year! I know it’s a bit late to say, but it never quite feels like the year has started until the new edition of our 10 Breakthrough Technologies list comes out. 

For 25 years, MIT Technology Review has put together this package, which highlights the technologies that we think are going to matter in the future. This year’s version has some stars, including gene resurrection (remember all the dire wolf hype last year?) and commercial space stations

And of course, the world of climate and energy is represented with sodium-ion batteries, next-generation nuclear, and hyperscale AI data centers. Let’s take a look at what ended up on the list, and what it says about this moment for climate tech. 

Sodium-ion batteries

I’ve been covering sodium-ion batteries for years, but this moment feels like a breakout one for the technology. 

Today, lithium-ion cells power everything from EVs, phones, and computers to huge stationary storage arrays that help support the grid. But researchers and battery companies have been racing to develop an alternative, driven by the relative scarcity of lithium and the metal’s volatile price in recent years. 

Sodium-ion batteries could be that alternative. Sodium is much more abundant than lithium, and it could unlock cheaper batteries that hold a lower fire risk.  

There are limitations here: Sodium-ion batteries won’t be able to pack as much energy into cells as their lithium counterparts. But it might not matter, especially for grid storage and smaller EVs. 

In recent years, we’ve seen a ton of interest in sodium-based batteries, particularly from major companies in China. Now the new technology is starting to make its way into the world—CATL says it started manufacturing these batteries at scale in 2025. 

Next-generation nuclear

Nuclear reactors are an important part of grids around the world today—massive workhorse reactors generate reliable, consistent electricity. But the countries with the oldest and most built-out fleets have struggled to add to them in recent years, since reactors are massive and cost billions. Recent high-profile projects have gone way over budget and faced serious delays. 

Next-generation reactor designs could help the industry break out of the old blueprint and get more nuclear power online more quickly, and they’re starting to get closer to becoming reality. 

There’s a huge variety of proposals when it comes to what’s next for nuclear. Some companies are building smaller reactors, which they say could make it easier to finance new projects, and get them done on time. 

Other companies are focusing on tweaking key technical bits of reactors, using alternative fuels or coolants that help ferry heat out of the reactor core. These changes could help reactors generate electricity more efficiently and safely. 

Kairos Power was the first US company to receive approval to begin construction on a next-generation reactor to produce electricity. China is emerging as a major center of nuclear development, with the country’s national nuclear company reportedly working on several next-gen reactors. 

Hyperscale data centers

This one isn’t quite what I would call a climate technology, but I spent most of last year reporting on the climate and environmental impacts of AI, and the AI boom is deeply intertwined with climate and energy. 

Data centers aren’t new, but we’re seeing a wave of larger centers being proposed and built to support the rise of AI. Some of these facilities require a gigawatt or more of power—that’s like the output of an entire conventional nuclear power plant, just for one data center. 

(This feels like a good time to mention that our Breakthrough Technologies list doesn’t just highlight tech that we think will have a straightforwardly positive influence on the world. I think back to our 2023 list, which included mass-market military drones.)

There’s no denying that new, supersize data centers are an important force driving electricity demand, sparking major public pushback, and emerging as a key bit of our new global infrastructure. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

What new legal challenges mean for the future of US offshore wind

For offshore wind power in the US, the new year is bringing new legal battles.

On December 22, the Trump administration announced it would pause the leases of five wind farms currently under construction off the US East Coast. Developers were ordered to stop work immediately.

The cited reason? National security, specifically concerns that turbines can cause radar interference. But that’s a known issue, and developers have worked with the government to deal with it for years.

Companies have been quick to file lawsuits, and the court battles could begin as soon as this week. Here’s what the latest kerfuffle might mean for the struggling offshore wind industry in the US.

This pause affects $25 billion in investment in five wind farms: Vineyard Wind 1 off Massachusetts, Revolution Wind off Rhode Island, Sunrise Wind and Empire Wind off New York, and Coastal Virginia Offshore Wind off Virginia. Together, those projects had been expected to create 10,000 jobs and power more than 2.5 million homes and businesses.

In a statement announcing the move, the Department of the Interior said that “recently completed classified reports” revealed national security risks, and that the pause would give the government time to work through concerns with developers. The statement specifically says that turbines can create radar interference (more on the technical details here in a moment).

Three of the companies involved have already filed lawsuits, and they’re seeking preliminary injunctions that would allow construction to continue. Orsted and Equinor (the developers for Revolution Wind and Empire Wind, respectively) told the New York Times that their projects went through lengthy federal reviews, which did address concerns about national security.

This is just the latest salvo from the Trump administration against offshore wind. On Trump’s first day in office, he signed an executive order stopping all new lease approvals for offshore wind farms. (That order was struck down by a judge in December.)

The administration previously ordered Revolution Wind to stop work last year, also citing national security concerns. A federal judge lifted the stop-work order weeks later, after the developer showed that the financial stakes were high, and that government agencies had previously found no national security issues with the project.

There are real challenges that wind farms introduce for radar systems, which are used in everything from air traffic control to weather forecasting to national defense operations. A wind turbine’s spinning can create complex signatures on radar, resulting in so-called clutter.

Previous government reports, including one 2024 report from the Department of Energy and a 2025 report from the Government Accountability Office (an independent government watchdog), have pointed out this issue in the past.

“To date, no mitigation technology has been able to fully restore the technical performance of impacted radars,” as the DOE report puts it. However, there are techniques that can help, including software that acts to remove the signatures of wind turbines. (Think of this as similar to how noise-canceling headphones work, but more complicated, as one expert told TechCrunch.)

But the most widespread and helpful tactic, according to the DOE report, is collaboration between developers and the government. By working together to site and design wind farms strategically, the groups can ensure that the projects don’t interfere with government or military operations. The 2025 GAO report found that government officials, researchers, and offshore wind companies were collaborating effectively, and any concerns could be raised and addressed in the permitting process.

This and other challenges threaten an industry that could be a major boon for the grid. On the East Coast where these projects are located, and in New England specifically, winter can bring tight supplies of fossil fuels and spiking prices because of high demand. It just so happens that offshore winds blow strongest in the winter, so new projects, including the five wrapped up in this fight, could be a major help during the grid’s greatest time of need.

One 2025 study found that if 3.5 gigawatts’ worth of offshore wind had been operational during the 2024-2025 winter, it would have lowered energy prices by 11%. (That’s the combined capacity of Revolution Wind and Vineyard Wind, two of the paused projects, plus two future projects in the pipeline.) Ratepayers would have saved $400 million.

Before Donald Trump was elected, the energy consultancy BloombergNEF projected that the US would build 39 gigawatts of offshore wind by 2035. Today, that expectation has dropped to just 6 gigawatts. These legal battles could push it lower still.

What’s hardest to wrap my head around is that some of the projects being challenged are nearly finished. The developers of Revolution Wind have installed all the foundations and 58 of 65 turbines, and they say the project is over 87% complete. Empire Wind is over 60% done and is slated to deliver electricity to the grid next year.

To hit the pause button so close to the finish line is chilling, not just for current projects but for future offshore wind efforts in the US. Even if these legal battles clear up and more developers can technically enter the queue, why would they want to? Billions of dollars are at stake, and if there’s one word to describe the current state of the offshore wind industry in the US, it’s “unpredictable.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Four bright spots in climate news in 2025

Climate news hasn’t been great in 2025. Global greenhouse-gas emissions hit record highs (again). This year is set to be either the second or third warmest on record. Climate-fueled disasters like wildfires in California and flooding in Indonesia and Pakistan devastated communities and caused billions in damage.

In addition to these worrying indicators of our continued contributions to climate change and their obvious effects, the world’s largest economy has made a sharp U-turn on climate policy this year. The US under the Trump administration withdrew from the Paris Agreement, cut funds for climate research, and scrapped billions of dollars in funding for climate tech projects.

We’re in a severe situation with climate change. But for those looking for bright spots, there was some good news in 2025. Here are a few of the positive stories our climate reporters noticed this year.

China’s flattening emissions

Solar panels field on hillside

GETTY IMAGES

One of the most notable and encouraging signs of progress this year occurred in China. The world’s second-biggest economy and biggest climate polluter has managed to keep carbon dioxide emissions flat for the last year and a half, according to an analysis in Carbon Brief.

That’s happened before, but only when the nation’s economy was retracting, including in the midst of the covid-19 pandemic. But emissions are now falling even as China’s economy is on track to grow about 5% this year, and electricity demands continue to rise.

So what’s changed? China has now installed so much solar and wind, and put so many EVs on the road, that its economy can continue to expand without increasing the amount of carbon dioxide it’s pumping into the atmosphere, decoupling the traditional link between emissions and growth.

Specifically, China added an astounding 240 gigawatts of solar power capacity and 61 gigawatts of wind power in the first nine months of the year, the Carbon Brief analysis noted. That’s nearly as much solar power as the US has installed in total, in just the first three quarters of this year.

It’s too early to say China’s emissions have peaked, but the country has said it will officially reach that benchmark before 2030.

To be clear, China still isn’t moving fast enough to keep the world on track for meeting relatively safe temperature targets. (Indeed, very few countries are.) But it’s now both producing most of the world’s clean energy technologies and curbing its emissions growth, providing a model for cleaning up industrial economies without sacrificing economic prosperity—and setting the stage for faster climate progress in the coming years.

Batteries on the grid

looking down a row on battery storage units on an overcast day

AP PHOTO/SAM HODDE

It’s hard to articulate just how quickly batteries for grid storage are coming online. These massive arrays of cells can soak up electricity when sources like solar are available and prices are low, and then discharge power back to the grid when it’s needed most.

Back in 2015, the battery storage industry had installed only a fraction of a gigawatt of battery storage capacity across the US. That year, it set a seemingly bold target of adding 35 gigawatts by 2035. The sector passed that goal a decade early this year and then hit 40 gigawatts a couple of months later. 

Costs are still falling, which could help maintain the momentum for the technology’s deployment. This year, battery prices for EVs and stationary storage fell yet again, reaching a record low, according to data from BloombergNEF. Battery packs specifically used for grid storage saw prices fall even faster than the average; they cost 45% less than last year.

We’re starting to see what happens on grids with lots of battery capacity, too: in California and Texas, batteries are already helping meet demand in the evenings, reducing the need to run natural-gas plants. The result: a cleaner, more stable grid.

AI’s energy funding influx

Aerial view of a large Google Data Centre being built in Cheshunt, Hertfordshire, UK

GETTY IMAGES

The AI boom is complicated for our energy system, as we covered at length this year. Electricity demand is ticking up: the amount of power utilities supplied to US data centers jumped 22% this year and will more than double by 2030.

But at least one positive shift is coming out of AI’s influence on energy: It’s driving renewed interest and investment in next-generation energy technologies.

In the near term, much of the energy needed for data centers, including those that power AI, will likely come from fossil fuels, especially new natural-gas power plants. But tech giants like Google, Microsoft, and Meta all have goals on the books to reduce their greenhouse-gas emissions, so they’re looking for alternatives.

Meta signed a deal with XGS Energy in June to purchase up to 150 megawatts of electricity from a geothermal plant. In October, Google signed an agreement that will help reopen Duane Arnold Energy Center in Iowa, a previously shuttered nuclear power plant.

Geothermal and nuclear could be key pieces of the grid of the future, as they can provide constant power in a way that wind and solar don’t. There’s a long way to go for many of the new versions of the tech, but more money and interest from big, powerful players can’t hurt.

Good news, bad news

Aerial view of solar power and battery storage units in the desert

ADOBE STOCK

Perhaps the strongest evidence of collective climate progress so far: We’ve already avoided the gravest dangers that scientists feared just a decade ago.

The world is on track for about 2.6 °C of warming over preindustrial conditions by 2100, according to Climate Action Tracker, an independent scientific effort to track the policy progress that nations have made toward their goals under the Paris climate agreement.

That’s a lot warmer than we want the planet to ever get. But it’s also a whole degree better than the 3.6 °C path that we were on a decade ago, just before nearly 200 countries signed the Paris deal.

That progress occurred because more and more nations passed emissions mandates, funded subsidies, and invested in research and development—and private industry got busy cranking out vast amounts of solar panels, wind turbines, batteries, and EVs. 

The bad news is that progress has stalled. Climate Action Tracker notes that its warming projections have remained stubbornly fixed for the last four years, as nations have largely failed to take the additional action needed to bend that curve closer to the 2 °C goal set out in the international agreement.

But having shaved off a degree of danger is still demonstrable proof that we can pull together in the face of a global threat and address a very, very hard problem. And it means we’ve done the difficult work of laying down the technical foundation for a society that can largely run without spewing ever more greenhouse gas into the atmosphere.

Hopefully, as cleantech continues to improve and climate change steadily worsens, the world will find the collective will to pick up the pace again soon.

Can AI really help us discover new materials?

Judging from headlines and social media posts in recent years, one might reasonably assume that AI is going to fix the power grid, cure the world’s diseases, and finish my holiday shopping for me. But maybe there’s just a whole lot of hype floating around out there.

This week, we published a new package called Hype Correction. The collection of stories takes a look at how the world is starting to reckon with the reality of what AI can do, and what’s just fluff.

One of my favorite stories in that package comes from my colleague David Rotman, who took a hard look at AI for materials research. AI could transform the process of discovering new materials—innovation that could be especially useful in the world of climate tech, which needs new batteries, semiconductors, magnets, and more. 

But the field still needs to prove it can make materials that are actually novel and useful. Can AI really supercharge materials research? What could that look like?

For researchers hoping to find new ways to power the world (or cure disease or achieve any number of other big, important goals), a new material could change everything.

The problem is, inventing materials is difficult and slow. Just look at plastic—the first totally synthetic plastic was invented in 1907, but it took until roughly the 1950s for companies to produce the wide range we’re familiar with today. (And of course, though it is incredibly useful, plastic also causes no shortage of complications for society.)

In recent decades, materials science has fallen a bit flat—David has been covering this field for nearly 40 years, and as he puts it, there have been just a few major commercial breakthroughs in that time. (Lithium-ion batteries are one.)

Could AI change everything? The prospect is a tantalizing one, and companies are racing to test it out.

Lila Sciences, based in Cambridge, Massachusetts, is working on using AI models to uncover new materials. The company can not only train an AI model on all the latest scientific literature, but also plug it into an automated lab, so it can learn from experimental data. The goal is to speed up the iterative process of inventing and testing new materials and look at research in ways that humans might miss.

At an MIT Technology Review event earlier this year, I got to listen to David interview Rafael Gómez-Bombarelli, one of Lila’s cofounders. As he described what the company is working on, Gómez-Bombarelli acknowledged that AI materials discovery hasn’t yet seen a big breakthrough moment. Yet.

Gómez-Bombarelli described how models Lila has trained are providing insights that are “as deep [as] or deeper than our domain scientists would have.” In the future, AI could “think” in ways that depart from how human scientists approach a problem, he added: “There will be a need to translate scientific reasoning by AI to the way we think about the world.”

It’s exciting to see this sort of optimism in materials research, but there’s still a long and winding road before we can satisfyingly say that AI has transformed the field. One major difficulty is that it’s one thing to take suggestions from a model about new experimental methods or new potential structures. It’s quite another to actually make a material and show that it’s novel and useful.

You might remember that a couple of years ago, Google’s DeepMind announced it had used AI to predict the structures of “millions of new materials” and had made hundreds of them in the lab.

But as David notes in his story, after that announcement, some materials scientists pointed out that some of the supposedly novel materials were basically slightly different versions of known ones. Others couldn’t even physically exist in normal conditions (the simulations were done at ultra-low temperatures, where atoms don’t move around much).

It’s possible that AI could give materials discovery a much-needed jolt and usher in a new age that brings superconductors and batteries and magnets we’ve never seen before. But for now, I’m calling hype. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Solar geoengineering startups are getting serious

Solar geoengineering aims to manipulate the climate by bouncing sunlight back into space. In theory, it could ease global warming. But as interest in the idea grows, so do concerns about potential consequences.

A startup called Stardust Solutions recently raised a $60 million funding round, the largest known to date for a geoengineering startup. My colleague James Temple has a new story out about the company, and how its emergence is making some researchers nervous.

So far, the field has been limited to debates, proposed academic research, and—sure—a few fringe actors to keep an eye on. Now things are getting more serious. What does it mean for geoengineering, and for the climate?

Researchers have considered the possibility of addressing planetary warming this way for decades. We already know that volcanic eruptions, which spew sulfur dioxide into the atmosphere, can reduce temperatures. The thought is that we could mimic that natural process by spraying particles up there ourselves.

The prospect is a controversial one, to put it lightly. Many have concerns about unintended consequences and uneven benefits. Even public research led by top institutions has faced barriers—one famous Harvard research program was officially canceled last year after years of debate.

One of the difficulties of geoengineering is that in theory a single entity, like a startup company, could make decisions that have a widespread effect on the planet. And in the last few years, we’ve seen more interest in geoengineering from the private sector. 

Three years ago, James broke the story that Make Sunsets, a California-based company, was already releasing particles into the atmosphere in an effort to tweak the climate.

The company’s CEO Luke Iseman went to Baja California in Mexico, stuck some sulfur dioxide into a weather balloon, and sent it skyward. The amount of material was tiny, and it’s not clear that it even made it into the right part of the atmosphere to reflect any sunlight.

But fears that this group or others could go rogue and do their own geoengineering led to widespread backlash. Mexico announced plans to restrict geoengineering experiments in the country a few weeks after that news broke.

You can still buy cooling credits from Make Sunsets, and the company was just granted a patent for its system. But the startup is seen as something of a fringe actor.

Enter Stardust Solutions. The company has been working under the radar for a few years, but it has started talking about its work more publicly this year. In October, it announced a significant funding round, led by some top names in climate investing. “Stardust is serious, and now it’s raised serious money from serious people,” as James puts it in his new story.

That’s making some experts nervous. Even those who believe we should be researching geoengineering are concerned about what it means for private companies to do so.

“Adding business interests, profit motives, and rich investors into this situation just creates more cause for concern, complicating the ability of responsible scientists and engineers to carry out the work needed to advance our understanding,” write David Keith and Daniele Visioni, two leading figures in geoengineering research, in a recent opinion piece for MIT Technology Review.

Stardust insists that it won’t move forward with any geoengineering until and unless it’s commissioned to do so by governments and there are rules and bodies in place to govern use of the technology.

But there’s no telling how financial pressure might change that, down the road. And we’re already seeing some of the challenges faced by a private company in this space: the need to keep trade secrets.

Stardust is currently not sharing information about the particles it intends to release into the sky, though it says it plans to do so once it secures a patent, which could happen as soon as next year. The company argues that its proprietary particles will be safe, cheap to manufacture, and easier to track than the already abundant sulfur dioxide. But at this point, there’s no way for external experts to evaluate those claims.

As Keith and Visioni put it: “Research won’t be useful unless it’s trusted, and trust depends on transparency.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Why the grid relies on nuclear reactors in the winter

As many of us are ramping up with shopping, baking, and planning for the holiday season, nuclear power plants are also getting ready for one of their busiest seasons of the year.

Here in the US, nuclear reactors follow predictable seasonal trends. Summer and winter tend to see the highest electricity demand, so plant operators schedule maintenance and refueling for other parts of the year.

This scheduled regularity might seem mundane, but it’s quite the feat that operational reactors are as reliable and predictable as they are. It leaves some big shoes to fill for next-generation technology hoping to join the fleet in the next few years.

Generally, nuclear reactors operate at constant levels, as close to full capacity as possible. In 2024, for commercial reactors worldwide, the average capacity factor—the ratio of actual energy output to the theoretical maxiumum—was 83%. North America rang in at an average of about 90%.

(I’ll note here that it’s not always fair to just look at this number to compare different kinds of power plants—natural-gas plants can have lower capacity factors, but it’s mostly because they’re more likely to be intentionally turned on and off to help meet uneven demand.)

Those high capacity factors also undersell the fleet’s true reliability—a lot of the downtime is scheduled. Reactors need to refuel every 18 to 24 months, and operators tend to schedule those outages for the spring and fall, when electricity demand isn’t as high as when we’re all running our air conditioners or heaters at full tilt.

Take a look at this chart of nuclear outages from the US Energy Information Administration. There are some days, especially at the height of summer, when outages are low, and nearly all commercial reactors in the US are operating at nearly full capacity. On July 28 of this year, the fleet was operating at 99.6%. Compare that with  the 77.6% of capacity on October 18, as reactors were taken offline for refueling and maintenance. Now we’re heading into another busy season, when reactors are coming back online and shutdowns are entering another low point.

That’s not to say all outages are planned. At the Sequoyah nuclear power plant in Tennessee, a generator failure in July 2024 took one of two reactors offline, an outage that lasted nearly a year. (The utility also did some maintenance during that time to extend the life of the plant.) Then, just days after that reactor started back up, the entire plant had to shut down because of low water levels.

And who can forget the incident earlier this year when jellyfish wreaked havoc on not one but two nuclear power plants in France? In the second instance, the squishy creatures got into the filters of equipment that sucks water out of the English Channel for cooling at the Paluel nuclear plant. They forced the plant to cut output by nearly half, though it was restored within days.

Barring jellyfish disasters and occasional maintenance, the global nuclear fleet operates quite reliably. That wasn’t always the case, though. In the 1970s, reactors operated at an average capacity factor of just 60%. They were shut down nearly as often as they were running.

The fleet of reactors today has benefited from decades of experience. Now we’re seeing a growing pool of companies aiming to bring new technologies to the nuclear industry.

Next-generation reactors that use new materials for fuel or cooling will be able to borrow some lessons from the existing fleet, but they’ll also face novel challenges.

That could mean early demonstration reactors aren’t as reliable as the current commercial fleet at first. “First-of-a-kind nuclear, just like with any other first-of-a-kind technologies, is very challenging,” says Koroush Shirvan, a professor of nuclear science and engineering at MIT.

That means it will probably take time for molten-salt reactors or small modular reactors, or any of the other designs out there to overcome technical hurdles and settle into their own rhythm. It’s taken decades to get to a place where we take it for granted that the nuclear fleet can follow a neat seasonal curve based on electricity demand. 

There will always be hurricanes and electrical failures and jellyfish invasions that cause some unexpected problems and force nuclear plants (or any power plants, for that matter) to shut down. But overall, the fleet today operates at an extremely high level of consistency. One of the major challenges ahead for next-generation technologies will be proving that they can do the same.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

This year’s UN climate talks avoided fossil fuels, again

If we didn’t have pictures and videos, I almost wouldn’t believe the imagery that came out of this year’s UN climate talks.

Over the past few weeks in Belem, Brazil, attendees dealt with oppressive heat and flooding, and at one point a literal fire broke out, delaying negotiations. The symbolism was almost too much to bear.

While many, including the president of Brazil, framed this year’s conference as one of action, the talks ended with a watered-down agreement. The final draft doesn’t even include the phrase “fossil fuels.”

As emissions and global temperatures reach record highs again this year, I’m left wondering: Why is it so hard to formally acknowledge what’s causing the problem?

This is the 30th time that leaders have gathered for the Conference of the Parties, or COP, an annual UN conference focused on climate change. COP30 also marks 10 years since the gathering that produced the Paris Agreement, in which world powers committed to limiting global warming to “well below” 2.0 °C above preindustrial levels, with a goal of staying below the 1.5 °C mark. (That’s 3.6 °F and 2.7 °F, respectively, for my fellow Americans.)

Before the conference kicked off this year, host country Brazil’s president, Luiz Inácio Lula da Silva, cast this as the “implementation COP” and called for negotiators to focus on action, and specifically to deliver a road map for a global transition away from fossil fuels.

The science is clear—burning fossil fuels emits greenhouse gases and drives climate change. Reports have shown that meeting the goal of limiting warming to 1.5 °C would require stopping new fossil-fuel exploration and development.

The problem is, “fossil fuels” might as well be a curse word at global climate negotiations. Two years ago, fights over how to address fossil fuels brought talks at COP28 to a standstill. (It’s worth noting that the conference was hosted in Dubai in the UAE, and the leader was literally the head of the country’s national oil company.)

The agreement in Dubai ended up including a line that called on countries to transition away from fossil fuels in energy systems. It was short of what many advocates wanted, which was a more explicit call to phase out fossil fuels entirely. But it was still hailed as a win. As I wrote at the time: “The bar is truly on the floor.”

And yet this year, it seems we’ve dug into the basement.

At one point about 80 countries, a little under half of those present, demanded a concrete plan to move away from fossil fuels.

But oil producers like Saudi Arabia were insistent that fossil fuels not be singled out. Other countries, including some in Africa and Asia, also made a very fair point: Western nations like the US have burned the most fossil fuels and benefited from it economically. This contingent maintains that legacy polluters have a unique responsibility to finance the transition for less wealthy and developing nations rather than simply barring them from taking the same development route. 

The US, by the way, didn’t send a formal delegation to the talks, for the first time in 30 years. But the absence spoke volumes. In a statement to the New York Times that sidestepped the COP talks, White House spokesperson Taylor Rogers said that president Trump had “set a strong example for the rest of the world” by pursuing new fossil-fuel development.

To sum up: Some countries are economically dependent on fossil fuels, some don’t want to stop depending on fossil fuels without incentives from other countries, and the current US administration would rather keep using fossil fuels than switch to other energy sources. 

All those factors combined help explain why, in its final form, COP30’s agreement doesn’t name fossil fuels at all. Instead, there’s a vague line that leaders should take into account the decisions made in Dubai, and an acknowledgement that the “global transition towards low greenhouse-gas emissions and climate-resilient development is irreversible and the trend of the future.”

Hopefully, that’s true. But it’s concerning that even on the world’s biggest stage, naming what we’re supposed to be transitioning away from and putting together any sort of plan to actually do it seems to be impossible.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Three things to know about the future of electricity

<div data-chronoton-summary="

  • Electricity demand is surging globally. Global electricity demand will grow 40% over the next decade. Data center investment hit $580 billion in 2025 alone—surpassing global oil spending. In the US, data centers will account for half of all electricity growth through 2030.
  • Air-conditioning and emerging economies are reshaping energy consumption. Rising temperatures and growing prosperity in developing nations will add over 500 gigawatts of peak demand by 2035, dwarfing data centers’ contribution to overall electricity growth.
  • Renewables are finally overtaking coal, but the transition remains too slow. Solar and wind led electricity generation in the first half of 2025 with nuclear capacity poised to increase by a third this decade. Yet global emissions are likely to hit record highs again this year.

” data-chronoton-post-id=”1128167″ data-chronoton-expand-collapse=”1″ data-chronoton-analytics-enabled=”1″>

One of the dominant storylines I’ve been following through 2025 is electricity—where and how demand is going up, how much it costs, and how this all intersects with that topic everyone is talking about: AI.

Last week, the International Energy Agency released the latest version of the World Energy Outlook, the annual report that takes stock of the current state of global energy and looks toward the future. It contains some interesting insights and a few surprising figures about electricity, grids, and the state of climate change. So let’s dig into some numbers, shall we?

We’re in the age of electricity

Energy demand in general is going up around the world as populations increase and economies grow. But electricity is the star of the show, with demand projected to grow by 40% in the next 10 years.

China has accounted for the bulk of electricity growth for the past 10 years, and that’s going to continue. But emerging economies outside China will be a much bigger piece of the pie going forward. And while advanced economies, including the US and Europe, have seen flat demand in the past decade, the rise of AI and data centers will cause demand to climb there as well.

Air-conditioning is a major source of rising demand. Growing economies will give more people access to air-conditioning; income-driven AC growth will add about 330 gigawatts to global peak demand by 2035. Rising temperatures will tack on another 170 GW in that time. Together, that’s an increase of over 10% from 2024 levels.  

AI is a local story

This year, AI has been the story that none of us can get away from. One number that jumped out at me from this report: In 2025, investment in data centers is expected to top $580 billion. That’s more than the $540 billion spent on the global oil supply. 

It’s no wonder, then, that the energy demands of AI are in the spotlight. One key takeaway is that these demands are vastly different in different parts of the world.

Data centers still make up less than 10% of the projected increase in total electricity demand between now and 2035. It’s not nothing, but it’s far outweighed by sectors like industry and appliances, including air conditioners. Even electric vehicles will add more demand to the grid than data centers.

But AI will be the factor for the grid in some parts of the world. In the US, data centers will account for half the growth in total electricity demand between now and 2030.

And as we’ve covered in this newsletter before, data centers present a unique challenge, because they tend to be clustered together, so the demand tends to be concentrated around specific communities and on specific grids. Half the data center capacity that’s in the pipeline is close to large cities.

Look out for a coal crossover

As we ask more from our grid, the key factor that’s going to determine what all this means for climate change is what’s supplying the electricity we’re using.

As it stands, the world’s grids still primarily run on fossil fuels, so every bit of electricity growth comes with planet-warming greenhouse-gas emissions attached. That’s slowly changing, though.

Together, solar and wind were the leading source of electricity in the first half of this year, overtaking coal for the first time. Coal use could peak and begin to fall by the end of this decade.

Nuclear could play a role in replacing fossil fuels: After two decades of stagnation, the global nuclear fleet could increase by a third in the next 10 years. Solar is set to continue its meteoric rise, too. Of all the electricity demand growth we’re expecting in the next decade, 80% is in places with high-quality solar irradiation—meaning they’re good spots for solar power.

Ultimately, there are a lot of ways in which the world is moving in the right direction on energy. But we’re far from moving fast enough. Global emissions are, once again, going to hit a record high this year. To limit warming and prevent the worst effects of climate change, we need to remake our energy system, including electricity, and we need to do it faster. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Google is still aiming for its “moonshot” 2030 energy goals

Last week, we hosted EmTech MIT, MIT Technology Review’s annual flagship conference in Cambridge, Massachusetts. Over the course of three days of main-stage sessions, I learned about innovations in AI, biotech, and robotics. 

But as you might imagine, some of this climate reporter’s favorite moments came in the climate sessions. I was listening especially closely to my colleague James Temple’s discussion with Lucia Tian, head of advanced energy technologies at Google. 

They spoke about the tech giant’s growing energy demand and what sort of technologies the company is looking to to help meet it. In case you weren’t able to join us, let’s dig into that session and consider how the company is thinking about energy in the face of AI’s rapid rise. 

I’ve been closely following Google’s work in energy this year. Like the rest of the tech industry, the company is seeing ballooning electricity demand in its data centers. That could get in the way of a major goal that Google has been talking about for years. 

See, back in 2020, the company announced an ambitious target: by 2030, it aimed to run on carbon-free energy 24-7. Basically, that means Google would purchase enough renewable energy on the grids where it operates to meet its entire electricity demand, and the purchases would match up so the electricity would have to be generated when the company was actually using energy. (For more on the nuances of Big Tech’s renewable-energy pledges, check out James’s piece from last year.)

Google’s is an ambitious goal, and on stage, Tian said that the company is still aiming for it but acknowledged that it’s looking tough with the rise of AI. 

“It was always a moonshot,” she said. “It’s something very, very hard to achieve, and it’s only harder in the face of this growth. But our perspective is, if we don’t move in that direction, we’ll never get there.”

Google’s total electricity demand more than doubled from 2020 to 2024, according to its latest Environmental Report. As for that goal of 24-7 carbon-free energy? The company is basically treading water. While it was at 67% for its data centers in 2020, last year it came in at 66%. 

Not going backwards is something of an accomplishment, given the rapid growth in electricity demand. But it still leaves the company some distance away from its finish line.

To close the gap, Google has been signing what feels like constant deals in the energy space. Two recent announcements that Tian talked about on stage were a project involving carbon capture and storage at a natural-gas plant in Illinois and plans to reopen a shuttered nuclear power plant in Iowa. 

Let’s start with carbon capture. Google signed an agreement to purchase most of the electricity from a new natural-gas plant, which will capture and store about 90% of its carbon dioxide emissions. 

That announcement was controversial, with critics arguing that carbon capture keeps fossil-fuel infrastructure online longer and still releases greenhouse gases and other pollutants into the atmosphere. 

One question that James raised on stage: Why build a new natural-gas plant rather than add equipment to an already existing facility? Tacking on equipment to an operational plant would mean cutting emissions from the status quo, rather than adding entirely new fossil-fuel infrastructure. 

The company did consider many existing plants, Tian said. But, as she put it, “Retrofits aren’t going to make sense everywhere.” Space can be limited at existing plants, for example, and many may not have the right geology to store carbon dioxide underground. 

“We wanted to lead with a project that could prove this technology at scale,” Tian said. This site has an operational Class VI well, the type used for permanent sequestration, she added, and it also doesn’t require a big pipeline buildout. 

Tian also touched on the company’s recent announcement that it’s collaborating with NextEra Energy to reopen Duane Arnold Energy Center, a nuclear power plant in Iowa. The company will purchase electricity from that plant, which is scheduled to reopen in 2029. 

As I covered in a story earlier this year, Duane Arnold was basically the final option in the US for companies looking to reopen shuttered nuclear power plants. “Just a few years back, we were still closing down nuclear plants in this country,” Tian said on stage. 

While each reopening will look a little different, Tian highlighted the groups working to restart the Palisades plant in Michigan, which was the first reopening to be announced, last spring. “They’re the real heroes of the story,” she said.

I’m always interested to get a peek behind the curtain at how Big Tech is thinking about energy. I’m skeptical but certainly interested to see how Google’s, and the rest of the industry’s, goals shape up over the next few years. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Stop worrying about your AI footprint. Look at the big picture instead.

Picture it: I’m minding my business at a party, parked by the snack table (of course). A friend of a friend wanders up, and we strike up a conversation. It quickly turns to work, and upon learning that I’m a climate technology reporter, my new acquaintance says something like: “Should I be using AI? I’ve heard it’s awful for the environment.” 

This actually happens pretty often now. Generally, I tell people not to worry—let a chatbot plan your vacation, suggest recipe ideas, or write you a poem if you want. 

That response might surprise some people, but I promise I’m not living under a rock, and I have seen all the concerning projections about how much electricity AI is using. Data centers could consume up to 945 terawatt-hours annually by 2030. (That’s roughly as much as Japan.) 

But I feel strongly about not putting the onus on individuals, partly because AI concerns remind me so much of another question: “What should I do to reduce my carbon footprint?” 

That one gets under my skin because of the context: BP helped popularize the concept of a carbon footprint in a marketing campaign in the early 2000s. That framing effectively shifts the burden of worrying about the environment from fossil-fuel companies to individuals. 

The reality is, no one person can address climate change alone: Our entire society is built around burning fossil fuels. To address climate change, we need political action and public support for researching and scaling up climate technology. We need companies to innovate and take decisive action to reduce greenhouse-gas emissions. Focusing too much on individuals is a distraction from the real solutions on the table. 

I see something similar today with AI. People are asking climate reporters at barbecues whether they should feel guilty about using chatbots too frequently when we need to focus on the bigger picture. 

Big tech companies are playing into this narrative by providing energy-use estimates for their products at the user level. A couple of recent reports put the electricity used to query a chatbot at about 0.3 watt-hours, the same as powering a microwave for about a second. That’s so small as to be virtually insignificant.

But stopping with the energy use of a single query obscures the full truth, which is that this industry is growing quickly, building energy-hungry infrastructure at a nearly incomprehensible scale to satisfy the AI appetites of society as a whole. Meta is currently building a data center in Louisiana with five gigawatts of computational power—about the same demand as the entire state of Maine at the summer peak.  (To learn more, read our Power Hungry series online.)

Increasingly, there’s no getting away from AI, and it’s not as simple as choosing to use or not use the technology. Your favorite search engine likely gives you an AI summary at the top of your search results. Your email provider’s suggested replies? Probably AI. Same for chatting with customer service while you’re shopping online. 

Just as with climate change, we need to look at this as a system rather than a series of individual choices. 

Massive tech companies using AI in their products should be disclosing their total energy and water use and going into detail about how they complete their calculations. Estimating the burden per query is a start, but we also deserve to see how these impacts add up for billions of users, and how that’s changing over time as companies (hopefully) make their products more efficient. Lawmakers should be mandating these disclosures, and we should be asking for them, too. 

That’s not to say there’s absolutely no individual action that you can take. Just as you could meaningfully reduce your individual greenhouse-gas emissions by taking fewer flights and eating less meat, there are some reasonable things that you can do to reduce your AI footprint. Generating videos tends to be especially energy-intensive, as does using reasoning models to engage with long prompts and produce long answers. Asking a chatbot to help plan your day, suggest fun activities to do with your family, or summarize a ridiculously long email has relatively minor impact. 

Ultimately, as long as you aren’t relentlessly churning out AI slop, you shouldn’t be too worried about your individual AI footprint. But we should all be keeping our eye on what this industry will mean for our grid, our society, and our planet. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.