Google’s still not giving us the full picture on AI energy use

Google just announced that a typical query to its Gemini app uses about 0.24 watt-hours of electricity. That’s about the same as running a microwave for one second—something that, to me, feels virtually insignificant. I run the microwave for so many more seconds than that on most days.

I was excited to see this report come out, and I welcome more openness from major players in AI about their estimated energy use per query. But I’ve noticed that some folks are taking this number and using it to conclude that we don’t need to worry about AI’s energy demand. That’s not the right takeaway here. Let’s dig into why.

1. This one number doesn’t reflect all queries, and it leaves out cases that likely use much more energy.

Google’s new report considers only text queries. Previous analysis, including MIT Technology Review’s reporting, suggests that generating a photo or video will typically use more electricity.

When I spoke with Jeff Dean, Google’s chief scientist, he said the company doesn’t currently have plans to do this sort of analysis for images and videos, but that he wouldn’t rule it out.

The reason the company started with text prompts is that those are something many people out there are using in their daily lives, he says, while image and video generation is something that not as many people are doing. But I’m seeing more AI images and videos all over my social feeds. So there’s a whole world of queries not represented here.

Also, this estimate is the median, meaning it’s just the number in the middle of the range of queries Google is seeing. Longer questions and responses can push up the energy demand, and so can using a reasoning model.  We don’t know anything about how much energy these more complicated queries demand or what the distribution of the range is.

2. We don’t know how many queries Gemini is seeing, so we don’t know the product’s total energy impact.

One of my biggest outstanding questions about Gemini’s energy use is the total number of queries the product is seeing every day. 

This number isn’t included in Google’s report, and the company wouldn’t share it with me. And let me be clear: I absolutely pestered them about this, both in a press call they had about the news and in my interview with Dean. In the press call, the company pointed me to a recent earnings report, which includes only figures about monthly active users (450 million, for what it’s worth).

“We’re not comfortable revealing that for various reasons,” Dean told me on our call. The total number is an abstract measure that changes over time, he says, adding that the company wants users to be thinking about the energy usage per prompt.

But there are people out there all over the world interacting with this technology, not just me—and what we all add up to seems quite relevant.

OpenAI does publicly share its total, sharing recently that it sees 2.5 billion queries to ChatGPT every day. So for the curious, we can use this as an example and take the company’s self-reported average energy use per query (0.34 watt-hours) to get a rough idea of the total for all people prompting ChatGPT.

According to my math, over the course of a year, that would add up to over 300 gigawatt-hours—the same as powering nearly 30,000 US homes annually. When you put it that way, it starts to sound like a lot of seconds in microwaves.

3. AI is everywhere, not just in chatbots, and we’re often not even conscious of it.

AI is touching our lives even when we’re not looking for it. AI summaries appear in web searches, whether you ask for them or not. There are built-in features for email and texting applications that that can draft or summarize messages for you.

Google’s estimate is strictly for Gemini apps and wouldn’t include many of the other ways that even this one company is using AI. So even if you’re trying to think about your own personal energy demand, it’s increasingly difficult to tally up. 

To be clear, I don’t think people should feel guilty for using tools that they find genuinely helpful. And ultimately, I don’t think the most important conversation is about personal responsibility. 

There’s a tendency right now to focus on the small numbers, but we need to keep in mind what this is all adding up to. Over two gigawatts of natural gas will need to come online in Louisiana to power a single Meta data center this decade. Google Cloud is spending $25 billion on AI just in the PJM grid on the US East Coast. By 2028, AI could account for 326 terawatt-hours of electricity demand in the US annually, generating over 100 million metric tons of carbon dioxide.

We need more reporting from major players in AI, and Google’s recent announcement is one of the most transparent accounts yet. But one small number doesn’t negate the ways this technology is affecting communities and changing our power grid. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

How lidar measures the cost of climate disasters

The wildfires that swept through Los Angeles County in January 2025 left an indelible mark on the Southern California landscape. The Eaton and Palisades fires raged for 24 days, killing 29 people and destroying 16,000 structures, with losses estimated at $60 billion. More than 55,000 acres were consumed, and the landscape itself was physically transformed.

Researchers are now using lidar (light detection and ranging) technology to precisely measure these changes in the landscape’s geometry—helping them understand the effects of climate disasters.

Lidar, which measures how long it takes for pulses of laser light to bounce off surfaces and return, has been used in topographic mapping for decades. Today, airborne lidar from planes and drones maps the Earth’s surface in high detail. Scientists can then “diff” the data—compare before-and-after snapshots and highlight all the changes—to identify more subtle consequences of a disaster, including fault-line shifts, volcanic eruptions, and mudslides.

Falko Kuester, an engineering professor at the University of California, San Diego, co-directs ALERTCalifornia, a public safety program that uses real-time remote sensing to help detect wildfires. Kuester says lidar snapshots can tell a story over time.

“They give us a lay of the land,” he says. “This is what a particular region has been like at this point in time. Now, if you have consecutive flights at a later time, you can do a ‘difference.’ Show me what it looked like. Show me what it looks like. Tell me what changed. Was something constructed? Something burned down? Did something fall down? Did vegetation grow?” 

Shortly after the fires were contained in late January 2025, ALERTCalifornia sponsored new lidar flights over the Eaton and Palisades burn areas. NV5, an inspection and engineering firm, conducted the scans, and the US Geological Survey is now hosting the public data sets.  

Comparing a 2016 lidar snapshot and the January 2025 snapshot, Cassandra Brigham and her team at Arizona State University visualized the elevation changes—revealing the buildings, trees, and structures that had disappeared.

“We said, what would be a useful product for people to have as quickly as possible, since we’re doing this a couple weeks after the end of the fires?” says Brigham. Her team cleaned and reformatted the older, lower-resolution data and then subtracted the newer data. The resulting visualizations reveal the scale of devastation in ways satellite imagery can’t match. Red shows lost elevation (like when a building burns), and blue shows a gain (such as tree growth or new construction).

Lidar is helping scientists track the cascading effects of climate-­driven disasters—from the damage to structures and vegetation destroyed by wildfires to the landslides and debris flows that often follow in their wake. “For the Eaton and Palisades fires, for example, entire hillsides burned. So all of that vegetation is removed,” Kuester says. “Now you have an atmospheric river coming in, dumping water. What happens next? You have debris flows, mud flows, landslides.” 

Lidar’s usefulness for quantifying the costs of climate disasters underscores its value in preparing for future fires, floods, and earthquakes. But as policymakers weigh steep budget cuts to scientific research, these crucial lidar data collection projects could face an uncertain future.

Jon Keegan writes about technology and AI, and he publishes Beautiful Public Data (beautifulpublicdata.com), a curated collection of government data sets.

Why recycling isn’t enough to address the plastic problem

I remember using a princess toothbrush when I was little. The handle was purple, teal, and sparkly. Like most of the other pieces of plastic that have ever been made, it’s probably still out there somewhere, languishing in a landfill. (I just hope it’s not in the ocean.)

I’ve been thinking about that toothbrush again this week after UN talks about a plastic treaty broke down on Friday. Nations had gotten together to try and write a binding treaty to address plastic waste, but negotiators left without a deal.

Plastic is widely recognized as a huge source of environmental pollution—again, I’m wondering where that toothbrush is—but the material is also a contributor to climate change. Let’s dig into why talks fell apart and how we might address emissions from plastic.

I’ve defended plastic before in this newsletter (sort of). It’s a wildly useful material, integral in everything from glasses lenses to IV bags.

But the pace at which we’re producing and using plastic is absolutely bonkers. Plastic production has increased at an average rate of 9% every year since 1950. Production hit 460 million metric tons in 2019. And an estimated 52 million metric tons are dumped into the environment or burned each year.

So, in March 2022, the UN Environment Assembly set out to develop an international treaty to address plastic pollution. Pretty much everyone should agree that a bunch of plastic waste floating in the ocean is a bad thing. But as we’ve learned over the past few years, as these talks developed, opinions diverge on what to do about it and how any interventions should happen.

One phrase that’s become quite contentious is the “full life cycle” of plastic. Basically, some groups are hoping to go beyond efforts to address just the end of the plastic life cycle (collecting and recycling it) by pushing for limits on plastic production. There was even talk at the Assembly of a ban on single-use plastic.

Petroleum-producing nations strongly opposed production limits in the talks. Representatives from Saudi Arabia and Kuwait told the Guardian that they considered limits to plastic production outside the scope of talks. The US reportedly also slowed down talks and proposed to strike a treaty article that references the full life cycle of plastics.

Petrostates have a vested interest because oil, natural gas, and coal are all burned for energy used to make plastic, and they’re also used as raw materials. This stat surprised me: 12% of global oil demand and over 8% of natural gas demand is for plastic production.  

That translates into a lot of greenhouse gas emissions. One report from Lawrence Berkeley National Lab found that plastics production accounted for 2.24 billion metric tons of carbon dioxide emissions in 2019—that’s roughly 5% of the global total.  

And looking into the future, emissions from plastics are only set to grow. Another estimate, from the Organisation for Economic Co-operation and Development, projects that emissions from plastics could swell from about 2 billion metric tons to 4 billion metric tons by 2060.

This chart is what really strikes me and makes the conclusion of the plastic treaty talks such a disappointment.

Recycling is a great tool, and new methods could make it possible to recycle more plastics and make it easier to do so. (I’m particularly interested in efforts to recycle a mix of plastics, cutting down on the slow and costly sorting process.)

But just addressing plastic at its end of life won’t be enough to address the climate impacts of the material. Most emissions from plastic come from making it. So we need new ways to make plastic, using different ingredients and fuels to take oil and gas out of the equation. And we need to be smarter about the volume of plastic we produce.  

One positive note here: The plastic treaty isn’t dead, just on hold for the moment. Officials say that there’s going to be an effort to revive the talks.

Less than 10% of plastic that’s ever been produced has been recycled. Whether it’s a water bottle, a polyester shirt you wore a few times, or a princess toothbrush from when you were a kid, it’s still out there somewhere in a landfill or in the environment. Maybe you already knew that. But also consider this: The greenhouse gases emitted to make the plastic are still in the atmosphere, too, contributing to climate change. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

How to make clean energy progress under Trump in the states—blue and red alike

The second Trump administration is proving to be more disastrous for the climate and the clean energy economy than many had feared. 

Donald Trump’s One Big Beautiful Bill Act repealed most of the clean energy incentives in former president Joe Biden’s Inflation Reduction Act. Meanwhile, his EPA administrator moved to revoke the endangerment finding, the legal basis for federal oversight of greenhouse gases. For those of us who have been following policy developments in this area closely, nearly every day brings a new blow to past efforts to salvage our climate and to build the clean energy economy of the future.


Heat Exchange

MIT Technology Review’s guest opinion series, offering expert commentary on legal, political and regulatory issues related to climate change and clean energy. You can read the rest of the pieces here.


This has left many in the climate and clean energy communities wondering what do we do now? The answer, I would argue, is to return to state capitals—a policymaking venue that climate and renewable energy advocates already know well. This can be done strategically, focusing on a handful of key states rather than all fifty. 

But I have another piece of advice: Don’t get too caught up in “red states” versus “blue states” when considering which states to target. American politics is being remade before our eyes, and long-standing policy problems are being redefined and reframed.  

Let’s take clean energy, for example. Yes, shifting away from carbon-spewing resources is about slowing down climate change, and for some this is the single most important motivation for pursuing it. But it also can be about much more. 

The case can be made just as forcefully—and perhaps more effectively—that shifting to clean energy advances affordability at a time when electricity bills are skyrocketing. It promotes energy freedom by resisting monopolistic utilities’ ownership and gatekeeping of the grid. It increases reliability as battery storage reaches new heights and renewable sources and baseload power plants like nuclear or natural gas facilities (some of which we certainly do and will need) increasingly complement one another. And it drives job creation and economic development. 

Talking about clean energy policy in these ways is safer from ideological criticisms of “climate alarmism.” Research reported in my forthcoming book, Owning the Green Grid, shows that this framing has historically been effective in red states. In addition, using the arguments above to promote all forms of energy can allow clean energy proponents to reclaim a talking point deployed in a previous era by the political right: a true “all-of-the-above” approach to energy policy.

Every energy technology—gas, nuclear, wind, solar, geothermal and storage, among others—has its own set of strengths and weaknesses. But combining them enhances overall grid performance, delivering more than the sum of their individual parts.

To be clear, this is not the approach of the current national administration in Washington, DC. Its policies have picked winners (coal, oil, and natural gas) and losers (solar and wind) among energy technologies—ironically, given conservative claims of blue states having done so in the past. Yet a true all-of-the-above approach can now be sold in state capitals throughout the country, in red states and even in fossil-fuel producing states. 

To be sure, the Trump-led Republican party has taken such extreme measures that it will constrain certain state policymaking possibilities. Notably, in May the US Senate voted to block waivers allowing California to phase out gas guzzlers in the state, over the objections of the Senate parliamentarian. The fiscal power of the federal government is also immense. But there are a variety of other ways to continue to make state-level progress on greenhouse gas emissions.

State and local advocacy efforts are nothing new for the clean energy community. For decades before the Inflation Reduction Act, the states were the primary locus of activity for clean energy policy. But in recent years, some have suggested that Democratic state governments are a necessary prerequisite to making meaningful state-level progress. This view is limiting, and it perpetuates a false—or at least unnecessary—alignment between party and energy technology. 

The electric grid is nonpartisan. Struggling to pay your utility bill is nonpartisan. Keeping the lights on is nonpartisan. Even before renewable energy was as cheap as it is today, early progress at diversifying energy portfolios was made in conservative states. Iowa, Texas, and Montana were all early adopters of renewable portfolio standards. Advocates in such places did not lead with messaging about climate change, but rather about economic development and energy independence. These policy efforts paid off: The deeply red Lone Star State, for instance, generates more wind energy than any other state and ranks only behind California in producing solar power. 

Now, in 2025, advances in technology and improvements in cost should make the economic arguments for clean energy even easier and more salient. So, in the face of a national government that is choosing last century’s energy technologies as policy winners and this century’s technologies as policy losers, the states offer clean energy advocates a familiar terrain on which to make continued progress, if they tailor their selling points to the reality on the ground.         

Joshua A. Basseches is the David and Jane Flowerree Assistant Professor of Environmental Studies and Public Policy at Tulane University. His research focuses on state-level renewable energy politics and policymaking, especially in the electricity sector.

The US could really use an affordable electric truck

On Monday, Ford announced plans for an affordable electric truck with a 2027 delivery date and an expected price tag of about $30,000, thanks in part to a new manufacturing process that it says will help cut costs.

This could be the shot in the arm that the slowing US EV market needs. Sales are slowing, and Ford in particular has struggled recently—the automaker has lost $12 billion over the last two and a half years on its EV division. And the adoption barriers continue to mount, with the Trump administration cutting tax credits as well as rules designed to push automakers toward zero-emissions vehicles. And that’s not to mention tariffs.

But if anything can get Americans excited, it’s a truck, especially an affordable one. (There was a ton of buzz over the announcement of a bare-bones truck from Bezos-backed Slate Auto earlier this year, for example.) The big question is whether the company can deliver in this environment.

One key thing to note here: This is not the first time that there’s been a big splashy truck announcement from Ford that was supposed to change everything. The F-150 Lightning was hailed as a turning point for vehicle electrification, a signal that decarbonization had entered a new era. We cited the truck when we put “The Inevitable EV” on our 10 Breakthrough Technologies list in 2023. 

Things haven’t quite turned out that way. One problem is that the Lightning was supposed to be relatively affordable, with a price tag of about $40,000 when it was first announced in 2021. The starting price inflated to $52,000 when it actually went on sale in 2022.

The truck was initially popular and became quite hard to find at dealerships. But prices climbed and interest leveled off. The base model hit nearly $60,000 by 2023. For the past few years, Ford has cut Lightning production several times and laid off employees who assembled the trucks.

Now, though, Ford is once again promising an affordable truck, and it’s supposed to be even cheaper this time. To help cut costs, the company says it’s simplifying, creating one universal platform for a new set of EVs. Using a common structure and set of components will help produce not only a midsize truck but also other trucks, vans, and SUVs. There are also planned changes to the manufacturing process (rather than one assembly line, multiple lines will join together to form what they’re calling an assembly tree). 

Another supporting factor for cost savings is the battery. The company plans to use lithium-iron phosphate (or LFP) cells—a type of lithium-ion battery that doesn’t contain nickel or cobalt. Leaving out those relatively pricey metals means lower costs.

Side note here: That battery could be surprisingly small. In a media briefing, a Ford official reportedly said that the truck’s battery would be 15% smaller than the one in the Atto crossover from the Chinese automaker BYD. Since that model has a roughly 60-kilowatt-hour pack, that could put this new battery at 51 kilowatt-hours. That’s only half the capacity of the Ford Lightning’s battery and similar to the smallest pack offered in a Tesla Model 3 today. (This could mean the truck has a relatively limited range, though the company hasn’t shared any details on that front yet.) 

A string of big promises isn’t too unusual for a big company announcement. What was unusual was the tone from officials during the event on Monday.

As Andrew Hawkins pointed out in The Verge this week, “Ford seems to realize its timing is unfortunate.” During the announcement, executives emphasized that this was a bet, one that might not work out.

CEO Jim Farley put it bluntly: “The automotive industry has a graveyard littered with affordable vehicles that were launched in our country with all good intentions, and they fizzled out with idle plants, laid-off workers, and red ink.” Woof.

From where I’m standing, it’s hard to be optimistic that this announcement will turn out differently from all those failed ones, given where the US EV market is right now.   

In a new report published in June, the energy consultancy BNEF slashed its predictions for future EV uptake. Last year, the organization predicted that 48% of new vehicles sold in the US in 2030 would be electric. In this year’s edition, that number got bumped down to just 27%.

To be clear: BNEF and other organizations are still expecting more EVs on the roads in the future than today, since the vehicles make up less than 10% of new sales in the US. But expectations are way down, in part because of a broad cut in public support for EVs. 

The tax credits that gave drivers up to $7,500 off the purchase of a new EV end in just over a month. Tariffs are going to push costs up even for domestic automakers like Ford, which still rely on imported steel and aluminum.

A revamped manufacturing process and a cheaper, desirable vehicle could be exactly the sort of move that automakers need to make for the US EV market. But I’m skeptical that this truck will be able to turn it all around. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The greenhouse gases we’re not accounting for

In the spring of 2021, climate scientists were stumped. 

The global economy was just emerging from the covid-19 lockdowns, but for some reason the levels of methane—a greenhouse gas emitted mainly through agriculture and fossil-fuel production—had soared in the atmosphere the previous year, rising at the fastest rate on record.

Researchers around the world set to work unraveling the mystery, reviewing readings from satellites, aircraft, and greenhouse-gas monitoring stations. They eventually spotted a clear pattern: Methane emissions had increased sharply across the tropics, where wetlands were growing wetter and warmer. 

That created the ideal conditions for microbes that thrive in anaerobic muck, which gobbled up more of the carbon-rich organic matter and spat out more methane as a by-product. (Reduced pollution from nitrogen oxides, which help to break down methane in the atmosphere, also likely played a substantial role.)

The findings offer one of the clearest cases so far where climate change itself is driving additional greenhouse-gas emissions from natural systems, triggering a feedback effect that threatens to produce more warming, more emissions, and on and on. 

There are numerous additional ways this is happening or soon could, including wildfires and thawing permafrost. These are major emissions sources that aren’t included in the commitments nations have made under the Paris climate agreement—and climate risks that largely aren’t accounted for in the UN Intergovernmental Panel on Climate Change’s most recent warming scenarios.

Spark Climate Solutions (not to be confused with this newsletter) hopes to change that.

The San Francisco nonprofit is launching what’s known as a model intercomparison project, in which different research teams run the same set of experiments on different models across a variety of emissions scenarios to determine how climate change could play out. This one would specifically explore how a range of climate feedback effects could propel additional warming, additional emissions, and additional types of feedback.

“These increased emissions from natural sources add to human emissions and amplify climate change,” says Phil Duffy, chief scientist at Spark Climate Solutions, who previously served as climate science advisor to President Joe Biden. “And if you don’t look at all of them together, you can’t quantify the strength of that feedback effect.”

Other participants in the effort will include scientists at the Environmental Defense Fund, Stanford University, the Woodwell Climate Research Center, and other institutions in Europe and Australia, according to Spark Climate Solutions.

The nonprofit hopes to publish the findings in time for them to be incorporated into the UN climate panel’s seventh major assessment report, which is just getting underway, to help ensure that these dangers are more fully represented. That, in turn, would give nations a more accurate sense of the world’s carbon budgets, or the quantity of greenhouse gases they can produce before the planet reaches temperatures 1.5 °C or  2 °C over preindustrial levels. 

But one thing is already clear: Since the current scenarios don’t fully account for these feedback effects, the world will almost certainly warm faster than is now forecast, which underscores the importance of carrying out this exercise. 

Scientists at EDF, Woodwell and other institutions found that fires in the world’s northernmost forests, thawing permafrost and warming tropical wetlands could together push the planet beyond 2 °C years faster, eliminating up to a quarter of the time left before the world passes the core goal of the Paris agreement, in a paper under review. 

Earlier this year, Spark Climate Solutions set up a broader program to advance research and awareness of what’s known as warming-induced emissions, which will launch additional collaborations similar to the modeling intercomparison project.  

The goal of the program and the research project is “to really mainstream the inclusion of this topic in climate science and climate policy, and to drive research around climate solutions,” says Ben Poulter, who leads the program at Spark Climate Solutions and was previously a scientist at the NASA Goddard Space Flight Center.

Spark notes that warming temperatures could also release more carbon dioxide from the oceans, in a process known as outgassing; additional carbon dioxide and nitrous oxide, a potent greenhouse gas that also depletes the protective ozone layer, from farmland; more carbon dioxide and methane from wildfires; and still more of all three of these gases as permafrost thaws.

The ground remains frozen year round across a vast expanse of the Northern Hemisphere, creating a frosty underground storehouse from Alaska to Siberia that’s packed with twice as much carbon as the atmosphere.

But as it thaws, it starts to decompose and release greenhouse gases, says Susan Natali, an Arctic climate scientist focused on permafrost at Woodwell. A study published in Nature in January noted that 30% of the world’s Arctic–Boreal Zone has already flipped from a carbon sink to a carbon source, when wildfires, thawing permafrost and other factors are taken into account.

Despite these increasing risks, only a minority of the models that fed into the UN climate panel’s last major report incorporated the feedback effects of thawing permafrost. And the emissions risks still weren’t fully accounted for because these ecosystems are difficult to monitor and model, Natali says.

Among the complexities: Wildfires, which are themselves hard to predict, can accelerate thawing. It’s also hard to foresee which regions will grow drier or wetter, which determines whether they release mostly methane or carbon dioxide—and those gases have very different warming effects over different time periods. There are counterbalancing effects that must be taken into account as well—for instance, as carbon-absorbing plants replace ice and snow in certain areas.

Natali says improving our understanding of these complex feedback effects is essential to understanding the dangers we face.

“It’s going to mean additional costs to human health, human life,” she says. “We want people to be safe—and it’s very hard to do that if you don’t know what’s coming and you’re not prepared for it.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

An EPA rule change threatens to gut US climate regulations

This story is part of MIT Technology Review’s “America Undone” series, examining how the foundations of US success in science and innovation are currently under threat. You can read the rest here.

The mechanism that allows the US federal government to regulate climate change is on the chopping block.

On Tuesday, US Environmental Protection Agency administrator Lee Zeldin announced that the agency is taking aim at the endangerment finding, a 2009 rule that’s essentially the tentpole supporting federal greenhouse-gas regulations.

This might sound like an obscure legal situation, but it’s a really big deal for climate policy in the US. So buckle up, and let’s look at what this rule says now, what the proposed change looks like, and what it all means.

To set the stage, we have to go back to the Clean Air Act of 1970, the law that essentially gave the EPA the power to regulate air pollution. (Stick with me—I promise I’ll keep this short and not get too into the legal weeds.)

There were some pollutants explicitly called out in this law and its amendments, including lead and sulfur dioxide. But it also required the EPA to regulate new pollutants that were found to be harmful. In the late 1990s and early 2000s, environmental groups and states started asking for the agency to include greenhouse-gas pollution.

In 2007, the Supreme Court ruled that greenhouse gases qualify as air pollutants under the Clean Air Act, and that the EPA should study whether they’re a danger to public health. In 2009, the incoming Obama administration looked at the science and ruled that greenhouse gases pose a threat to public health because they cause climate change. That’s the endangerment finding, and it’s what allows the agency to pass rules to regulate greenhouse gases.  

The original case and argument were specifically about vehicles and the emissions from tailpipes, but this finding was eventually used to allow the agency to set rules around power plants and factories, too. It essentially underpins climate regulations in the US.

Fast-forward to today, and the Trump administration wants to reverse the endangerment finding. In a proposed rule released on Tuesday, the EPA argues that the Clean Air Act does not, in fact, authorize the agency to set emissions standards to address global climate change. Zeldin, in an appearance on the conservative politics and humor podcast Ruthless that preceded the official announcement, called the proposal the “largest deregulatory action in the history of America.”

The administration was already moving to undermine the climate regulations that rely on this rule. But this move directly targets a “fundamental building block of EPA’s climate policy,” says Deborah Sivas, an environmental-law professor at Stanford University.

The proposed rule will go up for public comment, and the agency will then take that feedback and come up with a final version. It’ll almost certainly get hit with legal challenges and will likely wind up in front of the Supreme Court.

One note here is that the EPA makes a mostly legal argument in the proposed rule reversal rather than focusing on going after the science of climate change, says Madison Condon, an associate law professor at Boston University. That could make it easier for the Supreme Court to eventually uphold it, she says, though this whole process is going to take a while. 

If the endangerment finding goes down, it would have wide-reaching ripple effects. “We could find ourselves in a couple years with no legal tools to try and address climate change,” Sivas says.

To take a step back for a moment, it’s wild that we’ve ended up in this place where a single rule is so central to regulating emissions. US climate policy is held up by duct tape and a dream. Congress could have, at some point, passed a law that more directly allows the EPA to regulate greenhouse-gas emissions (the last time we got close was a 2009 bill that passed the House but never made it to the Senate). But here we are.

This move isn’t a surprise, exactly. The Trump administration has made it very clear that it is going after climate policy in every way that it can. But what’s most striking to me is that we’re not operating in a shared reality anymore when it comes to this subject. 

While top officials tend to acknowledge that climate change is real, there’s often a “but” followed by talking points from climate denial’s list of greatest hits. (One of the more ridiculous examples is the statement that carbon dioxide is good, actually, because it helps plants.) 

Climate change is real, and it’s a threat. And the US has emitted more greenhouse gases into the atmosphere than any other country in the world. It shouldn’t be controversial to expect the government to be doing something about it. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

This startup wants to use the Earth as a massive battery

The Texas-based startup Quidnet Energy just completed a test showing it can store energy for up to six months by pumping water underground.

Using water to store electricity is hardly a new concept—pumped hydropower storage has been around for over a century. But the company hopes its twist on the technology could help bring cheap, long-duration energy storage to new places.

In traditional pumped hydro storage facilities, electric pumps move water uphill, into a natural or manmade body of water. Then, when electricity is needed, that water is released and flows downhill past a turbine, generating electricity. Quidnet’s approach instead pumps water down into impermeable rock formations and keeps it under pressure so it flows up when released. “It’s like pumped hydro, upside down,” says CEO Joe Zhou.

Quidnet started a six-month test of its technology in late 2024, pressurizing the system. In June, the company was able to discharge 35 megawatt-hours of energy from the well. There was virtually no self-discharge, meaning no energy loss, Zhou says.

Inexpensive forms of energy storage that can store electricity for weeks or months could help inconsistent electricity sources like wind and solar go further for the grid. And Quidnet’s approach, which uses commercially available equipment, could be deployed quickly and qualify for federal tax credits to help make it even cheaper.

However, there’s still a big milestone ahead: turning the pressurized water back into electricity. The company is currently building a facility with the turbines and support equipment to do that—all the components are available to purchase from established companies. “We don’t need to invent new things based on what we’ve already developed today,” Zhou says. “We can now start just deploying at very, very substantial scales.”

That process will come with energy losses. Energy storage systems are typically measured by their round-trip efficiency: how much of the electricity that’s put into the system is returned at the end as electricity. Modeling suggests that Quidnet’s technology could reach a maximum efficiency of about 65%, Zhou says, though some design choices made to optimize for economics will likely cause the system to land at roughly 50%.

That’s less efficient than lithium-ion batteries, but long-duration systems, if they’re cheap enough, can operate at low efficiencies and still be useful for the grid, says Paul Denholm, a senior research fellow at the National Renewable Energy Laboratory.

“It’s got to be cost-competitive; it all comes down to that,” Denholm says.

Lithium-ion batteries, the fastest-growing technology in energy storage, are the target that new forms of energy storage, like Quidnet’s, must chase. Lithium-ion batteries are about 90% cheaper today than they were 15 years ago. They’ve become a price-competitive alternative to building new natural-gas plants, Denholm says.

When it comes to competing with batteries, one potential differentiator for Quidnet could be government subsidies. While the Trump administration has clawed back funding for clean energy technologies, there’s still an energy storage tax credit, though recently passed legislation added new supply chain restrictions.

Starting in 2026, new energy storage facilities hoping to qualify for tax credits will need to prove that at least 55% of the value of a project’s materials are not from foreign entities of concern. That rules out sourcing batteries from China, which dominates battery production today. Quidnet has a “high level of domestic content” and expects to qualify for tax credits under the new rules, Zhou says.

The facility Quidnet is building is a project with utility partner CPS Energy, and it should come online in early 2026. 

What role should oil and gas companies play in climate tech?

This week, I have a new story out about Quaise, a geothermal startup that’s trying to commercialize new drilling technology. Using a device called a gyrotron, the company wants to drill deeper, cheaper, in an effort to unlock geothermal power anywhere on the planet. (For all the details, check it out here.) 

For the story, I visited Quaise’s headquarters in Houston. I also took a trip across town to Nabors Industries, Quaise’s investor and tech partner and one of the biggest drilling companies in the world. 

Standing on top of a drilling rig in the backyard of Nabors’s headquarters, I couldn’t stop thinking about the role oil and gas companies are playing in the energy transition. This industry has resources and energy expertise—but also a vested interest in fossil fuels. Can it really be part of addressing climate change?

The relationship between Quaise and Nabors is one that we see increasingly often in climate tech—a startup partnering up with an established company in a similar field. (Another one that comes to mind is in the cement industry, where Sublime Systems has seen a lot of support from legacy players including Holcim, one of the biggest cement companies in the world.) 

Quaise got an early investment from Nabors in 2021, to the tune of $12 million. Now the company also serves as a technical partner for the startup. 

“We are agnostic to what hole we’re drilling,” says Cameron Maresh, a project engineer on the energy transition team at Nabors Industries. The company is working on other investments and projects in the geothermal industry, Maresh says, and the work with Quaise is the culmination of a yearslong collaboration: “We’re just truly excited to see what Quaise can do.”

From the outside, this sort of partnership makes a lot of sense for Quaise. It gets resources and expertise. Meanwhile, Nabors is getting involved with an innovative company that could represent a new direction for geothermal. And maybe more to the point, if fossil fuels are to be phased out, this deal gives the company a stake in next-generation energy production.

There is so much potential for oil and gas companies to play a productive role in addressing climate change. One report from the International Energy Agency examined the role these legacy players could take:  “Energy transitions can happen without the engagement of the oil and gas industry, but the journey to net zero will be more costly and difficult to navigate if they are not on board,” the authors wrote. 

In the agency’s blueprint for what a net-zero emissions energy system could look like in 2050, about 30% of energy could come from sources where the oil and gas industry’s knowledge and resources are useful. That includes hydrogen, liquid biofuels, biomethane, carbon capture, and geothermal. 

But so far, the industry has hardly lived up to its potential as a positive force for the climate. Also in that report, the IEA pointed out that oil and gas producers made up only about 1% of global investment in climate tech in 2022. Investment has ticked up a bit since then, but still, it’s tough to argue that the industry is committed. 

And now that climate tech is falling out of fashion with the government in the US, I’d venture to guess that we’re going to see oil and gas companies increasingly pulling back on their investments and promises. 

BP recently backtracked on previous commitments to cut oil and gas production and invest in clean energy. And last year the company announced that it had written off $1.1 billion in offshore wind investments in 2023 and wanted to sell other wind assets. Shell closed down all its hydrogen fueling stations for vehicles in California last year. (This might not be all that big a loss, since EVs are beating hydrogen by a huge margin in the US, but it’s still worth noting.) 

So oil and gas companies are investing what amounts to pennies and often backtrack when the political winds change direction. And, let’s not forget, fossil-fuel companies have a long history of behaving badly. 

In perhaps the most notorious example, scientists at Exxon modeled climate change in the 1970s, and their forecasts turned out to be quite accurate. Rather than publish that research, the company downplayed how climate change might affect the planet. (For what it’s worth, company representatives have argued that this was less of a coverup and more of an internal discussion that wasn’t fit to be shared outside the company.) 

While fossil fuels are still part of our near-term future, oil and gas companies, and particularly producers, would need to make drastic changes to align with climate goals—changes that wouldn’t be in their financial interest. Few seem inclined to really take the turn needed. 

As the IEA report puts it:  “In practice, no one committed to change should wait for someone else to move first.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

How nonprofits and academia are stepping up to salvage US climate programs

Nonprofits are striving to preserve a US effort to modernize greenhouse-gas measurements, amid growing fears that the Trump administration’s dismantling of federal programs will obscure the nation’s contributions to climate change.

The Data Foundation, a Washington, DC, nonprofit that advocates for open data, is fundraising for an initiative that will coordinate efforts among nonprofits, technical experts, and companies to improve the accuracy and accessibility of climate emissions information. It will build on an effort to improve the collection of emissions data that former president Joe Biden launched in 2023—and which President Trump nullified on his first day in office. 

The initiative will help prioritize responses to changes in federal greenhouse-gas monitoring and measurement programs, but the Data Foundation stresses that it will primarily serve a “long-standing need for coordination” of such efforts outside of government agencies.

The new greenhouse-gas coalition is one of a growing number of nonprofit and academic groups that have spun up or shifted focus to keep essential climate monitoring and research efforts going amid the Trump administration’s assault on environmental funding, staffing, and regulations. Those include efforts to ensure that US scientists can continue to contribute to the UN’s major climate report and publish assessments of the rising domestic risks of climate change. Otherwise, the loss of these programs will make it increasingly difficult for communities to understand how more frequent or severe wildfires, droughts, heat waves, and floods will harm them—and how dire the dangers could become. 

Few believe that nonprofits or private industry can come close to filling the funding holes that the Trump administration is digging. But observers say it’s essential to try to sustain efforts to understand the risks of climate change that the federal government has historically overseen, even if the attempts are merely stopgap measures. 

If we give up these sources of emissions data, “we’re flying blind,” says Rachel Cleetus, senior policy director with the climate and energy program at the Union of Concerned Scientists. “We’re deliberating taking away the very information that would help us understand the problem and how to address it best.”

Improving emissions estimates

The Environmental Protection Agency, the National Oceanic and Atmospheric Administration, the US Forest Service, and other agencies have long collected information about greenhouse gases in a variety of ways. These include self-reporting by industry; shipboard, balloon, and aircraft readings of gas concentrations in the atmosphere; satellite measurements of the carbon dioxide and methane released by wildfires; and on-the-ground measurements of trees. The EPA, in turn, collects and publishes the data from these disparate sources as the Inventory of US Greenhouse Gas Emissions and Sinks.

But that report comes out on a two-year lag, and studies show that some of the estimates it relies on could be way off—particularly the self-reported ones.

A recent analysis using satellites to measure methane pollution from four large landfills found they produce, on average, six times more emissions than the facilities had reported to the EPA. Likewise, a 2018 study in Science found that the actual methane leaks from oil and gas infrastructure were about 60% higher than the self-reported estimates in the agency’s inventory.

The Biden administration’s initiative—the National Strategy to Advance an Integrated US Greenhouse Gas Measurement, Monitoring, and Information System—aimed to adopt state-of-the-art tools and methods to improve the accuracy of these estimates, including satellites and other monitoring technologies that can replace or check self-reported information.

The administration specifically sought to achieve these improvements through partnerships between government, industry, and nonprofits. The initiative called for the data collected across groups to be published to an online portal in formats that would be accessible to policymakers and the public.

Moving toward a system that produces more current and reliable data is essential for understanding the rising risks of climate change and tracking whether industries are abiding by government regulations and voluntary climate commitments, says Ben Poulter, a former NASA scientist who coordinated the Biden administration effort as a deputy director in the Office of Science and Technology Policy.

“Once you have this operational system, you can provide near-real-time information that can help drive climate action,” Poulter says. He is now a senior scientist at Spark Climate Solutions, a nonprofit focused on accelerating emerging methods of combating climate change, and he is advising the Data Foundation’s Climate Data Collaborative, which is overseeing the new greenhouse-gas initiative. 

Slashed staffing and funding  

But the momentum behind the federal strategy deflated when Trump returned to office. On his first day, he signed an executive order that effectively halted it. The White House has since slashed staffing across the agencies at the heart of the effort, sought to shut down specific programs that generate emissions data, and raised uncertainties about the fate of numerous other program components. 

In April, the administration missed a deadline to share the updated greenhouse-gas inventory with the United Nations, for the first time in three decades, as E&E News reported. It eventually did release the report in May, but only after the Environmental Defense Fund filed a Freedom of Information Act request.

There are also indications that the collection of emissions data might be in jeopardy. In March, the EPA said it would “reconsider” the Greenhouse Gas Reporting Program, which requires thousands of power plants, refineries, and other industrial facilities to report emissions each year.

In addition, the tax and spending bill that Trump signed into law earlier this month rescinds provisions in Biden’s Inflation Reduction Act that provided incentives or funding for corporate greenhouse-gas reporting and methane monitoring. 

Meanwhile, the White House has also proposed slashing funding for the National Oceanic and Atmospheric Administration and shuttering a number of its labs. Those include the facility that supports the Mauna Loa Observatory in Hawaii, the world’s longest-running carbon dioxide measuring program, as well as the Global Monitoring Laboratory, which operates a global network of collection flasks that capture air samples used to measure concentrations of nitrous oxide, chlorofluorocarbons, and other greenhouse gases.

Under the latest appropriations negotiations, Congress seems set to spare NOAA and other agencies the full cuts pushed by the Trump administration, but that may or may not protect various climate programs within them. As observers have noted, the loss of experts throughout the federal government, coupled with the priorities set by Trump-appointed leaders of those agencies, could still prevent crucial emissions data from being collected, analyzed, and published.

“That’s a huge concern,” says David Hayes, a professor at the Stanford Doerr School of Sustainability, who previously worked on the effort to upgrade the nation’s emissions measurement and monitoring as special assistant to President Biden for climate policy. It’s not clear “whether they’re going to continue and whether the data availability will drop off.”

‘A natural disaster’

Amid all these cutbacks and uncertainties, those still hoping to make progress toward an improved system for measuring greenhouse gases have had to adjust their expectations: It’s now at least as important to simply preserve or replace existing federal programs as it is to move toward more modern tools and methods.

But Ryan Alexander, executive director of the Data Foundation’s Climate Data Collaborative, is optimistic that there will be opportunities to do both. 

She says the new greenhouse-gas coalition will strive to identify the highest-priority needs and help other nonprofits or companies accelerate the development of new tools or methods. It will also aim to ensure that these organizations avoid replicating one another’s efforts and deliver data with high scientific standards, in open and interoperable formats. 

The Data Foundation declines to say what other nonprofits will be members of the coalition or how much money it hopes to raise, but it plans to make a formal announcement in the coming weeks. 

Nonprofits and companies are already playing a larger role in monitoring emissions, including organizations like Carbon Mapper, which operates satellites and aircraft that detect and measure methane emissions from particular facilities. The EDF also launched a satellite last year, known as MethaneSAT, that could spot large and small sources of emissions—though it lost power earlier this month and probably cannot be recovered. 

Alexander notes that shifting from self-reported figures to observational technology like satellites could not just replace but perhaps also improve on the EPA reporting program that the Trump administration has moved to shut down.

Given the “dramatic changes” brought about by this administration, “the future will not be the past,” she says. “This is like a natural disaster. We can’t think about rebuilding in the way that things have been in the past. We have to look ahead and say, ‘What is needed? What can people afford?’”

Organizations can also use this moment to test and develop emerging technologies that could improve greenhouse-gas measurements, including novel sensors or artificial intelligence tools, Hayes says. 

“We are at a time when we have these new tools, new technologies for measurement, measuring, and monitoring,” he says. “To some extent it’s a new era anyway, so it’s a great time to do some pilot testing here and to demonstrate how we can create new data sets in the climate area.”

Saving scientific contributions

It’s not just the collection of emissions data that nonprofits and academic groups are hoping to save. Notably, the American Geophysical Union and its partners have taken on two additional climate responsibilities that traditionally fell to the federal government.

The US State Department’s Office of Global Change historically coordinated the nation’s contributions to the UN Intergovernmental Panel on Climate Change’s major reports on climate risks, soliciting and nominating US scientists to help write, oversee, or edit sections of the assessments. The US Global Change Research Program, an interagency group that ran much of the process, also covered the cost of trips to a series of in-person meetings with international collaborators. 

But the US government seems to have relinquished any involvement as the IPCC kicks off the process for the Seventh Assessment Report. In late February, the administration blocked federal scientists including NASA’s Katherine Calvin, who was previously selected as a cochair for one of the working groups, from attending an early planning meeting in China. (Calvin was the agency’s chief scientist at the time but was no longer serving in that role as of April, according to NASA’s website.)

The agency didn’t respond to inquiries from interested scientists after the UN panel issued a call for nominations in March, and it failed to present a list of nominations by the deadline in April, scientists involved in the process say. The Trump administration also canceled funding for the Global Change Research Program and, earlier this month, fired the last remaining staffers working at the Office of Global Change.

In response, 10 universities came together in March to form the US Academic Alliance for the IPCC, in partnership with the AGU, to request and evalute applications from US researchers. The universities—which include Yale, Princeton, and the University of California, San Diego—together nominated nearly 300 scientists, some of whom the IPCC has since officially selected. The AGU is now conducting a fundraising campaign to help pay for travel expenses. 

Pamela McElwee, a professor at Rutgers who helped establish the academic coalition, says it’s crucial for US scientists to continue participating in the IPCC process.

“It is our flagship global assessment report on the state of climate, and it plays a really important role in influencing country policies,” she says. “To not be part of it makes it much more difficult for US scientists to be at the cutting edge and advance the things we need to do.” 

The AGU also stepped in two months later, after the White House dismissed hundreds of researchers working on the National Climate Assessment, an annual report analyzing the rising dangers of climate change across the country. The AGU and American Meteorological Society together announced plans to publish a “special collection” to sustain the momentum of that effort.

“It’s incumbent on us to ensure our communities, our neighbors, our children are all protected and prepared for the mounting risks of climate change,” said Brandon Jones, president of the AGU, in an earlier statement.

The AGU declined to discuss the status of the project.

Stopgap solution

The sheer number of programs the White House is going after will require organizations to make hard choices about what they attempt to save and how they go about it. Moreover, relying entirely on nonprofits and companies to take over these federal tasks is not viable over the long term. 

Given the costs of these federal programs, it could prove prohibitive to even keep a minimum viable version of some essential monitoring systems and research programs up and running. Dispersing across various organizations the responsibility of calculating the nation’s emissions sources and sinks also creates concerns about the scientific standards applied and the accessibility of that data, Cleetus says. Plus, moving away from the records that NOAA, NASA, and other agencies have collected for decades would break the continuity of that data, undermining the ability to detect or project trends.

More basically, publishing national emissions data should be a federal responsibility, particularly for the government of the world’s second-largest climate polluter, Cleetus adds. Failing to calculate and share its contributions to climate change sidesteps the nation’s global responsibilities and sends a terrible signal to other countries. 

Poulter stresses that nonprofits and the private sector can do only so much, for so long, to keep these systems up and running.

“We don’t want to give the impression that this greenhouse-gas coalition, if it gets off the ground, is a long-term solution,” he says. “But we can’t afford to have gaps in these data sets, so somebody needs to step in and help sustain those measurements.”