The US could really use an affordable electric truck

On Monday, Ford announced plans for an affordable electric truck with a 2027 delivery date and an expected price tag of about $30,000, thanks in part to a new manufacturing process that it says will help cut costs.

This could be the shot in the arm that the slowing US EV market needs. Sales are slowing, and Ford in particular has struggled recently—the automaker has lost $12 billion over the last two and a half years on its EV division. And the adoption barriers continue to mount, with the Trump administration cutting tax credits as well as rules designed to push automakers toward zero-emissions vehicles. And that’s not to mention tariffs.

But if anything can get Americans excited, it’s a truck, especially an affordable one. (There was a ton of buzz over the announcement of a bare-bones truck from Bezos-backed Slate Auto earlier this year, for example.) The big question is whether the company can deliver in this environment.

One key thing to note here: This is not the first time that there’s been a big splashy truck announcement from Ford that was supposed to change everything. The F-150 Lightning was hailed as a turning point for vehicle electrification, a signal that decarbonization had entered a new era. We cited the truck when we put “The Inevitable EV” on our 10 Breakthrough Technologies list in 2023. 

Things haven’t quite turned out that way. One problem is that the Lightning was supposed to be relatively affordable, with a price tag of about $40,000 when it was first announced in 2021. The starting price inflated to $52,000 when it actually went on sale in 2022.

The truck was initially popular and became quite hard to find at dealerships. But prices climbed and interest leveled off. The base model hit nearly $60,000 by 2023. For the past few years, Ford has cut Lightning production several times and laid off employees who assembled the trucks.

Now, though, Ford is once again promising an affordable truck, and it’s supposed to be even cheaper this time. To help cut costs, the company says it’s simplifying, creating one universal platform for a new set of EVs. Using a common structure and set of components will help produce not only a midsize truck but also other trucks, vans, and SUVs. There are also planned changes to the manufacturing process (rather than one assembly line, multiple lines will join together to form what they’re calling an assembly tree). 

Another supporting factor for cost savings is the battery. The company plans to use lithium-iron phosphate (or LFP) cells—a type of lithium-ion battery that doesn’t contain nickel or cobalt. Leaving out those relatively pricey metals means lower costs.

Side note here: That battery could be surprisingly small. In a media briefing, a Ford official reportedly said that the truck’s battery would be 15% smaller than the one in the Atto crossover from the Chinese automaker BYD. Since that model has a roughly 60-kilowatt-hour pack, that could put this new battery at 51 kilowatt-hours. That’s only half the capacity of the Ford Lightning’s battery and similar to the smallest pack offered in a Tesla Model 3 today. (This could mean the truck has a relatively limited range, though the company hasn’t shared any details on that front yet.) 

A string of big promises isn’t too unusual for a big company announcement. What was unusual was the tone from officials during the event on Monday.

As Andrew Hawkins pointed out in The Verge this week, “Ford seems to realize its timing is unfortunate.” During the announcement, executives emphasized that this was a bet, one that might not work out.

CEO Jim Farley put it bluntly: “The automotive industry has a graveyard littered with affordable vehicles that were launched in our country with all good intentions, and they fizzled out with idle plants, laid-off workers, and red ink.” Woof.

From where I’m standing, it’s hard to be optimistic that this announcement will turn out differently from all those failed ones, given where the US EV market is right now.   

In a new report published in June, the energy consultancy BNEF slashed its predictions for future EV uptake. Last year, the organization predicted that 48% of new vehicles sold in the US in 2030 would be electric. In this year’s edition, that number got bumped down to just 27%.

To be clear: BNEF and other organizations are still expecting more EVs on the roads in the future than today, since the vehicles make up less than 10% of new sales in the US. But expectations are way down, in part because of a broad cut in public support for EVs. 

The tax credits that gave drivers up to $7,500 off the purchase of a new EV end in just over a month. Tariffs are going to push costs up even for domestic automakers like Ford, which still rely on imported steel and aluminum.

A revamped manufacturing process and a cheaper, desirable vehicle could be exactly the sort of move that automakers need to make for the US EV market. But I’m skeptical that this truck will be able to turn it all around. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The greenhouse gases we’re not accounting for

In the spring of 2021, climate scientists were stumped. 

The global economy was just emerging from the covid-19 lockdowns, but for some reason the levels of methane—a greenhouse gas emitted mainly through agriculture and fossil-fuel production—had soared in the atmosphere the previous year, rising at the fastest rate on record.

Researchers around the world set to work unraveling the mystery, reviewing readings from satellites, aircraft, and greenhouse-gas monitoring stations. They eventually spotted a clear pattern: Methane emissions had increased sharply across the tropics, where wetlands were growing wetter and warmer. 

That created the ideal conditions for microbes that thrive in anaerobic muck, which gobbled up more of the carbon-rich organic matter and spat out more methane as a by-product. (Reduced pollution from nitrogen oxides, which help to break down methane in the atmosphere, also likely played a substantial role.)

The findings offer one of the clearest cases so far where climate change itself is driving additional greenhouse-gas emissions from natural systems, triggering a feedback effect that threatens to produce more warming, more emissions, and on and on. 

There are numerous additional ways this is happening or soon could, including wildfires and thawing permafrost. These are major emissions sources that aren’t included in the commitments nations have made under the Paris climate agreement—and climate risks that largely aren’t accounted for in the UN Intergovernmental Panel on Climate Change’s most recent warming scenarios.

Spark Climate Solutions (not to be confused with this newsletter) hopes to change that.

The San Francisco nonprofit is launching what’s known as a model intercomparison project, in which different research teams run the same set of experiments on different models across a variety of emissions scenarios to determine how climate change could play out. This one would specifically explore how a range of climate feedback effects could propel additional warming, additional emissions, and additional types of feedback.

“These increased emissions from natural sources add to human emissions and amplify climate change,” says Phil Duffy, chief scientist at Spark Climate Solutions, who previously served as climate science advisor to President Joe Biden. “And if you don’t look at all of them together, you can’t quantify the strength of that feedback effect.”

Other participants in the effort will include scientists at the Environmental Defense Fund, Stanford University, the Woodwell Climate Research Center, and other institutions in Europe and Australia, according to Spark Climate Solutions.

The nonprofit hopes to publish the findings in time for them to be incorporated into the UN climate panel’s seventh major assessment report, which is just getting underway, to help ensure that these dangers are more fully represented. That, in turn, would give nations a more accurate sense of the world’s carbon budgets, or the quantity of greenhouse gases they can produce before the planet reaches temperatures 1.5 °C or  2 °C over preindustrial levels. 

But one thing is already clear: Since the current scenarios don’t fully account for these feedback effects, the world will almost certainly warm faster than is now forecast, which underscores the importance of carrying out this exercise. 

Scientists at EDF, Woodwell and other institutions found that fires in the world’s northernmost forests, thawing permafrost and warming tropical wetlands could together push the planet beyond 2 °C years faster, eliminating up to a quarter of the time left before the world passes the core goal of the Paris agreement, in a paper under review. 

Earlier this year, Spark Climate Solutions set up a broader program to advance research and awareness of what’s known as warming-induced emissions, which will launch additional collaborations similar to the modeling intercomparison project.  

The goal of the program and the research project is “to really mainstream the inclusion of this topic in climate science and climate policy, and to drive research around climate solutions,” says Ben Poulter, who leads the program at Spark Climate Solutions and was previously a scientist at the NASA Goddard Space Flight Center.

Spark notes that warming temperatures could also release more carbon dioxide from the oceans, in a process known as outgassing; additional carbon dioxide and nitrous oxide, a potent greenhouse gas that also depletes the protective ozone layer, from farmland; more carbon dioxide and methane from wildfires; and still more of all three of these gases as permafrost thaws.

The ground remains frozen year round across a vast expanse of the Northern Hemisphere, creating a frosty underground storehouse from Alaska to Siberia that’s packed with twice as much carbon as the atmosphere.

But as it thaws, it starts to decompose and release greenhouse gases, says Susan Natali, an Arctic climate scientist focused on permafrost at Woodwell. A study published in Nature in January noted that 30% of the world’s Arctic–Boreal Zone has already flipped from a carbon sink to a carbon source, when wildfires, thawing permafrost and other factors are taken into account.

Despite these increasing risks, only a minority of the models that fed into the UN climate panel’s last major report incorporated the feedback effects of thawing permafrost. And the emissions risks still weren’t fully accounted for because these ecosystems are difficult to monitor and model, Natali says.

Among the complexities: Wildfires, which are themselves hard to predict, can accelerate thawing. It’s also hard to foresee which regions will grow drier or wetter, which determines whether they release mostly methane or carbon dioxide—and those gases have very different warming effects over different time periods. There are counterbalancing effects that must be taken into account as well—for instance, as carbon-absorbing plants replace ice and snow in certain areas.

Natali says improving our understanding of these complex feedback effects is essential to understanding the dangers we face.

“It’s going to mean additional costs to human health, human life,” she says. “We want people to be safe—and it’s very hard to do that if you don’t know what’s coming and you’re not prepared for it.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

An EPA rule change threatens to gut US climate regulations

This story is part of MIT Technology Review’s “America Undone” series, examining how the foundations of US success in science and innovation are currently under threat. You can read the rest here.

The mechanism that allows the US federal government to regulate climate change is on the chopping block.

On Tuesday, US Environmental Protection Agency administrator Lee Zeldin announced that the agency is taking aim at the endangerment finding, a 2009 rule that’s essentially the tentpole supporting federal greenhouse-gas regulations.

This might sound like an obscure legal situation, but it’s a really big deal for climate policy in the US. So buckle up, and let’s look at what this rule says now, what the proposed change looks like, and what it all means.

To set the stage, we have to go back to the Clean Air Act of 1970, the law that essentially gave the EPA the power to regulate air pollution. (Stick with me—I promise I’ll keep this short and not get too into the legal weeds.)

There were some pollutants explicitly called out in this law and its amendments, including lead and sulfur dioxide. But it also required the EPA to regulate new pollutants that were found to be harmful. In the late 1990s and early 2000s, environmental groups and states started asking for the agency to include greenhouse-gas pollution.

In 2007, the Supreme Court ruled that greenhouse gases qualify as air pollutants under the Clean Air Act, and that the EPA should study whether they’re a danger to public health. In 2009, the incoming Obama administration looked at the science and ruled that greenhouse gases pose a threat to public health because they cause climate change. That’s the endangerment finding, and it’s what allows the agency to pass rules to regulate greenhouse gases.  

The original case and argument were specifically about vehicles and the emissions from tailpipes, but this finding was eventually used to allow the agency to set rules around power plants and factories, too. It essentially underpins climate regulations in the US.

Fast-forward to today, and the Trump administration wants to reverse the endangerment finding. In a proposed rule released on Tuesday, the EPA argues that the Clean Air Act does not, in fact, authorize the agency to set emissions standards to address global climate change. Zeldin, in an appearance on the conservative politics and humor podcast Ruthless that preceded the official announcement, called the proposal the “largest deregulatory action in the history of America.”

The administration was already moving to undermine the climate regulations that rely on this rule. But this move directly targets a “fundamental building block of EPA’s climate policy,” says Deborah Sivas, an environmental-law professor at Stanford University.

The proposed rule will go up for public comment, and the agency will then take that feedback and come up with a final version. It’ll almost certainly get hit with legal challenges and will likely wind up in front of the Supreme Court.

One note here is that the EPA makes a mostly legal argument in the proposed rule reversal rather than focusing on going after the science of climate change, says Madison Condon, an associate law professor at Boston University. That could make it easier for the Supreme Court to eventually uphold it, she says, though this whole process is going to take a while. 

If the endangerment finding goes down, it would have wide-reaching ripple effects. “We could find ourselves in a couple years with no legal tools to try and address climate change,” Sivas says.

To take a step back for a moment, it’s wild that we’ve ended up in this place where a single rule is so central to regulating emissions. US climate policy is held up by duct tape and a dream. Congress could have, at some point, passed a law that more directly allows the EPA to regulate greenhouse-gas emissions (the last time we got close was a 2009 bill that passed the House but never made it to the Senate). But here we are.

This move isn’t a surprise, exactly. The Trump administration has made it very clear that it is going after climate policy in every way that it can. But what’s most striking to me is that we’re not operating in a shared reality anymore when it comes to this subject. 

While top officials tend to acknowledge that climate change is real, there’s often a “but” followed by talking points from climate denial’s list of greatest hits. (One of the more ridiculous examples is the statement that carbon dioxide is good, actually, because it helps plants.) 

Climate change is real, and it’s a threat. And the US has emitted more greenhouse gases into the atmosphere than any other country in the world. It shouldn’t be controversial to expect the government to be doing something about it. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

This startup wants to use the Earth as a massive battery

The Texas-based startup Quidnet Energy just completed a test showing it can store energy for up to six months by pumping water underground.

Using water to store electricity is hardly a new concept—pumped hydropower storage has been around for over a century. But the company hopes its twist on the technology could help bring cheap, long-duration energy storage to new places.

In traditional pumped hydro storage facilities, electric pumps move water uphill, into a natural or manmade body of water. Then, when electricity is needed, that water is released and flows downhill past a turbine, generating electricity. Quidnet’s approach instead pumps water down into impermeable rock formations and keeps it under pressure so it flows up when released. “It’s like pumped hydro, upside down,” says CEO Joe Zhou.

Quidnet started a six-month test of its technology in late 2024, pressurizing the system. In June, the company was able to discharge 35 megawatt-hours of energy from the well. There was virtually no self-discharge, meaning no energy loss, Zhou says.

Inexpensive forms of energy storage that can store electricity for weeks or months could help inconsistent electricity sources like wind and solar go further for the grid. And Quidnet’s approach, which uses commercially available equipment, could be deployed quickly and qualify for federal tax credits to help make it even cheaper.

However, there’s still a big milestone ahead: turning the pressurized water back into electricity. The company is currently building a facility with the turbines and support equipment to do that—all the components are available to purchase from established companies. “We don’t need to invent new things based on what we’ve already developed today,” Zhou says. “We can now start just deploying at very, very substantial scales.”

That process will come with energy losses. Energy storage systems are typically measured by their round-trip efficiency: how much of the electricity that’s put into the system is returned at the end as electricity. Modeling suggests that Quidnet’s technology could reach a maximum efficiency of about 65%, Zhou says, though some design choices made to optimize for economics will likely cause the system to land at roughly 50%.

That’s less efficient than lithium-ion batteries, but long-duration systems, if they’re cheap enough, can operate at low efficiencies and still be useful for the grid, says Paul Denholm, a senior research fellow at the National Renewable Energy Laboratory.

“It’s got to be cost-competitive; it all comes down to that,” Denholm says.

Lithium-ion batteries, the fastest-growing technology in energy storage, are the target that new forms of energy storage, like Quidnet’s, must chase. Lithium-ion batteries are about 90% cheaper today than they were 15 years ago. They’ve become a price-competitive alternative to building new natural-gas plants, Denholm says.

When it comes to competing with batteries, one potential differentiator for Quidnet could be government subsidies. While the Trump administration has clawed back funding for clean energy technologies, there’s still an energy storage tax credit, though recently passed legislation added new supply chain restrictions.

Starting in 2026, new energy storage facilities hoping to qualify for tax credits will need to prove that at least 55% of the value of a project’s materials are not from foreign entities of concern. That rules out sourcing batteries from China, which dominates battery production today. Quidnet has a “high level of domestic content” and expects to qualify for tax credits under the new rules, Zhou says.

The facility Quidnet is building is a project with utility partner CPS Energy, and it should come online in early 2026. 

What role should oil and gas companies play in climate tech?

This week, I have a new story out about Quaise, a geothermal startup that’s trying to commercialize new drilling technology. Using a device called a gyrotron, the company wants to drill deeper, cheaper, in an effort to unlock geothermal power anywhere on the planet. (For all the details, check it out here.) 

For the story, I visited Quaise’s headquarters in Houston. I also took a trip across town to Nabors Industries, Quaise’s investor and tech partner and one of the biggest drilling companies in the world. 

Standing on top of a drilling rig in the backyard of Nabors’s headquarters, I couldn’t stop thinking about the role oil and gas companies are playing in the energy transition. This industry has resources and energy expertise—but also a vested interest in fossil fuels. Can it really be part of addressing climate change?

The relationship between Quaise and Nabors is one that we see increasingly often in climate tech—a startup partnering up with an established company in a similar field. (Another one that comes to mind is in the cement industry, where Sublime Systems has seen a lot of support from legacy players including Holcim, one of the biggest cement companies in the world.) 

Quaise got an early investment from Nabors in 2021, to the tune of $12 million. Now the company also serves as a technical partner for the startup. 

“We are agnostic to what hole we’re drilling,” says Cameron Maresh, a project engineer on the energy transition team at Nabors Industries. The company is working on other investments and projects in the geothermal industry, Maresh says, and the work with Quaise is the culmination of a yearslong collaboration: “We’re just truly excited to see what Quaise can do.”

From the outside, this sort of partnership makes a lot of sense for Quaise. It gets resources and expertise. Meanwhile, Nabors is getting involved with an innovative company that could represent a new direction for geothermal. And maybe more to the point, if fossil fuels are to be phased out, this deal gives the company a stake in next-generation energy production.

There is so much potential for oil and gas companies to play a productive role in addressing climate change. One report from the International Energy Agency examined the role these legacy players could take:  “Energy transitions can happen without the engagement of the oil and gas industry, but the journey to net zero will be more costly and difficult to navigate if they are not on board,” the authors wrote. 

In the agency’s blueprint for what a net-zero emissions energy system could look like in 2050, about 30% of energy could come from sources where the oil and gas industry’s knowledge and resources are useful. That includes hydrogen, liquid biofuels, biomethane, carbon capture, and geothermal. 

But so far, the industry has hardly lived up to its potential as a positive force for the climate. Also in that report, the IEA pointed out that oil and gas producers made up only about 1% of global investment in climate tech in 2022. Investment has ticked up a bit since then, but still, it’s tough to argue that the industry is committed. 

And now that climate tech is falling out of fashion with the government in the US, I’d venture to guess that we’re going to see oil and gas companies increasingly pulling back on their investments and promises. 

BP recently backtracked on previous commitments to cut oil and gas production and invest in clean energy. And last year the company announced that it had written off $1.1 billion in offshore wind investments in 2023 and wanted to sell other wind assets. Shell closed down all its hydrogen fueling stations for vehicles in California last year. (This might not be all that big a loss, since EVs are beating hydrogen by a huge margin in the US, but it’s still worth noting.) 

So oil and gas companies are investing what amounts to pennies and often backtrack when the political winds change direction. And, let’s not forget, fossil-fuel companies have a long history of behaving badly. 

In perhaps the most notorious example, scientists at Exxon modeled climate change in the 1970s, and their forecasts turned out to be quite accurate. Rather than publish that research, the company downplayed how climate change might affect the planet. (For what it’s worth, company representatives have argued that this was less of a coverup and more of an internal discussion that wasn’t fit to be shared outside the company.) 

While fossil fuels are still part of our near-term future, oil and gas companies, and particularly producers, would need to make drastic changes to align with climate goals—changes that wouldn’t be in their financial interest. Few seem inclined to really take the turn needed. 

As the IEA report puts it:  “In practice, no one committed to change should wait for someone else to move first.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

How nonprofits and academia are stepping up to salvage US climate programs

Nonprofits are striving to preserve a US effort to modernize greenhouse-gas measurements, amid growing fears that the Trump administration’s dismantling of federal programs will obscure the nation’s contributions to climate change.

The Data Foundation, a Washington, DC, nonprofit that advocates for open data, is fundraising for an initiative that will coordinate efforts among nonprofits, technical experts, and companies to improve the accuracy and accessibility of climate emissions information. It will build on an effort to improve the collection of emissions data that former president Joe Biden launched in 2023—and which President Trump nullified on his first day in office. 

The initiative will help prioritize responses to changes in federal greenhouse-gas monitoring and measurement programs, but the Data Foundation stresses that it will primarily serve a “long-standing need for coordination” of such efforts outside of government agencies.

The new greenhouse-gas coalition is one of a growing number of nonprofit and academic groups that have spun up or shifted focus to keep essential climate monitoring and research efforts going amid the Trump administration’s assault on environmental funding, staffing, and regulations. Those include efforts to ensure that US scientists can continue to contribute to the UN’s major climate report and publish assessments of the rising domestic risks of climate change. Otherwise, the loss of these programs will make it increasingly difficult for communities to understand how more frequent or severe wildfires, droughts, heat waves, and floods will harm them—and how dire the dangers could become. 

Few believe that nonprofits or private industry can come close to filling the funding holes that the Trump administration is digging. But observers say it’s essential to try to sustain efforts to understand the risks of climate change that the federal government has historically overseen, even if the attempts are merely stopgap measures. 

If we give up these sources of emissions data, “we’re flying blind,” says Rachel Cleetus, senior policy director with the climate and energy program at the Union of Concerned Scientists. “We’re deliberating taking away the very information that would help us understand the problem and how to address it best.”

Improving emissions estimates

The Environmental Protection Agency, the National Oceanic and Atmospheric Administration, the US Forest Service, and other agencies have long collected information about greenhouse gases in a variety of ways. These include self-reporting by industry; shipboard, balloon, and aircraft readings of gas concentrations in the atmosphere; satellite measurements of the carbon dioxide and methane released by wildfires; and on-the-ground measurements of trees. The EPA, in turn, collects and publishes the data from these disparate sources as the Inventory of US Greenhouse Gas Emissions and Sinks.

But that report comes out on a two-year lag, and studies show that some of the estimates it relies on could be way off—particularly the self-reported ones.

A recent analysis using satellites to measure methane pollution from four large landfills found they produce, on average, six times more emissions than the facilities had reported to the EPA. Likewise, a 2018 study in Science found that the actual methane leaks from oil and gas infrastructure were about 60% higher than the self-reported estimates in the agency’s inventory.

The Biden administration’s initiative—the National Strategy to Advance an Integrated US Greenhouse Gas Measurement, Monitoring, and Information System—aimed to adopt state-of-the-art tools and methods to improve the accuracy of these estimates, including satellites and other monitoring technologies that can replace or check self-reported information.

The administration specifically sought to achieve these improvements through partnerships between government, industry, and nonprofits. The initiative called for the data collected across groups to be published to an online portal in formats that would be accessible to policymakers and the public.

Moving toward a system that produces more current and reliable data is essential for understanding the rising risks of climate change and tracking whether industries are abiding by government regulations and voluntary climate commitments, says Ben Poulter, a former NASA scientist who coordinated the Biden administration effort as a deputy director in the Office of Science and Technology Policy.

“Once you have this operational system, you can provide near-real-time information that can help drive climate action,” Poulter says. He is now a senior scientist at Spark Climate Solutions, a nonprofit focused on accelerating emerging methods of combating climate change, and he is advising the Data Foundation’s Climate Data Collaborative, which is overseeing the new greenhouse-gas initiative. 

Slashed staffing and funding  

But the momentum behind the federal strategy deflated when Trump returned to office. On his first day, he signed an executive order that effectively halted it. The White House has since slashed staffing across the agencies at the heart of the effort, sought to shut down specific programs that generate emissions data, and raised uncertainties about the fate of numerous other program components. 

In April, the administration missed a deadline to share the updated greenhouse-gas inventory with the United Nations, for the first time in three decades, as E&E News reported. It eventually did release the report in May, but only after the Environmental Defense Fund filed a Freedom of Information Act request.

There are also indications that the collection of emissions data might be in jeopardy. In March, the EPA said it would “reconsider” the Greenhouse Gas Reporting Program, which requires thousands of power plants, refineries, and other industrial facilities to report emissions each year.

In addition, the tax and spending bill that Trump signed into law earlier this month rescinds provisions in Biden’s Inflation Reduction Act that provided incentives or funding for corporate greenhouse-gas reporting and methane monitoring. 

Meanwhile, the White House has also proposed slashing funding for the National Oceanic and Atmospheric Administration and shuttering a number of its labs. Those include the facility that supports the Mauna Loa Observatory in Hawaii, the world’s longest-running carbon dioxide measuring program, as well as the Global Monitoring Laboratory, which operates a global network of collection flasks that capture air samples used to measure concentrations of nitrous oxide, chlorofluorocarbons, and other greenhouse gases.

Under the latest appropriations negotiations, Congress seems set to spare NOAA and other agencies the full cuts pushed by the Trump administration, but that may or may not protect various climate programs within them. As observers have noted, the loss of experts throughout the federal government, coupled with the priorities set by Trump-appointed leaders of those agencies, could still prevent crucial emissions data from being collected, analyzed, and published.

“That’s a huge concern,” says David Hayes, a professor at the Stanford Doerr School of Sustainability, who previously worked on the effort to upgrade the nation’s emissions measurement and monitoring as special assistant to President Biden for climate policy. It’s not clear “whether they’re going to continue and whether the data availability will drop off.”

‘A natural disaster’

Amid all these cutbacks and uncertainties, those still hoping to make progress toward an improved system for measuring greenhouse gases have had to adjust their expectations: It’s now at least as important to simply preserve or replace existing federal programs as it is to move toward more modern tools and methods.

But Ryan Alexander, executive director of the Data Foundation’s Climate Data Collaborative, is optimistic that there will be opportunities to do both. 

She says the new greenhouse-gas coalition will strive to identify the highest-priority needs and help other nonprofits or companies accelerate the development of new tools or methods. It will also aim to ensure that these organizations avoid replicating one another’s efforts and deliver data with high scientific standards, in open and interoperable formats. 

The Data Foundation declines to say what other nonprofits will be members of the coalition or how much money it hopes to raise, but it plans to make a formal announcement in the coming weeks. 

Nonprofits and companies are already playing a larger role in monitoring emissions, including organizations like Carbon Mapper, which operates satellites and aircraft that detect and measure methane emissions from particular facilities. The EDF also launched a satellite last year, known as MethaneSAT, that could spot large and small sources of emissions—though it lost power earlier this month and probably cannot be recovered. 

Alexander notes that shifting from self-reported figures to observational technology like satellites could not just replace but perhaps also improve on the EPA reporting program that the Trump administration has moved to shut down.

Given the “dramatic changes” brought about by this administration, “the future will not be the past,” she says. “This is like a natural disaster. We can’t think about rebuilding in the way that things have been in the past. We have to look ahead and say, ‘What is needed? What can people afford?’”

Organizations can also use this moment to test and develop emerging technologies that could improve greenhouse-gas measurements, including novel sensors or artificial intelligence tools, Hayes says. 

“We are at a time when we have these new tools, new technologies for measurement, measuring, and monitoring,” he says. “To some extent it’s a new era anyway, so it’s a great time to do some pilot testing here and to demonstrate how we can create new data sets in the climate area.”

Saving scientific contributions

It’s not just the collection of emissions data that nonprofits and academic groups are hoping to save. Notably, the American Geophysical Union and its partners have taken on two additional climate responsibilities that traditionally fell to the federal government.

The US State Department’s Office of Global Change historically coordinated the nation’s contributions to the UN Intergovernmental Panel on Climate Change’s major reports on climate risks, soliciting and nominating US scientists to help write, oversee, or edit sections of the assessments. The US Global Change Research Program, an interagency group that ran much of the process, also covered the cost of trips to a series of in-person meetings with international collaborators. 

But the US government seems to have relinquished any involvement as the IPCC kicks off the process for the Seventh Assessment Report. In late February, the administration blocked federal scientists including NASA’s Katherine Calvin, who was previously selected as a cochair for one of the working groups, from attending an early planning meeting in China. (Calvin was the agency’s chief scientist at the time but was no longer serving in that role as of April, according to NASA’s website.)

The agency didn’t respond to inquiries from interested scientists after the UN panel issued a call for nominations in March, and it failed to present a list of nominations by the deadline in April, scientists involved in the process say. The Trump administration also canceled funding for the Global Change Research Program and, earlier this month, fired the last remaining staffers working at the Office of Global Change.

In response, 10 universities came together in March to form the US Academic Alliance for the IPCC, in partnership with the AGU, to request and evalute applications from US researchers. The universities—which include Yale, Princeton, and the University of California, San Diego—together nominated nearly 300 scientists, some of whom the IPCC has since officially selected. The AGU is now conducting a fundraising campaign to help pay for travel expenses. 

Pamela McElwee, a professor at Rutgers who helped establish the academic coalition, says it’s crucial for US scientists to continue participating in the IPCC process.

“It is our flagship global assessment report on the state of climate, and it plays a really important role in influencing country policies,” she says. “To not be part of it makes it much more difficult for US scientists to be at the cutting edge and advance the things we need to do.” 

The AGU also stepped in two months later, after the White House dismissed hundreds of researchers working on the National Climate Assessment, an annual report analyzing the rising dangers of climate change across the country. The AGU and American Meteorological Society together announced plans to publish a “special collection” to sustain the momentum of that effort.

“It’s incumbent on us to ensure our communities, our neighbors, our children are all protected and prepared for the mounting risks of climate change,” said Brandon Jones, president of the AGU, in an earlier statement.

The AGU declined to discuss the status of the project.

Stopgap solution

The sheer number of programs the White House is going after will require organizations to make hard choices about what they attempt to save and how they go about it. Moreover, relying entirely on nonprofits and companies to take over these federal tasks is not viable over the long term. 

Given the costs of these federal programs, it could prove prohibitive to even keep a minimum viable version of some essential monitoring systems and research programs up and running. Dispersing across various organizations the responsibility of calculating the nation’s emissions sources and sinks also creates concerns about the scientific standards applied and the accessibility of that data, Cleetus says. Plus, moving away from the records that NOAA, NASA, and other agencies have collected for decades would break the continuity of that data, undermining the ability to detect or project trends.

More basically, publishing national emissions data should be a federal responsibility, particularly for the government of the world’s second-largest climate polluter, Cleetus adds. Failing to calculate and share its contributions to climate change sidesteps the nation’s global responsibilities and sends a terrible signal to other countries. 

Poulter stresses that nonprofits and the private sector can do only so much, for so long, to keep these systems up and running.

“We don’t want to give the impression that this greenhouse-gas coalition, if it gets off the ground, is a long-term solution,” he says. “But we can’t afford to have gaps in these data sets, so somebody needs to step in and help sustain those measurements.”

This startup wants to use beams of energy to drill geothermal wells

A beam of energy hit the slab of rock, which quickly began to glow. Pieces cracked off, sparks ricocheted, and dust whirled around under a blast of air. 

From inside a modified trailer, I peeked through the window as a millimeter-wave drilling rig attached to an unassuming box truck melted a hole into a piece of basalt in less than two minutes. After the test was over, I stepped out of the trailer into the Houston heat. I could see a ring of black, glassy material stamped into the slab fragments, evidence of where the rock had melted.  

This rock-melting drilling technology from the geothermal startup Quaise is certainly unconventional. The company hopes it’s the key to unlocking geothermal energy and making it feasible anywhere.

Geothermal power tends to work best in those parts of the world that have the right geology and heat close to the surface. Iceland and the western US, for example, are hot spots for this always-available renewable energy source because they have all the necessary ingredients. But by digging deep enough, companies could theoretically tap into the Earth’s heat from anywhere on the globe.

That’s a difficult task, though. In some places, accessing temperatures high enough to efficiently generate electricity would require drilling miles and miles beneath the surface. Often, that would mean going through very hard rock, like granite.

Quaise’s proposed solution is a new mode of drilling that eschews the traditional technique of scraping into rock with a hard drill bit. Instead, the company plans to use a gyrotron, a device that emits high-frequency electromagnetic radiation. Today, the fusion power industry uses gyrotrons to heat plasma to 100 million °C, but Quaise plans to use them to blast, melt, and vaporize rock. This could, in theory, make drilling faster and more economical, allowing for geothermal energy to be accessed anywhere.  

Since Quaise’s founding in 2018, the company has demonstrated that its systems work in the controlled conditions of the laboratory, and it has started trials in a semi-controlled environment, including the backyard of its Houston headquarters. Now these efforts are leaving the lab, and the team is taking gyrotron drilling technology to a quarry to test it in real-world conditions. 

Some experts caution that reinventing drilling won’t be as simple, or as fast, as Quaise’s leadership hopes. The startup is also attempting to raise a large funding round this year, at a time when economic uncertainty is slowing investment and the US climate technology industry is in a difficult spot politically because of policies like tariffs and a slowdown in government support. Quaise’s big idea aims to accelerate an old source of renewable energy. This make-or-break moment might determine how far that idea can go. 

Blasting through

Rough calculations from the geothermal industry suggest that enough energy is stored inside the Earth to meet our energy demands for tens or even hundreds of thousands of years, says Matthew Houde, cofounder and chief of staff at Quaise. After that, other sources like fusion should be available, “assuming we continue going on that long, so to speak,” he quips. 

“We want to be able to scale this style of geothermal beyond the locations where we’re able to readily access those temperatures today with conventional drilling,” Houde says. The key, he adds, is simply going deep enough: “If we can scale those depths to 10 to 20 kilometers, then we can enable super-hot geothermal to be worldwide accessible.”

Though that’s technically possible, there are few examples of humans drilling close to this depth. One research project that began in 1970 in the former Soviet Union reached just over 12 kilometers, but it took nearly 20 years and was incredibly expensive. 

Quaise hopes to speed up drilling and cut its cost, Houde says. The company’s goal is to drill through rock at a rate of between three and five meters per hour of steady operation.

One key factor slowing down many operations that drill through hard rocks like granite is nonproductive time. For example, equipment frequently needs to be brought all the way back up to the surface for repairs or to replace drill bits.

Quaise’s key to potentially changing that is its gyrotron. The device emits millimeter waves, beams of energy with wavelengths that fall between microwaves and infrared waves. It’s a bit like a laser, but the beam is not visible to the human eye. 

Quaise’s goal is to heat up the target rock, effectively drilling it away. The gyrotron beams waves at a target rock via a waveguide, a hollow metal tube that directs the energy to the right spot. (One of the company’s main technological challenges is to avoid accidentally making plasma, an ionized, superheated state of matter, as it can waste energy and damage key equipment like the waveguide.)

Here’s how it works in practice: When Quaise’s rig is drilling a hole, the tip of the waveguide is positioned a foot or so away from the rock it’s targeting. The gyrotron lets out a burst of millimeter waves for about a minute. They travel down the waveguide and hit the target rock, which heats up and then cracks, melts, or even vaporizes.

Then the beam stops, and the drill bit at the end of the waveguide is lowered to the surface of the rock, rotating and scraping off broken shards and melted bits of rock as it descends. A steady blast of air carries the debris up to the surface, and the process repeats. The energy in the millimeter waves does the hard work, and the scraping and compressed air help remove the fractured or melted material away.

This system is what I saw in action at the company’s Houston headquarters. The drilling rig in the yard is a small setup, something like what a construction company might use to drill micro piles for a foundation or what researchers would use to take geological samples. In total, the gyrotron has a power of 100 kilowatts. A cooling system helps the superconducting magnet in the gyrotron reach the necessary temperature (about -200 °C), and a filtration system catches the debris that sloughs off samples. 

Quaise truck and mobile drill unit

CASEY CROWNHART

Soon after my visit, this backyard setup was packed up and shipped to central Texas to be used for further field testing in a rock quarry. The company announced in July that it had used that rig to drill a 100-meter-deep hole at that field test site. 

Quaise isn’t the first to develop nonmechanical drilling, says Roland Horne, head of the geothermal program at Stanford University. “Burning holes in rocks is impressive. However, that’s not the whole of what’s involved in drilling,” he says. The operation will need to be able to survive the high temperatures and pressures at the bottom of wells as they’re drilled, he says.

So far, the company has found success drilling holes into columns of rock inside metal casings, as well as the quarry in its field trials. But there’s a long road between drilling into predictable material in a relatively predictable environment and creating a miles-deep geothermal well. 

Rocky roads

In April, Quaise fully integrated its second 100-kilowatt gyrotron onto an oil and gas rig owned by the company’s investor and technology partner Nabors. This rig is the sort that would typically be used for training or engineering development, and it’s set up along with a row of other rigs at the Nabors headquarters, just across town from the Quaise lab. At 182 feet high, the top is visible above the office building from the parking lot.

When I visited in April, the company was still completing initial tests, using special thermal paper and firing short blasts to test the setup. In May the company tested this integrated rig, drilling a hole four inches in diameter and 30 feet deep. Another test in June reached a depth of 40 feet. These holes were drilled into columns of basalt that had been lowered into the ground as a test material.

While the company tests its 100-kilowatt systems at the rig and the quarry, the next step is an even larger system, which features a gyrotron that’s 10 times more powerful. This one-megawatt system will drill larger holes, over eight inches across, and represents the commercial-scale version of the company’s technology. Drilling tests are set to begin with this larger drill in 2026. 

The one-megawatt system actually needs a little over three megawatts of power overall, including the energy needed to run support equipment like cooling systems and the compressor that blows air into the hole, carrying the rock dust back up to the surface. That power demand is similar to what an oil and gas rig requires today. 

Quaise is in the process of setting up a pilot plant in Oregon, basically on the side of a volcano, says Trenton Cladouhos, the company’s vice president of geothermal resource development. This project will use conventional drilling, and its main purpose is to show that Quaise can build and run a geothermal plant, Cladouhos says. 

The company is building an exploration well this year and plans to begin drilling production wells (those that can eventually be used to generate electricity) in 2026. That pilot project will reach about 20 megawatts of power with the first few wells, operating on rock that’s around 350 °C. The company plans to have it operational as early as 2028.

Quaise’s strategy with the Oregon project is to show that it can use super-hot rocks to produce geothermal power efficiently, says CEO Carlos Araque. After it fires up the plant and begins producing electricity, the company can go back in and deepen the holes with millimeter-wave drilling in the future, he adds.

A drilling test shows Quaise’s millimeter-wave technology drilling into a piece of granite.
QUAISE

Araque says the company already has some customers lined up for the energy it’ll produce, though he declined to name them, saying only that one was a big tech company, and there’s a utility involved as well.

But the startup will need more capital to finish this project and complete its testing with the larger, one-megawatt gyrotron. And uncertainty is floating around in climate tech, given the Trump administration’s tariffs and rollback of financial support for climate tech (though geothermal has been relatively unscathed). 

Quaise still has some technical barriers to overcome before it begins building commercial power plants. 

One potential hurdle: drilling in different directions. Right now, millimeter-wave drilling can go in a straight line, straight down. Developing a geothermal plant like the one at the Oregon site will likely require what’s called directional drilling, the ability to drill in directions other than vertical.

And the company will likely face challenges as it transitions from lab testing to field trials. One key challenge for geothermal technology companies attempting to operate at this depth will be  keeping wells functional for a long time to keep a power plant operating, says Jefferson Tester, a professor at Cornell University and an expert in geothermal energy.

Quaise’s technology is very aspirational, Tester says, and it can be difficult for new ideas in geothermal to compete economically. “It’s eventually all about cost,” he says. And companies with ambitious ideas run the risk that their investors will run out of patience before they can develop their technology enough to make it onto the grid.

“There’s a lot more to learn—I mean, we’re reinventing drilling,” says Steve Jeske, a project manager at Quaise. “It seems like it shouldn’t work, but it does.”

In defense of air-conditioning

I’ll admit that I’ve rarely hesitated to point an accusing finger at air-conditioning. I’ve outlined in many stories and newsletters that AC is a significant contributor to global electricity demand, and it’s only going to suck up more power as temperatures rise.

But I’ll also be the first to admit that it can be a life-saving technology, one that may become even more necessary as climate change intensifies. And in the wake of Europe’s recent deadly heat wave, it’s been oddly villainized

We should all be aware of the growing electricity toll of air-conditioning, but the AC hate is misplaced. Yes, AC is energy intensive, but so is heating our homes, something that’s rarely decried in the same way that cooling is. Both are tools for comfort and, more important, for safety.  So why is air-conditioning cast as such a villain?

In the last days of June and the first few days of July, temperatures hit record highs across Europe. Over 2,300 deaths during that period were attributed to the heat wave, according to early research from World Weather Attribution, an academic collaboration that studies extreme weather. And human-caused climate change accounted for 1,500 of the deaths, the researchers found. (That is, the number of fatalities would have been under 800 if not for higher temperatures because of climate change.)

We won’t have the official death toll for months, but these early figures show just how deadly heat waves can be. Europe is especially vulnerable, because in many countries, particularly in the northern part of the continent, air-conditioning is not common.

Popping on a fan, drawing the shades, or opening the windows on the hottest days used to cut it in many European countries. Not anymore. The UK was 1.24 °C (2.23 °F) warmer over the past decade than it was between 1961 and 1990, according to the Met Office, the UK’s national climate and weather service. One recent study found that homes across the country are uncomfortably or dangerously warm much more frequently than they used to be.

The reality is, some parts of the world are seeing an upward shift in temperatures that’s not just uncomfortable but dangerous. As a result, air-conditioning usage is going up all over the world, including in countries with historically low rates.

The reaction to this long-term trend, especially in the face of the recent heat wave, has been apoplectic. People are decrying AC across social media and opinion pages, arguing that we need to suck it up and deal with being a little bit uncomfortable.

Now, let me preface this by saying that I do live in the US, where roughly 90% of homes are cooled with air-conditioning today. So perhaps I am a little biased in favor of AC. But it baffles me when people talk about air-conditioning this way.

I spent a good amount of my childhood in the southeastern US, where it’s very obvious that heat can be dangerous. I was used to many days where temperatures were well above 90 °F (32 °C), and the humidity was so high your clothes would stick to you as soon as you stepped outdoors. 

For some people, being active or working in those conditions can lead to heatstroke. Prolonged exposure, even if it’s not immediately harmful, can lead to heart and kidney problems. Older people, children, and those with chronic conditions can be more vulnerable

In other words, air-conditioning is more than a convenience; in certain conditions, it’s a safety measure. That should be an easy enough concept to grasp. After all, in many parts of the world we expect access to heating in the name of safety. Nobody wants to freeze to death. 

And it’s important to clarify here that while air-conditioning does use a lot of electricity in the US, heating actually has a higher energy footprint. 

In the US, about 19% of residential electricity use goes to air-conditioning. That sounds like a lot, and it’s significantly more than the 12% of electricity that goes to space heating. However, we need to zoom out to get the full picture, because electricity makes up only part of a home’s total energy demand. A lot of homes in the US use natural gas for heating—that’s not counted in the electricity being used, but it’s certainly part of the home’s total energy use.

When we look at the total, space heating accounts for a full 42% of residential energy consumption in the US, while air conditioning accounts for only 9%.

I’m not letting AC off the hook entirely here. There’s obviously a difference between running air-conditioning (or other, less energy-intensive technologies) when needed to stay safe and blasting systems at max capacity because you prefer it chilly. And there’s a lot of grid planning we’ll need to do to make sure we can handle the expected influx of air-conditioning around the globe. 

But the world is changing, and temperatures are rising. If you’re looking for a villain, look beyond the air conditioner and into the atmosphere.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

California is set to become the first US state to manage power outages with AI

California’s statewide power grid operator is poised to become the first in North America to deploy artificial intelligence to manage outages, MIT Technology Review has learned. 

“We wanted to modernize our grid operations. This fits in perfectly with that,” says Gopakumar Gopinathan, a senior advisor on power system technologies at the California Independent System Operator—known as the CAISO and pronounced KAI-so. “AI is already transforming different industries. But we haven’t seen many examples of it being used in our industry.” 

At the DTECH Midwest utility industry summit in Minneapolis on July 15, CAISO is set to announce a deal to run a pilot program using new AI software called Genie, from the energy-services giant OATI. The software uses generative AI to analyze and carry out real-time analyses for grid operators and comes with the potential to autonomously make decisions about key functions on the grid, a switch that might resemble going from uniformed traffic officers to sensor-equipped stoplights. 

But while CAISO may deliver electrons to cutting-edge Silicon Valley companies and laboratories, the actual task of managing the state’s electrical system is surprisingly analog. 

Today, CAISO engineers scan outage reports for keywords about maintenance that’s planned or in the works, read through the notes, and then load each item into the grid software system to run calculations on how a downed line or transformer might affect power supply.

“Even if it takes you less than a minute to scan one on average, when you amplify that over 200 or 300 outages, it adds up,” says Abhimanyu Thakur, OATI’s vice president of platforms, visualization, and analytics. “Then different departments are doing it for their own respective keywords. Now we consolidate all of that into a single dictionary of keywords and AI can do this scan and generate a report proactively.” 

If CAISO finds that Genie produces reliable, more efficient data analyses for managing outages, Gopinathan says, the operator may consider automating more functions on the grid. “After a few rounds of testing, I think we’ll have an idea about what is the right time to call it successful or not,” he says. 

Regardless of the outcome, the experiment marks a significant shift. Most grid operators are using the same systems that utilities have used “for decades,” says Richard Doying, who spent more than 20 years as a top executive at the Midcontinent Independent System Operator, the grid operator for an area encompassing 15 states from the upper Midwest down to Louisiana. 

“These organizations are carved up for people working on very specific, specialized tasks and using their own proprietary tools that they’ve developed over time,” says Doying, now a vice president at the consultancy Grid Strategies. “To the extent that some of these new AI tools are able to draw from data across different areas of an organization and conduct more sophisticated analysis, that’s only helpful for grid operators.”

Last year, a Department of Energy report found that AI had potential to speed up studies on grid capacity and transmission, improve weather forecasting to help predict how much energy wind and solar plants would produce at a given time, and optimize planning for electric-vehicle charging networks. Another report by the energy department’s Loan Programs Office concluded that adding more “advanced” technology such as sensors to various pieces of equipment will generate data that can enable AI to do much more over time. 

In April, the PJM Interconnection—the nation’s largest grid system, spanning 13 states along the densely populated mid-Atlantic and Eastern Seaboard—took a big step toward embracing AI by inking a deal with Google to use its Tapestry software to improve regional planning and speed up grid connections for new power generators. 

ERCOT, the Texas grid system, is considering adopting technology similar to what CAISO is now set to use, according to a source with knowledge of the plans who requested anonymity because they were not authorized to speak publicly. ERCOT did not respond to a request for comment. 

Australia offers an example of what the future may look like. In New South Wales, where grid sensors and smart technology are more widely deployed, AI software rolled out in February is now predicting the production and flow of electricity from rooftop solar units across the state and automatically adjusting how much power from those panels can enter the grid. 

Until now, much of the discussion around AI and energy has focused on the electricity demands of AI data centers (check out MIT Technology Review’s Power Hungry series for more on this).

“We’ve been talking a lot about what the grid can do for AI and not nearly as much about what AI can do for the grid,” says Charles Hua, a coauthor of one of last year’s Energy Department reports who now serves executive director of PowerLines, a nonprofit that advocates for improving the affordability and reliability of US grids. “In general, there’s a huge opportunity for grid operators, regulators, and other stakeholders in the utility regulatory system to use AI effectively and harness it for a more resilient, modernized, and strengthened grid.” 

For now, Gopinathan says, he’s remaining cautiously optimistic. 

“I don’t want to overhype it,” he says. 

Still, he adds, “it’s a first step for bigger automation.”

“Right now, this is more limited to our outage management system. Genie isn’t talking to our other parts yet,” he says. “But I see a world where AI agents are able to do a lot more.”

China’s energy dominance in three charts

China is the dominant force in next-generation energy technologies today. It’s pouring hundreds of billions of dollars into putting renewable sources like wind and solar on its grid, manufacturing millions of electric vehicles, and building out capacity for energy storage, nuclear power, and more. This investment has been transformational for the country’s economy and has contributed to establishing China as a major player in global politics. 

Meanwhile, in the US, a massive new tax and spending bill just cut hundreds of billions in credits, grants, and loans for clean energy technologies. It’s a stark reversal from previous policies, and it could have massive effects at a time when it feels as if everyone is chasing China on energy.

So while we all try to get our heads around what’s next for climate tech in the US and beyond, let’s look at just how dominant China is when it comes to clean energy, as documented in three charts.

China is on an absolute tear installing wind and solar power. The country reached nearly 900 gigawatts of installed capacity for solar at the end of 2024, and the rapid pace of building has continued into this year. An additional 198 GW was installed between January and May, with 93 GW coming in May alone

For context, those additions over the first five months of the year account for more than double the capacity of the grid in California. Not the renewables capacity of that state—the entire grid. 

Meanwhile, the policy shift in the US is projected to slow down new solar and wind additions. With tax credits and other support stripped away, much of the new capacity that was expected to come online by the end of the decade will now face delays or cancellations. 

That’s significant because of all the new electricity generation capacity that’s come online in the US recently, renewables make up the vast majority. Solar and battery storage alone are expected to make up over 80% of capacity additions in 2025. So slowing down wind and solar basically means slowing down adding new electricity capacity, at a time when demand is very much set to rise. (Hello, AI?)

China’s EV market is also booming—the country is currently flirting with a big symbolic milestone, nearing the point where over half of all new vehicles sold in the country are electric. (It already passed that mark for a single month and could do so on a yearly basis in the next couple of years.)

It’s not just selling those vehicles within China, either: the country exports them globally, with customers including established markets like Europe and growing ones like India and Brazil. As of 2024, more than 70% of electric and plug-in hybrid vehicles on roads around the world were built in ChinaSome leaders in legacy automakers are taking notice. Ford CEO Jim Farley shared some striking comments at the Aspen Ideas Festival last month about how far ahead China is on vehicle technology and price. “They have far superior in-vehicle technology,” Farley said. “We are in a global competition with China, and it’s not just EVs. And if we lose this, we do not have a future Ford.” 

Looking ahead, China is still pouring money into renewables, storage, grids, and energy efficiency technologies. It’s also outspending the rest of the world on nuclear power. The country tripled its investment in renewable power from 2015 to 2025.

The situation isn’t set in stone, though: The US actually very briefly overtook China on battery investments over the past year, as Cat Clifford at Cipher reported last week. But changes resulting from the new bill could very quickly reverse that progress, cementing China as the place for battery manufacturing and innovation.

In a story earlier this week, the MIT economist David Autor laid out the high stakes for this race. Advanced manufacturing and technology are beneficial for US prosperity, and putting public support and trade protections in place for key industries could be crucial to keeping them going, he says.  

I’d add that this whole discussion shouldn’t be about a zero-sum competition between the US and China. But many experts argue that the US, where I and many readers live, is surrendering its leadership and ability to develop key energy technologies of the future.  

Ultimately, the numbers don’t lie: By a lot of measures, China is the world’s leader in energy. The question is, will that change anytime soon?  

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.