This startup wants to use the Earth as a massive battery

The Texas-based startup Quidnet Energy just completed a test showing it can store energy for up to six months by pumping water underground.

Using water to store electricity is hardly a new concept—pumped hydropower storage has been around for over a century. But the company hopes its twist on the technology could help bring cheap, long-duration energy storage to new places.

In traditional pumped hydro storage facilities, electric pumps move water uphill, into a natural or manmade body of water. Then, when electricity is needed, that water is released and flows downhill past a turbine, generating electricity. Quidnet’s approach instead pumps water down into impermeable rock formations and keeps it under pressure so it flows up when released. “It’s like pumped hydro, upside down,” says CEO Joe Zhou.

Quidnet started a six-month test of its technology in late 2024, pressurizing the system. In June, the company was able to discharge 35 megawatt-hours of energy from the well. There was virtually no self-discharge, meaning no energy loss, Zhou says.

Inexpensive forms of energy storage that can store electricity for weeks or months could help inconsistent electricity sources like wind and solar go further for the grid. And Quidnet’s approach, which uses commercially available equipment, could be deployed quickly and qualify for federal tax credits to help make it even cheaper.

However, there’s still a big milestone ahead: turning the pressurized water back into electricity. The company is currently building a facility with the turbines and support equipment to do that—all the components are available to purchase from established companies. “We don’t need to invent new things based on what we’ve already developed today,” Zhou says. “We can now start just deploying at very, very substantial scales.”

That process will come with energy losses. Energy storage systems are typically measured by their round-trip efficiency: how much of the electricity that’s put into the system is returned at the end as electricity. Modeling suggests that Quidnet’s technology could reach a maximum efficiency of about 65%, Zhou says, though some design choices made to optimize for economics will likely cause the system to land at roughly 50%.

That’s less efficient than lithium-ion batteries, but long-duration systems, if they’re cheap enough, can operate at low efficiencies and still be useful for the grid, says Paul Denholm, a senior research fellow at the National Renewable Energy Laboratory.

“It’s got to be cost-competitive; it all comes down to that,” Denholm says.

Lithium-ion batteries, the fastest-growing technology in energy storage, are the target that new forms of energy storage, like Quidnet’s, must chase. Lithium-ion batteries are about 90% cheaper today than they were 15 years ago. They’ve become a price-competitive alternative to building new natural-gas plants, Denholm says.

When it comes to competing with batteries, one potential differentiator for Quidnet could be government subsidies. While the Trump administration has clawed back funding for clean energy technologies, there’s still an energy storage tax credit, though recently passed legislation added new supply chain restrictions.

Starting in 2026, new energy storage facilities hoping to qualify for tax credits will need to prove that at least 55% of the value of a project’s materials are not from foreign entities of concern. That rules out sourcing batteries from China, which dominates battery production today. Quidnet has a “high level of domestic content” and expects to qualify for tax credits under the new rules, Zhou says.

The facility Quidnet is building is a project with utility partner CPS Energy, and it should come online in early 2026. 

What role should oil and gas companies play in climate tech?

This week, I have a new story out about Quaise, a geothermal startup that’s trying to commercialize new drilling technology. Using a device called a gyrotron, the company wants to drill deeper, cheaper, in an effort to unlock geothermal power anywhere on the planet. (For all the details, check it out here.) 

For the story, I visited Quaise’s headquarters in Houston. I also took a trip across town to Nabors Industries, Quaise’s investor and tech partner and one of the biggest drilling companies in the world. 

Standing on top of a drilling rig in the backyard of Nabors’s headquarters, I couldn’t stop thinking about the role oil and gas companies are playing in the energy transition. This industry has resources and energy expertise—but also a vested interest in fossil fuels. Can it really be part of addressing climate change?

The relationship between Quaise and Nabors is one that we see increasingly often in climate tech—a startup partnering up with an established company in a similar field. (Another one that comes to mind is in the cement industry, where Sublime Systems has seen a lot of support from legacy players including Holcim, one of the biggest cement companies in the world.) 

Quaise got an early investment from Nabors in 2021, to the tune of $12 million. Now the company also serves as a technical partner for the startup. 

“We are agnostic to what hole we’re drilling,” says Cameron Maresh, a project engineer on the energy transition team at Nabors Industries. The company is working on other investments and projects in the geothermal industry, Maresh says, and the work with Quaise is the culmination of a yearslong collaboration: “We’re just truly excited to see what Quaise can do.”

From the outside, this sort of partnership makes a lot of sense for Quaise. It gets resources and expertise. Meanwhile, Nabors is getting involved with an innovative company that could represent a new direction for geothermal. And maybe more to the point, if fossil fuels are to be phased out, this deal gives the company a stake in next-generation energy production.

There is so much potential for oil and gas companies to play a productive role in addressing climate change. One report from the International Energy Agency examined the role these legacy players could take:  “Energy transitions can happen without the engagement of the oil and gas industry, but the journey to net zero will be more costly and difficult to navigate if they are not on board,” the authors wrote. 

In the agency’s blueprint for what a net-zero emissions energy system could look like in 2050, about 30% of energy could come from sources where the oil and gas industry’s knowledge and resources are useful. That includes hydrogen, liquid biofuels, biomethane, carbon capture, and geothermal. 

But so far, the industry has hardly lived up to its potential as a positive force for the climate. Also in that report, the IEA pointed out that oil and gas producers made up only about 1% of global investment in climate tech in 2022. Investment has ticked up a bit since then, but still, it’s tough to argue that the industry is committed. 

And now that climate tech is falling out of fashion with the government in the US, I’d venture to guess that we’re going to see oil and gas companies increasingly pulling back on their investments and promises. 

BP recently backtracked on previous commitments to cut oil and gas production and invest in clean energy. And last year the company announced that it had written off $1.1 billion in offshore wind investments in 2023 and wanted to sell other wind assets. Shell closed down all its hydrogen fueling stations for vehicles in California last year. (This might not be all that big a loss, since EVs are beating hydrogen by a huge margin in the US, but it’s still worth noting.) 

So oil and gas companies are investing what amounts to pennies and often backtrack when the political winds change direction. And, let’s not forget, fossil-fuel companies have a long history of behaving badly. 

In perhaps the most notorious example, scientists at Exxon modeled climate change in the 1970s, and their forecasts turned out to be quite accurate. Rather than publish that research, the company downplayed how climate change might affect the planet. (For what it’s worth, company representatives have argued that this was less of a coverup and more of an internal discussion that wasn’t fit to be shared outside the company.) 

While fossil fuels are still part of our near-term future, oil and gas companies, and particularly producers, would need to make drastic changes to align with climate goals—changes that wouldn’t be in their financial interest. Few seem inclined to really take the turn needed. 

As the IEA report puts it:  “In practice, no one committed to change should wait for someone else to move first.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

How nonprofits and academia are stepping up to salvage US climate programs

Nonprofits are striving to preserve a US effort to modernize greenhouse-gas measurements, amid growing fears that the Trump administration’s dismantling of federal programs will obscure the nation’s contributions to climate change.

The Data Foundation, a Washington, DC, nonprofit that advocates for open data, is fundraising for an initiative that will coordinate efforts among nonprofits, technical experts, and companies to improve the accuracy and accessibility of climate emissions information. It will build on an effort to improve the collection of emissions data that former president Joe Biden launched in 2023—and which President Trump nullified on his first day in office. 

The initiative will help prioritize responses to changes in federal greenhouse-gas monitoring and measurement programs, but the Data Foundation stresses that it will primarily serve a “long-standing need for coordination” of such efforts outside of government agencies.

The new greenhouse-gas coalition is one of a growing number of nonprofit and academic groups that have spun up or shifted focus to keep essential climate monitoring and research efforts going amid the Trump administration’s assault on environmental funding, staffing, and regulations. Those include efforts to ensure that US scientists can continue to contribute to the UN’s major climate report and publish assessments of the rising domestic risks of climate change. Otherwise, the loss of these programs will make it increasingly difficult for communities to understand how more frequent or severe wildfires, droughts, heat waves, and floods will harm them—and how dire the dangers could become. 

Few believe that nonprofits or private industry can come close to filling the funding holes that the Trump administration is digging. But observers say it’s essential to try to sustain efforts to understand the risks of climate change that the federal government has historically overseen, even if the attempts are merely stopgap measures. 

If we give up these sources of emissions data, “we’re flying blind,” says Rachel Cleetus, senior policy director with the climate and energy program at the Union of Concerned Scientists. “We’re deliberating taking away the very information that would help us understand the problem and how to address it best.”

Improving emissions estimates

The Environmental Protection Agency, the National Oceanic and Atmospheric Administration, the US Forest Service, and other agencies have long collected information about greenhouse gases in a variety of ways. These include self-reporting by industry; shipboard, balloon, and aircraft readings of gas concentrations in the atmosphere; satellite measurements of the carbon dioxide and methane released by wildfires; and on-the-ground measurements of trees. The EPA, in turn, collects and publishes the data from these disparate sources as the Inventory of US Greenhouse Gas Emissions and Sinks.

But that report comes out on a two-year lag, and studies show that some of the estimates it relies on could be way off—particularly the self-reported ones.

A recent analysis using satellites to measure methane pollution from four large landfills found they produce, on average, six times more emissions than the facilities had reported to the EPA. Likewise, a 2018 study in Science found that the actual methane leaks from oil and gas infrastructure were about 60% higher than the self-reported estimates in the agency’s inventory.

The Biden administration’s initiative—the National Strategy to Advance an Integrated US Greenhouse Gas Measurement, Monitoring, and Information System—aimed to adopt state-of-the-art tools and methods to improve the accuracy of these estimates, including satellites and other monitoring technologies that can replace or check self-reported information.

The administration specifically sought to achieve these improvements through partnerships between government, industry, and nonprofits. The initiative called for the data collected across groups to be published to an online portal in formats that would be accessible to policymakers and the public.

Moving toward a system that produces more current and reliable data is essential for understanding the rising risks of climate change and tracking whether industries are abiding by government regulations and voluntary climate commitments, says Ben Poulter, a former NASA scientist who coordinated the Biden administration effort as a deputy director in the Office of Science and Technology Policy.

“Once you have this operational system, you can provide near-real-time information that can help drive climate action,” Poulter says. He is now a senior scientist at Spark Climate Solutions, a nonprofit focused on accelerating emerging methods of combating climate change, and he is advising the Data Foundation’s Climate Data Collaborative, which is overseeing the new greenhouse-gas initiative. 

Slashed staffing and funding  

But the momentum behind the federal strategy deflated when Trump returned to office. On his first day, he signed an executive order that effectively halted it. The White House has since slashed staffing across the agencies at the heart of the effort, sought to shut down specific programs that generate emissions data, and raised uncertainties about the fate of numerous other program components. 

In April, the administration missed a deadline to share the updated greenhouse-gas inventory with the United Nations, for the first time in three decades, as E&E News reported. It eventually did release the report in May, but only after the Environmental Defense Fund filed a Freedom of Information Act request.

There are also indications that the collection of emissions data might be in jeopardy. In March, the EPA said it would “reconsider” the Greenhouse Gas Reporting Program, which requires thousands of power plants, refineries, and other industrial facilities to report emissions each year.

In addition, the tax and spending bill that Trump signed into law earlier this month rescinds provisions in Biden’s Inflation Reduction Act that provided incentives or funding for corporate greenhouse-gas reporting and methane monitoring. 

Meanwhile, the White House has also proposed slashing funding for the National Oceanic and Atmospheric Administration and shuttering a number of its labs. Those include the facility that supports the Mauna Loa Observatory in Hawaii, the world’s longest-running carbon dioxide measuring program, as well as the Global Monitoring Laboratory, which operates a global network of collection flasks that capture air samples used to measure concentrations of nitrous oxide, chlorofluorocarbons, and other greenhouse gases.

Under the latest appropriations negotiations, Congress seems set to spare NOAA and other agencies the full cuts pushed by the Trump administration, but that may or may not protect various climate programs within them. As observers have noted, the loss of experts throughout the federal government, coupled with the priorities set by Trump-appointed leaders of those agencies, could still prevent crucial emissions data from being collected, analyzed, and published.

“That’s a huge concern,” says David Hayes, a professor at the Stanford Doerr School of Sustainability, who previously worked on the effort to upgrade the nation’s emissions measurement and monitoring as special assistant to President Biden for climate policy. It’s not clear “whether they’re going to continue and whether the data availability will drop off.”

‘A natural disaster’

Amid all these cutbacks and uncertainties, those still hoping to make progress toward an improved system for measuring greenhouse gases have had to adjust their expectations: It’s now at least as important to simply preserve or replace existing federal programs as it is to move toward more modern tools and methods.

But Ryan Alexander, executive director of the Data Foundation’s Climate Data Collaborative, is optimistic that there will be opportunities to do both. 

She says the new greenhouse-gas coalition will strive to identify the highest-priority needs and help other nonprofits or companies accelerate the development of new tools or methods. It will also aim to ensure that these organizations avoid replicating one another’s efforts and deliver data with high scientific standards, in open and interoperable formats. 

The Data Foundation declines to say what other nonprofits will be members of the coalition or how much money it hopes to raise, but it plans to make a formal announcement in the coming weeks. 

Nonprofits and companies are already playing a larger role in monitoring emissions, including organizations like Carbon Mapper, which operates satellites and aircraft that detect and measure methane emissions from particular facilities. The EDF also launched a satellite last year, known as MethaneSAT, that could spot large and small sources of emissions—though it lost power earlier this month and probably cannot be recovered. 

Alexander notes that shifting from self-reported figures to observational technology like satellites could not just replace but perhaps also improve on the EPA reporting program that the Trump administration has moved to shut down.

Given the “dramatic changes” brought about by this administration, “the future will not be the past,” she says. “This is like a natural disaster. We can’t think about rebuilding in the way that things have been in the past. We have to look ahead and say, ‘What is needed? What can people afford?’”

Organizations can also use this moment to test and develop emerging technologies that could improve greenhouse-gas measurements, including novel sensors or artificial intelligence tools, Hayes says. 

“We are at a time when we have these new tools, new technologies for measurement, measuring, and monitoring,” he says. “To some extent it’s a new era anyway, so it’s a great time to do some pilot testing here and to demonstrate how we can create new data sets in the climate area.”

Saving scientific contributions

It’s not just the collection of emissions data that nonprofits and academic groups are hoping to save. Notably, the American Geophysical Union and its partners have taken on two additional climate responsibilities that traditionally fell to the federal government.

The US State Department’s Office of Global Change historically coordinated the nation’s contributions to the UN Intergovernmental Panel on Climate Change’s major reports on climate risks, soliciting and nominating US scientists to help write, oversee, or edit sections of the assessments. The US Global Change Research Program, an interagency group that ran much of the process, also covered the cost of trips to a series of in-person meetings with international collaborators. 

But the US government seems to have relinquished any involvement as the IPCC kicks off the process for the Seventh Assessment Report. In late February, the administration blocked federal scientists including NASA’s Katherine Calvin, who was previously selected as a cochair for one of the working groups, from attending an early planning meeting in China. (Calvin was the agency’s chief scientist at the time but was no longer serving in that role as of April, according to NASA’s website.)

The agency didn’t respond to inquiries from interested scientists after the UN panel issued a call for nominations in March, and it failed to present a list of nominations by the deadline in April, scientists involved in the process say. The Trump administration also canceled funding for the Global Change Research Program and, earlier this month, fired the last remaining staffers working at the Office of Global Change.

In response, 10 universities came together in March to form the US Academic Alliance for the IPCC, in partnership with the AGU, to request and evalute applications from US researchers. The universities—which include Yale, Princeton, and the University of California, San Diego—together nominated nearly 300 scientists, some of whom the IPCC has since officially selected. The AGU is now conducting a fundraising campaign to help pay for travel expenses. 

Pamela McElwee, a professor at Rutgers who helped establish the academic coalition, says it’s crucial for US scientists to continue participating in the IPCC process.

“It is our flagship global assessment report on the state of climate, and it plays a really important role in influencing country policies,” she says. “To not be part of it makes it much more difficult for US scientists to be at the cutting edge and advance the things we need to do.” 

The AGU also stepped in two months later, after the White House dismissed hundreds of researchers working on the National Climate Assessment, an annual report analyzing the rising dangers of climate change across the country. The AGU and American Meteorological Society together announced plans to publish a “special collection” to sustain the momentum of that effort.

“It’s incumbent on us to ensure our communities, our neighbors, our children are all protected and prepared for the mounting risks of climate change,” said Brandon Jones, president of the AGU, in an earlier statement.

The AGU declined to discuss the status of the project.

Stopgap solution

The sheer number of programs the White House is going after will require organizations to make hard choices about what they attempt to save and how they go about it. Moreover, relying entirely on nonprofits and companies to take over these federal tasks is not viable over the long term. 

Given the costs of these federal programs, it could prove prohibitive to even keep a minimum viable version of some essential monitoring systems and research programs up and running. Dispersing across various organizations the responsibility of calculating the nation’s emissions sources and sinks also creates concerns about the scientific standards applied and the accessibility of that data, Cleetus says. Plus, moving away from the records that NOAA, NASA, and other agencies have collected for decades would break the continuity of that data, undermining the ability to detect or project trends.

More basically, publishing national emissions data should be a federal responsibility, particularly for the government of the world’s second-largest climate polluter, Cleetus adds. Failing to calculate and share its contributions to climate change sidesteps the nation’s global responsibilities and sends a terrible signal to other countries. 

Poulter stresses that nonprofits and the private sector can do only so much, for so long, to keep these systems up and running.

“We don’t want to give the impression that this greenhouse-gas coalition, if it gets off the ground, is a long-term solution,” he says. “But we can’t afford to have gaps in these data sets, so somebody needs to step in and help sustain those measurements.”

This startup wants to use beams of energy to drill geothermal wells

A beam of energy hit the slab of rock, which quickly began to glow. Pieces cracked off, sparks ricocheted, and dust whirled around under a blast of air. 

From inside a modified trailer, I peeked through the window as a millimeter-wave drilling rig attached to an unassuming box truck melted a hole into a piece of basalt in less than two minutes. After the test was over, I stepped out of the trailer into the Houston heat. I could see a ring of black, glassy material stamped into the slab fragments, evidence of where the rock had melted.  

This rock-melting drilling technology from the geothermal startup Quaise is certainly unconventional. The company hopes it’s the key to unlocking geothermal energy and making it feasible anywhere.

Geothermal power tends to work best in those parts of the world that have the right geology and heat close to the surface. Iceland and the western US, for example, are hot spots for this always-available renewable energy source because they have all the necessary ingredients. But by digging deep enough, companies could theoretically tap into the Earth’s heat from anywhere on the globe.

That’s a difficult task, though. In some places, accessing temperatures high enough to efficiently generate electricity would require drilling miles and miles beneath the surface. Often, that would mean going through very hard rock, like granite.

Quaise’s proposed solution is a new mode of drilling that eschews the traditional technique of scraping into rock with a hard drill bit. Instead, the company plans to use a gyrotron, a device that emits high-frequency electromagnetic radiation. Today, the fusion power industry uses gyrotrons to heat plasma to 100 million °C, but Quaise plans to use them to blast, melt, and vaporize rock. This could, in theory, make drilling faster and more economical, allowing for geothermal energy to be accessed anywhere.  

Since Quaise’s founding in 2018, the company has demonstrated that its systems work in the controlled conditions of the laboratory, and it has started trials in a semi-controlled environment, including the backyard of its Houston headquarters. Now these efforts are leaving the lab, and the team is taking gyrotron drilling technology to a quarry to test it in real-world conditions. 

Some experts caution that reinventing drilling won’t be as simple, or as fast, as Quaise’s leadership hopes. The startup is also attempting to raise a large funding round this year, at a time when economic uncertainty is slowing investment and the US climate technology industry is in a difficult spot politically because of policies like tariffs and a slowdown in government support. Quaise’s big idea aims to accelerate an old source of renewable energy. This make-or-break moment might determine how far that idea can go. 

Blasting through

Rough calculations from the geothermal industry suggest that enough energy is stored inside the Earth to meet our energy demands for tens or even hundreds of thousands of years, says Matthew Houde, cofounder and chief of staff at Quaise. After that, other sources like fusion should be available, “assuming we continue going on that long, so to speak,” he quips. 

“We want to be able to scale this style of geothermal beyond the locations where we’re able to readily access those temperatures today with conventional drilling,” Houde says. The key, he adds, is simply going deep enough: “If we can scale those depths to 10 to 20 kilometers, then we can enable super-hot geothermal to be worldwide accessible.”

Though that’s technically possible, there are few examples of humans drilling close to this depth. One research project that began in 1970 in the former Soviet Union reached just over 12 kilometers, but it took nearly 20 years and was incredibly expensive. 

Quaise hopes to speed up drilling and cut its cost, Houde says. The company’s goal is to drill through rock at a rate of between three and five meters per hour of steady operation.

One key factor slowing down many operations that drill through hard rocks like granite is nonproductive time. For example, equipment frequently needs to be brought all the way back up to the surface for repairs or to replace drill bits.

Quaise’s key to potentially changing that is its gyrotron. The device emits millimeter waves, beams of energy with wavelengths that fall between microwaves and infrared waves. It’s a bit like a laser, but the beam is not visible to the human eye. 

Quaise’s goal is to heat up the target rock, effectively drilling it away. The gyrotron beams waves at a target rock via a waveguide, a hollow metal tube that directs the energy to the right spot. (One of the company’s main technological challenges is to avoid accidentally making plasma, an ionized, superheated state of matter, as it can waste energy and damage key equipment like the waveguide.)

Here’s how it works in practice: When Quaise’s rig is drilling a hole, the tip of the waveguide is positioned a foot or so away from the rock it’s targeting. The gyrotron lets out a burst of millimeter waves for about a minute. They travel down the waveguide and hit the target rock, which heats up and then cracks, melts, or even vaporizes.

Then the beam stops, and the drill bit at the end of the waveguide is lowered to the surface of the rock, rotating and scraping off broken shards and melted bits of rock as it descends. A steady blast of air carries the debris up to the surface, and the process repeats. The energy in the millimeter waves does the hard work, and the scraping and compressed air help remove the fractured or melted material away.

This system is what I saw in action at the company’s Houston headquarters. The drilling rig in the yard is a small setup, something like what a construction company might use to drill micro piles for a foundation or what researchers would use to take geological samples. In total, the gyrotron has a power of 100 kilowatts. A cooling system helps the superconducting magnet in the gyrotron reach the necessary temperature (about -200 °C), and a filtration system catches the debris that sloughs off samples. 

Quaise truck and mobile drill unit

CASEY CROWNHART

Soon after my visit, this backyard setup was packed up and shipped to central Texas to be used for further field testing in a rock quarry. The company announced in July that it had used that rig to drill a 100-meter-deep hole at that field test site. 

Quaise isn’t the first to develop nonmechanical drilling, says Roland Horne, head of the geothermal program at Stanford University. “Burning holes in rocks is impressive. However, that’s not the whole of what’s involved in drilling,” he says. The operation will need to be able to survive the high temperatures and pressures at the bottom of wells as they’re drilled, he says.

So far, the company has found success drilling holes into columns of rock inside metal casings, as well as the quarry in its field trials. But there’s a long road between drilling into predictable material in a relatively predictable environment and creating a miles-deep geothermal well. 

Rocky roads

In April, Quaise fully integrated its second 100-kilowatt gyrotron onto an oil and gas rig owned by the company’s investor and technology partner Nabors. This rig is the sort that would typically be used for training or engineering development, and it’s set up along with a row of other rigs at the Nabors headquarters, just across town from the Quaise lab. At 182 feet high, the top is visible above the office building from the parking lot.

When I visited in April, the company was still completing initial tests, using special thermal paper and firing short blasts to test the setup. In May the company tested this integrated rig, drilling a hole four inches in diameter and 30 feet deep. Another test in June reached a depth of 40 feet. These holes were drilled into columns of basalt that had been lowered into the ground as a test material.

While the company tests its 100-kilowatt systems at the rig and the quarry, the next step is an even larger system, which features a gyrotron that’s 10 times more powerful. This one-megawatt system will drill larger holes, over eight inches across, and represents the commercial-scale version of the company’s technology. Drilling tests are set to begin with this larger drill in 2026. 

The one-megawatt system actually needs a little over three megawatts of power overall, including the energy needed to run support equipment like cooling systems and the compressor that blows air into the hole, carrying the rock dust back up to the surface. That power demand is similar to what an oil and gas rig requires today. 

Quaise is in the process of setting up a pilot plant in Oregon, basically on the side of a volcano, says Trenton Cladouhos, the company’s vice president of geothermal resource development. This project will use conventional drilling, and its main purpose is to show that Quaise can build and run a geothermal plant, Cladouhos says. 

The company is building an exploration well this year and plans to begin drilling production wells (those that can eventually be used to generate electricity) in 2026. That pilot project will reach about 20 megawatts of power with the first few wells, operating on rock that’s around 350 °C. The company plans to have it operational as early as 2028.

Quaise’s strategy with the Oregon project is to show that it can use super-hot rocks to produce geothermal power efficiently, says CEO Carlos Araque. After it fires up the plant and begins producing electricity, the company can go back in and deepen the holes with millimeter-wave drilling in the future, he adds.

A drilling test shows Quaise’s millimeter-wave technology drilling into a piece of granite.
QUAISE

Araque says the company already has some customers lined up for the energy it’ll produce, though he declined to name them, saying only that one was a big tech company, and there’s a utility involved as well.

But the startup will need more capital to finish this project and complete its testing with the larger, one-megawatt gyrotron. And uncertainty is floating around in climate tech, given the Trump administration’s tariffs and rollback of financial support for climate tech (though geothermal has been relatively unscathed). 

Quaise still has some technical barriers to overcome before it begins building commercial power plants. 

One potential hurdle: drilling in different directions. Right now, millimeter-wave drilling can go in a straight line, straight down. Developing a geothermal plant like the one at the Oregon site will likely require what’s called directional drilling, the ability to drill in directions other than vertical.

And the company will likely face challenges as it transitions from lab testing to field trials. One key challenge for geothermal technology companies attempting to operate at this depth will be  keeping wells functional for a long time to keep a power plant operating, says Jefferson Tester, a professor at Cornell University and an expert in geothermal energy.

Quaise’s technology is very aspirational, Tester says, and it can be difficult for new ideas in geothermal to compete economically. “It’s eventually all about cost,” he says. And companies with ambitious ideas run the risk that their investors will run out of patience before they can develop their technology enough to make it onto the grid.

“There’s a lot more to learn—I mean, we’re reinventing drilling,” says Steve Jeske, a project manager at Quaise. “It seems like it shouldn’t work, but it does.”

In defense of air-conditioning

I’ll admit that I’ve rarely hesitated to point an accusing finger at air-conditioning. I’ve outlined in many stories and newsletters that AC is a significant contributor to global electricity demand, and it’s only going to suck up more power as temperatures rise.

But I’ll also be the first to admit that it can be a life-saving technology, one that may become even more necessary as climate change intensifies. And in the wake of Europe’s recent deadly heat wave, it’s been oddly villainized

We should all be aware of the growing electricity toll of air-conditioning, but the AC hate is misplaced. Yes, AC is energy intensive, but so is heating our homes, something that’s rarely decried in the same way that cooling is. Both are tools for comfort and, more important, for safety.  So why is air-conditioning cast as such a villain?

In the last days of June and the first few days of July, temperatures hit record highs across Europe. Over 2,300 deaths during that period were attributed to the heat wave, according to early research from World Weather Attribution, an academic collaboration that studies extreme weather. And human-caused climate change accounted for 1,500 of the deaths, the researchers found. (That is, the number of fatalities would have been under 800 if not for higher temperatures because of climate change.)

We won’t have the official death toll for months, but these early figures show just how deadly heat waves can be. Europe is especially vulnerable, because in many countries, particularly in the northern part of the continent, air-conditioning is not common.

Popping on a fan, drawing the shades, or opening the windows on the hottest days used to cut it in many European countries. Not anymore. The UK was 1.24 °C (2.23 °F) warmer over the past decade than it was between 1961 and 1990, according to the Met Office, the UK’s national climate and weather service. One recent study found that homes across the country are uncomfortably or dangerously warm much more frequently than they used to be.

The reality is, some parts of the world are seeing an upward shift in temperatures that’s not just uncomfortable but dangerous. As a result, air-conditioning usage is going up all over the world, including in countries with historically low rates.

The reaction to this long-term trend, especially in the face of the recent heat wave, has been apoplectic. People are decrying AC across social media and opinion pages, arguing that we need to suck it up and deal with being a little bit uncomfortable.

Now, let me preface this by saying that I do live in the US, where roughly 90% of homes are cooled with air-conditioning today. So perhaps I am a little biased in favor of AC. But it baffles me when people talk about air-conditioning this way.

I spent a good amount of my childhood in the southeastern US, where it’s very obvious that heat can be dangerous. I was used to many days where temperatures were well above 90 °F (32 °C), and the humidity was so high your clothes would stick to you as soon as you stepped outdoors. 

For some people, being active or working in those conditions can lead to heatstroke. Prolonged exposure, even if it’s not immediately harmful, can lead to heart and kidney problems. Older people, children, and those with chronic conditions can be more vulnerable

In other words, air-conditioning is more than a convenience; in certain conditions, it’s a safety measure. That should be an easy enough concept to grasp. After all, in many parts of the world we expect access to heating in the name of safety. Nobody wants to freeze to death. 

And it’s important to clarify here that while air-conditioning does use a lot of electricity in the US, heating actually has a higher energy footprint. 

In the US, about 19% of residential electricity use goes to air-conditioning. That sounds like a lot, and it’s significantly more than the 12% of electricity that goes to space heating. However, we need to zoom out to get the full picture, because electricity makes up only part of a home’s total energy demand. A lot of homes in the US use natural gas for heating—that’s not counted in the electricity being used, but it’s certainly part of the home’s total energy use.

When we look at the total, space heating accounts for a full 42% of residential energy consumption in the US, while air conditioning accounts for only 9%.

I’m not letting AC off the hook entirely here. There’s obviously a difference between running air-conditioning (or other, less energy-intensive technologies) when needed to stay safe and blasting systems at max capacity because you prefer it chilly. And there’s a lot of grid planning we’ll need to do to make sure we can handle the expected influx of air-conditioning around the globe. 

But the world is changing, and temperatures are rising. If you’re looking for a villain, look beyond the air conditioner and into the atmosphere.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

California is set to become the first US state to manage power outages with AI

California’s statewide power grid operator is poised to become the first in North America to deploy artificial intelligence to manage outages, MIT Technology Review has learned. 

“We wanted to modernize our grid operations. This fits in perfectly with that,” says Gopakumar Gopinathan, a senior advisor on power system technologies at the California Independent System Operator—known as the CAISO and pronounced KAI-so. “AI is already transforming different industries. But we haven’t seen many examples of it being used in our industry.” 

At the DTECH Midwest utility industry summit in Minneapolis on July 15, CAISO is set to announce a deal to run a pilot program using new AI software called Genie, from the energy-services giant OATI. The software uses generative AI to analyze and carry out real-time analyses for grid operators and comes with the potential to autonomously make decisions about key functions on the grid, a switch that might resemble going from uniformed traffic officers to sensor-equipped stoplights. 

But while CAISO may deliver electrons to cutting-edge Silicon Valley companies and laboratories, the actual task of managing the state’s electrical system is surprisingly analog. 

Today, CAISO engineers scan outage reports for keywords about maintenance that’s planned or in the works, read through the notes, and then load each item into the grid software system to run calculations on how a downed line or transformer might affect power supply.

“Even if it takes you less than a minute to scan one on average, when you amplify that over 200 or 300 outages, it adds up,” says Abhimanyu Thakur, OATI’s vice president of platforms, visualization, and analytics. “Then different departments are doing it for their own respective keywords. Now we consolidate all of that into a single dictionary of keywords and AI can do this scan and generate a report proactively.” 

If CAISO finds that Genie produces reliable, more efficient data analyses for managing outages, Gopinathan says, the operator may consider automating more functions on the grid. “After a few rounds of testing, I think we’ll have an idea about what is the right time to call it successful or not,” he says. 

Regardless of the outcome, the experiment marks a significant shift. Most grid operators are using the same systems that utilities have used “for decades,” says Richard Doying, who spent more than 20 years as a top executive at the Midcontinent Independent System Operator, the grid operator for an area encompassing 15 states from the upper Midwest down to Louisiana. 

“These organizations are carved up for people working on very specific, specialized tasks and using their own proprietary tools that they’ve developed over time,” says Doying, now a vice president at the consultancy Grid Strategies. “To the extent that some of these new AI tools are able to draw from data across different areas of an organization and conduct more sophisticated analysis, that’s only helpful for grid operators.”

Last year, a Department of Energy report found that AI had potential to speed up studies on grid capacity and transmission, improve weather forecasting to help predict how much energy wind and solar plants would produce at a given time, and optimize planning for electric-vehicle charging networks. Another report by the energy department’s Loan Programs Office concluded that adding more “advanced” technology such as sensors to various pieces of equipment will generate data that can enable AI to do much more over time. 

In April, the PJM Interconnection—the nation’s largest grid system, spanning 13 states along the densely populated mid-Atlantic and Eastern Seaboard—took a big step toward embracing AI by inking a deal with Google to use its Tapestry software to improve regional planning and speed up grid connections for new power generators. 

ERCOT, the Texas grid system, is considering adopting technology similar to what CAISO is now set to use, according to a source with knowledge of the plans who requested anonymity because they were not authorized to speak publicly. ERCOT did not respond to a request for comment. 

Australia offers an example of what the future may look like. In New South Wales, where grid sensors and smart technology are more widely deployed, AI software rolled out in February is now predicting the production and flow of electricity from rooftop solar units across the state and automatically adjusting how much power from those panels can enter the grid. 

Until now, much of the discussion around AI and energy has focused on the electricity demands of AI data centers (check out MIT Technology Review’s Power Hungry series for more on this).

“We’ve been talking a lot about what the grid can do for AI and not nearly as much about what AI can do for the grid,” says Charles Hua, a coauthor of one of last year’s Energy Department reports who now serves executive director of PowerLines, a nonprofit that advocates for improving the affordability and reliability of US grids. “In general, there’s a huge opportunity for grid operators, regulators, and other stakeholders in the utility regulatory system to use AI effectively and harness it for a more resilient, modernized, and strengthened grid.” 

For now, Gopinathan says, he’s remaining cautiously optimistic. 

“I don’t want to overhype it,” he says. 

Still, he adds, “it’s a first step for bigger automation.”

“Right now, this is more limited to our outage management system. Genie isn’t talking to our other parts yet,” he says. “But I see a world where AI agents are able to do a lot more.”

China’s energy dominance in three charts

China is the dominant force in next-generation energy technologies today. It’s pouring hundreds of billions of dollars into putting renewable sources like wind and solar on its grid, manufacturing millions of electric vehicles, and building out capacity for energy storage, nuclear power, and more. This investment has been transformational for the country’s economy and has contributed to establishing China as a major player in global politics. 

Meanwhile, in the US, a massive new tax and spending bill just cut hundreds of billions in credits, grants, and loans for clean energy technologies. It’s a stark reversal from previous policies, and it could have massive effects at a time when it feels as if everyone is chasing China on energy.

So while we all try to get our heads around what’s next for climate tech in the US and beyond, let’s look at just how dominant China is when it comes to clean energy, as documented in three charts.

China is on an absolute tear installing wind and solar power. The country reached nearly 900 gigawatts of installed capacity for solar at the end of 2024, and the rapid pace of building has continued into this year. An additional 198 GW was installed between January and May, with 93 GW coming in May alone

For context, those additions over the first five months of the year account for more than double the capacity of the grid in California. Not the renewables capacity of that state—the entire grid. 

Meanwhile, the policy shift in the US is projected to slow down new solar and wind additions. With tax credits and other support stripped away, much of the new capacity that was expected to come online by the end of the decade will now face delays or cancellations. 

That’s significant because of all the new electricity generation capacity that’s come online in the US recently, renewables make up the vast majority. Solar and battery storage alone are expected to make up over 80% of capacity additions in 2025. So slowing down wind and solar basically means slowing down adding new electricity capacity, at a time when demand is very much set to rise. (Hello, AI?)

China’s EV market is also booming—the country is currently flirting with a big symbolic milestone, nearing the point where over half of all new vehicles sold in the country are electric. (It already passed that mark for a single month and could do so on a yearly basis in the next couple of years.)

It’s not just selling those vehicles within China, either: the country exports them globally, with customers including established markets like Europe and growing ones like India and Brazil. As of 2024, more than 70% of electric and plug-in hybrid vehicles on roads around the world were built in ChinaSome leaders in legacy automakers are taking notice. Ford CEO Jim Farley shared some striking comments at the Aspen Ideas Festival last month about how far ahead China is on vehicle technology and price. “They have far superior in-vehicle technology,” Farley said. “We are in a global competition with China, and it’s not just EVs. And if we lose this, we do not have a future Ford.” 

Looking ahead, China is still pouring money into renewables, storage, grids, and energy efficiency technologies. It’s also outspending the rest of the world on nuclear power. The country tripled its investment in renewable power from 2015 to 2025.

The situation isn’t set in stone, though: The US actually very briefly overtook China on battery investments over the past year, as Cat Clifford at Cipher reported last week. But changes resulting from the new bill could very quickly reverse that progress, cementing China as the place for battery manufacturing and innovation.

In a story earlier this week, the MIT economist David Autor laid out the high stakes for this race. Advanced manufacturing and technology are beneficial for US prosperity, and putting public support and trade protections in place for key industries could be crucial to keeping them going, he says.  

I’d add that this whole discussion shouldn’t be about a zero-sum competition between the US and China. But many experts argue that the US, where I and many readers live, is surrendering its leadership and ability to develop key energy technologies of the future.  

Ultimately, the numbers don’t lie: By a lot of measures, China is the world’s leader in energy. The question is, will that change anytime soon?  

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Google’s electricity demand is skyrocketing

We got two big pieces of energy news from Google this week. The company announced that it’s signed an agreement to purchase electricity from a fusion company’s forthcoming first power plant. Google also released its latest environmental report, which shows that its energy use from data centers has doubled since 2020.

Taken together, these two bits of news offer a fascinating look at just how desperately big tech companies are hunting for clean electricity to power their data centers as energy demand and emissions balloon in the age of AI. Of course, we don’t know exactly how much of this pollution is attributable to AI because Google doesn’t break that out. (Also a problem!) So, what’s next and what does this all mean? 

Let’s start with fusion: Google’s deal with Commonwealth Fusion Systems is intended to provide the tech giant with 200 megawatts of power. This will come from Commonwealth’s first commercial plant, a facility planned for Virginia that the company refers to as the Arc power plant. The agreement represents half its capacity.

What’s important to note here is that this power plant doesn’t exist yet. In fact, Commonwealth still needs to get its Sparc demonstration reactor, located outside Boston, up and running. That site, which I visited in the fall, should be completed in 2026.

(An aside: This isn’t the first deal between Big Tech and a fusion company. Microsoft signed an agreement with Helion a couple of years ago to buy 50 megawatts of power from a planned power plant, scheduled to come online in 2028. Experts expressed skepticism in the wake of that deal, as my colleague James Temple reported.)

Nonetheless, Google’s announcement is a big moment for fusion, in part because of the size of the commitment and also because Commonwealth, a spinout company from MIT’s Plasma Science and Fusion Center, is seen by many in the industry as a likely candidate to be the first to get a commercial plant off the ground. (MIT Technology Review is owned by MIT but is editorially independent.)

Google leadership was very up-front about the length of the timeline. “We would certainly put this in the long-term category,” said Michael Terrell, Google’s head of advanced energy, in a press call about the deal.

The news of Google’s foray into fusion comes just days after the tech giant’s release of its latest environmental report. While the company highlighted some wins, some of the numbers in this report are eye-catching, and not in a positive way.

Google’s emissions have increased by over 50% since 2019, rising 6% in the last year alone. That’s decidedly the wrong direction for a company that’s set a goal to reach net-zero greenhouse-gas emissions by the end of the decade.

It’s true that the company has committed billions to clean energy projects, including big investments in next-generation technologies like advanced nuclear and enhanced geothermal systems. Those deals have helped dampen emissions growth, but it’s an arguably impossible task to keep up with the energy demand the company is seeing.

Google’s electricity consumption from data centers was up 27% from the year before. It’s doubled since 2020, reaching over 30 terawatt-hours. That’s nearly the annual electricity consumption from the entire country of Ireland.

As an outsider, it’s tempting to point the finger at AI, since that technology has crashed into the mainstream and percolated into every corner of Google’s products and business. And yet the report downplays the role of AI. Here’s one bit that struck me:

“However, it’s important to note that our growing electricity needs aren’t solely driven by AI. The accelerating growth of Google Cloud, continued investments in Search, the expanding reach of YouTube, and more, have also contributed to this overall growth.”

There is enough wiggle room in that statement to drive a large electric truck through. When I asked about the relative contributions here, company representative Mara Harris said via email that they don’t break out what portion comes from AI. When I followed up asking if the company didn’t have this information or just wouldn’t share it, she said she’d check but didn’t get back to me.

I’ll make the point here that we’ve made before, including in our recent package on AI and energy: Big companies should be disclosing more about the energy demands of AI. We shouldn’t be guessing at this technology’s effects.

Google has put a ton of effort and resources into setting and chasing ambitious climate goals. But as its energy needs and those of the rest of the industry continue to explode, it’s obvious that this problem is getting tougher, and it’s also clear that more transparency is a crucial part of the way forward.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

This battery recycling company is now cleaning up AI data centers

In a sandy industrial lot outside Reno, Nevada, rows of battery packs that once propelled electric vehicles are now powering a small AI data center.

Redwood Materials, one of the US’s largest battery recycling companies, showed off this array of energy storage modules, sitting on cinder blocks and wrapped in waterproof plastic, during a press tour at its headquarters on June 26. 

The event marked the launch of the company’s new business line, Redwood Energy, which will initially repurpose (rather than recycle) batteries with years of remaining life to create renewable-powered microgrids. Such small-scale energy systems can operate on or off the larger electricity grid, providing electricity for businesses or communities.

Redwood Materials says many of the batteries it takes in for processing retain more than half their capacity. 

“We can extract a lot more value from that material by using it as an energy storage project before recycling it,” JB Straubel, Redwood’s founder and chief executive, said at the event. 

This first microgrid, housed at the company’s facility in the Tahoe Reno Industrial Center, is powered by solar panels and capable of generating 64 megawatt-hours of electricity, making it one of the nation’s largest such systems. That power flows to Crusoe, a cryptocurrency miner that pivoted into developing AI data centers, which has built a facility with 2,000 graphics processing units adjacent to the lot of repurposed EV batteries. 

(That’s tiny as modern data centers go: Crusoe is developing a $500 billion AI data center for OpenAI and others in Abilene, Texas, where it expects to install 100,000 GPUs across its first two facilities by the end of the year, according to Forbes.)

Redwood’s project underscores a growing interest in powering data centers partially or entirely outside the electric grid. Not only would such microgrids be quicker to build than conventional power plants, but consumer ratepayers wouldn’t be on the hook for the cost of new grid-connected power plants developed to serve AI data centers. 

Since Redwood’s batteries are used, and have already been removed from vehicles, the company says its microgrids should also be substantially cheaper than ones assembled from new batteries.

A close up of grid of battery packs from Redwood Materials.

COURTESY REDWOOD MATERIALS

Redwood Energy’s microgrids could generate electricity for any kind of operation. But the company stresses they’re an ideal fit for addressing the growing energy needs and climate emissions of data centers. The energy consumption of such facilities could double by 2030, mainly due to the ravenous appetite of AI, according to an April report by the International Energy Agency.

“Storage is this perfectly positioned technology, especially low-cost storage, to attack each of those problems,” Straubel says.

The Tahoe Reno Industrial Center is the epicenter of a data center development boom in northern Nevada that has sparked growing concerns about climate emissions and excessive demand for energy and water, as MIT Technology Review recently reported.

Straubel says the litany of data centers emerging around it “would be logical targets” for its new business line, but adds there are growth opportunities across the expanding data center clusters in Texas, Virginia, and the Midwest as well.

“We’re talking to a broad cross section of those companies,” he says.

Crusoe, which also provides cloud services, recently announced a joint venture with the investment firm Engine No. 1 to provide “powered data center real estate solutions” to AI companies by constructing 4.5 gigawatts of new natural-gas plants.

Redwood’s microgrid should provide more than 99% of the electricity Crusoe’s local facilities need. In the event of extended periods with little sunlight, a rarity in the Nevada desert, the company could still draw from the standard power grid.

Cully Cavness, cofounder and operating chief of Crusoe, says the company is already processing AI queries and producing conclusions for its customers at the Nevada facility. (Its larger data centers are dedicated to the more computationally intensive process of training AI models.)

Redwood’s new business division offers a test case for a strategy laid out in a paper late last year, which highlighted the potential for solar-powered microgrids to supply the energy that AI data centers need.

The authors of that paper found that microgrids could be built much faster than natural-gas plants and would generally be only a little more expensive as an energy source for data centers, so long as the facilities could occasionally rely on natural-gas generators to get them through extended periods of low sunlight.

If solar-powered microgrids were used to power 30 gigawatts of new AI data centers, with just 10% backup from natural gas, it would eliminate 400 million tons of carbon dioxide emissions relative to running the centers entirely on natural gas, the study found. 

“Having a data center running off solar and storage is more or less what we were advocating for in our paper,” says Zeke Hausfather, climate lead at the payments company Stripe and a coauthor of the paper. He hopes that Redwood’s new microgrid will establish that “these sorts of systems work in the real world” and encourage other data center developers to look for similar solutions. 

Redwood Materials says electric vehicles are its fastest-growing source of used batteries, and it estimates that more than 100,000 EVs will come off US roads this year.

The company says it tests each battery to determine whether it can be reused. Those that qualify will be integrated into its modular storage systems, which can then store up energy from wind and solar installations or connect to the grid. As those batteries reach the end of their life, they’ll be swapped out of the microgrids and moved into the company’s recycling process. 

Redwood says it already has enough reusable batteries to build a gigawatt-hour’s worth of microgrids, capable of powering a little more than a million homes for an hour. In addition, the company’s new division has begun designing microgrids that are 10 times larger than the one it unveiled this week.

Straubel expects Redwood Energy to become a major business line, conceivably surpassing the company’s core recycling operation someday.

“We’re confident this is the lowest-cost solution out there,” he says.

It’s officially summer, and the grid is stressed

It’s crunch time for the grid this week. As I’m writing this newsletter, it’s 100 °F (nearly 38 °C) here in New Jersey, and I’m huddled in the smallest room in my apartment with the shades drawn and a single window air conditioner working overtime.  

Large swaths of the US have seen brutal heat this week, with multiple days in a row nearing or exceeding record-breaking temperatures. Spain recently went through a dramatic heat wave too, as did the UK, which is unfortunately bracing for another one soon. As I’ve been trying to stay cool, I’ve had my eyes on a website tracking electricity demand, which is also hitting record highs. 

We rely on electricity to keep ourselves comfortable, and more to the point, safe. These are the moments we design the grid for: when need is at its very highest. The key to keeping everything running smoothly during these times might be just a little bit of flexibility. 

While heat waves happen all over the world, let’s take my local grid as an example. I’m one of the roughly 65 million people covered by PJM Interconnection, the largest grid operator in the US. PJM covers Virginia, West Virginia, Ohio, Pennsylvania, and New Jersey, as well as bits of a couple of neighboring states.

Earlier this year, PJM forecast that electricity demand would peak at 154 gigawatts (GW) this summer. On Monday, just a few days past the official start of the season, the grid blew past that, averaging over 160 GW between 5 p.m. and 6 p.m. 

The fact that we’ve already passed both last year’s peak and this year’s forecasted one isn’t necessarily a disaster (PJM says the system’s total capacity is over 179 GW this year). But it is a good reason to be a little nervous. Usually, PJM sees its peak in July or August. As a reminder, it’s June. So we shouldn’t be surprised if we see electricity demand creep to even higher levels later in the summer.

It’s not just PJM, either. MISO, the grid that covers most of the Midwest and part of the US South, put out a notice that it expected to be close to its peak demand this week. And the US Department of Energy released an emergency order for parts of the Southeast, which allows the local utility to boost generation and skirt air pollution limits while demand is high.

This pattern of maxing out the grid is only going to continue. That’s because climate change is pushing temperatures higher, and electricity demand is simultaneously swelling (in part because of data centers like those that power AI). PJM’s forecasts show that the summer peak in 2035 could reach nearly 210 GW, well beyond the 179 GW it can provide today. 

Of course, we need more power plants to be built and connected to the grid in the coming years (at least if we don’t want to keep ancient, inefficient, expensive coal plants running, as we covered last week). But there’s a quiet strategy that could limit the new construction needed: flexibility.

The power grid has to be built for moments of the absolute highest demand we can predict, like this heat wave. But most of the time, a decent chunk of capacity that exists to get us through these peaks sits idle—it only has to come online when demand surges. Another way to look at that, however, is that by shaving off demand during the peak, we can reduce the total infrastructure required to run the grid. 

If you live somewhere that’s seen a demand crunch during a heat wave, you might have gotten an email from your utility asking you to hold off on running the dishwasher in the early evening or to set your air conditioner a few degrees higher. These are called demand response programs. Some utilities run more organized programs, where utilities pay customers to ramp down their usage during periods of peak demand.

PJM’s demand response programs add up to almost eight gigawatts of power—enough to power over 6 million homes. With these programs, PJM basically avoids having to fire up the equivalent of multiple massive nuclear power plants. (It did activate these programs on Monday afternoon during the hottest part of the day.)

As electricity demand goes up, building in and automating this sort of flexibility could go a long way to reducing the amount of new generation needed. One report published earlier this year found that if data centers agreed to have their power curtailed for just 0.5% of the time (around 40 hours out of a year of continuous operation), the grid could handle about 18 GW of new power demand in the PJM region without adding generation capacity. 

For the whole US, this level of flexibility would allow the grid to take on an additional 98 gigawatts of new demand without building any new power plants to meet it. To give you a sense of just how significant that would be, all the nuclear reactors in the US add up to 97 gigawatts of capacity.

Tweaking the thermostat and ramping down data centers during hot summer days won’t solve the demand crunch on their own, but it certainly won’t hurt to have more flexibility.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.