This American nuclear company could help India’s thorium dream

For just the second time in nearly two decades, the United States has granted an export license to an American company planning to sell nuclear technology to India, MIT Technology Review has learned. The decision to greenlight Clean Core Thorium Energy’s license is a major step toward closer cooperation between the two countries on atomic energy and marks a milestone in the development of thorium as an alternative to uranium for fueling nuclear reactors. 

Starting from the issuance last week, the thorium fuel produced by the Chicago-based company can be shipped to reactors in India, where it could be loaded into the cores of existing reactors. Once Clean Core receives final approval from Indian regulators, it will become one of the first American companies to sell nuclear technology to India, just as the world’s most populous nation has started relaxing strict rules that have long kept the US private sector from entering its atomic power industry. 

“This license marks a turning point, not just for Clean Core but for the US-India civil nuclear partnership,” says Mehul Shah, the company’s chief executive and founder. “It places thorium at the center of the global energy transformation.”

Thorium has long been seen as a good alternative to uranium because it’s more abundant, produces both smaller amounts of long-lived radioactive waste and fewer byproducts with centuries-long half-lives, and reduces the risk that materials from the fuel cycle will be diverted into weapons manufacturing. 

But at least some uranium fuel is needed to make thorium atoms split, making it an imperfect replacement. It’s also less well suited for use in the light-water reactors that power the vast majority of commercial nuclear plants worldwide. And in any case, the complex, highly regulated nuclear industry is extremely resistant to change.

For India, which has scant uranium reserves but abundant deposits of thorium, the latter metal has been part of a long-term strategy for reducing dependence on imported fuels. The nation started negotiating a nuclear export treaty with the US in the early 2000s, and a 123 Agreement—a special, Senate-approved treaty the US requires with another country before sending it any civilian nuclear products—was approved in 2008.

A new approach

While most thorium advocates have envisioned new reactors designed to run on this fuel, which would mean rebuilding the nuclear industry from the ground up, Shah and his team took a different approach. Clean Core created a new type of fuel that blends thorium with a more concentrated type of uranium called HALEU (high-assay low-enriched uranium). This blended fuel can be used in India’s pressurized heavy-water reactors, which make up the bulk of the country’s existing fleet and many of the new units under development now. 

Thorium isn’t a fissile material itself, meaning its atoms aren’t inherently unstable enough for an extra neutron to easily split the nuclei and release energy. But the metal has what’s known as “fertile properties,” meaning it can absorb neutrons and transform into the fissile material uranium-233. Uranium-233 produces fewer long-lived radioactive isotopes than the uranium-235 that makes up the fissionable part of traditional fuel pellets. Most commercial reactors run on low-enriched uranium, which is about 5% U-235. When the fuel is spent, roughly 95% of the energy potential is left in the metal. And what remains is a highly toxic cocktail of long-lived radioactive isotopes such as cesium-137 and plutonium-239, which keep the waste dangerous for tens of thousands of years. Another concern is that the plutonium could be extracted for use in weapons. 

Enriched up to 20%, HALEU allows reactors to extract more of the available energy and thus reduce the volume of waste. Clean Core’s fuel goes further: The HALEU provides the initial spark to ignite fertile thorium and triggers a reaction that can burn much hotter and utilize the vast majority of the material in the core, as a study published last year in the journal Nuclear Engineering and Design showed.

“Thorium provides attributes needed to achieve higher burnups,” says Koroush Shirvan, an MIT professor of nuclear science and engineering who helped design Clean Core’s fuel assemblies. “It is enabling technology to go to higher burnups, which reduces your spent fuel volume, increases your fuel efficiency, and reduces the amount of uranium that you need.” 

Compared with traditional uranium fuel, Clean Core says, its fuel reduces waste by more than 85% while avoiding the most problematic isotopes produced during fission. “The result is a safer, more sustainable cycle that reframes nuclear power not as a source of millennia-long liabilities but as a pathway to cleaner energy and a viable future fuel supply,” says Milan Shah, Clean Core’s chief operating officer and Mehul’s son.

Pressurized heavy-water reactors are particularly well suited to thorium because heavy water—a version of H2O that has an extra neutron on the hydrogen atom—absorbs fewer neutrons during the fission process, increasing efficiency by allowing more neutrons to be captured by the thorium.

There are 46 so-called PHWRs operating worldwide: 17 in Canada, 19 in India, three each in Argentina and South Korea, and two each in China and Romania, according to data from the International Atomic Energy Agency. In 1954, India set out a three-stage development plan for nuclear power that involved eventually phasing thorium into the fuel cycle for its fleet. 

Yet in the 56 years since India built its first commercial nuclear plant, its state-controlled industry has remained relatively shut off to the private sector and the rest of the world. When the US signed the 123 Agreement with India in 2008, the moment heralded an era in which the subcontinent could become a testing ground for new American reactor designs. 

In 2010, however, India passed the Civil Liability for Nuclear Damage Act. The legislation was based on what lawmakers saw as legal shortcomings in the wake of the 1984 Bhopal chemical factory disaster, when a subsidiary of the American industrial giant Dow Chemical avoided major payouts to the victims of a catastrophe that killed thousands. Under this law, responsibility for an accident at an Indian nuclear plant would fall on suppliers. The statute effectively killed any exports to India, since few companies could shoulder that burden. Only Russia’s state-owned Rosatom charged ahead with exporting reactors to India.

But things are changing. In a joint statement issued after a February 2025 summit, Prime Minister Narendra Modi and President Donald Trump “announced their commitment to fully realise the US-India 123 Civil Nuclear Agreement by moving forward with plans to work together to build US-designed nuclear reactors in India through large scale localisation and possible technology transfer.” 

In March 2025, US federal officials gave the nuclear developer Holtec International an export license to sell Indian companies its as-yet-unbuilt small modular reactors, which are based on the light-water reactor design used in the US. In April, the Indian government suggested it would reform the nuclear liability law to relax rules on foreign companies in hopes of drawing more overseas developers. Last month, a top minister confirmed that the Modi administration would overhaul the law. 

“For India, the thing they need to do is get another international vendor in the marketplace,” says Chris Gadomski, the chief nuclear analyst at the consultancy BloombergNEF.

Path of least resistance

But Shah sees larger potential for Clean Core. Unlike Holtec, whose export license was endorsed by the two Mumbai-based industrial giants Larsen & Toubro and Tata Consulting Engineers, Clean Core had its permit approved by two of India’s atomic regulators and its main state-owned nuclear company. By focusing on fuel rather than new reactors, Clean Core could become a vendor to the majority of the existing plants already operating in India. 

Its technology diverges not only from that of other US nuclear companies but also from the approach used in China. Last year, China made waves by bringing its first thorium-fueled reactor online. This enabled it to establish a new foothold in a technology the US had invented and then abandoned, and it gave Beijing another leg up in atomic energy.

But scaling that technology will require building out a whole new kind of reactor. That comes at a cost. A recent Johns Hopkins University study found that China’s success in building nuclear reactors stemmed in large part from standardization and repetition of successful designs, virtually all of which have been light-water reactors. Using thorium in existing heavy-water reactors lowers the bar for popularizing the fuel, according to the younger Shah. 

“We think ours is the path of least resistance,” Milan Shah says. “Maybe not being completely revolutionary in the way you look at nuclear today, but incredibly evolutionary to progress humanity forward.” 

The company has plans to go beyond pressurized heavy-water reactors. Within two years, the elder Shah says, Clean Core plans to design a version of its fuel that could work in the light-water reactors that make up the entire US fleet of 94. But it’s not a simple conversion. For starters, there’s the size: While the PHWR fuel rods are about 50 centimeters in length, the rods that go into light-water reactors are roughly four meters long. Then there’s the history of challenges with light water’s absorption of neutrons that could otherwise be captured to induce fission in the thorium. 

For Anil Kakodkar, the former chairman of India’s Atomic Energy Commission and a mentor to Shah, popularizing thorium could help rectify one of the darker chapters in his country’s nuclear development. In 1974, India became the first country since the signing of the first global Treaty on the Non-Proliferation of Nuclear Weapons to successfully test an atomic weapon. New Delhi was never a signatory to the pact. But the milestone prompted neighboring Pakistan to develop its own weapons. 

In response, President Jimmy Carter tried to demonstrate Washington’s commitment to reversing the Cold War arms race by sacrificing the first US effort to commercialize nuclear waste recycling, since the technology to separate plutonium and other radioisotopes from uranium in spent fuel was widely seen as a potential new source of weapons-grade material. By running its own reactors on thorium, Kakodkar says, India can chart a new path for newcomer nations that want to harness the power of the atom without stoking fears that nuclear weapons capability will spread. 

“The proliferation concerns will be dismissed to a significant extent, allowing more rapid growth of nuclear power in emerging countries,” he says. “That will be a good thing for the world at large.” 

Alexander C. Kaufman is a reporter who has covered energy, climate change, pollution, business, and geopolitics for more than a decade. 

Google’s still not giving us the full picture on AI energy use

Google just announced that a typical query to its Gemini app uses about 0.24 watt-hours of electricity. That’s about the same as running a microwave for one second—something that, to me, feels virtually insignificant. I run the microwave for so many more seconds than that on most days.

I was excited to see this report come out, and I welcome more openness from major players in AI about their estimated energy use per query. But I’ve noticed that some folks are taking this number and using it to conclude that we don’t need to worry about AI’s energy demand. That’s not the right takeaway here. Let’s dig into why.

1. This one number doesn’t reflect all queries, and it leaves out cases that likely use much more energy.

Google’s new report considers only text queries. Previous analysis, including MIT Technology Review’s reporting, suggests that generating a photo or video will typically use more electricity.

When I spoke with Jeff Dean, Google’s chief scientist, he said the company doesn’t currently have plans to do this sort of analysis for images and videos, but that he wouldn’t rule it out.

The reason the company started with text prompts is that those are something many people out there are using in their daily lives, he says, while image and video generation is something that not as many people are doing. But I’m seeing more AI images and videos all over my social feeds. So there’s a whole world of queries not represented here.

Also, this estimate is the median, meaning it’s just the number in the middle of the range of queries Google is seeing. Longer questions and responses can push up the energy demand, and so can using a reasoning model.  We don’t know anything about how much energy these more complicated queries demand or what the distribution of the range is.

2. We don’t know how many queries Gemini is seeing, so we don’t know the product’s total energy impact.

One of my biggest outstanding questions about Gemini’s energy use is the total number of queries the product is seeing every day. 

This number isn’t included in Google’s report, and the company wouldn’t share it with me. And let me be clear: I absolutely pestered them about this, both in a press call they had about the news and in my interview with Dean. In the press call, the company pointed me to a recent earnings report, which includes only figures about monthly active users (450 million, for what it’s worth).

“We’re not comfortable revealing that for various reasons,” Dean told me on our call. The total number is an abstract measure that changes over time, he says, adding that the company wants users to be thinking about the energy usage per prompt.

But there are people out there all over the world interacting with this technology, not just me—and what we all add up to seems quite relevant.

OpenAI does publicly share its total, sharing recently that it sees 2.5 billion queries to ChatGPT every day. So for the curious, we can use this as an example and take the company’s self-reported average energy use per query (0.34 watt-hours) to get a rough idea of the total for all people prompting ChatGPT.

According to my math, over the course of a year, that would add up to over 300 gigawatt-hours—the same as powering nearly 30,000 US homes annually. When you put it that way, it starts to sound like a lot of seconds in microwaves.

3. AI is everywhere, not just in chatbots, and we’re often not even conscious of it.

AI is touching our lives even when we’re not looking for it. AI summaries appear in web searches, whether you ask for them or not. There are built-in features for email and texting applications that that can draft or summarize messages for you.

Google’s estimate is strictly for Gemini apps and wouldn’t include many of the other ways that even this one company is using AI. So even if you’re trying to think about your own personal energy demand, it’s increasingly difficult to tally up. 

To be clear, I don’t think people should feel guilty for using tools that they find genuinely helpful. And ultimately, I don’t think the most important conversation is about personal responsibility. 

There’s a tendency right now to focus on the small numbers, but we need to keep in mind what this is all adding up to. Over two gigawatts of natural gas will need to come online in Louisiana to power a single Meta data center this decade. Google Cloud is spending $25 billion on AI just in the PJM grid on the US East Coast. By 2028, AI could account for 326 terawatt-hours of electricity demand in the US annually, generating over 100 million metric tons of carbon dioxide.

We need more reporting from major players in AI, and Google’s recent announcement is one of the most transparent accounts yet. But one small number doesn’t negate the ways this technology is affecting communities and changing our power grid. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

How lidar measures the cost of climate disasters

The wildfires that swept through Los Angeles County in January 2025 left an indelible mark on the Southern California landscape. The Eaton and Palisades fires raged for 24 days, killing 29 people and destroying 16,000 structures, with losses estimated at $60 billion. More than 55,000 acres were consumed, and the landscape itself was physically transformed.

Researchers are now using lidar (light detection and ranging) technology to precisely measure these changes in the landscape’s geometry—helping them understand the effects of climate disasters.

Lidar, which measures how long it takes for pulses of laser light to bounce off surfaces and return, has been used in topographic mapping for decades. Today, airborne lidar from planes and drones maps the Earth’s surface in high detail. Scientists can then “diff” the data—compare before-and-after snapshots and highlight all the changes—to identify more subtle consequences of a disaster, including fault-line shifts, volcanic eruptions, and mudslides.

Falko Kuester, an engineering professor at the University of California, San Diego, co-directs ALERTCalifornia, a public safety program that uses real-time remote sensing to help detect wildfires. Kuester says lidar snapshots can tell a story over time.

“They give us a lay of the land,” he says. “This is what a particular region has been like at this point in time. Now, if you have consecutive flights at a later time, you can do a ‘difference.’ Show me what it looked like. Show me what it looks like. Tell me what changed. Was something constructed? Something burned down? Did something fall down? Did vegetation grow?” 

Shortly after the fires were contained in late January 2025, ALERTCalifornia sponsored new lidar flights over the Eaton and Palisades burn areas. NV5, an inspection and engineering firm, conducted the scans, and the US Geological Survey is now hosting the public data sets.  

Comparing a 2016 lidar snapshot and the January 2025 snapshot, Cassandra Brigham and her team at Arizona State University visualized the elevation changes—revealing the buildings, trees, and structures that had disappeared.

“We said, what would be a useful product for people to have as quickly as possible, since we’re doing this a couple weeks after the end of the fires?” says Brigham. Her team cleaned and reformatted the older, lower-resolution data and then subtracted the newer data. The resulting visualizations reveal the scale of devastation in ways satellite imagery can’t match. Red shows lost elevation (like when a building burns), and blue shows a gain (such as tree growth or new construction).

Lidar is helping scientists track the cascading effects of climate-­driven disasters—from the damage to structures and vegetation destroyed by wildfires to the landslides and debris flows that often follow in their wake. “For the Eaton and Palisades fires, for example, entire hillsides burned. So all of that vegetation is removed,” Kuester says. “Now you have an atmospheric river coming in, dumping water. What happens next? You have debris flows, mud flows, landslides.” 

Lidar’s usefulness for quantifying the costs of climate disasters underscores its value in preparing for future fires, floods, and earthquakes. But as policymakers weigh steep budget cuts to scientific research, these crucial lidar data collection projects could face an uncertain future.

Jon Keegan writes about technology and AI, and he publishes Beautiful Public Data (beautifulpublicdata.com), a curated collection of government data sets.

Why recycling isn’t enough to address the plastic problem

I remember using a princess toothbrush when I was little. The handle was purple, teal, and sparkly. Like most of the other pieces of plastic that have ever been made, it’s probably still out there somewhere, languishing in a landfill. (I just hope it’s not in the ocean.)

I’ve been thinking about that toothbrush again this week after UN talks about a plastic treaty broke down on Friday. Nations had gotten together to try and write a binding treaty to address plastic waste, but negotiators left without a deal.

Plastic is widely recognized as a huge source of environmental pollution—again, I’m wondering where that toothbrush is—but the material is also a contributor to climate change. Let’s dig into why talks fell apart and how we might address emissions from plastic.

I’ve defended plastic before in this newsletter (sort of). It’s a wildly useful material, integral in everything from glasses lenses to IV bags.

But the pace at which we’re producing and using plastic is absolutely bonkers. Plastic production has increased at an average rate of 9% every year since 1950. Production hit 460 million metric tons in 2019. And an estimated 52 million metric tons are dumped into the environment or burned each year.

So, in March 2022, the UN Environment Assembly set out to develop an international treaty to address plastic pollution. Pretty much everyone should agree that a bunch of plastic waste floating in the ocean is a bad thing. But as we’ve learned over the past few years, as these talks developed, opinions diverge on what to do about it and how any interventions should happen.

One phrase that’s become quite contentious is the “full life cycle” of plastic. Basically, some groups are hoping to go beyond efforts to address just the end of the plastic life cycle (collecting and recycling it) by pushing for limits on plastic production. There was even talk at the Assembly of a ban on single-use plastic.

Petroleum-producing nations strongly opposed production limits in the talks. Representatives from Saudi Arabia and Kuwait told the Guardian that they considered limits to plastic production outside the scope of talks. The US reportedly also slowed down talks and proposed to strike a treaty article that references the full life cycle of plastics.

Petrostates have a vested interest because oil, natural gas, and coal are all burned for energy used to make plastic, and they’re also used as raw materials. This stat surprised me: 12% of global oil demand and over 8% of natural gas demand is for plastic production.  

That translates into a lot of greenhouse gas emissions. One report from Lawrence Berkeley National Lab found that plastics production accounted for 2.24 billion metric tons of carbon dioxide emissions in 2019—that’s roughly 5% of the global total.  

And looking into the future, emissions from plastics are only set to grow. Another estimate, from the Organisation for Economic Co-operation and Development, projects that emissions from plastics could swell from about 2 billion metric tons to 4 billion metric tons by 2060.

This chart is what really strikes me and makes the conclusion of the plastic treaty talks such a disappointment.

Recycling is a great tool, and new methods could make it possible to recycle more plastics and make it easier to do so. (I’m particularly interested in efforts to recycle a mix of plastics, cutting down on the slow and costly sorting process.)

But just addressing plastic at its end of life won’t be enough to address the climate impacts of the material. Most emissions from plastic come from making it. So we need new ways to make plastic, using different ingredients and fuels to take oil and gas out of the equation. And we need to be smarter about the volume of plastic we produce.  

One positive note here: The plastic treaty isn’t dead, just on hold for the moment. Officials say that there’s going to be an effort to revive the talks.

Less than 10% of plastic that’s ever been produced has been recycled. Whether it’s a water bottle, a polyester shirt you wore a few times, or a princess toothbrush from when you were a kid, it’s still out there somewhere in a landfill or in the environment. Maybe you already knew that. But also consider this: The greenhouse gases emitted to make the plastic are still in the atmosphere, too, contributing to climate change. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

How to make clean energy progress under Trump in the states—blue and red alike

The second Trump administration is proving to be more disastrous for the climate and the clean energy economy than many had feared. 

Donald Trump’s One Big Beautiful Bill Act repealed most of the clean energy incentives in former president Joe Biden’s Inflation Reduction Act. Meanwhile, his EPA administrator moved to revoke the endangerment finding, the legal basis for federal oversight of greenhouse gases. For those of us who have been following policy developments in this area closely, nearly every day brings a new blow to past efforts to salvage our climate and to build the clean energy economy of the future.


Heat Exchange

MIT Technology Review’s guest opinion series, offering expert commentary on legal, political and regulatory issues related to climate change and clean energy. You can read the rest of the pieces here.


This has left many in the climate and clean energy communities wondering what do we do now? The answer, I would argue, is to return to state capitals—a policymaking venue that climate and renewable energy advocates already know well. This can be done strategically, focusing on a handful of key states rather than all fifty. 

But I have another piece of advice: Don’t get too caught up in “red states” versus “blue states” when considering which states to target. American politics is being remade before our eyes, and long-standing policy problems are being redefined and reframed.  

Let’s take clean energy, for example. Yes, shifting away from carbon-spewing resources is about slowing down climate change, and for some this is the single most important motivation for pursuing it. But it also can be about much more. 

The case can be made just as forcefully—and perhaps more effectively—that shifting to clean energy advances affordability at a time when electricity bills are skyrocketing. It promotes energy freedom by resisting monopolistic utilities’ ownership and gatekeeping of the grid. It increases reliability as battery storage reaches new heights and renewable sources and baseload power plants like nuclear or natural gas facilities (some of which we certainly do and will need) increasingly complement one another. And it drives job creation and economic development. 

Talking about clean energy policy in these ways is safer from ideological criticisms of “climate alarmism.” Research reported in my forthcoming book, Owning the Green Grid, shows that this framing has historically been effective in red states. In addition, using the arguments above to promote all forms of energy can allow clean energy proponents to reclaim a talking point deployed in a previous era by the political right: a true “all-of-the-above” approach to energy policy.

Every energy technology—gas, nuclear, wind, solar, geothermal and storage, among others—has its own set of strengths and weaknesses. But combining them enhances overall grid performance, delivering more than the sum of their individual parts.

To be clear, this is not the approach of the current national administration in Washington, DC. Its policies have picked winners (coal, oil, and natural gas) and losers (solar and wind) among energy technologies—ironically, given conservative claims of blue states having done so in the past. Yet a true all-of-the-above approach can now be sold in state capitals throughout the country, in red states and even in fossil-fuel producing states. 

To be sure, the Trump-led Republican party has taken such extreme measures that it will constrain certain state policymaking possibilities. Notably, in May the US Senate voted to block waivers allowing California to phase out gas guzzlers in the state, over the objections of the Senate parliamentarian. The fiscal power of the federal government is also immense. But there are a variety of other ways to continue to make state-level progress on greenhouse gas emissions.

State and local advocacy efforts are nothing new for the clean energy community. For decades before the Inflation Reduction Act, the states were the primary locus of activity for clean energy policy. But in recent years, some have suggested that Democratic state governments are a necessary prerequisite to making meaningful state-level progress. This view is limiting, and it perpetuates a false—or at least unnecessary—alignment between party and energy technology. 

The electric grid is nonpartisan. Struggling to pay your utility bill is nonpartisan. Keeping the lights on is nonpartisan. Even before renewable energy was as cheap as it is today, early progress at diversifying energy portfolios was made in conservative states. Iowa, Texas, and Montana were all early adopters of renewable portfolio standards. Advocates in such places did not lead with messaging about climate change, but rather about economic development and energy independence. These policy efforts paid off: The deeply red Lone Star State, for instance, generates more wind energy than any other state and ranks only behind California in producing solar power. 

Now, in 2025, advances in technology and improvements in cost should make the economic arguments for clean energy even easier and more salient. So, in the face of a national government that is choosing last century’s energy technologies as policy winners and this century’s technologies as policy losers, the states offer clean energy advocates a familiar terrain on which to make continued progress, if they tailor their selling points to the reality on the ground.         

Joshua A. Basseches is the David and Jane Flowerree Assistant Professor of Environmental Studies and Public Policy at Tulane University. His research focuses on state-level renewable energy politics and policymaking, especially in the electricity sector.

The US could really use an affordable electric truck

On Monday, Ford announced plans for an affordable electric truck with a 2027 delivery date and an expected price tag of about $30,000, thanks in part to a new manufacturing process that it says will help cut costs.

This could be the shot in the arm that the slowing US EV market needs. Sales are slowing, and Ford in particular has struggled recently—the automaker has lost $12 billion over the last two and a half years on its EV division. And the adoption barriers continue to mount, with the Trump administration cutting tax credits as well as rules designed to push automakers toward zero-emissions vehicles. And that’s not to mention tariffs.

But if anything can get Americans excited, it’s a truck, especially an affordable one. (There was a ton of buzz over the announcement of a bare-bones truck from Bezos-backed Slate Auto earlier this year, for example.) The big question is whether the company can deliver in this environment.

One key thing to note here: This is not the first time that there’s been a big splashy truck announcement from Ford that was supposed to change everything. The F-150 Lightning was hailed as a turning point for vehicle electrification, a signal that decarbonization had entered a new era. We cited the truck when we put “The Inevitable EV” on our 10 Breakthrough Technologies list in 2023. 

Things haven’t quite turned out that way. One problem is that the Lightning was supposed to be relatively affordable, with a price tag of about $40,000 when it was first announced in 2021. The starting price inflated to $52,000 when it actually went on sale in 2022.

The truck was initially popular and became quite hard to find at dealerships. But prices climbed and interest leveled off. The base model hit nearly $60,000 by 2023. For the past few years, Ford has cut Lightning production several times and laid off employees who assembled the trucks.

Now, though, Ford is once again promising an affordable truck, and it’s supposed to be even cheaper this time. To help cut costs, the company says it’s simplifying, creating one universal platform for a new set of EVs. Using a common structure and set of components will help produce not only a midsize truck but also other trucks, vans, and SUVs. There are also planned changes to the manufacturing process (rather than one assembly line, multiple lines will join together to form what they’re calling an assembly tree). 

Another supporting factor for cost savings is the battery. The company plans to use lithium-iron phosphate (or LFP) cells—a type of lithium-ion battery that doesn’t contain nickel or cobalt. Leaving out those relatively pricey metals means lower costs.

Side note here: That battery could be surprisingly small. In a media briefing, a Ford official reportedly said that the truck’s battery would be 15% smaller than the one in the Atto crossover from the Chinese automaker BYD. Since that model has a roughly 60-kilowatt-hour pack, that could put this new battery at 51 kilowatt-hours. That’s only half the capacity of the Ford Lightning’s battery and similar to the smallest pack offered in a Tesla Model 3 today. (This could mean the truck has a relatively limited range, though the company hasn’t shared any details on that front yet.) 

A string of big promises isn’t too unusual for a big company announcement. What was unusual was the tone from officials during the event on Monday.

As Andrew Hawkins pointed out in The Verge this week, “Ford seems to realize its timing is unfortunate.” During the announcement, executives emphasized that this was a bet, one that might not work out.

CEO Jim Farley put it bluntly: “The automotive industry has a graveyard littered with affordable vehicles that were launched in our country with all good intentions, and they fizzled out with idle plants, laid-off workers, and red ink.” Woof.

From where I’m standing, it’s hard to be optimistic that this announcement will turn out differently from all those failed ones, given where the US EV market is right now.   

In a new report published in June, the energy consultancy BNEF slashed its predictions for future EV uptake. Last year, the organization predicted that 48% of new vehicles sold in the US in 2030 would be electric. In this year’s edition, that number got bumped down to just 27%.

To be clear: BNEF and other organizations are still expecting more EVs on the roads in the future than today, since the vehicles make up less than 10% of new sales in the US. But expectations are way down, in part because of a broad cut in public support for EVs. 

The tax credits that gave drivers up to $7,500 off the purchase of a new EV end in just over a month. Tariffs are going to push costs up even for domestic automakers like Ford, which still rely on imported steel and aluminum.

A revamped manufacturing process and a cheaper, desirable vehicle could be exactly the sort of move that automakers need to make for the US EV market. But I’m skeptical that this truck will be able to turn it all around. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The greenhouse gases we’re not accounting for

In the spring of 2021, climate scientists were stumped. 

The global economy was just emerging from the covid-19 lockdowns, but for some reason the levels of methane—a greenhouse gas emitted mainly through agriculture and fossil-fuel production—had soared in the atmosphere the previous year, rising at the fastest rate on record.

Researchers around the world set to work unraveling the mystery, reviewing readings from satellites, aircraft, and greenhouse-gas monitoring stations. They eventually spotted a clear pattern: Methane emissions had increased sharply across the tropics, where wetlands were growing wetter and warmer. 

That created the ideal conditions for microbes that thrive in anaerobic muck, which gobbled up more of the carbon-rich organic matter and spat out more methane as a by-product. (Reduced pollution from nitrogen oxides, which help to break down methane in the atmosphere, also likely played a substantial role.)

The findings offer one of the clearest cases so far where climate change itself is driving additional greenhouse-gas emissions from natural systems, triggering a feedback effect that threatens to produce more warming, more emissions, and on and on. 

There are numerous additional ways this is happening or soon could, including wildfires and thawing permafrost. These are major emissions sources that aren’t included in the commitments nations have made under the Paris climate agreement—and climate risks that largely aren’t accounted for in the UN Intergovernmental Panel on Climate Change’s most recent warming scenarios.

Spark Climate Solutions (not to be confused with this newsletter) hopes to change that.

The San Francisco nonprofit is launching what’s known as a model intercomparison project, in which different research teams run the same set of experiments on different models across a variety of emissions scenarios to determine how climate change could play out. This one would specifically explore how a range of climate feedback effects could propel additional warming, additional emissions, and additional types of feedback.

“These increased emissions from natural sources add to human emissions and amplify climate change,” says Phil Duffy, chief scientist at Spark Climate Solutions, who previously served as climate science advisor to President Joe Biden. “And if you don’t look at all of them together, you can’t quantify the strength of that feedback effect.”

Other participants in the effort will include scientists at the Environmental Defense Fund, Stanford University, the Woodwell Climate Research Center, and other institutions in Europe and Australia, according to Spark Climate Solutions.

The nonprofit hopes to publish the findings in time for them to be incorporated into the UN climate panel’s seventh major assessment report, which is just getting underway, to help ensure that these dangers are more fully represented. That, in turn, would give nations a more accurate sense of the world’s carbon budgets, or the quantity of greenhouse gases they can produce before the planet reaches temperatures 1.5 °C or  2 °C over preindustrial levels. 

But one thing is already clear: Since the current scenarios don’t fully account for these feedback effects, the world will almost certainly warm faster than is now forecast, which underscores the importance of carrying out this exercise. 

Scientists at EDF, Woodwell and other institutions found that fires in the world’s northernmost forests, thawing permafrost and warming tropical wetlands could together push the planet beyond 2 °C years faster, eliminating up to a quarter of the time left before the world passes the core goal of the Paris agreement, in a paper under review. 

Earlier this year, Spark Climate Solutions set up a broader program to advance research and awareness of what’s known as warming-induced emissions, which will launch additional collaborations similar to the modeling intercomparison project.  

The goal of the program and the research project is “to really mainstream the inclusion of this topic in climate science and climate policy, and to drive research around climate solutions,” says Ben Poulter, who leads the program at Spark Climate Solutions and was previously a scientist at the NASA Goddard Space Flight Center.

Spark notes that warming temperatures could also release more carbon dioxide from the oceans, in a process known as outgassing; additional carbon dioxide and nitrous oxide, a potent greenhouse gas that also depletes the protective ozone layer, from farmland; more carbon dioxide and methane from wildfires; and still more of all three of these gases as permafrost thaws.

The ground remains frozen year round across a vast expanse of the Northern Hemisphere, creating a frosty underground storehouse from Alaska to Siberia that’s packed with twice as much carbon as the atmosphere.

But as it thaws, it starts to decompose and release greenhouse gases, says Susan Natali, an Arctic climate scientist focused on permafrost at Woodwell. A study published in Nature in January noted that 30% of the world’s Arctic–Boreal Zone has already flipped from a carbon sink to a carbon source, when wildfires, thawing permafrost and other factors are taken into account.

Despite these increasing risks, only a minority of the models that fed into the UN climate panel’s last major report incorporated the feedback effects of thawing permafrost. And the emissions risks still weren’t fully accounted for because these ecosystems are difficult to monitor and model, Natali says.

Among the complexities: Wildfires, which are themselves hard to predict, can accelerate thawing. It’s also hard to foresee which regions will grow drier or wetter, which determines whether they release mostly methane or carbon dioxide—and those gases have very different warming effects over different time periods. There are counterbalancing effects that must be taken into account as well—for instance, as carbon-absorbing plants replace ice and snow in certain areas.

Natali says improving our understanding of these complex feedback effects is essential to understanding the dangers we face.

“It’s going to mean additional costs to human health, human life,” she says. “We want people to be safe—and it’s very hard to do that if you don’t know what’s coming and you’re not prepared for it.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

An EPA rule change threatens to gut US climate regulations

This story is part of MIT Technology Review’s “America Undone” series, examining how the foundations of US success in science and innovation are currently under threat. You can read the rest here.

The mechanism that allows the US federal government to regulate climate change is on the chopping block.

On Tuesday, US Environmental Protection Agency administrator Lee Zeldin announced that the agency is taking aim at the endangerment finding, a 2009 rule that’s essentially the tentpole supporting federal greenhouse-gas regulations.

This might sound like an obscure legal situation, but it’s a really big deal for climate policy in the US. So buckle up, and let’s look at what this rule says now, what the proposed change looks like, and what it all means.

To set the stage, we have to go back to the Clean Air Act of 1970, the law that essentially gave the EPA the power to regulate air pollution. (Stick with me—I promise I’ll keep this short and not get too into the legal weeds.)

There were some pollutants explicitly called out in this law and its amendments, including lead and sulfur dioxide. But it also required the EPA to regulate new pollutants that were found to be harmful. In the late 1990s and early 2000s, environmental groups and states started asking for the agency to include greenhouse-gas pollution.

In 2007, the Supreme Court ruled that greenhouse gases qualify as air pollutants under the Clean Air Act, and that the EPA should study whether they’re a danger to public health. In 2009, the incoming Obama administration looked at the science and ruled that greenhouse gases pose a threat to public health because they cause climate change. That’s the endangerment finding, and it’s what allows the agency to pass rules to regulate greenhouse gases.  

The original case and argument were specifically about vehicles and the emissions from tailpipes, but this finding was eventually used to allow the agency to set rules around power plants and factories, too. It essentially underpins climate regulations in the US.

Fast-forward to today, and the Trump administration wants to reverse the endangerment finding. In a proposed rule released on Tuesday, the EPA argues that the Clean Air Act does not, in fact, authorize the agency to set emissions standards to address global climate change. Zeldin, in an appearance on the conservative politics and humor podcast Ruthless that preceded the official announcement, called the proposal the “largest deregulatory action in the history of America.”

The administration was already moving to undermine the climate regulations that rely on this rule. But this move directly targets a “fundamental building block of EPA’s climate policy,” says Deborah Sivas, an environmental-law professor at Stanford University.

The proposed rule will go up for public comment, and the agency will then take that feedback and come up with a final version. It’ll almost certainly get hit with legal challenges and will likely wind up in front of the Supreme Court.

One note here is that the EPA makes a mostly legal argument in the proposed rule reversal rather than focusing on going after the science of climate change, says Madison Condon, an associate law professor at Boston University. That could make it easier for the Supreme Court to eventually uphold it, she says, though this whole process is going to take a while. 

If the endangerment finding goes down, it would have wide-reaching ripple effects. “We could find ourselves in a couple years with no legal tools to try and address climate change,” Sivas says.

To take a step back for a moment, it’s wild that we’ve ended up in this place where a single rule is so central to regulating emissions. US climate policy is held up by duct tape and a dream. Congress could have, at some point, passed a law that more directly allows the EPA to regulate greenhouse-gas emissions (the last time we got close was a 2009 bill that passed the House but never made it to the Senate). But here we are.

This move isn’t a surprise, exactly. The Trump administration has made it very clear that it is going after climate policy in every way that it can. But what’s most striking to me is that we’re not operating in a shared reality anymore when it comes to this subject. 

While top officials tend to acknowledge that climate change is real, there’s often a “but” followed by talking points from climate denial’s list of greatest hits. (One of the more ridiculous examples is the statement that carbon dioxide is good, actually, because it helps plants.) 

Climate change is real, and it’s a threat. And the US has emitted more greenhouse gases into the atmosphere than any other country in the world. It shouldn’t be controversial to expect the government to be doing something about it. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

This startup wants to use the Earth as a massive battery

The Texas-based startup Quidnet Energy just completed a test showing it can store energy for up to six months by pumping water underground.

Using water to store electricity is hardly a new concept—pumped hydropower storage has been around for over a century. But the company hopes its twist on the technology could help bring cheap, long-duration energy storage to new places.

In traditional pumped hydro storage facilities, electric pumps move water uphill, into a natural or manmade body of water. Then, when electricity is needed, that water is released and flows downhill past a turbine, generating electricity. Quidnet’s approach instead pumps water down into impermeable rock formations and keeps it under pressure so it flows up when released. “It’s like pumped hydro, upside down,” says CEO Joe Zhou.

Quidnet started a six-month test of its technology in late 2024, pressurizing the system. In June, the company was able to discharge 35 megawatt-hours of energy from the well. There was virtually no self-discharge, meaning no energy loss, Zhou says.

Inexpensive forms of energy storage that can store electricity for weeks or months could help inconsistent electricity sources like wind and solar go further for the grid. And Quidnet’s approach, which uses commercially available equipment, could be deployed quickly and qualify for federal tax credits to help make it even cheaper.

However, there’s still a big milestone ahead: turning the pressurized water back into electricity. The company is currently building a facility with the turbines and support equipment to do that—all the components are available to purchase from established companies. “We don’t need to invent new things based on what we’ve already developed today,” Zhou says. “We can now start just deploying at very, very substantial scales.”

That process will come with energy losses. Energy storage systems are typically measured by their round-trip efficiency: how much of the electricity that’s put into the system is returned at the end as electricity. Modeling suggests that Quidnet’s technology could reach a maximum efficiency of about 65%, Zhou says, though some design choices made to optimize for economics will likely cause the system to land at roughly 50%.

That’s less efficient than lithium-ion batteries, but long-duration systems, if they’re cheap enough, can operate at low efficiencies and still be useful for the grid, says Paul Denholm, a senior research fellow at the National Renewable Energy Laboratory.

“It’s got to be cost-competitive; it all comes down to that,” Denholm says.

Lithium-ion batteries, the fastest-growing technology in energy storage, are the target that new forms of energy storage, like Quidnet’s, must chase. Lithium-ion batteries are about 90% cheaper today than they were 15 years ago. They’ve become a price-competitive alternative to building new natural-gas plants, Denholm says.

When it comes to competing with batteries, one potential differentiator for Quidnet could be government subsidies. While the Trump administration has clawed back funding for clean energy technologies, there’s still an energy storage tax credit, though recently passed legislation added new supply chain restrictions.

Starting in 2026, new energy storage facilities hoping to qualify for tax credits will need to prove that at least 55% of the value of a project’s materials are not from foreign entities of concern. That rules out sourcing batteries from China, which dominates battery production today. Quidnet has a “high level of domestic content” and expects to qualify for tax credits under the new rules, Zhou says.

The facility Quidnet is building is a project with utility partner CPS Energy, and it should come online in early 2026. 

What role should oil and gas companies play in climate tech?

This week, I have a new story out about Quaise, a geothermal startup that’s trying to commercialize new drilling technology. Using a device called a gyrotron, the company wants to drill deeper, cheaper, in an effort to unlock geothermal power anywhere on the planet. (For all the details, check it out here.) 

For the story, I visited Quaise’s headquarters in Houston. I also took a trip across town to Nabors Industries, Quaise’s investor and tech partner and one of the biggest drilling companies in the world. 

Standing on top of a drilling rig in the backyard of Nabors’s headquarters, I couldn’t stop thinking about the role oil and gas companies are playing in the energy transition. This industry has resources and energy expertise—but also a vested interest in fossil fuels. Can it really be part of addressing climate change?

The relationship between Quaise and Nabors is one that we see increasingly often in climate tech—a startup partnering up with an established company in a similar field. (Another one that comes to mind is in the cement industry, where Sublime Systems has seen a lot of support from legacy players including Holcim, one of the biggest cement companies in the world.) 

Quaise got an early investment from Nabors in 2021, to the tune of $12 million. Now the company also serves as a technical partner for the startup. 

“We are agnostic to what hole we’re drilling,” says Cameron Maresh, a project engineer on the energy transition team at Nabors Industries. The company is working on other investments and projects in the geothermal industry, Maresh says, and the work with Quaise is the culmination of a yearslong collaboration: “We’re just truly excited to see what Quaise can do.”

From the outside, this sort of partnership makes a lot of sense for Quaise. It gets resources and expertise. Meanwhile, Nabors is getting involved with an innovative company that could represent a new direction for geothermal. And maybe more to the point, if fossil fuels are to be phased out, this deal gives the company a stake in next-generation energy production.

There is so much potential for oil and gas companies to play a productive role in addressing climate change. One report from the International Energy Agency examined the role these legacy players could take:  “Energy transitions can happen without the engagement of the oil and gas industry, but the journey to net zero will be more costly and difficult to navigate if they are not on board,” the authors wrote. 

In the agency’s blueprint for what a net-zero emissions energy system could look like in 2050, about 30% of energy could come from sources where the oil and gas industry’s knowledge and resources are useful. That includes hydrogen, liquid biofuels, biomethane, carbon capture, and geothermal. 

But so far, the industry has hardly lived up to its potential as a positive force for the climate. Also in that report, the IEA pointed out that oil and gas producers made up only about 1% of global investment in climate tech in 2022. Investment has ticked up a bit since then, but still, it’s tough to argue that the industry is committed. 

And now that climate tech is falling out of fashion with the government in the US, I’d venture to guess that we’re going to see oil and gas companies increasingly pulling back on their investments and promises. 

BP recently backtracked on previous commitments to cut oil and gas production and invest in clean energy. And last year the company announced that it had written off $1.1 billion in offshore wind investments in 2023 and wanted to sell other wind assets. Shell closed down all its hydrogen fueling stations for vehicles in California last year. (This might not be all that big a loss, since EVs are beating hydrogen by a huge margin in the US, but it’s still worth noting.) 

So oil and gas companies are investing what amounts to pennies and often backtrack when the political winds change direction. And, let’s not forget, fossil-fuel companies have a long history of behaving badly. 

In perhaps the most notorious example, scientists at Exxon modeled climate change in the 1970s, and their forecasts turned out to be quite accurate. Rather than publish that research, the company downplayed how climate change might affect the planet. (For what it’s worth, company representatives have argued that this was less of a coverup and more of an internal discussion that wasn’t fit to be shared outside the company.) 

While fossil fuels are still part of our near-term future, oil and gas companies, and particularly producers, would need to make drastic changes to align with climate goals—changes that wouldn’t be in their financial interest. Few seem inclined to really take the turn needed. 

As the IEA report puts it:  “In practice, no one committed to change should wait for someone else to move first.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.