I tried OpenAI’s new Atlas browser but I still don’t know what it’s for

OpenAI rolled out a new web browser last week called Atlas. It comes with ChatGPT built in, along with an agent, so that you can browse, get direct answers, and have automated tasks performed on your behalf all at the same time. 

I’ve spent the past several days tinkering with Atlas. I’ve used it to do all my normal web browsing, and also tried to take advantage of the ChatGPT functions—plus I threw some weird agentic tasks its way to see how it did with those. And my impression is that Atlas is…  fine? But my big takeaway is that it’s pretty pointless for anyone not employed by OpenAI, and that Atlas is little more than cynicism masquerading as software. 

If you want to know why, let’s start by looking at its agentic capabilities—which is really where it differentiates.

When I was browsing Amazon, I asked the Atlas agent to do some shopping for me, using a pre-set prompt of its own suggestion. (“Start a cart with items I’m likely to want based on my browsing here and highlight any active promo codes. Let me review before checkout.”) It picked out a notebook that I’d recently purchased and no longer needed, some deodorant I’d recently purchased and no longer needed, and a vacuum cleaner that I’d considered but decided was too expensive and no longer needed because I bought a cheaper one. 

I would guess that it took 10 minutes or so for it to do all that. I cleaned out my cart and considered myself lucky that it didn’t buy anything.  

When I logged onto Facebook, which is already lousy with all sorts of AI slop, I asked it to create a status update for me. So it dug through my browser history and came back with an incredibly long status I won’t bore you with all of it (and there was a lot) but here are the highlights from what it suggested:  “I dipped into Smartsheet and TeamSnap (because editors juggle rosters too!), flirted with Shopify and Amazon (holiday gift‑shopping? side hustle? you decide), and kept tabs on the news … . Somewhere in there I even remembered to log into Slack, schedule Zoom meetings, and read a few NYTimes and Technology Review pieces. Who says an editor’s life isn’t glamorous? 😊” 

Uh. Okay. I decided against posting that. There were some other equally unillustrious examples as well, but you get the picture. 

Aside from the agent, the other unique feature is having ChatGPT built right into the browser. Notice I said “unique,” not “useful.” I struggled with finding any obvious utility by having this right there, versus just going to chatgpt dot com. In some cases, the built-in chatbot was worse and dumber. 

For example, I asked the built-in ChatGPT to summarize a MIT Technology Review article I was reading for me. Yet instead of answering the question about the page I was on, it referred back to the page I had previously been on when I started the session. Which is to say it spit back some useless nonsense. Thanks, AI. 

OpenAI is marketing Atlas pretty aggressively when you come to ChatGPT now, suggesting people download it. And it may in fact score a lot of downloads because of that. But without giving people more of a reason to actually switch from more entrenched browsers, like Chrome or Safari, this feels like a real empty salvo in the new browser wars. 

It’s been hard for me to understand why Atlas exists. Who is this browser for, exactly? Who is its customer? And the answer I have come to there is that Atlas is for OpenAI. The real customer, the true end user of Atlas, is not the person browsing websites, it is the company collecting data about what and how that person is browsing.

This review first appeared in The Debrief, Mat Honan’s weekly subscriber-only newsletter.

An AI app to measure pain is here

How are you feeling?

I’m genuinely interested in the well-being of all my treasured Checkup readers, of course. But this week I’ve also been wondering how science and technology can help answer that question—especially when it comes to pain. 
In the latest issue of MIT Technology Review magazine, Deena Mousa describes how an AI-powered smartphone app is being used to assess how much pain a person is in.

The app, and other tools like it, could help doctors and caregivers. They could be especially useful in the care of people who aren’t able to tell others how they are feeling.

But they are far from perfect. And they open up all kinds of thorny questions about how we experience, communicate, and even treat pain.

Pain can be notoriously difficult to describe, as almost everyone who has ever been asked to will know. At a recent medical visit, my doctor asked me to rank my pain on a scale from 1 to 10. I found it incredibly difficult to do. A 10, she said, meant “the worst pain imaginable,” which brought back unpleasant memories of having appendicitis.

A short while before the problem that brought me in, I’d broken my toe in two places, which had hurt like a mother—but less than appendicitis. If appendicitis was a 10, breaking a toe was an 8, I figured. If that was the case, maybe my current pain was a 6. As a pain score, it didn’t sound as bad as I actually felt. I couldn’t help wondering if I might have given a higher score if my appendix were still intact. I wondered, too, how someone else with my medical issue might score their pain.

In truth, we all experience pain in our own unique ways. Pain is subjective, and it is influenced by our past experiences, our moods, and our expectations. The way people describe their pain can vary tremendously, too.

We’ve known this for ages. In the 1940s, the anesthesiologist Henry Beecher noted that wounded soldiers were much less likely to ask for pain relief than similarly injured people in civilian hospitals. Perhaps they were putting on a brave face, or maybe they just felt lucky to be alive, given their circumstances. We have no way of knowing how much pain they were really feeling.

Given this messy picture, I can see the appeal of a simple test that can score pain and help medical professionals understand how best to treat their patients. That’s what is being offered by PainChek, the smartphone app Deena wrote about. The app works by assessing small facial movements, such as lip raises or brow pinches. A user is then required to fill a separate checklist to identify other signs of pain the patient might be displaying. It seems to work well, and it is already being used in hospitals and care settings.

But the app is judged against subjective reports of pain. It might be useful for assessing the pain of people who can’t describe it themselves—perhaps because they have dementia, for example—but it won’t add much to assessments from people who can already communicate their pain levels.

There are other complications. Say a test could spot that a person was experiencing pain. What can a doctor do with that information? Perhaps prescribe pain relief—but most of the pain-relieving drugs we have were designed to treat acute, short-term pain. If a person is grimacing from a chronic pain condition, the treatment options are more limited, says Stuart Derbyshire, a pain neuroscientist at the National University of Singapore.

The last time I spoke to Derbyshire was back in 2010, when I covered work by researchers in London who were using brain scans to measure pain. That was 15 years ago. But pain-measuring brain scanners are yet to become a routine part of clinical care.

That scoring system was also built on subjective pain reports. Those reports are, as Derbyshire puts it, “baked into the system.” It’s not ideal, but when it comes down to it, we must rely on these wobbly, malleable, and sometimes incoherent self-descriptions of pain. It’s the best we have.

Derbyshire says he doesn’t think we’ll ever have a “pain-o-meter” that can tell you what a person is truly experiencing. “Subjective report is the gold standard, and I think it always will be,” he says.

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

What’s next for carbon removal?

MIT Technology Review’s What’s Next series looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of them here.

In the early 2020s, a little-known aquaculture company in Portland, Maine, snagged more than $50 million by pitching a plan to harness nature to fight back against climate change. The company, Running Tide, said it could sink enough kelp to the seafloor to sequester a billion tons of carbon dioxide by this year, according to one of its early customers.

Instead, the business shut down its operations last summer, marking the biggest bust to date in the nascent carbon removal sector.

Its demise was the most obvious sign of growing troubles and dimming expectations for a space that has spawned hundreds of startups over the last few years. A handful of other companies have shuttered, downsized, or pivoted in recent months as well. Venture investments have flagged. And the collective industry hasn’t made a whole lot more progress toward that billion-ton benchmark.

The hype phase is over and the sector is sliding into the turbulent business trough that follows, warns Robert Höglund, cofounder of CDR.fyi, a public-benefit corporation that provides data and analysis on the carbon removal industry.

“We’re past the peak of expectations,” he says. “And with that, we could see a lot of companies go out of business, which is natural for any industry.”

The open question is: If the carbon removal sector is heading into a painful if inevitable clearing-out cycle, where will it go from there? 

The odd quirk of carbon removal is that it never made a lot of sense as a business proposition: It’s an atmospheric cleanup job, necessary for the collective societal good of curbing climate change. But it doesn’t produce a service or product that any individual or organization strictly needs—or is especially eager to pay for.

To date, a number of businesses have voluntarily agreed to buy tons of carbon dioxide that companies intend to eventually suck out of the air. But whether they’re motivated by sincere climate concerns or pressures from investors, employees, or customers, corporate do-goodism will only scale any industry so far. 

Most observers argue that whether carbon removal continues to bobble along or transforms into something big enough to make a dent in climate change will depend largely on whether governments around the world decide to pay for a whole, whole lot of it—or force polluters to. 

“Private-sector purchases will never get us there,” says Erin Burns, executive director of Carbon180, a nonprofit that advocates for the removal and reuse of carbon dioxide. “We need policy; it has to be policy.”

What’s the problem?

The carbon removal sector began to scale up in the early part of this decade, as increasingly grave climate studies revealed the need to dramatically cut emissions and suck down vast amounts of carbon dioxide to keep global warming in check.

Specifically, nations may have to continually remove as much as 11 billion tons of carbon dioxide per year by around midcentury to have a solid chance of keeping the planet from warming past 2 °C over preindustrial levels, according to a UN climate panel report in 2022.

A number of startups sprang up to begin developing the technology and building the infrastructure that would be needed, trying out a variety of approaches like sinking seaweed or building carbon-dioxide-sucking factories.

And they soon attracted customers. Companies including Stripe, Google, Shopify, Microsoft, and others began agreeing to pre-purchase tons of carbon removal, hoping to stand up the nascent industry and help offset their own climate emissions. Venture investments also flooded into the space, peaking in 2023 at nearly $1 billion, according to data provided by PitchBook.

From early on, players in the emerging sector sought to draw a sharp distinction between conventional carbon offset projects, which studies have shown frequently exaggerate climate benefits, and “durable” carbon removal that could be relied upon to suck down and store away the greenhouse gas for decades to centuries. There’s certainly a big difference in the price: While buying carbon offsets through projects that promise to preserve forests or plant trees might cost a few dollars per ton, a ton of carbon removal can run hundreds to thousands of dollars, depending on the approach. 

That high price, however, brings big challenges. Removing 10 billion tons of carbon dioxide a year at, say, $300 a ton adds up to a global price tag of $3 trillion—a year. 

Which brings us back to the fundamental question: Who should or would foot the bill to develop and operate all the factories, pipelines, and wells needed to capture, move, and bury billions upon billions of tons of carbon dioxide?

The state of the market

The market is still growing, as companies voluntarily purchase tons of carbon removal to make strides toward their climate goals. In fact, sales reached an all-time high in the second quarter of this year, mostly thanks to several massive purchases by Microsoft.

But industry sources fear that demand isn’t growing fast enough to support a significant share of the startups that have formed or even the projects being built, undermining the momentum required to scale the sector up to the size needed by midcentury.

To date, all those hundreds of companies that have spun up in recent years have disclosed deals to sell some 38 million tons of carbon dioxide pulled from the air, according to CDR.fyi. That’s roughly the amount the US pumps out in energy-related emissions every three days. 

And they’ve only delivered around 940,000 tons of carbon removal. The US emits that much carbon dioxide in less than two hours. (Not every transaction is publicly announced or revealed to CDR.fyi, so the actual figures could run a bit higher.)

Another concern is that the same handful of big players continue to account for the vast majority of the overall purchases, leaving the health and direction of the market dependent on their whims and fortunes. 

Most glaringly, Microsoft has agreed to buy 80% of all the carbon removal purchased to date, according to  CDR.fyi. The second-biggest buyer is Frontier, a coalition of companies that includes Google, Meta, Stripe, and Shopify, which has committed to spend $1 billion.

If you strip out those two buyers, the market shrinks from 16 million tons under contract during the first half of this year to just 1.2 million, according to data provided to MIT Technology Review by CDR.fyi. 

Signs of trouble

Meanwhile, the investor appetite for carbon removal is cooling. For the 12-month period ending in the second quarter of 2025, venture capital investments in the sector fell more than 13% from the same period last year, according to data provided by PitchBook. That tightening funding will make it harder and harder for companies that aren’t bringing in revenue to stay afloat.

Other companies that have already shut down include the carbon removal marketplace Nori, the direct air capture company Noya and Alkali Earth, which was attempting to use industrial by-products to tie up carbon dioxide.

Still other businesses are struggling. Climeworks, one of the first companies to build direct-air-capture (DAC) factories, announced it was laying off 10% of its staff in May, as it grapples with challenges on several fronts.

The company’s plans to collaborate on the development of a major facility in the US have been at least delayed as the Trump administration has held back tens of millions of dollars in funding granted in 2023 under the Department of Energy’s Regional Direct Air Capture Hubs program. It now appears the government could terminate the funding altogether, along with perhaps tens of billions of dollars’ worth of additional grants previously awarded for a variety of other US carbon removal and climate tech projects.

“Market rumors have surfaced, and Climeworks is prepared for all scenarios,” Christoph Gebald, one of the company’s co-CEOs, said in a previous statement to MIT Technology Review. “The need for DAC is growing as the world falls short of its climate goals and we’re working to achieve the gigaton capacity that will be needed.”

But purchases from direct-air-capture projects fell nearly 16% last year and account for just 8% of all carbon removal transactions to date. Buyers are increasingly looking to categories that promise to deliver tons faster and for less money, notably including burying biochar or installing carbon capture equipment on bioenergy plants. (Read more in my recent story on that method of carbon removal, known as BECCS, here.)

CDR.fyi recently described the climate for direct air capture in grim terms: “The sector has grown rapidly, but the honeymoon is over: Investment and sales are falling, while deployments are delayed across almost every company.”

“Most DAC companies,” the organization added, “will fold or be acquired.”

What’s next?

In the end, most observers believe carbon removal isn’t really going to take off unless governments bring their resources and regulations to bear. That could mean making direct purchases, subsidizing these sectors, or getting polluters to pay the costs to do so—for instance, by folding carbon removal into market-based emissions reductions mechanisms like cap-and-trade systems. 

More government support does appear to be on the way. Notably, the European Commission recently proposed allowing “domestic carbon removal” within its EU Emissions Trading System after 2030, integrating the sector into one of the largest cap-and-trade programs. The system forces power plants and other polluters in member countries to increasingly cut their emissions or pay for them over time, as the cap on pollution tightens and the price on carbon rises. 

That could create incentives for more European companies to pay direct-air-capture or bioenergy facilities to draw down carbon dioxide as a means of helping them meet their climate obligations.

There are also indications that the International Civil Aviation Organization, a UN organization that establishes standards for the aviation industry, is considering incorporating carbon removal into its market-based mechanism for reducing the sector’s emissions. That might take several forms, including allowing airlines to purchase carbon removal to offset their use of traditional jet fuel or requiring the use of carbon dioxide obtained through direct air capture in some share of sustainable aviation fuels.

Meanwhile, Canada has committed to spend $10 million on carbon removal and is developing a protocol to allow direct air capture in its national offsets program. And Japan will begin accepting several categories of carbon removal in its emissions trading system

Despite the Trump administration’s efforts to claw back funding for the development of carbon-sucking projects, the US does continue to subsidize storage of carbon dioxide, whether it comes from power plants, ethanol refineries, direct-air-capture plants, or other facilities. The so-called 45Q tax credit, which is worth up to $180 a ton, was among the few forms of government support for climate-tech-related sectors that survived in the 2025 budget reconciliation bill. In fact, the subsidies for putting carbon dioxide to other uses increased.

Even in the current US political climate, Burns is hopeful that local or federal legislators will continue to enact policies that support specific categories of carbon removal in the regions where they make the most sense, because the projects can provide economic growth and jobs as well as climate benefits.

“I actually think there are lots of models for what carbon removal policy can look like that aren’t just things like tax incentives,” she says. “And I think that this particular political moment gives us the opportunity in a unique way to start to look at what those regionally specific and pathway specific policies look like.”

The dangers ahead

But even if more nations do provide the money or enact the laws necessary to drive the business of durable carbon renewal forward, there are mounting concerns that a sector conceived as an alternative to dubious offset markets could increasingly come to replicate their problems.

Various incentives are pulling in that direction.

Financial pressures are building on suppliers to deliver tons of carbon removal. Corporate buyers are looking for the fastest and most affordable way of hitting their climate goals. And the organizations that set standards and accredit carbon removal projects often earn more money as the volume of purchases rises, creating clear conflicts of interest.

Some of the same carbon registries that have long signed off on carbon offset projects have begun creating standards or issuing credits for various forms of carbon removal, including Verra and Gold Standard.

“Reliable assurance that a project’s declared ton of carbon savings equates to a real ton of emissions removed, reduced, or avoided is crucial,” Cynthia Giles, a senior EPA advisor under President Biden, and Cary Coglianese, a law professor at the University of Pennsylvania, wrote in a recent editorial in Science. “Yet extensive research from many contexts shows that auditors selected and paid by audited organizations often produce results skewed toward those entities’ interests.”

Noah McQueen, the director of science and innovation at Carbon180, has stressed that the industry must strive to counter the mounting credibility risks, noting in a recent LinkedIn post: “Growth matters, but growth without integrity isn’t growth at all.”

In an interview, McQueen said that heading off the problem will require developing and enforcing standards to truly ensure that carbon removal projects deliver the climate benefits promised. McQueen added that to gain trust, the industry needs to earn buy-in from the communities in which these projects are built and avoid the environmental and health impacts that power plants and heavy industry have historically inflicted on disadvantaged communities.

Getting it right will require governments to take a larger role in the sector than just subsidizing it, argues David Ho, a professor at the University of Hawaiʻi at Mānoa who focuses  on ocean-based carbon removal.

He says there should be a massive, multinational research drive to determine the most effective ways of mopping up the atmosphere with minimal environmental or social harm, likening it to a Manhattan Project (minus the whole nuclear bomb bit).

“If we’re serious about doing this, then let’s make it a government effort,” he says, “so that you can try out all the things, determine what works and what doesn’t, and you don’t have to please your VCs or concentrate on developing [intellectual property] so you can sell yourself to a fossil-fuel company.”

Ho adds that there’s a moral imperative for the world’s historically biggest climate polluters to build and pay for the carbon-sucking and storage infrastructure required to draw down billions of tons of greenhouse gas. That’s because the world’s poorest, hottest nations, which have contributed the least to climate change, will nevertheless face the greatest dangers from intensifying heat waves, droughts, famines, and sea-level rise.

“It should be seen as waste management for the waste we’re going to dump on the Global South,” he says, “because they’re the people who will suffer the most from climate change.”

Correction (October 24): An earlier version of this article referred to Noya as a carbon removal marketplace. It was a direct air capture company.

This startup is about to conduct the biggest real-world test of aluminum as a zero-carbon fuel

The crushed-up soda can disappears in a cloud of steam and—though it’s not visible—hydrogen gas. “I can just keep this reaction going by adding more water,” says Peter Godart, squirting some into the steaming beaker. “This is room-temperature water, and it’s immediately boiling. Doing this on your stove would be slower than this.” 

Godart is the founder and CEO of Found Energy, a startup in Boston that aims to harness the energy in scraps of aluminum metal to power industrial processes without fossil fuels. Since 2022, the company has worked to develop ways to rapidly release energy from aluminum on a small scale. Now it’s just switched on a much larger version of its aluminum-powered engine, which Godart claims is the largest aluminum-water reactor ever built. 

Early next year, it will be installed to supply heat and hydrogen to a tool manufacturing facility in the southeastern US, using the aluminum waste produced by the plant itself as fuel. (The manufacturer did not want to be named until the project is formally announced.)

If everything works as planned, this technology, which uses a catalyst to unlock the energy stored within aluminum metal, could transform a growing share of aluminum scrap into a zero-carbon fuel. The high heat generated by the engine could be especially valuable to reduce the substantial greenhouse-gas emissions generated by industrial processes, like cement production and metal refining, that are difficult to power with electricity directly.

“We invented the fuel, which is a blessing and a curse,” says Godart, surrounded by the pipes and wires of the experimental reactor. “It’s a huge opportunity for us, but it also means we do have to develop all of the systems around us. We’re redefining what even is an engine.”

Engineers have long eyed using aluminum as a fuel thanks to its superior energy density. Once it has been refined and smelted from ore, aluminum metal contains more than twice as much energy as diesel fuel by volume and almost eight times as much as hydrogen gas. When it reacts with oxygen in water or air, it forms aluminum oxides. This reaction releases heat and hydrogen gas, which can be tapped for zero-carbon power.

Liquid metal

The trouble with aluminum as a fuel—and the reason your soda can doesn’t spontaneously combust—is that as soon as the metal starts to react, an oxidized layer forms across its surface that prevents the rest of it from reacting. It’s like a fire that puts itself out as it generates ash. “People have tried it and abandoned this idea many, many times,” says Godart.

Some believe using aluminum as a fuel remains a fool’s errand. “This potential use of aluminum crops up every few years and has no possibility of success even if aluminum scrap is used as the fuel source,” says Geoff Scamans, a metallurgist at Brunel University of London who spent a decade working on using aluminum to power vehicles in the 1980s. He says the aluminum-water reaction isn’t efficient enough for the metal to make sense as a fuel given how much energy it takes to refine and smelt aluminum from ore to begin with: “A crazy idea is always a crazy idea.”

But Godart believes he and his company have found a way to make it work. “The real breakthrough was thinking about catalysis in a different way,” he says: Instead of trying to speed up the reaction by bringing water and aluminum together onto a catalyst, they “flipped it around” and “found a material that we could actually dissolve into the aluminum.”

Petert Godart holding up two glass jars; one with metal spheres and the other with flat metal shapes

JAMES DINNEEN

The liquid metal catalyst at the heart of the company’s approach “permeates the microstructure” of the aluminum, says Godart. As the aluminum reacts with water, the catalyst forces the metal to froth and split open, exposing more unreacted aluminum to the water. 

The composition of the catalyst is proprietary, but Godart says it is a “low-melting-point liquid metal that’s not mercury.” His dissertation research focused on using a liquid mixture of gallium and indium as the catalyst, and he says the principle behind the current material is the same.

During a visit in early October, Godart demonstrated the central reaction in the Found R&D lab, which after the company’s $12 million seed round last year now fills the better part of two floors of an industrial building in Boston’s Charlestown neighborhood. Using a pair of tongs to avoid starting the reaction with the moisture on his fingers, he placed a pellet of aluminum treated with the secret catalyst in a beaker and then added water. Immediately, the metal began to bubble with hydrogen. Then the water steamed away, leaving behind a frothing gray mass of aluminum hydroxide.

“One of the impediments to this technology taking off is that [the aluminum-water reaction] was just too sluggish,” says Godart. “But you can see here we’re making steam. We just made a boiler.”

From Europa to Earth

Godart was a scientist at NASA when he first started thinking about fresh ways to unlock the energy stored in aluminum. He was working on building aluminum robots that could consume themselves for fuel when roving on Jupiter’s icy moon Europa. But that work was cut short when Congress reduced funding for the mission.

“I was sort of having this little mini crisis where I was like, I need to do something about climate change, about Earth problems,” says Godart. “And I was like, you know—I bet this aluminum technology would be even better for Earth applications.” After completing a dissertation on aluminum fuels at MIT, he started Found Energy in his house in Cambridge in 2022 (the next year, he earned a place on MIT Technology Review’s annual 35 Innovators under 35 list).

Until this year, the company was working at a tiny scale, tweaking the catalyst and testing different conditions within a small 10-kilowatt reactor to make the reaction release more heat and hydrogen more quickly. Then, in January, it began designing an engine that’s 10 times larger, big enough to supply a useful amount of power for industrial processes beyond the lab.

This larger engine took up most of the lab on the second floor. The reactor vessel resembled a water boiler turned on its side, with piping and wires connected to monitoring equipment that took up almost as much space as the engine itself. On one end, there was a pipe to inject water and a piston to deliver pellets of aluminum fuel into the reactor at variable rates. On the other end, outflow pipes carried away the reaction products: steam, hydrogen gas, aluminum hydroxide, and the recovered catalyst. Godart says none of the catalyst is lost in the reaction, so it can be used again to make more fuel.

The company first switched on the engine to begin testing in July. In September, it managed to power it up to its targeted power of 100 kilowatts—roughly as much as can be supplied by the diesel engine in a small pickup truck. In early 2026, it plans to install the 100-kilowatt engine to supply heat and hydrogen to the tool manufacturing facility. This pilot project is meant to serve as the proof of concept needed to raise the money for a 1-megawatt reactor, 10 times larger again.

The initial pilot will use the engine to supply hot steam and hydrogen. But the energy released in the reactor could be put to use in a variety of ways across a range of temperatures, according to Godart. The hot steam could spin a turbine to produce electricity, or the hydrogen could produce electricity in a fuel cell. By burning the hydrogen within the steam, the engine can produce superheated steam as hot as 1,300 °C, which could be used to generate electricity more efficiently or refine chemicals. Burning the hydrogen alone could generate temperatures of 2,400 °C, hot enough to make steel.

Picking up scrap

Godart says he and his colleagues hope the engine will eventually power many different industrial processes, but the initial target is the aluminum refining and recycling industry itself, as it already handles scrap metal and aluminum oxide supply chains. “Aluminum recyclers are coming to us, asking us to take their aluminum waste that’s difficult to recycle and then turn that into clean heat that they can use to re-melt other aluminum,” he says. “They are begging us to implement this for them.”

Citing nondisclosure agreements, he wouldn’t name any of the companies offering up their unrecyclable aluminum, which he says is something of a “dirty secret” for an industry that’s supposed to be recycling all it collects. But estimates from the International Aluminium Institute, an industry group, suggest that globally a little over 3 million metric tons of aluminum collected for recycling currently goes unrecycled each year; another 9 million metric tons isn’t collected for recycling at all or is incinerated with other waste. Together, that’s a little under a third of the estimated 43 million metric tons of aluminum scrap that currently gets recycled each year.

Even if all that unused scrap was recovered for fuel, it would still supply only a fraction of the overall industrial demand for heat, let alone the overall industrial demand for energy. But the plan isn’t to be limited by available scrap. Eventually, Godart says, the hope is to “recharge” the aluminum hydroxide that comes out of the reactor by using clean electricity to convert it back into aluminum metal and react it again. According to the company’s estimates, this “closed loop” approach could supply all global demand for industrial heat by using and reusing a total of around 300 million metric tons of aluminum—around 4% of Earth’s abundant aluminum reserves. 

However, all that recharging would require a lot of energy. “If you’re doing that, [aluminum fuel] is an energy storage technology, not so much an energy providing technology,” says Jeffrey Rissman, who studies industrial decarbonization at Energy Innovation, a think tank in California. As with other forms of energy storage like thermal batteries or green hydrogen, he says, that could still make sense if the fuel can be recharged using low-cost, clean electricity. But that will be increasingly hard to come by amid the scramble for clean power for everything from AI data centers to heat pumps.

Despite these obstacles, Godart is confident his company will find a way to make it work. The existing engine may already be able to squeeze out more power from aluminum than anticipated. “We actually believe this can probably do half a megawatt,” he says. “We haven’t fully throttled it.”

James Dinneen is a science and environmental journalist based in New York City. 

What a massive thermal battery means for energy storage

Rondo Energy just turned on what it says is the world’s largest thermal battery, an energy storage system that can take in electricity and provide a consistent source of heat.

The company announced last week that its first full-scale system is operational, with 100 megawatt-hours of capacity. The thermal battery is powered by an off-grid solar array and will provide heat for enhanced oil recovery (more on this in a moment).

Thermal batteries could help clean up difficult-to-decarbonize sectors like manufacturing and heavy industrial processes like cement and steel production. With Rondo’s latest announcement, the industry has reached a major milestone in its effort to prove that thermal energy storage can work in the real world. Let’s dig into this announcement, what it means to have oil and gas involved, and what comes next.

The concept behind a thermal battery is overwhelmingly simple: Use electricity to heat up some cheap, sturdy material (like bricks) and keep it hot until you want to use that heat later, either directly in an industrial process or to produce electricity.

Rondo’s new system has been operating for 10 weeks and achieved all the relevant efficiency and reliability benchmarks, according to the company. The bricks reach temperatures over 1,000 °C (about 1,800 °F), and over 97% of the energy put into the system is returned as heat.

This is a big step from the 2 MWh pilot system that Rondo started up in 2023, and it’s the first of the mass-produced, full-size heat batteries that the company hopes to put in the hands of customers.

Thermal batteries could be a major tool in cutting emissions: 20% of total energy demand today is used to provide heat for industrial processes, and most of that is generated by burning fossil fuels. So this project’s success is significant for climate action.

There’s one major detail here, though, that dulls some of that promise: This battery is being used for enhanced oil recovery, a process where steam is injected down into wells to get stubborn oil out of the ground.

It can be  tricky for a climate technology to show its merit by helping harvest fossil fuels. Some critics argue that these sorts of techniques keep that polluting infrastructure running longer.

When I spoke to Rondo founder and chief innovation officer  John O’Donnell about the new system, he defended the choice to work with oil and gas.  

“We are decarbonizing the world as it is today,” O’Donnell says. To his mind, it’s better to help an oil and gas company use solar power for its operation than leave it to continue burning natural gas for heat. Between cheap solar, expensive natural gas, and policies in California, he adds, Rondo’s technology made sense for the customer.

Having a willing customer pay for a full-scale system has been crucial to Rondo’s effort to show that it can deliver its technology.

And the next units are on the way: Rondo is currently building three more full-scale units in Europe. The company will be able to bring them online cheaper and faster because of what it’s learned from the California project, O’Donnell says. 

The company has the capacity to build more batteries, and do it quickly. It currently makes batteries at its factory in Thailand, which has the capacity to make 2.4 gigawatt-hours’ worth of heat batteries today.

I’ve been following progress on thermal batteries for years, and this project obviously represents a big step forward. For all the promises of cheap, robust energy storage, there’s nothing like actually building a large-scale system and testing it in the field.

It’s definitely hard to get excited about enhanced oil recovery—we need to stop burning fossil fuels, and do it quickly, to avoid the worst impacts of climate change. But I see the argument that as long as oil and gas operations exist, there’s value in cleaning them up.

And as O’Donnell puts it, heat batteries can help: “This is a really dumb, practical thing that’s ready now.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

3 Things Stephanie Arnett is into right now

Dungeon Crawler Carl, by Matt Dinniman

This science fiction book series confronted me with existential questions like “Are we alone in the universe?” and “Do I actually like LitRPG??” (LitRPGwhich stands for “literary role-playing game”is a relatively new genre that merges the conventions of computer RPGs with those of science fiction and fantasy novels.) In the series, aliens destroy most of Earth, leaving the titular Carl and Princess Donut, his ex-girlfriend’s cat, to fight in a bloodthirsty game of survival with rules that are part reality TV and part video game dungeon crawl. I particularly recommend the audiobook, voiced by Jeff Hays, which makes the numerous characters easy to differentiate. 

Journaling, offline and open-source

For years I’ve tried to find a perfect system to keep track of all my random notes and weird little rabbit holes of inspiration. None of my paper journals or paid apps have been able to top how customizable and convenient the developer-­favorite notetaking app Obsidian is. Thanks to this app, I’ve been able to cancel subscription services I was using to track my reading habits, fitness goals, and journalingand I also use it to track tasks I do for work, like drafting this article. It’s open-source and files are stored on my device, so I don’t have to worry about whether I’m sharing my private thoughts with a company that might scrape them for AI.

Bird-watching with Merlin 

Sometimes I have to make a conscious effort to step away from my screens and get out in the world. The latest version of the birding app Merlin, from the Cornell Lab of Ornithology, helps ease the transition. I can “collect” and identify species via step-by-step questions, photos, ormy favoriteaudio that I record so that the app can analyze it to indicate which birds are singing in real time. Using the audio feature, I “captured” the red-eyed vireo flitting up in the tree canopy and backlit by the sun. Fantastic for my backyard feeder or while I’m out on the trail.

Dispatch: Partying at one of Africa’s largest AI gatherings

It’s late August in Rwanda’s capital, Kigali, and people are filling a large hall at one of Africa’s biggest gatherings of minds in AI and machine learning. The room is draped in white curtains, and a giant screen blinks with videos created with generative AI. A classic East African folk song by the Tanzanian singer Saida Karoli plays loudly on the speakers.

Friends greet each other as waiters serve arrowroot crisps and sugary mocktails. A man and a woman wearing leopard skins atop their clothes sip beer and chat; many women are in handwoven Ethiopian garb with red, yellow, and green embroidery. The crowd teems with life. “The best thing about the Indaba is always the parties,” computer scientist Nyalleng Moorosi tells me. Indaba means “gathering” in Zulu, and Deep Learning Indaba, where we’re meeting, is an annual AI conference where Africans present their research and technologies they’ve built.

Moorosi is a senior researcher at the Distributed AI Research Institute and has dropped in for the occasion from the mountain kingdom of Lesotho. Dressed in her signature “Mama Africa” headwrap, she makes her way through the crowded hall.

Moments later, a cheerful set of Nigerian music begins to play over the speakers. Spontaneously, people pop up and gather around the stage, waving flags of many African nations. Moorosi laughs as she watches. “The vibe at the Indaba—the community spirit—is really strong,” she says, clapping.

Moorosi is one of the founding members of the Deep Learning Indaba, which began in 2017 from a nucleus of 300 people gathered in Johannesburg, South Africa. Since then, the event has expanded into a prestigious pan-African movement with local chapters in 50 countries.

This year, nearly 3,000 people applied to join the Indaba; about 1,300 were accepted. They hail primarily from English-speaking African countries, but this year I noticed a new influx from Chad, Cameroon, the Democratic Republic of Congo, South Sudan, and Sudan. 

Moorosi tells me that the main “prize” for many attendees is to be hired by a tech company or accepted into a PhD program. Indeed, the organizations I’ve seen at the event include Microsoft Research’s AI for Good Lab, Google, the Mastercard Foundation, and the Mila–Quebec AI Institute. But she hopes to see more homegrown ventures create opportunities within Africa.

That evening, before the dinner, we’d both attended a panel on AI policy in Africa. Experts discussed AI governance and called for those developing national AI strategies to seek more community engagement. People raised their hands to ask how young Africans could access high-level discussions on AI policy, and whether Africa’s continental AI strategy was being shaped by outsiders. Later, in conversation, Moorosi told me she’d like to see more African priorities (such as African Union–backed labor protections, mineral rights, or safeguards against exploitation) reflected in such strategies. 

On the last day of the Indaba, I ask Moorosi about her dreams for the future of AI in Africa. “I dream of African industries adopting African-built AI products,” she says, after a long moment. “We really need to show our work to the world.” 

Abdullahi Tsanni is a science writer based in Senegal who specializes in narrative features. 

Job titles of the future: AI embryologist

Embryologists are the scientists behind the scenes of in vitro fertilization who oversee the development and selection of embryos, prepare them for transfer, and maintain the lab environment. They’ve been a critical part of IVF for decades, but their job has gotten a whole lot busier in recent years as demand for the fertility treatment skyrockets and clinics struggle to keep up. The United States is in fact facing a critical shortage of both embryologists and genetic counselors. 

Klaus Wiemer, a veteran embryologist and IVF lab director, believes artificial intelligence might help by predicting embryo health in real time and unlocking new avenues for productivity in the lab. 

Wiemer is the chief scientific officer and head of clinical affairs at Fairtility, a company that uses artificial intelligence to shed light on the viability of eggs and embryos before proceeding with IVF. The company’s algorithm, called CHLOE (for Cultivating Human Life through Optimal Embryos), has been trained on millions of embryo data points and outcomes and can quickly sift through a patient’s embryos to point the clinician to the ones with the highest potential for successful implantation. This, the company claims, will improve time to pregnancy and live births. While its effectiveness has been tested only retrospectively to date, CHLOE is the first and only FDA-approved AI tool for embryo assessment. 

Current challenge 

When a patient undergoes IVF, the goal is to make genetically normal embryos. Embryologists collect cells from each embryo and send them off for external genetic testing. The results of this biopsy can take up to two weeks, and the process can add thousands of dollars to the treatment cost. Moreover, passing the screen just means an embryo has the correct number of chromosomes. That number doesn’t necessarily reflect the overall health of the embryo. 

“An embryo has one singular function, and that is to divide,” says Wiemer. “There are millions of data points concerning embryo cell division, cell division characteristics, area and size of the inner cell mass, and the number of times the trophectoderm [the layer that contributes to the future placenta] contracts.”

The AI model allows for a group of embryos to be constantly measured against the optimal characteristics at each stage of development. “What CHLOE answers is: How well did that embryo develop? And does it have all the necessary components that are needed in order to make a healthy implantation?” says Wiemer. CHLOE produces an AI score reflecting all the analysis that’s been done within an embryo. 

In the near future, Wiemer says, reducing the percentage of abnormal embryos that IVF clinics transfer to patients should not require a biopsy: “Every embryology laboratory will be doing automatic assessments of embryo development.” 

A changing field

Wiemer, who started his career in animal science, says the difference between animal embryology and human embryology is the extent of paperwork. “Embryologists spend 40% of their time on non-embryology skills,” he adds. “AI will allow us to declutter the embryology field so we can get back to being true scientists.” This means spending more time studying the embryos, ensuring that they are developing normally, and using all that newfound information to get better at picking which embryos to transfer. 

“CHLOE is like having a virtual assistant in the lab to help with embryo selection, ensure conditions are optimal, and send out reports to patients and clinical staff,” he says. “Getting to study data and see what impacts embryo development is extremely rewarding, given that this capability was impossible a few years ago.” 

Amanda Smith is a freelance journalist and writer reporting on culture, society, human interest, and technology.

Inside the archives of the NASA Ames Research Center

At the southern tip of San Francisco Bay, surrounded by the tech giants Google, Apple, and Microsoft, sits the historic NASA Ames Research Center. Its rich history includes a grab bag of fascinating scientific research involving massive wind tunnels, experimental aircraft, supercomputing, astrobiology, and more.

Founded in 1939 as a West Coast lab for the National Advisory Committee for Aeronautics (NACA), NASA Ames was built to close the US gap with Germany in aeronautics research. Named for NACA founding member Joseph Sweetman Ames, the facility grew from a shack on Moffett Field into a sprawling compound with thousands of employees. A collection of 5,000 images from NASA Ames’s archives paints a vivid picture of bleeding-edge work at the heart of America’s technology hub. 

Wind tunnels

NASA AMES RESEARCH CENTER ARCHIVES

A key motivation for the new lab was the need for huge wind tunnels to jump-start America’s aeronautical research, which was far behind Germany’s. Smaller tunnels capable of speeds up to 300 miles per hour were built first, followed by a massive 40-by-80-foot tunnel for full-scale aircraft. Powered up in March 1941, these tunnels became vital after Pearl Harbor, helping scientists rapidly develop advanced aircraft.

Today, NASA Ames operates the world’s largest pressurized wind tunnel, with subsonic and transonic chambers for testing rockets, aircraft, and wind turbines.

Pioneer and Voyager 2

NASA AMES RESEARCH CENTER ARCHIVES

From 1965 to 1992, Ames managed the Pioneer missions, which explored the moon, Venus, Jupiter, and Saturn. It also contributed to Voyager 2, launched in 1977, which journeyed past four planets before entering interstellar space in 2018. Ames’s archive preserves our first glimpses of strange new worlds seen during these pioneering missions.

Odd aircraft

aircraft in flight

NASA AMES RESEARCH CENTER ARCHIVES

The skeleton of a hulking airship hangar, obsolete even before its completion, remains on NASA Ames’s campus.

Many odd-looking experimental aircraftsuch as vertical take-off and landing (VTOL) aircraft, jets, and rotorcrafthave been developed and tested at the facility over the years, and new designs continue to take shape there today.

Vintage illustrations

NASA AMES RESEARCH CENTER ARCHIVES

Awe-inspiring retro illustrations in the Ames archives depict surfaces of distant planets, NASA spacecraft descending into surreal alien landscapes, and fantastical renderings of future ring-shaped human habitats in space. The optimism and excitement of the ’70s and ’80s is evident. 

Bubble suits and early VR

person in an early VR suit

NASA AMES RESEARCH CENTER ARCHIVES

In the 1980s, NASA Ames researchers worked to develop next-generation space suits, such as the bulbous, hard-shelled AX-5 model. NASA Ames’s Human-Machine Interaction Group also did pioneering work in the 1980s with virtual reality and came up with some wild-­looking hardware. Long before today’s AR/VR boom, Ames researchers glimpsed the technology’s potentialwhich was limited only by computing power.

 Decades of federally funded research at Ames fueled breakthroughs in aviation, spaceflight, and supercomputingan enduring legacy now at risk as federal grants for science face deep cuts.

A version of this story appeared on Beau­tiful Public Data (beautifulpublicdata.com), a newsletter by Jon Keegan that curates visually interesting data sets collected by local, state, and federal government agencies.

AI could predict who will have a heart attack

For all the modern marvels of cardiology, we struggle to predict who will have a heart attack. Many people never get screened at all. Now, startups like Bunkerhill Health, Nanox.AI, and HeartLung Technologies are applying AI algorithms to screen millions of CT scans for early signs of heart disease. This technology could be a breakthrough for public health, applying an old tool to uncover patients whose high risk for a heart attack is hiding in plain sight. But it remains unproven at scale while raising thorny questions about implementation and even how we define disease. 

Last year, an estimated 20 million Americans had chest CT scans done, after an event like a car accident or to screen for lung cancer. Frequently, they show evidence of coronary artery calcium (CAC), a marker for heart attack risk, that is buried or not mentioned in a radiology report focusing on ruling out bony injuries, life-threatening internal trauma, or cancer.

Dedicated testing for CAC remains an underutilized method of predicting heart attack risk. Over decades, plaque in heart arteries moves through its own life cycle, hardening from lipid-rich residue into calcium. Heart attacks themselves typically occur when younger, lipid-rich plaque unpredictably ruptures, kicking off a clotting cascade of inflammation that ultimately blocks the heart’s blood supply. Calcified plaque is generally stable, but finding CAC suggests that younger, more rupture-prone plaque is likely present too. 

Coronary artery calcium can often be spotted on chest CTs, and its concentration can be subjectively described. Normally, quantifying a person’s CAC score involves obtaining a heart-specific CT scan. Algorithms that calculate CAC scores from routine chest CTs, however, could massively expand access to this metric. In practice, these algorithms could then be deployed to alert patients and their doctors about abnormally high scores, encouraging them to seek further care. Today, the footprint of the startups offering AI-derived CAC scores is not large, but it is growing quickly. As their use grows, these algorithms may identify high-risk patients who are traditionally missed or who are on the margins of care. 

Historically, CAC scans were believed to have marginal benefit and were marketed to the worried well. Even today, most insurers won’t cover them. Attitudes, though, may be shifting. More expert groups are endorsing CAC scores as a way to refine cardiovascular risk estimates and persuade skeptical patients to start taking statins. 

The promise of AI-derived CAC scores is part of a broader trend toward mining troves of medical data to spot otherwise undetected disease. But while it seems promising, the practice raises plenty of questions. For example, CAC scores ­haven’t proved useful as a blunt instrument for universal screening. A 2022 Danish study evaluating a population-based program, for example, showed no benefit in mortality rates for patients who had undergone CAC screening tests. If AI delivered this information automatically, would the calculus really shift? 

And with widespread adoption, abnormal CAC scores will become common. Who follows up on these findings? “Many health systems aren’t yet set up to act on incidental calcium findings at scale,” says Nishith Khandwala, the cofounder of Bunkerhill Health. Without a standard procedure for doing so, he says, “you risk creating more work than value.” 

There’s also the question of whether these AI-generated scores would actually improve patient care. For a symptomatic patient, a CAC score of zero may offer false reassurance. For the asymptomatic patient with a high CAC score, the next steps remain uncertain. Beyond statins, it isn’t clear if these patients would benefit from starting costly cholesterol-lowering drugs such as Repatha or other PCSK9-inhibitors. It may encourage some to pursue unnecessary but costly downstream procedures that could even end up doing harm. Currently, AI-derived CAC scoring is not reimbursed as a separate service by Medicare or most insurers. The business case for this technology today, effectively, lies in these potentially perverse incentives. 

At a fundamental level, this approach could actually change how we define disease. Adam Rodman, a hospitalist and AI expert at Beth Israel Deaconess Medical Center in Boston, has observed that AI-derived CAC scores share similarities with the “incidentaloma,” a term coined in the 1980s to describe unexpected findings on CT scans. In both cases, the normal pattern of diagnosis—in which doctors and patients deliberately embark on testing to figure out what’s causing a specific problem—were fundamentally disrupted. But, as Rodman notes, incidentalomas were still found by humans reviewing the scans. 

Now, he says, we are entering an era of “machine-based nosology,” where algorithms define diseases on their own terms. As machines make more diagnoses, they may catch things we miss. But Rodman and I began to wonder if a two-tiered diagnostic future may emerge, where “haves” pay for brand-name algorithms while “have-nots” settle for lesser alternatives. 

For patients who have no risk factors or are detached from regular medical care, an AI-derived CAC score could potentially catch problems earlier and rewrite the script. But how these scores reach people, what is done about them, and whether they can ultimately improve patient outcomes at scale remain open questions. For now—holding the pen as they toggle between patients and algorithmic outputs—clinicians still matter. 

Vishal Khetpal is a fellow in cardiovascular disease. The views expressed in this article do not represent those of his employers.