Why AI could eat quantum computing’s lunch

Tech companies have been funneling billions of dollars into quantum computers for years. The hope is that they’ll be a game changer for fields as diverse as finance, drug discovery, and logistics.

Those expectations have been especially high in physics and chemistry, where the weird effects of quantum mechanics come into play. In theory, this is where quantum computers could have a huge advantage over conventional machines.

But while the field struggles with the realities of tricky quantum hardware, another challenger is making headway in some of these most promising use cases. AI is now being applied to fundamental physics, chemistry, and materials science in a way that suggests quantum computing’s purported home turf might not be so safe after all.

The scale and complexity of quantum systems that can be simulated using AI is advancing rapidly, says Giuseppe Carleo, a professor of computational physics at the Swiss Federal Institute of Technology (EPFL). Last month, he coauthored a paper published in Science showing that neural-network-based approaches are rapidly becoming the leading technique for modeling materials with strong quantum properties. Meta also recently unveiled an AI model trained on a massive new data set of materials that has jumped to the top of a leaderboard for machine-learning approaches to material discovery.

Given the pace of recent advances, a growing number of researchers are now asking whether AI could solve a substantial chunk of the most interesting problems in chemistry and materials science before large-scale quantum computers become a reality. 

“The existence of these new contenders in machine learning is a serious hit to the potential applications of quantum computers,” says Carleo “In my opinion, these companies will find out sooner or later that their investments are not justified.”

Exponential problems

The promise of quantum computers lies in their potential to carry out certain calculations much faster than conventional computers. Realizing this promise will require much larger quantum processors than we have today. The biggest devices have just crossed the thousand-qubit mark, but achieving an undeniable advantage over classical computers will likely require tens of thousands, if not millions. Once that hardware is available, though, a handful of quantum algorithms, like the encryption-cracking Shor’s algorithm, have the potential to solve problems exponentially faster than classical algorithms can. 

But for many quantum algorithms with more obvious commercial applications, like searching databases, solving optimization problems, or powering AI, the speed advantage is more modest. And last year, a paper coauthored by Microsoft’s head of quantum computing, Matthias Troyer, showed that these theoretical advantages disappear if you account for the fact that quantum hardware operates orders of magnitude slower than modern computer chips. The difficulty of getting large amounts of classical data in and out of a quantum computer is also a major barrier. 

So Troyer and his colleagues concluded that quantum computers should instead focus on problems in chemistry and materials science that require simulation of systems where quantum effects dominate. A computer that operates along the same quantum principles as these systems should, in theory, have a natural advantage here. In fact, this has been a driving idea behind quantum computing ever since the renowned physicist Richard Feynman first proposed the idea.

The rules of quantum mechanics govern many things with huge practical and commercial value, like proteins, drugs, and materials. Their properties are determined by the interactions of their constituent particles, in particular their electrons—and simulating these interactions in a computer should make it possible to predict what kinds of characteristics a molecule will exhibit. This could prove invaluable for discovering things like new medicines or more efficient battery chemistries, for example. 

But the intuition-defying rules of quantum mechanics—in particular, the phenomenon of entanglement, which allows the quantum states of distant particles to become intrinsically linked—can make these interactions incredibly complex. Precisely tracking them requires complicated math that gets exponentially tougher the more particles are involved. That can make simulating large quantum systems intractable on classical machines.

This is where quantum computers could shine. Because they also operate on quantum principles, they are able to represent quantum states much more efficiently than is possible on classical machines. They could also take advantage of quantum effects to speed up their calculations.

But not all quantum systems are the same. Their complexity is determined by the extent to which their particles interact, or correlate, with each other. In systems where these interactions are strong, tracking all these relationships can quickly explode the number of calculations required to model the system. But in most that are of practical interest to chemists and materials scientists, correlation is weak, says Carleo. That means their particles don’t affect each other’s behavior significantly, which makes the systems far simpler to model.

The upshot, says Carleo, is that quantum computers are unlikely to provide any advantage for most problems in chemistry and materials science. Classical tools that can accurately model weakly correlated systems already exist, the most prominent being density functional theory (DFT). The insight behind DFT is that all you need to understand a system’s key properties is its electron density, a measure of how its electrons are distributed in space. This makes for much simpler computation but can still provide accurate results for weakly correlated systems.

Simulating large systems using these approaches requires considerable computing power. But in recent years there’s been an explosion of research using DFT to generate data on chemicals, biomolecules, and materials—data that can be used to train neural networks. These AI models learn patterns in the data that allow them to predict what properties a particular chemical structure is likely to have, but they are orders of magnitude cheaper to run than conventional DFT calculations. 

This has dramatically expanded the size of systems that can be modeled—to as many as 100,000 atoms at a time—and how long simulations can run, says Alexandre Tkatchenko, a physics professor at the University of Luxembourg. “It’s wonderful. You can really do most of chemistry,” he says.

Olexandr Isayev, a chemistry professor at Carnegie Mellon University, says these techniques are already being widely applied by companies in chemistry and life sciences. And for researchers, previously out of reach problems such as optimizing chemical reactions, developing new battery materials, and understanding protein binding are finally becoming tractable.

As with most AI applications, the biggest bottleneck is data, says Isayev. Meta’s recently released materials data set was made up of DFT calculations on 118 million molecules. A model trained on this data achieved state-of-the-art performance, but creating the training material took vast computing resources, well beyond what’s accessible to most research teams. That means fulfilling the full promise of this approach will require massive investment.

Modeling a weakly correlated system using DFT is not an exponentially scaling problem, though. This suggests that with more data and computing resources, AI-based classical approaches could simulate even the largest of these systems, says Tkatchenko. Given that quantum computers powerful enough to compete are likely still decades away, he adds, AI’s current trajectory suggests it could reach important milestones, such as precisely simulating how drugs bind to a protein, much sooner.

Strong correlations

When it comes to simulating strongly correlated quantum systems—ones whose particles interact a lot—methods like DFT quickly run out of steam. While more exotic, these systems include materials with potentially transformative capabilities, like high-temperature superconductivity or ultra-precise sensing. But even here, AI is making significant strides.

In 2017, EPFL’s Carleo and Microsoft’s Troyer published a seminal paper in Science showing that neural networks could model strongly correlated quantum systems. The approach doesn’t learn from data in the classical sense. Instead, Carleo says, it is similar to DeepMind’s AlphaZero model, which mastered the games of Go, chess, and shogi using nothing more than the rules of each game and the ability to play itself.

In this case, the rules of the game are provided by Schrödinger’s equation, which can precisely describe a system’s quantum state, or wave function. The model plays against itself by arranging particles in a certain configuration and then measuring the system’s energy level. The goal is to reach the lowest energy configuration (known as the ground state), which determines the system’s properties. The model repeats this process until energy levels stop falling, indicating that the ground state—or something close to it—has been reached.

The power of these models is their ability to compress information, says Carleo. “The wave function is a very complicated mathematical object,” he says. “What has been shown by several papers now is that [the neural network] is able to capture the complexity of this object in a way that can be handled by a classical machine.”

Since the 2017 paper, the approach has been extended to a wide range of strongly correlated systems, says Carleo, and results have been impressive. The Science paper he published with colleagues last month put leading classical simulation techniques to the test on a variety of tricky quantum simulation problems, with the goal of creating a benchmark to judge advances in both classical and quantum approaches.

Carleo says that neural-network-based techniques are now the best approach for simulating many of the most complex quantum systems they tested. “Machine learning is really taking the lead in many of these problems,” he says.

These techniques are catching the eye of some big players in the tech industry. In August, researchers at DeepMind showed in a paper in Science that they could accurately model excited states in quantum systems, which could one day help predict the behavior of things like solar cells, sensors, and lasers. Scientists at Microsoft Research have also developed an open-source software suite to help more researchers use neural networks for simulation.

One of the main advantages of the approach is that it piggybacks on massive investments in AI software and hardware, says Filippo Vicentini, a professor of AI and condensed-matter physics at École Polytechnique in France, who was also a coauthor on the Science benchmarking paper: “Being able to leverage these kinds of technological advancements gives us a huge edge.”

There is a caveat: Because the ground states are effectively found through trial and error rather than explicit calculations, they are only approximations. But this is also why the approach could make progress on what has looked like an intractable problem, says Juan Carrasquilla, a researcher at ETH Zurich, and another coauthor on the Science benchmarking paper.

If you want to precisely track all the interactions in a strongly correlated system, the number of calculations you need to do rises exponentially with the system’s size. But if you’re happy with an answer that is just good enough, there’s plenty of scope for taking shortcuts. 

“Perhaps there’s no hope to capture it exactly,” says Carrasquilla. “But there’s hope to capture enough information that we capture all the aspects that physicists care about. And if we do that, it’s basically indistinguishable from a true solution.”

And while strongly correlated systems are generally too hard to simulate classically, there are notable instances where this isn’t the case. That includes some systems that are relevant for modeling high-temperature superconductors, according to a 2023 paper in Nature Communications.

“Because of the exponential complexity, you can always find problems for which you can’t find a shortcut,” says Frank Noe, research manager at Microsoft Research, who has led much of the company’s work in this area. “But I think the number of systems for which you can’t find a good shortcut will just become much smaller.”

No magic bullets

However, Stefanie Czischek, an assistant professor of physics at the University of Ottawa, says it can be hard to predict what problems neural networks can feasibly solve. For some complex systems they do incredibly well, but then on other seemingly simple ones, computational costs balloon unexpectedly. “We don’t really know their limitations,” she says. “No one really knows yet what are the conditions that make it hard to represent systems using these neural networks.”

Meanwhile, there have also been significant advances in other classical quantum simulation techniques, says Antoine Georges, director of the Center for Computational Quantum Physics at the Flatiron Institute in New York, who also contributed to the recent Science benchmarking paper. “They are all successful in their own right, and they are also very complementary,” he says. “So I don’t think these machine-learning methods are just going to completely put all the other methods out of business.”

Quantum computers will also have their niche, says Martin Roetteler, senior director of quantum solutions at IonQ, which is developing quantum computers built from trapped ions. While he agrees that classical approaches will likely be sufficient for simulating weakly correlated systems, he’s confident that some large, strongly correlated systems will be beyond their reach. “The exponential is going to bite you,” he says. “There are cases with strongly correlated systems that we cannot treat classically. I’m strongly convinced that that’s the case.”

In contrast, he says, a future fault-tolerant quantum computer with many more qubits than today’s devices will be able to simulate such systems. This could help find new catalysts or improve understanding of metabolic processes in the body—an area of interest to the pharmaceutical industry.

Neural networks are likely to increase the scope of problems that can be solved, says Jay Gambetta, who leads IBM’s quantum computing efforts, but he’s unconvinced they’ll solve the hardest challenges businesses are interested in.

“That’s why many different companies that essentially have chemistry as their requirement are still investigating quantum—because they know exactly where these approximation methods break down,” he says.

Gambetta also rejects the idea that the technologies are rivals. He says the future of computing is likely to involve a hybrid of the two approaches, with quantum and classical subroutines working together to solve problems. “I don’t think they’re in competition. I think they actually add to each other,” he says.

But Scott Aaronson, who directs the Quantum Information Center at the University of Texas, says machine-learning approaches are directly competing against quantum computers in areas like quantum chemistry and condensed-matter physics. He predicts that a combination of machine learning and quantum simulations will outperform purely classical approaches in many cases, but that won’t become clear until larger, more reliable quantum computers are available.

“From the very beginning, I’ve treated quantum computing as first and foremost a scientific quest, with any industrial applications as icing on the cake,” he says. “So if quantum simulation turns out to beat classical machine learning only rarely, I won’t be quite as crestfallen as some of my colleagues.”

One area where quantum computers look likely to have a clear advantage is in simulating how complex quantum systems evolve over time, says EPFL’s Carleo. This could provide invaluable insights for scientists in fields like statistical mechanics and high-energy physics, but it seems unlikely to lead to practical uses in the near term. “These are more niche applications that, in my opinion, do not justify the massive investments and the massive hype,” Carleo adds.

Nonetheless, the experts MIT Technology Review spoke to said a lack of commercial applications is not a reason to stop pursuing quantum computing, which could lead to fundamental scientific breakthroughs in the long run.

“Science is like a set of nested boxes—you solve one problem and you find five other problems,” says Vicentini. “The complexity of the things we study will increase over time, so we will always need more powerful tools.”

What’s next for reproductive rights in the US

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

Earlier this week, Americans cast their votes in a seminal presidential election. But it wasn’t just the future president of the US that was on the ballot. Ten states also voted on abortion rights.

Two years ago, the US Supreme Court overturned Roe v. Wade, a legal decision that protected the right to abortion. Since then, abortion bans have been enacted in multiple states, and millions of people in the US have lost access to local clinics.

Now, some states are voting to extend and protect access to abortion. This week, seven states voted in support of such measures. And voters in Missouri, a state that has long restricted access, have voted to overturn its ban.

It’s not all good news for proponents of reproductive rights—some states voted against abortion access. And questions remain over the impact of a second term under former president Donald Trump, who is set to return to the post in January.

Roe v. Wade, the legal decision that enshrined a constitutional right to abortion in the US in 1973, guaranteed the right to an abortion up to the point of fetal viability, which is generally considered to be around 24 weeks of pregnancy. It was overturned by the US Supreme Court in the summer of 2022.

Within 100 days of the decision, 13 states had enacted total bans on abortion from the moment of conception. Clinics in these states could no longer offer abortions. Other states also restricted abortion access. In that 100-day period, 66 of the 79 clinics across 15 states stopped offering abortion services, and 26 closed completely, according to research by the Guttmacher Institute.

The political backlash to the decision was intense. This week, abortion was on the ballot in 10 states: Arizona, Colorado, Florida, Maryland, Missouri, Montana, Nebraska, Nevada, New York, and South Dakota. And seven of them voted in support of abortion access.

The impact of these votes will vary by state. Abortion was already legal in Maryland, for example. But the new measures should make it more difficult for lawmakers to restrict reproductive rights in the future. In Arizona, abortions after 15 weeks had been banned since 2022. There, voters approved an amendment to the state constitution that will guarantee access to abortion until fetal viability.

Missouri was the first state to enact an abortion ban once Roe v. Wade was overturned. The state’s current Right to Life of the Unborn Child Act prohibits doctors from performing abortions unless there is a medical emergency. It has no exceptions for rape or incest. This week, the state voted to overturn that ban and protect access to abortion up to fetal viability. 

Not all states voted in support of reproductive rights. Amendments to expand access failed to garner enough support in Nebraska, South Dakota, and Florida. In Florida, for example, where abortions after six weeks of pregnancy are banned, an amendment to protect access until fetal viability got 57% of the vote, falling just short of the 60% the state required for it to pass.

It’s hard to predict how reproductive rights will fare over the course of a second Trump term. Trump himself has been inconsistent on the issue. During his first term, he installed members of the Supreme Court who helped overturn Roe v. Wade. During his most recent campaign he said that decisions on reproductive rights should be left to individual states.

Trump, himself a Florida resident, has refused to comment on how he voted in the state’s recent ballot question on abortion rights. When asked, he said that the reporter who posed the question “should just stop talking about that,” according to the Associated Press.

State decisions can affect reproductive rights beyond abortion access. Just look at Alabama. In February, the Alabama Supreme Court ruled that frozen embryos can be considered children under state law. Embryos are routinely cryopreserved in the course of in vitro fertilization treatment, and the ruling was considered likely to significantly restrict access to IVF in the state. (In March, the state passed another law protecting clinics from legal repercussions should they damage or destroy embryos during IVF procedures, but the status of embryos remains unchanged.)

The fertility treatment became a hot topic during this year’s campaign. In October, Trump bizarrely referred to himself as “the father of IVF.” That title is usually reserved for Robert Edwards, the British researcher who won the 2010 Nobel prize in physiology or medicine for developing the technology in the 1970s.

Whatever is in store for reproductive rights in the US in the coming months and years, all we’ve seen so far suggests that it’s likely to be a bumpy ride.


Now read the rest of The Checkup

Read more from MIT Technology Review’s archive

My colleague Rhiannon Williams reported on the immediate aftermath of the decision that reversed Roe v. Wade when it was announced a couple of years ago. 

The Alabama Supreme Court ruling on embryos could also affect the development of technologies designed to serve as “artificial wombs,” as Antonio Regalado explained at the time.

Other technologies are set to change the way we have babies. Some, which could lead to the creation of children with four parents or none at all, stand to transform our understanding of parenthood.  

We’ve also reported on attempts to create embryo-like structures using stem cells. These structures look like embryos but are created without eggs or sperm. There’s a “wild race” afoot to make these more like the real thing. But both scientific and ethical questions remain over how far we can—and—should go.

My colleagues have been exploring what the US election outcome might mean for climate policies. Senior climate editor James Temple writes that Trump’s victory is “a stunning setback for climate change.” And senior reporter Casey Crownhart explains how efforts including a trio of laws implemented by the Biden administration, which massively increased climate funding, could be undone.

From around the web

Donald Trump has said he’ll let Robert F. Kennedy Jr. “go wild on health.” Here’s where the former environmental lawyer and independent candidate—who has no medical or public health degrees—stands on vaccines, fluoride, and the Affordable Care Act. (New York Times)

Bird flu has been detected in pigs on a farm in Oregon. It’s a worrying development that virologists were dreading. (The Conversation)

And, in case you need it, here’s some lighter reading:

Scientists are sequencing the DNA of tiny marine plankton for the first time. (Come for the story of the scientific expedition; stay for the beautiful images of jellies and sea sapphires.) (The Guardian)

Dolphins are known to communicate with whistles and clicks. But scientists were surprised to find a “highly vocal” solitary dolphin in the Baltic Sea. They think the animal is engaging in “dolphin self-talk.” (Bioacoustics)

How much do you know about baby animals? Test your knowledge in this quiz. (National Geographic)

The US is about to make a sharp turn on climate policy

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Voters have elected Donald Trump to a second term in the White House.

In the days leading up to the election, I kept thinking about what four years means for climate change right now. We’re at a critical moment that requires decisive action to rapidly slash greenhouse-gas emissions from power plants, transportation, industry, and the rest of the economy if we’re going to achieve our climate goals.

The past four years have seen the US take climate action seriously, working with the international community and pumping money into solutions. Now, we’re facing a period where things are going to be very different. A Trump presidency will have impacts far beyond climate, but for the sake of this newsletter, we’ll stay focused on what four years means in the climate fight as we start to make sense of this next chapter. 

Joe Biden arguably did more to combat climate change than any other American president. One of his first actions in office was rejoining the Paris climate accord—Trump pulled out of the international agreement to fight climate change during his first term in office. Biden then quickly set a new national goal to cut US carbon emissions in half, relative to their peak, by 2030.

The Environmental Protection Agency rolled out rules for power plants to slash pollution that harms both human health and the climate. The agency also announced new regulations for vehicle emissions to push the country toward EVs.

And the cornerstone of the Biden years has been unprecedented climate investment. A trio of laws—the Bipartisan Infrastructure Law, the CHIPS and Science Act, and the Inflation Reduction Act—pumped hundreds of billions of dollars into infrastructure and research, much of it on climate.

Now, this ship is about to make a quick turn. Donald Trump has regularly dismissed the threat of climate change and promised throughout the campaign to counter some of Biden’s key moves.

We can expect to see a dramatic shift in how the US talks about climate on the international stage. Trump has vowed to once again withdraw from the Paris agreement. Things are going to be weird at the annual global climate talks that kick off next week.

We can also expect to see efforts to undo some of Biden’s key climate actions, most centrally the Inflation Reduction Act, as my colleague James Temple covered earlier this year.

What, exactly, Trump can do will depend on whether Republicans take control of both houses of Congress. A clean sweep would open up more lanes for targeting legislation passed under Biden. (As of sending this email, Republicans have secured enough seats to control the Senate, but the House is uncertain and could be for days or even weeks.)

I don’t think the rug will be entirely pulled out from under the IRA—portions of the investment from the law are beginning to pay off, and the majority of the money has gone to Republican districts. But there will certainly be challenges to pieces, especially the EV tax credits, which Trump has been laser-focused on during the campaign.

This all adds up to a very different course on climate than what many had hoped we might see for the rest of this decade.

A Trump presidency could add 4 billion metric tons of carbon dioxide emissions to the atmosphere by 2030 over what was expected from a second Biden term, according to an analysis published in April by the website Carbon Brief (this was before Biden dropped out of the race). That projection sees emissions under Trump dropping by 28% below the peak by the end of the decade—nowhere near the 50% target set by Biden at the beginning of his term.

The US, which is currently the world’s second-largest greenhouse-gas emitter and has added more climate pollution to the atmosphere than any other nation, is now very unlikely to hit Biden’s 2030 goal. That’s basically the final nail in the coffin for efforts to limit global warming to 1.5 °C (2.7 °F) over preindustrial levels.

In the days, weeks, and years ahead we’ll be covering what this change will mean for efforts to combat climate change and to protect the most vulnerable from the dangerous world we’re marching toward—indeed, already living in. Stay tuned for more from us.


Now read the rest of The Spark

Related reading

Trump wants to unravel Biden’s landmark climate law. Read our coverage from earlier this year to see what’s most at risk

It’s been two years since the Inflation Reduction Act was passed, ushering in hundreds of billions of dollars in climate investment. Read more about the key provisions in this newsletter from August

silhouette of a cow with letters C,T,G,A floating inside in brilliant orange light

MIT TECHNOLOGY REVIEW | GETTY

Another thing

Jennifer Doudna, one of the inventors of the gene-editing tool CRISPR, says the tech could be a major tool to help address climate change and deal with the growing risks of our changing world. 

The hope is that CRISPR’s ability to chop out specific pieces of DNA will make it faster and easier to produce climate-resilient crops and livestock, while avoiding the pitfalls of previous attempts to tweak the genomes of plants and animals. Read the full story from my colleague James Temple.

Keeping up with climate  

Startup Redoxblox is building a technology that’s not exactly a thermal battery, but it’s not not a thermal battery either. The company raised just over $30 million to build its systems, which store energy in both heat and chemical bonds. (Heatmap)

It’s been a weird fall in the US Northeast—a rare drought has brought a string of wildfires, and New York City is seeing calls to conserve water. (New York Times)

It’s been bumpy skies this week for electric-plane startups. Beta Technologies raised over $300 million in funding, while Lilium may be filing for insolvency soon. (Canary Media)

→ The runway for futuristic electric planes is still a long one. (MIT Technology Review)

Meta’s plan to build a nuclear-powered AI data center has been derailed by a rare species of bee living on land earmarked for the project. (Financial Times)

The atmospheric concentration of methane—a powerful greenhouse gas—has been mysteriously climbing since 2007, and that growth nearly doubled in 2020. Now scientists may have finally figured out the culprits: microbes in wetlands that are getting warmer and wetter. (Washington Post)

Greenhouse-gas emissions from the European Union fell by 8% in 2023. The drop is thanks to efforts to shut down coal-fired power plants and generate more electricity from renewables like solar and wind. (The Guardian)

Four electric school buses could help officials figure out how to charge future bus fleets. A project in Brooklyn will aim to use onsite renewables and smart charging to control the costs and grid stress of EV charging depots. (Canary Media)

How ChatGPT search paves the way for AI agents

This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here.

OpenAI’s Olivier Godement, head of product for its platform, and Romain Huet, head of developer experience, are on a whistle-stop tour around the world. Last week, I sat down with the pair in London before DevDay, the company’s annual developer conference. London’s DevDay is the first one for the company outside San Francisco. Godement and Huet are heading to Singapore next. 

It’s been a busy few weeks for the company. In London, OpenAI announced updates to its new Realtime API platform, which allows developers to build voice features into their applications. The company is rolling out new voices and a function that lets developers generate prompts, which will allow them to build apps and more helpful voice assistants more quickly. Meanwhile for consumers, OpenAI announced it was launching ChatGPT search, which allows users to search the internet using the chatbot. Read more here

Both developments pave the way for the next big thing in AI: agents. These are AI assistants that can complete complex chains of tasks, such as booking flights. (You can read my explainer on agents here.) 

“Fast-forward a few years—every human on Earth, every business, has an agent. That agent knows you extremely well. It knows your preferences,” Godement says. The agent will have access to your emails, apps, and calendars and will act like a chief of staff, interacting with each of these tools and even working on long-term problems, such as writing a paper on a particular topic, he says. 

OpenAI’s strategy is to both build agents itself and allow developers to use its software to build their own agents, says Godement. Voice will play an important role in what agents will look and feel like. 

“At the moment most of the apps are chat based … which is cool, but not suitable for all use cases. There are some use cases where you’re not typing, not even looking at the screen, and so voice essentially has a much better modality for that,” he says. 

But there are two big hurdles that need to be overcome before agents can become a reality, Godement says. 

The first is reasoning. Building AI agents requires us to be able to trust that they will be able to complete complex tasks and do the right things, says Huet. That’s where OpenAI “reasoning” feature comes in. Introduced in OpenAI’s o1 model last month, it uses reinforcement learning to teach the model how to process information using “chain of thought.” Giving the model more time to generate answers allows it to recognize and correct mistakes, break down problems into smaller ones, and try different approaches to answering questions, Godement says. 

But OpenAI’s claims about reasoning should be taken with a pinch of salt, says Chirag Shah, a computer science professor at the University of Washington. Large language models are not exhibiting true reasoning. It’s most likely that they have picked up what looks like logic from something they’ve seen in their training data.

“These models sometimes seem to be really amazing at reasoning, but it’s just like they’re really good at pretending, and it only takes a little bit of picking at them to break them,” he says.

There is still much more work to be done, Godement admits. In the short term, AI models such as o1 need to be much more reliable, faster, and cheaper. In the long term, the company needs to apply its chain-of-thought technique to a wider pool of use cases. OpenAI has focused on science, coding, and math. Now it wants to address other fields, such as law, accounting, and economics, he says. 

Second on the to-do list is the ability to connect different tools, Godement says. An AI model’s capabilities will be limited if it has to rely on its training data alone. It needs to be able to surf the web and look for up-to-date information. ChatGPT search is one powerful way OpenAI’s new tools can now do that. 

These tools need to be able not only to retrieve information but to take actions in the real world. Competitor Anthropic announced a new feature where its Claude chatbot can “use” a computer by interacting with its interface to click on things, for example. This is an important feature for agents if they are going to be able to execute tasks like booking flights. Godement says o1 can “sort of” use tools, though not very reliably, and that research on tool use is a “promising development.” 

In the next year, Godemont says, he expects the adoption of AI for customer support and other assistant-based tasks to grow. However, he says that it can be hard to predict how people will adopt and use OpenAI’s technology. 

“Frankly, looking back every year, I’m surprised by use cases that popped up that I did not even anticipate,” he says. “I expect there will be quite a few surprises that you know none of us could predict.” 


Now read the rest of The Algorithm

Deeper Learning

This AI-generated version of Minecraft may represent the future of real-time video generation

When you walk around in a version of the video game Minecraft from the AI companies Decart and Etched, it feels a little off. Sure, you can move forward, cut down a tree, and lay down a dirt block, just like in the real thing. If you turn around, though, the dirt block you just placed may have morphed into a totally new environment. That doesn’t happen in Minecraft. But this new version is entirely AI-generated, so it’s prone to hallucinations. Not a single line of code was written.

Ready, set, go: This version of Minecraft is generated in real time, using a technique known as next-frame prediction. The AI companies behind it did this by training their model, Oasis, on millions of hours of Minecraft game play and recordings of the corresponding actions a user would take in the game. The AI is able to sort out the physics, environments, and controls of Minecraft from this data alone. Read more from Scott J. Mulligan.

Bits and Bytes

AI search could break the web
At its best, AI search can better infer a user’s intent, amplify quality content, and synthesize information from diverse sources. But if AI search becomes our primary portal to the web, it threatens to disrupt an already precarious digital economy, argues Benjamin Brooks, a fellow at the Berkman Klein Center at Harvard University, who used to lead public policy for Stability AI. (MIT Technology Review

AI will add to the e-waste problem. Here’s what we can do about it.
Equipment used to train and run generative AI models could produce up to 5 million tons of e-waste by 2030, a relatively small but significant fraction of the global total. (MIT Technology Review

How an “interview” with a dead luminary exposed the pitfalls of AI
A state-funded radio station in Poland fired its on-air talent and brought in AI-generated presenters. But the experiment caused an outcry and was stopped when tone of them  “interviewed” a dead Nobel laureate. (The New York Times

Meta says yes, please, to more AI-generated slop
In Meta’s latest earnings call, CEO Mark Zuckerberg said we’re likely to see 
“a whole new category of content, which is AI generated or AI summarized content or kind of existing content pulled together by AI in some way.” Zuckerberg added that he thinks “that’s going to be just very exciting.” (404 Media

How exosomes could become more than just an “anti-aging” fad

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

Over the past month or so, I’ve been working on a story about exosomes. You might have seen them advertised—they’re being touted as a hot new beauty treatment, a fountain of youth, and generally a cure-all therapy for a whole host of ailments.

Any cell biologist, though, will tell you what exosomes really are: tiny little blobs that bud off from cells and contain a mixture of proteins and other components. We’re not entirely clear what those components are or what they do, despite the promises made by medspas and cosmetic clinics charging thousands of dollars for exosome “therapies.” As one recipient of an exosome treatment told me, “I feel like it’s a little bit of health marketing bullshit.”

But there is some very exciting scientific research underway to better understand exactly what exosomes do. Scientists are exploring not only how these tiny particles might help cells communicate, but also how they might be used to diagnose or treat diseases. One company is trying to use exosomes to deliver drugs to the brains of people with rare neurological disorders.

It might take longer for these kinds of exosome applications to get to the clinic, but when they do, at least they’ll be evidence based.

Exosomes are a type of extracellular vesicle. This is a scientific way of saying they are basically little packages that bud off from cells. They were once thought to contain cellular garbage, but now scientists believe they convey important signals between cells and tissues.

Exactly what those signals are is still being figured out.  The contents of exosomes from cancer cells will probably be somewhat different to those from healthy cells, for example.

Because of that, many scientists hope that exosomes could one day be used to help us diagnose diseases. In theory, you could isolate exosomes from a blood sample, examine their contents, and figure out what might be going on in a person’s cells. Exosomes might provide clues as to how stressed or close to death a cell is. They might indicate the presence of a tumor.

Raghu Kalluri, a cancer biologist at MD Anderson Cancer Center in Houston, is one of the researchers exploring this possibility. “I believe that exosomes are likely providing a forensic fingerprint of what the cells are undergoing,” he says.

But understanding these signals won’t be straightforward. Exosomes from cancer cells might send signals to surrounding cells in order to “subjugate” them into helping the cancer grow, says Kalluri. Cells around a tumor might also send distress signals, alerting the immune system to fight back against it. “There’s definitely a role for these exosomes in cancer progression and metastasis,” he says. “Precisely what [that role is] is an active area of research right now.”

Exosomes could also be useful for delivering drug treatments. After all, they are essentially little packages of proteins and other matter that can be shuttled between cells. Why not fill them with a medicine and use them to target specific regions of the body?

Because exosomes are made in our bodies, they are less likely to be seen as “foreign” and rejected by our immune systems. And the outer layer of an exosome can serve as a protective coat, shielding the drug from being degraded until it reaches its destination, says James Edgar, who studies exosomes at the University of Cambridge. “It’s a really attractive method for drug delivery,” he says.

Dave Carter is one scientist working on it. Carter and his colleagues at Evox Therapeutics in Oxford, UK, are engineering cells to produce compounds that might help treat rare neurological diseases. These compounds could then be released from the cells in exosomes.

In their research, Carter and his colleagues can change almost everything about the exosomes they study. They can alter their contents, loading them with proteins or viruses or even gene-editing therapies. They can tweak the proteins on their surfaces to make them target different cells and tissues. They can control how long exosomes stay in an animal’s circulation.

“I always used to love playing with Lego,” he adds. “I feel like I’m playing with Lego when I’m working with exosomes.”

Others are hopeful that exosomes themselves hold some kind of therapeutic value. Some hope that exosomes derived from stem cells, for example, might have some regenerative capacity.

Ke Cheng at Columbia University in New York is interested in the idea of using exosomes to treat heart and lung conditions. Several preliminary studies suggest that exosomes from heart and stem cells might help animals like mice and pigs recover from heart injuries, such as those caused by a heart attack.

There are certainly plenty of clinical trials of exosomes underway. When I searched for “exosomes” on clinicaltrials.gov, I got over 400 results. These are early-stage trials, however—and are of variable quality.

Still, it’s an exciting time for exosome research. “It’s a growing field … I think we will see a lot of exciting science in the next five years,” says Cheng. “I’m very optimistic.”


Now read the rest of The Checkup

Read more from MIT Technology Review’s archive

You can read the piece about the costly exosome treatments being sold in aesthetic clinics and medspas in my longer piece, which was published earlier this week. 

It can be difficult to establish credibility in a medical field when you’re being undercut by clinics selling unapproved treatments and individuals making outlandish claims. Just ask the doctors and scientists trying to legitimize longevity medicine

Some treatments can take off culturally without the backing of rigorous evidence, only to go up in flames when the trial results come in. We saw this earlier this year, when FDA advisors rejected the use of MDMA (or ecstasy) for post-traumatic stress disorder (PTSD) owing to “significant confounders” in the trials. 

For some people, unproven treatments might represent a last hope for survival. In those cases, how do we balance access to experimental medicine with the need to protect people who are vulnerable?

Stem cells from human embryos promised to “launch a medical revolution in which ailing organs and tissues might be repaired” when they were isolated just over 25 years ago. So why haven’t they?  

From around the web

Having a disability shouldn’t prevent you from getting married. But that’s exactly the conundrum facing some people in the US, as this heartbreaking short documentary shows. (STAT)

A Neuralink rival says its eye implant restored vision in blind people. Science Corporation’s retinal implant enabled some legally blind individuals to read from a book, play cards, and fill out crossword puzzles. (Wired)

Women in Texas are dying after doctors delay treating them for miscarriages. Doctors treating Josseli Barnica waited 40 hours for the heart of her fetus to stop beating, despite the fact that miscarriage was “inevitable.” Her husband says doctors worried that “it would be a crime to give her an abortion.” She died of a preventable infection three days later. (ProPublica)

Between 30% and 50% of twins share a secret language or mode of communication, a phenomenon known as cryptophasia. The Youlden twins call theirs Umeri. (BBC Future)

Can a machine express fear? Try your hand at creating AI-generated images frightening enough to “spook the machine” as part of a project to explore how machines might express humanlike emotions. It is Halloween, after all. (Spook the Machine)

The surprising barrier that keeps us from building the housing we need

Ahead of abortion access, ahead of immigration, and way ahead of climate change, US voters under 30 are most concerned about one issue: housing affordability. And it’s not just young voters who are identifying soaring rents and eye-watering home sale prices as among their top worries. For the first time in recent memory, the cost of housing could be a major factor in the presidential election.  

It’s not hard to see why. From the beginning of the pandemic to early 2024, US home prices rose by 47%. In large swaths of the country, buying a home is no longer a possibility even for those with middle-class incomes. For many, that marks the end of an American dream built around owning a house. Over the same time, rents have gone up 26%.

Vice President Kamala Harris has offered an ambitious plan to build more: “Right now, a serious housing shortage is part of what is driving up cost,” she said last month in Las Vegas. “So we will cut the red tape and work with the private sector to build 3 million new homes.” Included in her proposals is a $40 billion innovation fund to support housing construction.

Former president Donald Trump, meanwhile, has also called for cutting regulations but mostly emphasizes a far different way to tackle the housing crunch: mass deportation of the immigrants he says are flooding the country, and whose need for housing he claims is responsible for the huge jump in prices. (While a few studies show some local impact on the cost of housing from immigration in general, the effect is relatively small, and there is no plausible economic scenario in which the number of immigrants over the last few years accounts for the magnitude of the increase in home prices and rents across much of the country.)

The opposing views offered by Trump and Harris have implications not only for how we try to lower home prices but for how we view the importance of building. Moreover, this attention on the housing crisis also reveals a broader issue with the construction industry at large: This sector has been tech-averse for decades, and it has become less productive over the past 50 years.

The reason for the current rise in the cost of housing is clear to most economists: a lack of supply. Simply put, we don’t build enough houses and apartments, and we haven’t for years. Depending on how you count it, the US has a shortage of around 1.2 million to more than 5.5 million single-family houses.

Permitting delays and strict zoning rules create huge obstacles to building more and faster—as do other widely recognized issues, like the political power of NIMBY activists across the country and an ongoing shortage of skilled workers. But there is also another, less talked-about problem that’s plaguing the industry: We’re not very efficient at building, and we seem somehow to be getting worse.

Together these forces have made it more expensive to build houses, leading to increases in prices. Albert Saiz, a professor of urban economics and real estate at MIT, calculates that construction costs account for more than two-thirds of the price of a new house in much of the country, including the Southwest and West, where much of the building is happening. Even in places like California and New England, where land is extremely expensive, construction accounts for 40% to 60% of value of a new home, according to Saiz.

Part of the problem, Saiz says, is that “if you go to any construction site, you’ll see the same methods used 30 years ago.”

The productivity woes are evident across the construction industry, not just in the housing sector. From clean-energy advocates dreaming of renewables and an expanded power grid to tech companies racing to add data centers, everyone seems to agree: We need to build more and do it quickly. The practical reality, though, is that it costs more, and takes more time, to construct anything.

For decades, companies across the industry have largely ignored ways they could improve the efficiency of their operations. They have shunned data science and the kinds of automation that have transformed the other sectors of the economy. According to an estimation by the McKinsey Global Institute, construction, one of the largest parts of the global economy, is the least digitized major sector worldwide—and it isn’t even close.

The reality is that even if we ease the endless permitting delays and begin cutting red tape, we will still be faced with a distressing fact: The construction industry is not very efficient when it comes to building stuff.

The awful truth

Productivity is our best measure of long-term progress in an industry, at least according to economists. Technically, it’s a measure of how much a worker can produce; as companies adopt more efficient practices and new technologies, productivity grows and businesses can make stuff (in this case, homes and buildings) faster and more cheaply. Yet something shocking has happened in the construction industry: Productivity seems to have stalled and even gone into reverse over the last few decades.

In a recent paper called “The Strange and Awful Path of Productivity in the US Construction Sector,” two leading economists at the University of Chicago showed that productivity growth in US construction came to a halt beginning around 1970. Productivity is notoriously difficult to quantify, but the Chicago researchers calculated it in one of the key parts of the construction business: housing. They found that the number of houses or total square footage (houses are getting bigger) built per employee each year was flat or even falling over the last 50 years. And the researchers believe the lack of productivity growth holds true for all different types of construction.

Chad Syverson, one of the authors, admits he is still trying to pinpoint the reason—“It’s probably a few things.” While he says it’s difficult to quantify the specific impact of various factors on productivity, including the effects of regulatory red tape and political fights that often delay construction, “part of the industry’s problem is its own operational inefficiency,” he says. “There’s no doubt about it.” In other words, the industry just isn’t very innovative.

The lack of productivity in construction over the last half-century, at a time when all other sectors grew dramatically, is “really amazing,” he says—and not in a good way.

US manufacturing, in contrast, continued growing at around 2% to 3% annually over the same period. Auto workers, as a result, now produce far more cars than they once did, leading to cheaper vehicles if you adjust for inflation (and, by most measures, safer and better ones).

Productivity in construction is not just a US problem, according to the McKinsey Global Institute, which has tracked the issue for nearly a decade. Not all countries are faring as badly as the US, but worldwide construction productivity has been flat over the last few decades, says Jan Mischke, who heads the McKinsey work.

Beyond adding to the costs and threatening the financial viability of many planned projects, Mischke says, the lack of productivity is “reflected in all the mess, time and cost overruns, concerns about quality, rework, and all the things that everyone who has ever built anything will have seen.” 

The nature of construction work can make it difficult to improve longstanding processes and introduce new technologies, he says: “Most other sectors become better over time by doing the same thing twice or three times or 3 million times. They learn and improve. All that is essentially missing in construction, where every single project starts from scratch and reinvents the wheel.”

Mischke also sees another reason for the industry’s lack of productivity: the “misaligned incentives” of the various players, who often make more money the longer a project takes.

Though the challenges are endemic to the business, Mischke adds that builders can take steps to overcome them by moving to digital technologies, implementing more standardized processes, and improving the efficiency of their business practices.

“Most other sectors become better over time by doing the same thing twice or three times or 3 million times. All that is essentially missing in construction.”

It’s an urgent problem to solve as many countries race to build housing, expand clean-energy capabilities, and update infrastructure like roads and airports. In their latest report, the McKinsey researchers warn of the dangers if productivity doesn’t improve: “The net-zero transition may be delayed, growth ambitions may be deferred, and countries may struggle to meet the infrastructure and housing needs for their populations.”

But the report also says there’s a flip side to the lack of progress in much of the industry: Individual companies that begin to improve their efficiency could gain a huge competitive advantage.

Building on the data

When Jit Kee Chin joined Suffolk Construction as its chief data officer in 2017, the title was unique in the industry. But Chin, armed with a PhD in experimental physics from MIT and a 10-year stint at McKinsey, brought to the large Boston-based firm the kind of technical and management expertise often missing from construction companies. And she recognized that large construction projects—including the high-rise apartment buildings and sprawling data centers that Suffolk often builds—generate vast amounts of useful data.

At the time, much of the data was siloed; information on the progress of a project was in one place, scheduling in another, and safety data and reports in yet another. “The systems didn’t talk to each other, and it was very difficult to cross-correlate,” says Chin. Getting all the data together so it could be understood and utilized across the business was an early task.

“Almost all construction companies are talking about how to better use their data now,” says Chin, who is currently Suffolk’s CTO, and since her hiring, “a couple others have even appointed chief data officers.” But despite such encouraging signs, she sees the effort to improve productivity in the industry as still very much a work in progress.  

One ongoing and obvious target: the numerous documents that are constantly being revised as they move along from architect to engineers to subcontractors. It’s the lifeblood of any construction project, and Chin says the process “is by no means seamless.” Architects and subcontractors sometimes use different software; meanwhile, the legally binding documents spelling out details of a project are still circulated as printouts. A more frictionless flow of information among the multitude of players is critical to better coordinate the complex building process.

Ultimately, though, building is a physical activity. And while automation has largely been absent from building trades, robots are finally cheap enough to be attractive to builders, especially companies facing a shortage of workers. “The cost of off-the-shelf robotic components has come down to a point where it is feasible to think of simple robots automating a very repetitive task,” says Chin. And advances in robotic image recognition, lidar, AI, and dexterity, she says, mean robots are starting to be able to safely navigate construction sites.

One step in construction where digital designs meet the physical world is the process of laying out blueprints for walls and other structures on the floor of a building. It’s an exacting, time-consuming manual practice, prone to errors.

The Dusty Robotics field printer marks the layout for walls and other structures.
DUSTY ROBOTICS

And startups like Dusty Robotics are betting it’s an almost perfect application for a Roomba-like robot. Tessa Lau, its CEO, recalls that when she researched the industry before founding the company in 2018, she was struck by seeing “people on their hands and knees snapping chalk lines.”

Based in Silicon Valley, the company builds a box-shaped machine that scoots about a site on sturdy wheels to mark the layout. Though the company often markets it as a field printer to allay any fears about automation, it’s an AI-powered robot with advanced sensors that plan and guide its travels.

Not only does the robot automate a critical job, but because that task is so central in the construction process, it also helps open a digital window into the overall workflow of a project.

A history lesson

Whatever the outcome of the upcoming election, don’t hold your breath waiting for home prices to fall; even if we do build more (or somehow decrease demand), it will probably take years for the supply to catch up. But the political spotlight on housing affordability could be a rare opportunity to focus on the broad problem of construction productivity.  

While some critics have argued that Harris’s plan is too vague and lacks the ambition required to solve the housing crisis, her message that we need to build more and faster is the right one. “It takes too long and it costs too much to build. Whether it’s a new housing development, a new factory, or a new bridge, projects take too long to go from concept to reality,” Harris said in a speech in late September. Then she asked: “You know long it took to build [the Empire State Building]?”

Harris stresses cutting red tape to unleash a building boom. That’s critical, but it’s only part of the long-term answer. The construction of the famous New York City skyscraper took just over a year in 1931—a feat that provides valuable clues to how the industry itself can finally increase its productivity.

The explanation for why it was built so quickly has less to do with new technologies—in fact, the engineers mostly opted for processes and materials that were familiar and well-tested at the time—and more to do with how the project leaders managed every aspect of the design and construction process for speed and efficiency. The activity of the thousands of workers was carefully scheduled and tracked, and the workflow was highly choreographed to minimize delays. Even the look of the 1,250-foot building was largely a result of choosing the fastest and simplest way to build.

To a construction executive like Suffolk’s Chin, who estimates it would take at least four years to construct such a building today, the lessons of the Empire State Building resonate, especially the operational discipline and the urgency to finish the structure as quickly as possible. “It’s a stark difference when you think about how much time it took and how much time it would take to build that building now,” she says.

If we want an affordable future, the construction business needs to recapture that sense of urgency and efficiency. To do so, the industry will need to change the way it operates and alter its incentive structures; it will need to incorporate the right mix of automation and find financial models that will transform outdated business practices. The good news is that advances in data science, automation, and AI are offering companies new opportunities to do just that.

The hope, then, is that capitalism will do capitalism. Innovative firms will (hopefully) build more cheaply and faster, boost their profits, and become more competitive. Such companies will prosper, and others will begin to mimic the early adopters, investing in the new technologies and business models. In other words, the reality of seeing some builders profit by using data and automation will finally help drag the construction industry into the modern digital age.

Inside a fusion energy facility

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

On an overcast day in early October, I picked up a rental car and drove to Devens, Massachusetts, to visit a hole in the ground.

Commonwealth Fusion Systems has raised over $2 billion in funding since it spun out of MIT in 2018, all in service of building the first commercial fusion reactor. The company has ambitions to build power plants, but currently the goal is to finish putting together its first demonstration system, the SPARC reactor. The plan is to have it operating by 2026.

I visited the company’s site recently to check in on progress. Things are starting to come together around the hole in the floor where SPARC will eventually be installed. Looking around the site, I found it becoming easier to imagine a future that could actually include fusion energy. But there’s still a lot of work left to do. 

Fusion power has been a dream for decades. The idea is simple: Slam atoms together and use the energy that’s released to power the world. The systems would require small amounts of abundant fuel and wouldn’t produce dangerous waste. The problem is, executing this vision has been much slower than many had hoped.

Commonwealth is one of the leaders in commercial fusion. My colleague James Temple wrote a feature story, published in early 2022, about the company’s attempts to bring the technology to reality. At the time, the Devens location was still a muddy construction site, with the steel and concrete just starting to go into the ground.

Things are much more polished now—when I visited earlier this month, I pulled into one of the designated visitor parking spots and checked in at a reception desk in a bustling office building before beginning my tour. There were two main things to see: the working magnet factory and the cluster of buildings that will house and support the SPARC reactor.

We started in the magnet factory. SPARC is a tokamak, a device relying on powerful magnets to contain the plasma where fusion reactions take place. There will be three different types of magnets in SPARC, all arranged to keep the plasma in position and moving around in the right way.

The company is making its own magnets powered with tape made from a high-temperature superconductor, which generates a magnetic field when an electric current runs through it. SPARC will contain thousands of miles’ worth of this tape in its magnets. In the factory, specialized equipment winds up the tape and tucks it into metal cases, which are then stacked together and welded into protective shells.  

After our quick loop around the magnet factory, I donned a helmet, neon vest, and safety glasses and got a short safety talk that included a stern warning to not stare directly at any welding. Then we walked across a patio and down a gravel driveway to the main complex of buildings that will house the SPARC reactor.

Except for some remaining plywood stairs and dust, the complex appeared to be nearly completed. There’s a huge wall of glass on the front of the building—a feature intended to show that the company is open with the community about the goings-on inside, as my tour guide, chief marketing officer Joe Paluska, put it.  

Four main buildings surround the central tokamak hall. These house support equipment needed to cool down the magnets, heat up the plasma, and measure conditions in the reactor. Most of these big, industrial systems that support SPARC are close to being ready to turn on or are actively being installed, explained Alex Creely, director of tokamak operations, in a call after my tour.

When it was finally time to see the tokamak hall that will house SPARC, we had to take a winding route to get there. A maze of concrete walls funneled us to the entrance, and I lost track of my left and right turns. Called the labyrinth, this is a safety feature, designed to keep stray neutrons from escaping the hall once the reactor is operating. (Neutrons are a form of radiation, and enough exposure can be dangerous to humans.) 

Finally, we stepped into a cavernous space. From our elevated vantage point on a metal walkway, we peered down into a room with gleaming white floors and equipment scattered around the perimeter. At the center was a hole, covered with a tarp and surrounded by bright-yellow railings. That empty slot is where the star of the show, SPARC, will eventually be installed.

tokamak hall at Commonwealth Fusion Systems
The tokamak hall at Commonwealth Fusion Systems will house the company’s SPARC reactor.
COMMONWEALTH FUSION SYSTEMS

While there’s still very little tokamak in the tokamak hall right now, Commonwealth has an ambitious timeline planned: The goal is to have SPARC running and the first plasma in the reactor by 2026. The company plans to demonstrate that it can produce more energy in the reactor than is needed to power it (a milestone known as Q>1 in the fusion world) by 2027.

When we published our 2022 story on Commonwealth, the plan was to flip on the reactor and reach the Q>1 milestone by 2025, so the timeline has slipped. It’s not uncommon for big projects in virtually every industry to take longer than expected. But there’s an especially long and fraught history of promises and missed milestones in fusion. 

Commonwealth has certainly made progress over the past few years, and it’s getting easier to imagine the company actually turning on a reactor and meeting the milestones the field has been working toward for decades. But there’s still a tokamak-shaped hole in suburban Massachusetts waiting to be filled. 


Now read the rest of The Spark

Related reading

Read our 2022 feature on Commonwealth Fusion Systems and its path to commercializing fusion energy here

In late 2022, a reactor at a national lab in the US generated more energy than was put in, a first for the industry. Here’s what meeting that milestone actually means for clean energy

There’s still a lot of research to be done in fusion—here’s what’s coming next

Another company called Helion says its first fusion power plant is five years away. Experts are skeptical, to say the least.

AI e-waste

PHOTO ILLUSTRATION BY SARAH ROGERS/MITTR | PHOTOS GETTY

Another thing

Generative AI will add to our growing e-waste problem. A new study estimates that AI could add up to 5 million tons of e-waste by 2030. 

It’s a small fraction of the total, but there’s still good reason to think carefully about how we handle discarded servers and high-performance computing equipment, according to experts. Read more in my latest story

Keeping up with climate  

New York City will buy 10,000 induction stoves from a startup called Copper. The stoves will be installed in public housing in the city. (Heatmap)

Demand is growing for electric cabs in India, but experts say there’s not nearly enough supply to meet it. (Rest of World)

Pivot Bio aims to tweak the DNA of bacteria so they can help deliver nutrients to plants. The company is trying to break into an industry dominated by massive agriculture and chemical companies. (New York Times)

→ Check out our profile of Pivot Bio, which was one of our 15 Climate Tech Companies to Watch this year. (MIT Technology Review)

At least 62 people are dead and many more are missing in dangerous flooding across Spain. (Washington Post

A massive offshore wind lease sale this week offered up eight patches of ocean off the coast of Maine in the US. Four sold, opening the door for up to 6.8 gigawatts of additional offshore wind power. (Canary Media)

Climate change contributed to the deaths of 38,000 people across Europe in the summer of 2022, according to a new study. (The Guardian)

→ The legacy of Europe’s heat waves will be more air-conditioning, and that could be its own problem. (MIT Technology Review)

There are nearly 9,000 public fast-charging sites in the US, and a surprising wave of installations in the Midwest and Southeast. (Bloomberg)

Some proposed legislation aims to ban factory farming, but determining what that category includes is way more complicated than you might think. (Ambrook Research)

OpenAI brings a new web search tool to ChatGPT

ChatGPT can now search the web for up-to-date answers to a user’s queries, OpenAI announced today. 

Until now, ChatGPT was mostly restricted to generating answers from its training data, which is current up to October 2023 for GPT-4o, and had limited web search capabilities. Searches about generalized topics will still draw on this information from the model itself, but now ChatGPT will automatically search the web in response to queries about recent information such as sports, stocks, or news of the day, and can deliver rich multi-media results. Users can also manually trigger a web search, but for the most part, the chatbot will make its own decision about when an answer would benefit from information taken from the web, says Adam Fry, OpenAI’s product lead for search.

“Our goal is to make ChatGPT the smartest assistant, and now we’re really enhancing its capabilities in terms of what it has access to from the web,” Fry tells MIT Technology Review. The feature is available today for the chatbot’s paying users. 

ChatGPT triggers a web search when the user asks about local restaurants in this example

While ChatGPT search, as it is known, is initially available to paying customers, OpenAI intends to make it available for free later, even when people are logged out. The company also plans to combine search with its voice features and Canvas, its interactive platform for coding and writing, although these capabilities will not be available in today’s initial launch.

The company unveiled a standalone prototype of web search in July. Those capabilities are now built directly into the chatbot. OpenAI says it has “brought the best of the SearchGPT experience into ChatGPT.” 

OpenAI is the latest tech company to debut an AI-powered search assistant, challenging similar tools from competitors such as Google, Microsoft, and startup Perplexity. Meta, too, is reportedly developing its own AI search engine. As with Perplexity’s interface, users of ChatGPT search can interact with the chatbot in natural language, and it will offer an AI-generated answer with sources and links to further reading. In contrast, Google’s AI Overviews offer a short AI-generated summary at the top of the website, as well as a traditional list of indexed links. 

These new tools could eventually challenge Google’s 90% market share in online search. AI search is a very important way to draw more users, says Chirag Shah, a professor at the University of Washington, who specializes in online search. But he says it is unlikely to chip away at Google’s search dominance. Microsoft’s high-profile attempt with Bing barely made a dent in the market, Shah says. 

Instead, OpenAI is trying to create a new market for more powerful and interactive AI agents, which can take complex actions in the real world, Shah says. 

The new search function in ChatGPT is a step toward these agents. 

It can also deliver highly contextualized responses that take advantage of chat histories, allowing users to go deeper in a search. Currently, ChatGPT search is able to recall conversation histories and continue the conversation with questions on the same topic. 

ChatGPT itself can also remember things about users that it can use later —sometimes it does this automatically, or you can ask it to remember something. Those “long-term” memories affect how it responds to chats. Search doesn’t have this yet—a new web search starts from scratch— but it should get this capability in the “next couple of quarters,” says Fry. When it does, OpenAI says it will allow it to deliver far more personalized results based on what it knows.

“Those might be persistent memories, like ‘I’m a vegetarian,’ or it might be contextual, like ‘I’m going to New York in the next few days,’” says Fry. “If you say ‘I’m going to New York in four days,’ it can remember that fact and the nuance of that point,” he adds. 

To help develop ChatGPT’s web search, OpenAI says it leveraged its partnerships with news organizations such as Reuters, the Atlantic, Le Monde, the Financial Times, Axel Springer, Condé Nast, and Time. However, its results include information not only from these publishers, but any other source online that does not actively block its search crawler.   

It’s a positive development that ChatGPT will now be able to retrieve information from these reputable online sources and generate answers based on them, says Suzan Verberne, a professor of natural-language processing at Leiden University, who has studied information retrieval. It also allows users to ask follow-up questions.

But despite the enhanced ability to search the web and cross-check sources, the tool is not immune from the persistent tendency of AI language models to make things up or get it wrong. When MIT Technology Review tested the new search function and asked it for vacation destination ideas, ChatGPT suggested “luxury European destinations” such as Japan, Dubai, the Caribbean islands, Bali, the Seychelles, and Thailand. It offered as a source an article from the Times, a British newspaper, which listed these locations as well as those in Europe as luxury holiday options.

“Especially when you ask about untrue facts or events that never happened, the engine might still try to formulate a plausible response that is not necessarily correct,” says Verberne. There is also a risk that misinformation might seep into ChatGPT’s answers from the internet if the company has not filtered its sources well enough, she adds. 

Another risk is that the current push to access the web through AI search will disrupt the internet’s digital economy, argues Benjamin Brooks, a fellow at Harvard University’s Berkman Klein Center, who previously led public policy for Stability AI, in an op-ed published by MIT Technology Review today.

“By shielding the web behind an all-knowing chatbot, AI search could deprive creators of the visits and ‘eyeballs’ they need to survive,” Brooks writes.

This AI-generated version of Minecraft may represent the future of real-time video generation

When you walk around in a version of the video game Minecraft from the AI companies Decart and Etched, it feels a little off. Sure, you can move forward, cut down a tree, and lay down a dirt block, just like in the real thing. If you turn around, though, the dirt block you just placed may have morphed into a totally new environment. That doesn’t happen in Minecraft. But this new version is entirely AI-generated, so it’s prone to hallucinations. Not a single line of code was written.

For Decart and Etched, this demo is a proof of concept. They imagine that the technology could be used for real-time generation of videos or video games more generally. “Your screen can turn into a portal—into some imaginary world that doesn’t need to be coded, that can be changed on the fly. And that’s really what we’re trying to target here,” says Dean Leitersdorf, cofounder and CEO of Decart, which came out of stealth this week.

Their version of Minecraft is generated in real time, in a technique known as next-frame prediction. They did this by training their model, Oasis, on millions of hours of Minecraft gameplay and recordings of the corresponding actions a user would take in the game. The AI is able to sort out the physics, environments, and controls of Minecraft from this data alone. 

The companies acknowledge that their version of Minecraft is a little wonky. The resolution is quite low, you can only play for minutes at a time, and it’s prone to hallucinations like the one described above. But they believe that with innovations in chip design and further improvements, there’s no reason they can’t develop a high-fidelity version of Minecraft, or really any game. 

“What if you could say ‘Hey, add a flying unicorn here’? Literally, talk to the model. Or ‘Turn everything here into medieval ages,’ and then, boom, it’s all medieval ages. Or ‘Turn this into Star Wars,’ and it’s all Star Wars,” says Leitersdorf.

A major limitation right now is hardware. They relied on Nvidia cards for their current demo, but in the future, they plan to use Sohu, a new card that Etched has in development, which the firm claims will improve performance by a factor of 10. This gain would significantly cut down on the cost and energy needed to produce real-time interactive video. It would allow Decart and Etched to make a better version of their current demo, allowing the game to run longer, with fewer hallucinations, and at higher resolution. They say the new chip would also make it possible for more players to use the model at once.

“Custom chips for AI hold the potential to unlock significant performance gains and energy efficiency gains,” says Siddharth Garg, a professor of electrical and computer engineering at NYU Tandon, who is not associated with Etched or Decart.

Etched says that its gains come from designing their cards specifically for AI development. For example, the chip uses a single core, which it says makes it possible to handle complicated mathematical operations with more efficiency. The chip also focuses on inference (where an AI makes predictions) over training (where an AI learns from data).

“We are building something much more specialized than all of the chips out on the market today,” says Robert Wachen, cofounder and COO of Etched. They plan to run projects on the new card next year. Until the chip is deployed or its capabilities are verified, Etched’s claims are yet to be substantiated. And given the extent of AI specialization already in the top GPUs on the market, Garg is “very skeptical about a 10x improvement just from smarter or more specialized design.”

But the two companies have big ambitions. If the efficiency gains are close to what Etched claims, they believe, they will be able to generate real-time virtual doctors or tutors. “All of that is coming down the pipe, and it comes from having a better architecture and better hardware to power it. So that’s what we’re really trying to get people to realize with the proof of concept here,” says Wachen.

For the time being, you can try out the demo of their version of Minecraft here.

The arrhythmia of our current age

Thumpa-thumpa, thumpa-thumpa, bump, 

thumpa, skip, 

thumpa-thump, pause …

My heart wasn’t supposed to be beating like this. Way too fast, with bumps, pauses, and skips. On my smart watch, my pulse was topping out at 210 beats per minute and jumping every which way as my chest tightened. Was I having a heart attack? 

The day was July 4, 2022, and I was on a 12-mile bike ride on Martha’s Vineyard. I had just pedaled past Inkwell Beach, where swimmers sunbathed under colorful umbrellas, and into a hot, damp headwind blowing off the sea. That’s when I first sensed a tugging in my chest. My legs went wobbly. My head started to spin. I pulled over, checked my watch, and discovered that I was experiencing atrial fibrillation—a fancy name for a type of arrhythmia. The heart beats, but not in the proper time. Atria are the upper chambers of the heart; fibrillation means an attack of “uncoordinated electrical activity.”   

I recount this story less to describe a frightening moment for me personally than to consider the idea of arrhythmia—a critical rhythm of life suddenly going rogue and unpredictable, triggered by … what? That July afternoon was steamy and over 90 °F, but how many times had I biked in heat far worse? I had recently recovered from a not-so-bad bout of covid—my second. Plus, at age 64, I wasn’t a kid anymore, even if I didn’t always act accordingly.  

Whatever the proximal cause, what was really gripping me on July 4, 2022, was the idea of arrhythmia as metaphor. That a pulse once seemingly so steady was now less sure, and how this wobbliness might be extrapolated into a broader sense of life in the 2020s. I know it’s quite a leap from one man’s abnormal ticker to the current state of an entire species and era, but that’s where my mind went as I was taken to the emergency department at Martha’s Vineyard Hospital. 

Maybe you feel it, too—that the world seems to have skipped more than a beat or two as demagogues rant and democracy shudders, hurricanes rage, glaciers dissolve, and sunsets turn a deeper orange as fires spew acrid smoke into the sky, and into our lungs. We can’t stop watching tiny screens where influencers pitch products we don’t need alongside news about senseless wars that destroy, murder, and maim tens-of-thousands. Poverty remains intractable for billions. So does loneliness and a rising crisis in mental health even as we fret over whether AI is going to save us or turn us into pets; and on and on.

For most of my life, I’ve leaned into optimism, confident that things will work out in the end. But as a nurse admitted me and attached ECG leads to my chest, I felt a wave of doubt about the future. Lying on a gurney, I watched my pulse jump up and down on a monitor, erratically and still way too fast, as another nurse poked a needle into my hand to deliver an IV bag of saline that would hydrate my blood vessels. Soon after, a young, earnest doctor came in to examine me, and I heard the word uttered for the first time. 

“You are having an arrhythmia,” he said.

Even with my heart beating rat-a-tat-tat, I couldn’t help myself. Intrigued by the word, which I had heard before but had never really heard, I pulled out the phone that is always at my side and looked it up.

ar·rhyth·mi·a
Noun: “a condition in which the heart beats with an irregular or abnormal  rhythm.” Greek a-, “without,” and rhuthmos, “rhythm.”

I lay back and closed my eyes and let this Greek origin of the word roll around in my mind as I repeated it several times—rhuthmos, rhuthmos, rhuthmos.

Rhythm, rhythm, rhythm …

I tapped my finger to follow the beat of my heart, but of course I couldn’t, because my heart wasn’t beating in the steady and predictable manner that my finger could easily have followed before July 4, 2022. After all, my heart was built to tap out in a rhythm, a rhuthmos—not an arhuthmos

Later I discovered that the Greek rhuthmos, ῥυθμός, like the English rhythm, refers not only to heartbeats but to any steady motion, symmetry, or movement. For the ancient Greeks this word was closely tied to music and dance; to the physics of vibration and polarity; to a state of balance and harmony. The concept of rhuthmos was incorporated into Greek classical sculptures using a strict formula of proportions called the Kanon, an example being the Doryphoros (Spear Bearer) originally by the fifth century sculptor Polykleitos. Standing today in the Acropolis Museum in Athens this statue appears to be moving in an easy fluidity, a rhuthmos that’s somehow drawn out of the milky-colored stone. 

The Greeks also thought of rhuthmos as harmony and balance in emotions, with Greek playwrights penning tragedies where the rhuthmos of life, nature, and the gods goes awry. “In this rhythm, I am caught,” cries Prometheus in Aeschylus’s Prometheus Bound, where rhuthmos becomes a steady, unrelenting punishment inflicted by Zeus when Prometheus introduces fire to humans, providing them with a tool previously reserved for the gods. Each day Prometheus, who is chained to a rock, has his liver eaten out by an eagle, only to have the liver grow back each night, a cycle repeated day after day in a steady beat for an eternity of penance, pain, and vexation.

In modern times, cardiologists have used rhuthmos to refer to the physical beating of the muscle in our chests that mixes oxygen and blood and pumps it through 60,000 miles of veins, arteries, and capillaries to fingertips, toe tips, frontal cortex, kidneys, eyes, everywhere. In 2006, the journal Rhythmos launched as a quarterly medical publication that focuses on cardiac electrophysiology. This subspecialty of cardiology involves the electrical signals animating the heart with pulses that keep it beating steadily—or, for me in the summer of 2022, not. 

The question remained: Why?

As far as I know, I wasn’t being punished by Zeus, although I couldn’t entirely rule out the possibility that I had annoyed some god or goddess and was catching hell for it. Possibly covid was the culprit—that microscopic bundle of RNA with the power of a god to mess with us mortals—but who knows? As science learns more about this pernicious bug, evidence suggests that it can play havoc with the nervous system and tissue that usually make sure the heart stays in rhuthmos

A-fib also can be instigated by even moderate imbibing of alcohol, by aging, and sometimes by a gene called KCNQ1. Mutations in this gene “appear to increase the flow of potassium ions through the channel formed with the KCNQ1 protein,” according to MedlinePlus, part of the National Library of Medicine. “The enhanced ion transport can disrupt the heart’s normal rhythm, resulting in atrial fibrillation.” Was a miscreant  mutation playing a role in my arrhythmia?

Angst and fear can influence A-fib too. I had plenty of both during the pandemic, along with most of humanity. Lest we forget—and we’re trying really, really hard to forget—covid anxiety continued to rage in the summer of 2022, even after vaccines had arrived and most of the world had reopened. 

Back then, the damage done to fragile brains forced to shelter in place for months and months was still fresh. Cable news and social media continued to amplify the terror of seeing so many people dead or facing permanent impairment. Politics also seemed out of control, with demagogues—another Greek word—running amok. Shootings, invasions, hatred, and fury seemed to lurk everywhere. This is one reason I stopped following the news for days at a time—something I had never done, as a journalist and news junkie. I felt that my fragile heart couldn’t bear so much visceral tragedy, so much arhuthmos.

We each have our personal stories from those dark days. For me, covid came early in 2020 and led to a spring and summer with a pervasive brain fog, trouble breathing, and eventually a depression of the sort that I had never experienced before. At the same time, I had friends who ended up in the ICU, and I knew people whose parents and other relatives had passed. My mother was dying of dementia, and my father had been in and out of the ICU a half-dozen times with myasthenia gravis, an autoimmune disease that can be fatal. This family dissolution had started before covid hit, but the pandemic made the implosion of my nuclear family seem worse and undoubtedly contributed to the failure of my heart’s pulse to stay true. 


Likewise, the wider arhuthmos some of us are feeling now began long before the novel coronavirus shut down ordinary life in March 2020. Statistics tell us that anxiety, stress, depression, and general mental unhealthiness have been steadily ticking up for years. This seems to suggest that something bigger has been going on for some time—a collective angst that seems to point to the darker side of modern life itself. 

Don’t get me wrong. Modern life has provided us with spectacular benefits—Manhattan, Boeing 787 Dreamliners, IMAX films, cappuccinos, and switches and dials on our walls that instantly illuminate or heat a room. Unlike our ancestors, most of us no longer need to fret about when we will eat next or whether we’ll find a safe place to sleep, or worry that a saber-toothed tiger will eat us. Nor do we need to experience an A-fib attack without help from an eager and highly trained young doctor, an emergency department, and an IV to pump hydration into our veins. 

But there have been trade-offs. New anxieties and threats have emerged to make us feel uneasy and arrhythmic. These start with an uneven access to things like emergency departments, eager young doctors, shelter, and food—which can add to anxiety not only for those without them but also for anyone who finds this situation unacceptable. Even being on the edge of need can make the heart gambol about.

Consider, too, the basic design features of modern life, which tend toward straight lines—verticals and horizontals. This comes from an instinct we have to tidy up and organize things, and from the fact that verticals and horizontals in architecture are stable and functional. 

All this straightness, however, doesn’t always sit well with brains that evolved to see patterns and shapes in the natural world, which isn’t horizontal and vertical. Our ancestors looked out over vistas of trees and savannas and mountains that were not made from straight lines. Crooked lines, a bending tree, the fuzzy contour of a grassy vista, a horizon that bobs and weaves—these feel right to our primordial brains. We are comforted by the curve of a robin’s breast and the puffs and streaks and billows of clouds high in the sky, the soft earth under our feet when we walk.

Not to overly romanticize nature, which can be violent, unforgiving, and deadly. Devastating storms and those predators with sharp teeth were a major reason why our forebears lived in trees and caves and built stout huts surrounded by walls. Homo sapiens also evolved something crucial to our survival—optimism that they would survive and prevail. This has been a powerful tool—one of the reasons we are able to forge ahead, forget the horrors of pandemics and plagues, build better huts, and learn to make cappuccinos on demand. 

As one of the great optimists of our day, Kevin Kelly, has said: “Over the long term, the future is decided by optimists.” 

But is everything really okay in this future that our ancestors built for us? Is the optimism that’s hardwired into us and so important for survival and the rise of civilization one reason for the general anxiety we’re feeling in a future that has in some crucial ways turned out less ideal than those who constructed it had hoped? 

At the very least, modern life seems to be downplaying elements that are as critical to our feelings of safety as sturdy walls, standing armies, and clean ECGs—and truly more crucial to our feelings of happiness and prosperity than owning two cars or showing off the latest swimwear on Miami Beach. These fundamentals include love and companionship, which statistics tell us are in short supply. Today millions have achieved the once optimistic dream of living like minor pharaohs and kings in suburban tract homes and McMansions, yet inadvertently many find themselves separated from the companionship and community that are basic human cravings. 

Modern science and technology can be dazzling and good and useful. But they’ve also been used to design things that hurt us broadly while spectacularly benefiting just a few of us. We have let the titans of social media hijack our genetic cravings to be with others, our need for someone to love and to love us, so that we will stay glued to our devices, even in the ED when we think we might be having a heart attack. Processed foods are designed to play on our body’s craving for sweets and animal fat, something that evolution bestowed so we would choose food that is nutritious and safe to eat (mmm, tastes good) and not dangerous (ugh, sour milk). But now their easy abundance overwhelms our bodies and makes many of us sick. 

We invented money so that acquiring things and selling what we make in order to live better would be faster and easier. In the process, we also invented a whole new category of anxiety—about money. We worry about having too little of it and sometimes too much; we fear that someone will steal it or trick us into spending it on things we don’t need. Some of us feel guilty about not spending enough of it on feeding the hungry or repairing our climate. Money also distorts elections, which require huge amounts of it. You may have gotten a text message just now, asking for some to support a candidate you don’t even like. 

The irony is that we know how to fix at least some of what makes us on edge. For instance, we know we shouldn’t drive gas-guzzling SUVs and that we should stop looking at endless perfect kitchens, too-perfect influencers, and 20-second rants on TikTok. We can feel helpless even as new ideas and innovations proliferate. This may explain one of the great contradictions of this age of arrhythmia—one demonstrated in a 2023 UNESCO global survey about climate change that questioned 3,000 young people from 80 different countries, aged 16 to 24. Not surprisingly, 57% were “eco-anxious.” But an astonishing 67% were “eco-optimistic,” meaning many were both anxious and hopeful. 

Me too. 

All this anxiety and optimism have been hard on our hearts—literally and metaphorically. Too much worry can cause this fragile muscle to break down, to lose its rhythm. So can too much of modern life. Cardiovascular disease remains the No. 1 killer of adults, in the US and most of the world, with someone in America dying of it every 33 seconds, according to the Centers for Disease Control and Prevention. The incidence of A-fib has tripled in the past 50 years (possibly because we’re diagnosing it more); it afflicted almost 50 million people globally in 2016.


For me, after that initial attack on Martha’s Vineyard, the A-fib episodes kept coming. I charted them on my watch, the blips and pauses in my pulse, the moments when my heart raced at over 200 beats per minute, causing my chest to tighten and my throat to feel raw. Sometimes I tasted blood, or thought I did. I kept bicycling through the summer and fall of 2022, gingerly watching my heart rate to see if I could keep the beats from taking a sudden leap from normal to out of control. 

When an arrhythmic episode happened, I struggled to catch my breath as I  pulled over to the roadside to wait for the misfirings to pass. Sometimes my mind grew groggy, and I got confused. It became difficult during these cardio-disharmonious moments to maintain my cool with other people. I became less able to process the small setbacks that we all face every day—things I previously had been able to let roll off my back. 

Early in 2023 I had my heart checked by a cardiologist. He conducted an echocardiogram and had me jog on a treadmill hooked up to monitors. “There has been no damage to your heart,” he declared after getting the results, pointing to a black-and-white video of my heart muscle contracting and constricting, drawing in blood and pumping it back out again. I felt relieved, although he also said that the A-fib was likely to persist, so he prescribed a blood thinner called Eliquis as a precaution to prevent stroke. Apparently, during unnatural pauses in one’s heartbeat blood can clot and send tiny, scab-like fragments into the brain, potentially clogging up critical capillaries and other blood vessels. “You don’t want that to happen,” said the cardiologist.

Toward the end of my heart exam, the doctor mentioned a possible fix for my arrhythmia. I was skeptical, although what he proposed turned out to be one of the great pluses of being alive right now—a solution that was unavailable to my ancestors or even to my grandparents. “It’s called a heart ablation,” he said. The procedure, a simple operation, redirects errant electric signals in the heart muscle to restore a normal pattern of beating. Doctors will run a tube into your heart, find the abnormal tissue throwing off the rhythm, and zap it with either extreme heat, cold, or (the newest option) electrical pulses. There are an estimated 240,000 such procedures a year in the United States. 

“Can you really do that?” I asked.

“We can,” said the doctor. “It doesn’t always work the first time. Sometimes you need a second or third procedure, but the success rate is high.”

A few weeks later, I arrived at Beth Israel Hospital in Boston at 11 a.m. on a Tuesday. My first cardiologist was unavailable to do the procedure, so after being prepped in the pre-op area I was greeted by Andre d’Avila, a specialist in electrocardiology, who explained again how the procedure worked. He said  that he and an electrophysiology fellow would be inserting long, snakelike catheters through the femoral veins in my groin that contain wires tipped with a tiny ultrasound camera and a cauterizer that would be used to selectively and carefully burn the surfaces of my atrial muscles. The idea was to create patterns of scar tissue to block and redirect the errant electrical signals and restore a steady rhuthmos to my heart. The whole thing would take about two or three hours, and I would likely be going home that afternoon.

Moments later, an orderly came and wheeled me through busy hallways to an OR where Dr. d’Avila introduced the technicians and nurses on his OR team. Monitors pinged and machines whirred as moments later an anesthesiologist placed a mask over my mouth and nose, and I slipped into unconsciousness. 

The ablation was a success. Since I woke up, my heart has kept a steady beat, restoring my internal rhuthmos, even if the procedure sadly did not repair the myriad worrisome externalities—the demagogues, carbon footprints, and the rest. Still, the undeniably miraculous singeing of my atrial muscles left me with a realization that if human ingenuity can fix my heart and restore its rhythm, shouldn’t we be able to figure out how to fix other sources of arhuthmos in our lives? 

We already have solutions to some of what ails us. We know how to replace fossil fuels with renewables, make cities less sharp-edged, and create smart gizmos and apps that calm our minds rather than agitating them. 

For my own small fix, I thank Dr. d’Avila and his team, and the inventors of the ablation procedure. I also thank Prometheus, whose hubris in bringing fire to mortals literally saved me by providing the hot-tipped catalyst to repair my ailing heart. Perhaps this can give us hope that the human species will bring the larger rhythms of life into a better, if not perfect, beat. Call me optimistic, but also anxious, about our prospects even as I can now place my finger on my wrist and feel once again the steady rhuthmos of my heart.