How tracking animal movement may save the planet

There was something strange about the way the sharks were moving between the islands of the Bahamas.

Tiger sharks tend to hug the shoreline, explains marine biologist Austin Gallagher, but when he began tagging the 1,000-pound animals with satellite transmitters in 2016, he discovered that these predators turned away from it, toward two ancient underwater hills made of sand and coral fragments that stretch out 300 miles toward Cuba. They were spending a lot of time “crisscrossing, making highly tortuous, convoluted movements” to be near them, Gallagher says. 

It wasn’t immediately clear what attracted sharks to the area: while satellite images clearly showed the subsea terrain, they didn’t pick up anything out of the ordinary. It was only when Gallagher and his colleagues attached 360-degree cameras to the animals that they were able to confirm what they were so drawn to: vast, previously unseen seagrass meadows—a biodiverse habitat that offered a smorgasbord of prey.   

The discovery did more than solve a minor mystery of animal behavior. Using the data they gathered from the sharks, the researchers were able to map an expanse of seagrass stretching across 93,000 square kilometers of Caribbean seabed—extending the total known global seagrass coverage by more than 40%, according to a study Gallagher’s team published in 2022. This revelation could have huge implications for efforts to protect threatened marine ecosystems—seagrass meadows are a nursery for one-fifth of key fish stocks and habitats for endangered marine species—and also for all of us above the waves, as seagrasses can capture carbon up to 35 times faster than tropical rainforests. 

Animals have long been able to offer unique insights about the natural world around us, acting as organic sensors picking up phenomena that remain invisible to humans. More than 100 years ago, leeches signaled storms ahead by slithering out of the water; canaries warned of looming catastrophe in coal mines until the 1980s; and mollusks that close when exposed to toxic substances are still used to trigger alarms in municipal water systems in Minneapolis and Poland. 

a tiger shark seen underwater with a camera on its flank
Attaching 360-degree cameras to tiger sharks helped demystify the
animals’ strange movements around the Bahamas.
COURTESY OF BENEATH THE WAVES

These days, we have more insight into animal behavior than ever before thanks to sensor tags, which have helped researchers answer key questions about globe-spanning migrations and the sometimes hard-to-reach places animals visit along the way. In turn, tagged animals have increasingly become partners in scientific discovery and planetary monitoring.

But the data we gather from these animals still adds up to only a relatively narrow slice of the whole picture. Results are often confined to silos, and for many years tags were big and expensive, suitable only for a handful of animal species—like tiger sharks—that are powerful (or large) enough to transport them. 

This is beginning to change. Researchers are asking: What will we find if we follow even the smallest animals? What if we could monitor a sample of all the world’s wildlife to see how different species’ lives intersect? What could we learn from a big-data system of animal movement, continuously monitoring how creatures big and small adapt to the world around us? It may be, some researchers believe, a vital tool in the effort to save our increasingly crisis-plagued planet. 

Wearables for the wild

Just a few years ago, a project called ICARUS seemed ready to start answering the big questions about animal movement. 

A team led by Martin Wikelski, a director at the Max Planck Institute of Animal Behavior in southern Germany and a pioneer in the field, launched a new generation of affordable and lightweight GPS sensors that could be worn by animals as small as songbirds, fish, and rodents. 

Wikelski holding a bat with a tag on its leg
Martin Wikelski envisions a big-data system that monitors animal behavior to help us better understand the environment.
CHRISTIAN ZIEGLER/MAX PLANCK INSTITUTE FOR ORNITHOLOGY

These Fitbits for wild creatures, to use Wikelski’s analogy, could produce live location data accurate to a few meters and simultaneously allow scientists to monitor animals’ heart rates, body heat, and sudden movements, plus the temperature, humidity, and air pressure in their surroundings. The signals they transmitted would be received by a three-meter antenna affixed to the International Space Station—the result of a €50 million investment from the German Aerospace Centre and the Russian Space Agency—and beamed down to a data bank on Earth, producing a map of the animals’ paths in close to real time as they crisscrossed the globe.

Wikelski and his peers hoped the project, formally the International Cooperation for Animal Research Using Space, would provide insights about a much wider variety of animals than they’d previously been able to track. It also aimed to show proof of concept for Wikelski’s dream of the past several decades: the Internet of Animals—a big-data system that monitors and analyzes animal behavior to help us understand the planet and predict the future of the environment.

Researchers have been laying the groundwork for years, connecting disparate data sets on animal movement, the environment, and weather and analyzing them with the help of AI and automated analytics. But Wikelski had his sights on something even grander and more comprehensive: a dashboard in which 100,000 sensor-tagged animals could be simultaneously monitored as near-real-time data flowed in from Earth-imaging satellites and ground-based sources. 

By bringing together each of these snapshots of animals’ lives, we might begin to understand the forces that are shaping life across the planet. The project had the potential to help us better understand and conserve the world’s most vulnerable species, showing how animals are responding to the challenges of climate change and ecosystem loss. It also promised another way to monitor the Earth itself during a period of increasing instability, transforming our animal co-inhabitants into sentinels of a changing world. 

When ICARUS first went into space in 2018, it was widely celebrated in the press. Yet what should have been a moment of glory for Wikelski and the field of animal ecology instead became a test of his will. The ICARUS antenna first went down for a year because of a technical issue; it went back up but was only just out of testing in February 2022 when the Russian invasion of Ukraine halted the project altogether.

Wikelski and his peers, though, have used the time since to innovate and evangelize. They now envision a more complete and technologically advanced version of the Internet of Animals than the one they hoped to build even just a few years ago, thanks to innovations in tracking technologies and AI and satellite systems. They have made even smaller and cheaper sensors and found a new, more affordable way to work in space with microsatellites called CubeSats. Their efforts have even gotten NASA to invest its time and resources into the possibility of building the Internet of Animals.

Now Wikelski and his collaborators are again on the verge, with an experimental CubeSat successfully transmitting data as part of a testing phase that started last June. If all goes as planned, another fully operational ICARUS CubeSat will begin collecting data next year, with more launches to follow. 

The potential benefits of this system are extraordinary and still not yet fully understood, says Scott Yanco, a researcher in movement ecology at the University of Michigan. Perhaps it could help prevent mountain lion attacks or warn about a zoonotic disease about to make a jump to humans. It could alert researchers of behavioral changes that seem to happen in some animals before earthquakes, a phenomenon Wikelski has studied, and determine what conditions tell boobies in the Indo-Pacific to lay fewer eggs in years before strong El Niños or signal to weaver birds in the Niger Delta to build their nests higher up before floods. 

“You can talk to 100 scientists about this,” Yanco says, “and they’re all going to give you a different answer of what they’re interested in.”

But first, a lot still needs to go right. 

Animals as sentinels

When I first spoke with Wikelski, in early 2022, ICARUS was live, tracking 46 species from the ISS 400 kilometers overhead. Wearing a pair of square-rimmed glasses and speaking in a German accent with a tone of unfailing urgency, he was excited to tell me about a tagged blackbird who made a 1,000-or-so-kilometer crossing from Belarus to Albania. 

That was actually pretty routine, Wikelski said, but almost everything else he had been seeing over the past year of road-testing had been stranger than expected. White storks were crossing back and forth over the Sahara five times a season, without apparent reason. Cuckoos, which are tree-dwelling birds ill suited to long periods at sea, were making uninterrupted journeys from India to the Horn of Africa. “Now, any time you look, totally novel aspects appear, and novel connections appear across continents,” he told me.

This could have been a mystifying mess. But for Wikelski, it was “beautiful data.” 

The practice of tagging animals to monitor their movements has been used for more than 100 years, though it began with a stroke of luck. In the 1820s, a hunter in a village in central Africa threw a 30-inch spear that lodged itself nonfatally in the neck of a white stork. This became what might have been the world’s first tag on a wild animal, says Yanco: the bird somehow flew back to Germany in the spring, helping settle the mystery of where storks disappeared to in the winter. 

By the 1890s, scientists had started tracking wild birds with bands fitted around their legs—but 49 out of every 50 ring-tagged birds were never seen again. Starting in the 1960s, thousands of birds received very-high-frequency radio tags known as “pingers,” but these were only powerful enough to broadcast a few kilometers. To capture the data, researchers had to embark on cartoonish chase scenes, in which tagged birds were pursued by an oversize homing antenna pointed out the roof of a car, plane, or hang-glider. 

More than 100 years
ago, leeches held in a
contraption called the
Tempest Prognosticator
provided signals of storms ahead by slithering out of
water in glass bottles.
model on a plinth of a stork with a spear thrust vertically through its neck
In the 1820s, a hunter in central Africa threw a spear that lodged itself
nonfatally in the neck
of a white stork. This
became what might have been the world’s first wild-animal tag.
Monique the Elk with a transmitter collar around her neck
NASA invented
space-based animal
tracking in 1970 when it
strapped a transmitter
collar the weight of two
bowling balls around the
neck of Monique the Space Elk, a local news celebrity at the time.
a crowd of miners facing left . The nearest holds a canary in a square cage
Canaries warned of looming catastrophe
in coal mines until the
1980s.

Wikelski tried all three. During a stint at the University of Illinois in Urbana-Champaign in the mid-’90s, he was studying thrushes and would gun an Oldsmobile around the Midwest at over 70 miles per hour. He’d set off as the songbirds got going at around 2 a.m., which tended to draw the attention of local police. Wikelski found that contrary to the conventional wisdom, thrushes used just 29% of their energy on their overnight migrations, less than they expended hunting and sheltering during stopovers. But the hassle of his process, which also entailed capturing and recapturing birds to weigh them, convinced Wikelski that, among other things, he needed better tools.

Thinking bigger (and higher) 

It was not immediately clear that the solution to Wikelski’s problems would be in space, though the idea of tracking animals via satellite had been explored decades before his Oldsmobile experiments. 

In fact, NASA invented space-based animal tracking back in 1970 when it strapped a transmitter collar the weight of two bowling balls around the neck of Monique the Space Elk, a local news celebrity at the time. (Monique was actually two elks: the anointed Monique, who wore a dummy collar for testing and press photos, and another, who accidentally caught a misfired tranquilizer dart and subsequently got the satellite transmitter collar.) After the Moniques met untimely deaths—one from starvation, the other at the hands of a hunter—the project went dormant too. 

But its research lived on in Argos, a weather monitoring system established in 1978 by the National Oceanic and Atmospheric Administration (NOAA) and the French space agency. It pioneered a way to track a tagged animal’s location by beaming up a short stream of analog data and measuring wave compression—the so-called Doppler shift—as a polar-­orbiting satellite zoomed overhead at thousands of miles an hour. But this captured locations to only a few hundred meters, at best, and typically required a clear line of sight between tag and satellite—a challenge when working with animals below the canopy of rainforests, for instance. 

Wikelski worked extensively with Argos but found that the technology didn’t enable him to capture the highly detailed whole-life data he craved. By the late ’90s, he was on an island in Panama, exploring an alternative approach that followed hundreds of animals from 38 species, including small mammals and insects. 

Using six long-distance radio towers, Wikelski and Roland Kays, now the director of the Biodiversity Laboratory at the North Carolina Museum of Natural Sciences, started to develop the Automated Radio Telemetry System (ARTS), a radio collar tracking system that could penetrate thick canopy. Crucially, ARTS revealed interactions between species—for example, how predatory ocelots support the island’s palm trees by eating large quantities of rabbit-like agoutis, after the rodents bury palm seeds underground as a snack for later. The researchers also found that despite what everyone believed, many of the animal inhabitants don’t remain on the island year-round, but frequently travel to the mainland. Kays and Wikelski had demonstrated in microcosm the kinds of insights that fine-grained multispecies tracking could provide even in challenging environments.

But Wikelski was frustrated that he couldn’t follow animals off the map. “If we don’t know the fate of an animal, we will never be able to really do good biology,” he says. The only solution would be to have a map with no edge. 

This was around the time that GPS trackers became small enough to be used in animal tags. While radio tags like those used by Argos estimated location by transmitting signals to receivers, GPS systems like those in cars download data from three or more satellites to triangulate location precisely. 

Wikelski became a man possessed by the idea of using this technology to create a truly global animal monitoring system. He envisioned digital tags that could capture GPS data throughout the day and upload packets of data to satellites that would periodically pass overhead. This idea would generate both excitement and a lot of skepticism. Peers told Wikelski that his dream system was unrealistic and unworkable.

At the turn of the millennium, he took a position at Princeton with the notion that the institutional pedigree might earn an audience for his “crazy” idea. Not long after he arrived, the chief of NASA’s Jet Propulsion Laboratory came for a talk, and Wikelski asked whether the agency would benefit from a satellite system that could track birds. “He looked at me as if I came from a different planet,” Wikelski remembers. Still, he got a meeting with NASA—though he says he was laughed out of the building. By this time, the agency had apparently forgotten all about Monique. 

Undeterred, in 2002 Wikelski launched ICARUS, a half-joke (for fans of Greek mythology) at his own immodest ambitions. It aimed to use digital GPS tags and satellites that would relay the information to a data center on Earth nearly as instantly as the ARTS system had.

Wikelski’s big ideas continued to run into big doubts. “At the time, people told us technology-wise, it will never work,” he says. Even 10 years ago, when Wikelski was making proposals to space agencies, he was told to avoid digital tech altogether in favor of tried-and-tested Argos-style communication. “Don’t go digital!” he recalls people telling him. “This is completely impossible! You have to do it analog.” 

Moving away from the fringe

In the two decades since ICARUS was established, the scientific community has caught up, thanks to developments in consumer tech. The Internet of Things made two-way digital communications with small devices viable, while lithium batteries have shrunk to sizes that more animals can carry and smartphones have made low-cost GPS and accelerometers increasingly available.

“We’re going from where we couldn’t really track most vertebrate species on the planet to flipping it. We’re now able to track most things,” says Yanco, emphasizing that this is possible “to varying degrees of accuracy and resolution.” 

The other key advance has been in data systems, and in particular the growth of Movebank, a central repository of animal tracking data that was developed from Wikelski’s ARTS system. Movebank brings together terrestrial-animal tracking data from various streams, including location data from the Argos system and from new high-res digital satellites, like ICARUS’s antenna on the ISS. (There are also plans to incorporate CubeSat data.) To date, it has collected 6 billion data points from more than 1,400 species, tracking animals’ full life cycles in ways that Wikelski once could only dream about. It is now a key part of the plumbing of the animal internet. 

The field also had some practical successes, which in turn allowed it to marshal additional resources. In 2016 in London, for instance, where air pollution was responsible for nearly 10,000 human deaths a year, researchers from Imperial College and the tech startup Plume Labs released 10 racing pigeons equipped with sensors for nitrogen dioxide and ozone emissions from traffic. Daily updates (tweeted out by the Pigeon Air Patrol account) showed how taking a pigeon’s path through the neighborhoods revealed pollution hot spots that weather stations missed.

Diego Ellis Soto, a NASA research fellow and a Yale PhD candidate studying animal ecology, highlights an experiment from 2018: flocks of storks were outfitted with high-resolution GPS collars to monitor the air movements they encountered over the open ocean. Tagged storks were able to capture live data on turbulence, which can be notoriously hard for airlines to predict.

Among the critical roles for these animal sensors was one that was once considered eccentric: predicting weather and the world’s fast-changing climate patterns. Animals equipped with temperature and pressure sensors essentially act as free-roaming weather buoys that can beam out readings from areas underserved by weather stations, including polar regions, small islands, and much of the Global South. Satellites struggle to measure many environmental variables, including ocean temperatures, which can also be prohibitively expensive for drones to collect. “Eighty percent of all measurements in Antarctica of sea surface temperature are collected by elephant seals, and not by robots or icebreakers,” Ellis Soto says. “These seals can just swim underneath the ice and [do] stuff that robots can’t do.” The seals are now tagged yearly, and the data they collect helps refine weather models that predict El Niño and sea-level rise.

When the ICARUS antenna was installed on the ISS in August 2018, it seemed poised to unlock even more capabilities and discoveries. In the antenna’s short life, the project recorded the movements of bats, birds, and antelope in near-real time, from Alaska to the islands of Papua New Guinea, and transferred the data to Movebank. But when the experiment ground to a premature halt, Wikelski knew he’d have to do something different, and he concocted a plan by which ICARUS could continue—whether it could rely on a major space agency or not.

Another shot

Rather than a system of major satellites, the new incarnation of ICARUS will run on CubeSats: low-cost, off-the-shelf microsatellites launched into low Earth orbit (around the same height as the ISS) for around $800,000, meaning even developing nations that harbor space ambitions can be part of the project. CubeSats also offer the benefit of truly global coverage; the ISS’s orbital path means it can’t pick up signals from polar regions further north than southern Sweden or further south than the tip of Chile.

There’s currently one ICARUS CubeSat in testing, having launched into orbit last summer. If all goes well, a CubeSat funded by the Max Planck Society, in collaboration with the University of the Bundeswehr Munich, will launch next April, followed by another in winter 2025, and—they’re hoping—another in 2026. Each further addition allows the tags to upload once more per day, increasing the temporal resolution and bringing the system closer to truly real-time tracking. 

map of the earth with flight pattern of tracked birds shown in red

Outfitting even small animals with lightweight, inexpensive GPS sensors, like the one on this blackbird, and monitoring how they move around the world could provide insights into the global effects of climate change.

Wikelski and his partners have also rededicated themselves to making even smaller tags. They’re close to the goal of getting them down to three grams, which would in theory make it possible to track more than half of mammal species and around two-fifths of birds, plus hundreds of species of crocodiles, turtles, and lizards. ICARUS’s tags are also now cheaper (costing just $150) and smarter. ICARUS developed AI-on-chip systems that can reduce the energy use by orders of magnitude to cut down on the size of batteries, Wikelski explains. There are also new tags being tested by scientists from the University of Copenhagen and Wikelski’s institute at Max Planck that harvest energy from animal movements, like a self-winding wristwatch. Finally, these new ICARUS sensors can also be reprogrammed remotely, thanks to their two-way Internet of Things–style communications. A new ecosystem of tag makers—professional and DIY—is further driving down prices, open-sourcing innovation, and allowing experimentation. 

Still, not everyone has bought into ICARUS. Critics question the costs compared with those of existing terrestrial monitoring initiatives like MOTUS, a national Canadian bird conservation program that uses a network of 750 receiving towers. Others argue that researchers can make better use of the thousands of animals already tracked by Argos, which is upgrading to more accurate tags and is also set to launch a series of CubeSats. The total cost of a fully realized ICARUS system—100,000 animals at any one time, some of which die or disappear as new ones are tagged—is around $10 million to $15 million a year. “If you’re thinking about how to tag a moose or bighorn sheep, you might need to hire a helicopter and the whole team and the vet,” says Ellis Soto, who has long collaborated with Wikelski. “So the costs can be extremely, extremely limiting.” 

But, proponents argue, the initiative would beget a lot more information than other Earth-imaging space missions and be significantly cheaper than sending humans or drones to collect data from remote locations like polar ice sheets. Wikelski also emphasizes that no one entity will bear the cost. He is working with local communities in Bhutan, South Africa, Thailand, China, Russia, and Nigeria and gets requests from people across the world who want to connect tags to ICARUS. With cheap satellites and cheap tags, he sees a route to scale. 

Even as ICARUS explores a grassroots future, one of the biggest changes since the initial launch is the backing Internet of Animals technology has received from the biggest giant in the field: NASA. The agency is now two years into a five-year project to explore how it might get more involved in building out such a system. “We’re very much focused on developing future mission concepts that will come after the current set of ICARUS missions,” says Ryan Pavlick, a researcher in remote sensing of biodiversity at NASA’s Jet Propulsion Laboratory. In 2024, this will mean “architecture studies” that aim to understand what technical systems might meet the animal-tracking needs of stakeholders including NOAA, the US Fish and Wildlife Service, and the United States Geological Survey. 

While NASA’s project aims to deliver benefits for the American people, a fully realized Internet of Animals would necessarily be global and interspecies. When we spoke in November 2023, Wikelski had just got off the phone discussing how ICARUS can help monitor the global “deal for nature” established by the UN’s COP15 biodiversity conference, whose targets include reducing extinction rates by a factor of 10. 

Jill Deppe, who leads the National Audubon Society’s Migratory Bird Initiative, has boundless enthusiasm for how an Internet of Animals could affect organizations like hers. For a century, Audubon has watched migratory birds disappear on journeys to Chile or Colombia. A system that could tell us where birds are dying across the entire Western Hemisphere would allow Audubon to precisely target investments in habitat protection and efforts to address threats, she says.  

“Our on-the-ground conservation work is all done on a local scale,” says Deppe. For migratory birds, ICARUS can link these isolated moments into a storyline that spans continents: “How do all of those factors and processes interact? And what does that mean for the birds’ survival?”

Movebank’s live-updating dashboard also makes more dynamic conservation action possible. Beaches can be closed as exhausted shorebirds land, wind farms can halt turbines as bats migrate through, and conservation-conscious farmers—who already aim to flood fields or drain them at times that suit migrating flocks—can do so with real knowledge. 

In return, will animals really help us see the future of the planet’s climate? 

No one is suggesting that animals take over from the system of satellites, weather stations, balloons, and ocean buoys that currently feed into meteorologists’ complex models. Yet technology that complements these dependable data streams, that captures the ever-changing biological signals of seals, storks, sharks, and other species, is already starting to fill in gaps in our knowledge. Once considered cryptic signs from the fates, or harbingers of doom, their behaviors are messages that have only just begun to show us ways to live on a changing planet. 

Matthew Ponsford is a freelance reporter based in London. 

Why China’s EV ambitions need virtual power plants

This story first appeared in China Report, MIT Technology Review’s newsletter about technology in China. Sign up to receive it in your inbox every Tuesday.

The first time I heard the term “virtual power plants,” I was reporting on how extreme heat waves in 2022 had overwhelmed the Chinese grid and led the government to restrict electric-vehicle charging as an emergency solution. I was told at the time that virtual power plants (VPPs) could make grid breakdowns like that less likely to happen again, but I didn’t have a chance to delve in to learn what that meant.

If you, like me, are unsure how a power plant can be virtual, my colleague June Kim just published an insightful article explaining the technology and how it works. For this week’s newsletter, I took the chance to ask her some more questions about VPPs. It turns out the technology has a particularly good synergy with the EV industry, which is why the Chinese government has started to invest in VPPs. 

“VPPs are basically just aggregations of distributed energy resources that can balance electricity on the grid,” June says—resources including electric-vehicle chargers, heat pumps, rooftop solar panels, and home battery packs for power backups. “They’re working in coordination to replace the function of a centralized coal plant or gas plant … but also add a whole host of other functionalities that are beneficial for the grid,” she says.

To really make the most of these resources, VPPs introduce another layer: a central smart system that coordinates energy consumption and supply. 

This system allows utility companies to handle times of higher energy demand by making adjustments like shifting EV charge time to 2 a.m. to avoid peak hours.

The US government is working to triple VPP capacity by 2030, June says. That capacity is equivalent to 80 to 160 fossil-fuel plants that don’t have to be built. “They expect that EV batteries and the EV charging infrastructure are going to be the biggest factor in building up this additional VPP capacity,” she says.

Considering the significant impact that EVs have on the grid, it’s no surprise that China, where an EV revolution is taking place faster than in any other country, has also turned its attention to VPPs.

By the end of 2023, there were over 20 million EVs in China, almost half the global total. Together, these cars can consume monstrous amounts of energy—but their batteries can also be an emergency backup source. The power shortage that happens in China almost every summer is an urgent reminder that the country needs to figure out how to incorporate these millions of EVs into the existing grid.

Luckily, there are already some moves in this area, both from the Chinese government and from Chinese EV companies.

In January 2024, China’s National Development and Reform Commission, the top economic planning authority, released a blueprint for integrating EV charging infrastructure into the grid. The country plans to start pilot programs with dynamic electricity pricing in a few cities: lower prices late at night can incentivize EV owners to charge their vehicles when the grid is not stressed. The goal is that no more than 40% of EV charging will take place outside these “trough hours.” There will also be a batch of bidirectional charging stations in public and private spaces. At these chargers, batteries can either draw electricity from the grid or send it back.

Meanwhile, NIO, a leading Chinese EV company, is transforming its own charging networks. Last month, 10 NIO charging stations opened in Shanghai that allow vehicles to feed energy back into the grid. The company also has over 2,000 battery-swapping stations across the country. These are ideal energy storage resources for the VPP network. Some of them have already been connected to VPP pilot programs in eastern China, the company said in July 2023.

One of the key obstacles to adoption of VPPs is getting people to sign up to participate. But there’s a compelling reward on offer: money. 

If the reverse-charging infrastructure grows larger, millions of Chinese EV owners could make a little income by charging at the right times and selling electricity at others. 

We don’t know how much earning potential there is, since these pilot programs are still in their very early stages in China. But existing VPP projects in the US can offer some reference. Over the course of one summer, a Massachusetts home can make an estimated $550; participants in a separate VPP project in Texas can earn an estimated $150 per year. “It’s not huge, but it’s not nothing,” June says.

Obviously, it will take a long time to transform our electric grids. But developing VPPs along with the EV charging network seems like a win-win situation for China: it helps the country maintain its lead in the EV industry, and it also makes the grid more resilient and less dependent on coal power plants. I won’t be surprised if Chinese local governments and companies work together to roll out virtual power plants in earnest over the next few years.

Do you think China will catch up quickly on adopting virtual power plants? Tell me your thoughts at zeyi@technologyreview.com.

Catch up with China

1. The economic shadows of the pandemic have finally receded. This Lunar New Year, the number of travelers and the amount of spending in China finally surpassed pre-pandemic levels. (Bloomberg $)

2. The European Union is probing China’s state-owned train manufacturer for government subsidies that could give it an unfair advantage when bidding for overseas procurements. (Politico)

  • Last year, the European Commission started another anti-subsidy investigation over imports of Chinese electric vehicles. (MIT Technology Review)

3. Burgeoning sci-fi literature circles in China attracted the prestigious Hugo Awards to be held there last year. But leaked emails show that the awards’ administration team actively censored authors who could upset the Chinese government. (The Guardian)

4. A Volkswagen supplier found a component that might have been produced in Xinjiang, where the use of forced labor has been documented. Now thousands of Porsche, Bentley, and Audi cars are being held at US ports waiting for replacement parts. (Financial Times $)

5. The leading Chinese EV maker BYD is considering building a factory in Mexico. If that happens, we might be able to buy BYD vehicles in the US soon. (Nikkei Asia $)

  • Exports of BYD cars have grown so much in recent years that the company is now buying and hiring massive ships to help deliver them. (MIT Technology Review)

6. A new report by OpenAI and Microsoft says hackers from China, Russia, North Korea, and Iran have used their large language models, but mostly for mundane tasks like drafting emails. (New York Times $)

7. China’s first domestically made passenger airplane made its first overseas trip to Singapore. (Reuters $)

8. New Chinese restaurant chains that combine traditional cuisine with fast food are blowing up in China. When are they going to open one in the US? (Time)

Lost in translation

Huaqiangbei is a neighborhood in Shenzhen known as a hub of domestic innovation and imitation. It has always played a pivotal role in introducing expensive products (like iPhones and AirPods) to Chinese users, either through smuggling or by producing knockoff versions. And the launch of Apple’s Vision Pro has again reminded people of Huaqiangbei’s influence on consumer trends, according to Chinese tech columnist Wang Qingrui

One Shenzhen-based company, EmdoorVR, has already launched a VR headset that looks almost identical to the Vision Pro. This imitator, which is much more limited in function, is named VisionSE and sells for less than 1/10 the price. However, many Huaqiangbei brands have yet to follow suit, since they are not confident about the future of VR headsets. Their hesitation could be another signal that it will be hard for the Vision Pro to find as much acceptance as Apple’s previous successes.

One more thing

For many Chinese families, playing mah-jongg is an essential New Year tradition. But machines are transforming how the game is played: a viral video on social media shows a mah-jongg machine without the usual tiles. Instead, it displays everything on five different screens. It also automatically voices the moves and calculates the results. Not many people in the comments are impressed. Mah-jongg is “99% about feeling the tiles,” says one.

Three things to love about batteries

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

I wouldn’t exactly say I have favorites when it comes to climate technologies. Anything that could help us get closer to tackling climate change is worth writing about, both to share the potential upsides and to carefully examine for pitfalls. But I have a special spot in my heart and my reporting notebook for batteries.

After all, what’s not to love? They play a crucial role in climate action, there are a million different kinds that can meet basically any need, and they’re at least a little bit magical. 

In honor of everyone’s favorite Hallmark-ified holiday, I thought I’d share a love letter to batteries. In any case, this should give you some sense of why I keep coming back to this subject. (Most recently, I dove into the topic of an alternative battery chemistry, lithium-sulfur—give that a read if you haven’t!)

So, how do I love batteries? Let me count the ways. 

They’re practical 

Imagine a world that’s on its way to reaching net-zero greenhouse gas emissions by 2050. That would put us on track to limit global warming to less than 2 °C, or 3.6 °F. To get there, the two biggest sectors to clean up are electricity and transportation: how we power the world and get around. And the common denominator is—you guessed it—batteries. 

Some low-emissions power sources, like wind and solar, aren’t consistently available, so they need a little backup. That’s where grid storage comes in—we’ll need to build about 100 times more energy storage by 2050 on the grid to be on track for our net-zero scenario. 

This won’t all be batteries—storing energy with pumped hydro, compressed air, and other methods could be key. But batteries, especially if cheaper alternatives can scale, will be a major piece of the puzzle.

Electrifying transport is a similar story. We need to move from gas guzzlers to zero-emissions vehicles. And batteries are going to help us do it. 

In our net-zero scenario, the world needs about 14 terawatt hours’ worth of batteries for EVs every year by 2050, according to the International Energy Agency. That’s something like 90 times greater than production in 2020. 

They’re versatile

One of my favorite things about battery technology is its adaptability. Researchers are finding and developing new chemistries all the time, and it’s fascinating to follow. 

Lithium-ion batteries tend to be the default for the industries I typically write about (think transportation and energy storage). That’s mostly because these batteries were developed for personal devices that became widespread beginning in the 1990s, so they’ve had a head start on scaling and the cost cuts that come along with it. 

Even in existing battery technologies, there’s lots of nuance and innovation. Lithium-ion batteries follow a similar blueprint, but there’s a whole world of flavors. Your phone and laptop probably house pouch cells with higher levels of cobalt, whereas your EV likely runs off cylindrical ones that are high in nickel. And a growing fraction of lithium-ion cells don’t include either of those metals—companies are looking at these options for stationary storage or lower- cost vehicles. 

But don’t stop there. Next-generation batteries could give us a different chemistry for every occasion. Need a robust, low-cost battery? Try sodium-ion. Even cheaper, for stationary storage? Zinc flow batteries or iron-air might be the chemistry for you. Something for a long-range, high performance EV? Check out solid state, or maybe something of the lithium-sulfur variety. 

I’m often asked which battery chemistry is going to “win.” Not all batteries are going to make it to widespread adoption, and not all battery companies are going to succeed. But I think the answer is that we’ll hopefully see not a single dominant type of battery, but an ever-growing menu of options. 

They’re at least a little bit magic

Last but not least, I think that one of the main reasons that I’m obsessed with batteries is that I find them a little bit mystifying. Tiny ions shuttling around in a metal container can store energy for us to use, whenever and wherever we want. 

I’ll never get sick of it, and I hope you won’t either. Here’s to spending more time with the ones we love in the year ahead. 

Related reading

Read more about lithium-sulfur batteries, which could unlock cheaper EVs with longer range, in my latest story. 

For another alternative, check out this story from last year on the sodium-ion batteries that could be closer to hitting the roads.

Form Energy and its iron-air batteries made our 2023 list of 15 Climate Tech Companies to Watch. Read all about them here.

I’m not the first MIT Technology Review reporter to dive in on batteries. Read this 2018 story from my colleague James Temple on why lithium-ion batteries won’t be able to clean up the grid on their own. 

Another thing

If you, like me, can’t get enough batteries, I’ve got a great event coming up this week for you! Join me, senior editor James Temple, and editor-at-large David Rotman for the latest in our Roundtables series, where we’ll be diving into a rousing conversation about batteries and their materials. 

This event is open to subscribers, so subscribe if you haven’t yet and come ask all the questions you have about batteries, minerals, and mining! See you there!

a line of heat pumps stretch into the distance with a yellow arrow trending up in front of the closest one

STEPHANIE ARNETT/MITTR | ENVATO

More from us

Sales might be down, but heat pumps are still hot. The devices, which can heat and cool spaces using electricity, are gaining ground on fossil fuels in the US. Check out the data in this story for more on why it matters, and what this says about decarbonization prospects for the country and beyond. 

Also, I’d like to introduce you to a new colleague, James O’Donnell! He’s joining the AI team, and he’s coming out swinging with a story about how Google is using a new satellite to detect methane leaks. Give it a read, and stay tuned for more great stories from him to come. 

Keeping up with climate  

Charging EVs might seem like it’s all about being fast, but slow chargers could be the key to getting more renters to adopt the technology. (Grist)

Chinese automaker BYD has seen massive growth in its EV sales, beating out Tesla in the last quarter of 2023 to become the world’s largest EV maker. Here’s how that happened. (New York Times)

→ BYD is moving so fast that the company is getting into shipping to move more vehicles. (MIT Technology Review)

Consumer demand for EVs is slowing a bit. Some companies are looking to smaller vehicles to help jumpstart interest. (IEEE Spectrum)

Dirt is a major carbon store, holding three times as much as the entire atmosphere. The problem for people looking to leverage dirt for carbon removal is that nobody knows exactly how much carbon can be stored in dirt. (Grist)

Last year was an awful one for the offshore wind industry, but things might be looking up in the year ahead. (Heatmap)

→ Here’s what’s coming next for offshore wind. (MIT Technology Review)

This carbon removal startup is powered by sunlight and seawater. Banyu Carbon’s reversible photoacid could help suck up greenhouse gases from the ocean, though experts have questions about the scalability and ecological effects. (Bloomberg)

How sulfur could be a surprise ingredient in cheaper, better batteries

The key to building less-expensive batteries that could extend the range of EVs might lie in a cheap, abundant material: sulfur.

Addressing climate change is going to require a whole lot of batteries, both to drive an increasingly electric fleet of vehicles and to store renewable power on the grid. Today, lithium-ion batteries are the dominant choice for both industries.

But as the need for more batteries grows, digging up the required materials becomes more challenging. The solution may lie in a growing number of alternatives that avoid some of the most limited and controversial metals needed for lithium-ion batteries, like cobalt and nickel.

One contender chemistry, lithium-sulfur, could soon reach a major milestone, as startup Lyten plans to deliver limited quantities of lithium-sulfur cells to its first customers later this year. The cells (which can be strung together to build batteries of different sizes) will go to customers in the aerospace and defense industries, a step on the journey to building batteries that can stand up to the test of EVs.  

When it comes to new options for batteries, “we need something that we can make a lot of, and make it quickly. And that’s where lithium-sulfur comes in,” says Celina Mikolajczak, chief battery technology officer at Lyten.

Sulfur is widely abundant and inexpensive—a major reason that lithium-sulfur batteries could come with a much cheaper price tag. The cost of materials is around half that of lithium-ion cells, Mikolajczak says. 

That doesn’t mean the cost for the new batteries will immediately be lower, though. Lithium-ion has had decades to slowly cut costs, as production has scaled and companies have worked out the kinks. But a lower cost of materials means the potential for cheaper batteries in the future. 

Not only could lithium-sulfur batteries eventually provide a cheaper way to store energy—they could also beat out lithium-ion on a crucial metric: energy density. A lithium-sulfur battery can pack in nearly twice the energy as a lithium-ion battery of the same weight. That could be a major plus for electric vehicles, allowing automakers to build vehicles that can go farther on a single charge without weighing them down.

However, there are still major technical barriers Lyten needs to overcome for its products to be ready to hit the road in an EV. Chief among them is getting batteries to last.

Today’s lithium-ion batteries built for EVs can last for 800 cycles or more (meaning they can be sapped and recharged 800 times). Lithium-sulfur options tend to degrade much faster, with many efforts today hovering somewhere around 100 cycles, says Shirley Meng, a battery researcher at the University of Chicago and Argonne National Laboratory.

That’s because taming the chemical reactions that power lithium-sulfur batteries has proved to be a challenge. Unwanted reactions between lithium and sulfur can sap the life out of batteries and drive them to an early grave.

Lyten is far from the first to go after the promise of lithium-sulfur batteries, with companies big and small making forays into the chemistry for decades. Some, like UK-based Oxis Energy, have shuttered, while others, including Sion Power, have pivoted away from lithium-sulfur.  But growing demand for alternatives, and a higher level of interest and funding, could mean that Lyten succeeds where earlier efforts have failed, Meng says.

Lyten has made progress in stretching the lifetime of its batteries, recently seeing some samples reach as high as 300 cycles, Mickolajczak says. She attributes the success to Lyten’s 3D graphene material, which helps prevent unwanted side reactions and boost the cell’s energy density. The company is also looking to use 3D graphene, a more complicated structure than the two-dimensional variety, in other products like sensors and composites.  

Even with recent progress, Lyten is still far from producing batteries that can last long enough to power an EV. In the meantime, the company plans to bring its cells to market in places where lifetime isn’t quite so important. 

Since lithium-sulfur batteries can be extremely lightweight, the company is working with customers building devices like drones, for which replacing the batteries frequently would be worth the savings on weight, says Keith Norman, Lyten’s chief sustainability officer. 

The company opened a pilot manufacturing line in 2023 with a maximum capacity of 200,000 cells annually. It recently began producing a small number of cells, which are scheduled for delivery to paying customers later this year. 

The company hasn’t publicly shared which companies will receive the first batteries.  Moving forward, two of the company’s main focuses are improving lifetime and scaling production of both 3D graphene and battery cells, Norman says. 

The road to lithium-sulfur batteries that can power EVs is still a long one, but as Mikolajczak points out, today’s staple chemistry, lithium-ion, has improved leaps and bounds on cost, lifetime, and energy density in the years that companies have been working to tweak it. 

People have tried out a massive range of chemistry options in batteries, Mikolajczak says. “To make one of them reality requires that you put in the work.”

A new satellite will use Google’s AI to map methane leaks from space

A methane-measuring satellite will launch in March that aims to use Google’s AI to quantify, map, and reduce leaks. The mission is part of a collaboration with the nonprofit Environmental Defense Fund, and the result, they say, will be the most detailed portrait yet of methane emissions. It should help to identify the worst spots, and who is responsible.

With methane responsible for roughly a third of the warming caused by greenhouse gases, regulators in the United States and elsewhere are pushing for stronger rules to curb the leaks that spring from oil and gas plants.

MethaneSAT will measure the plumes of methane that billow invisibly from oil and gas operations around the globe, and Google and EDF will then map those leaks for use by researchers, regulators, and the public.

“We’re effectively putting on a really high-quality set of glasses, allowing us to look at the Earth and these emissions with a sharpness that we’ve never had before,” says Steve Hamburg, chief scientist and MethaneSAT project lead at EDF.

Methane experts say, however, that the path from finding a leak to getting a company to plug it will be arduous, and one that the collaboration cannot solve on its own.

Once in orbit, MethaneSAT’s software and spectrometers, which measure different wavelengths of light to detect methane, will pinpoint both concentrated locations for methane plumes as well as the broader areas where the gases diffuse and spread. It will also use Google’s image detection algorithms to create the first comprehensive, global map of the oil and gas industry’s infrastructure, like pump jacks and storage tanks, where leaks most commonly occur. 

“Once those maps are lined up, we expect people will be able to have a far better understanding of the types of machinery that contribute most to methane leaks,” says Yael Maguire, who leads geo-sustainability efforts at Google. 

This tool could solve a significant stumbling block for methane researchers, according to Rob Jackson, professor of Earth system science at Stanford. There are millions of oil and gas operations around the world, and information about where many of these facilities are located is tightly guarded, and where available, expensive to access. Some countries also block researchers from studying their infrastructure or using low-flying planes to measure emissions. With satellites, that may change.

“I think AI is the future of this field, where we should be creating databases of all these infrastructure types,” says Jackson, as measuring plumes from space sidesteps much of the oil and gas industry’s opaqueness on Earth. “One door that satellites are unlocking is the ability to peer everywhere. There will be nowhere to hide, eventually.” 

The MethaneSAT collaboration comes at a time when governments around the world are taking stronger stances on reducing methane leaks. Fueled by the momentum of COP28 in December, the Biden administration announced a new set of rules in December that will require more monitoring and repair of leaks. In January, the administration also proposed a fine against companies for excess methane, though that rule has not been finalized and is being fought by the industry. The European Union also agreed to stricter standards in November.

Once the MethaneSAT collaboration identifies where leaks are coming from, EDF will use the global Methane Alert and Response System from the United Nations, which sends data about methane leaks to governments and policymakers for them to act on. Hamburg from EDF says the first data and images from the satellite are expected in early summer.

Though Jackson is optimistic that more accurate data from Google and EDF will put pressure on companies, he cautions that going from awareness to action is not straightforward. For one, even if a particular oil and gas operation is identified as a bad actor, it’s no small task to figure out who owns that infrastructure, and what tools are available to get them to act. On top of that, some regions and governments are likely to be less responsive to the data than others.

“I’m not confident that simply having this information will mean that companies and countries will switch off methane leaks like a light switch,” he says. 

This chart shows why heat pumps are still hot in the US

Heat pumps are still a hot technology, though sales in the US, one of the world’s largest markets, fell in 2023. Even with the drop, the appliances beat out gas furnaces for the second year in a row and saw their overall market share increase compared to furnaces, sales of which also fell last year.

Heat pumps heat and cool spaces using electricity, and they could be a major tool in the effort to cut greenhouse gas emissions. (About 10% of global emissions are generated from heating buildings.) Many homes and other buildings around the world use fossil fuels for heating in systems like gas furnaces—heat pumps are generally more efficient, and crucially, can be powered using renewable electricity. Experts say heat pump sales will need to grow quickly in order to keep buildings safe and comfortable while meeting climate goals. 

Heat pumps have been around for decades, but the technology has been experiencing a clear moment in the sun in recent years, with global sales increasing by double digits in both 2021 and 2022, according to the International Energy Agency (IEA). Heat pumps were featured on MIT Technology Review’s 2024 list of 10 Breakthrough Technologies

Sales fell by nearly 17% in 2023 in the US, one of the technology’s largest markets, according to new data from the Air-Conditioning, Heating, and Refrigeration Institute. The slowdown comes after nearly a decade of constant growth. The AHRI data isn’t comprehensive, but the organization includes manufacturers accounting for about 90% of the units sold in the US annually.

However, the decline likely says less about heat pumps than it does about the whole HVAC sector, since gas furnaces and air conditioners saw even steeper drops. Gas furnace sales declined even more than heat pumps did in 2023, so heat pumps actually made up a slightly larger percentage of sales this year than in 2022.

The broad slowdown reflects broader consumer pessimism amid higher interest rates and inflation, says Yannick Monschauer, an analyst at the IEA, via email. 

“We have also been observing slowing heat pump sales in other parts of the world for 2023,” Monschauer adds. In Europe, a rush to electrify, driven by the energy crisis and rising natural gas prices, has slowed. 

New incentives programs could help speed progress in 2024 and beyond. The Inflation Reduction Act, a sweeping climate bill passed in 2022, includes individual tax credits for up to $2,000 towards a new heat pump, which went into effect at the beginning of 2023. 

However, the more generous incentives in that law have yet to take effect, says Wael Kanj, a research associate at Rewiring America, a nonprofit group focused on electrification in the US.  

New rebates set aside funding of up to $8,000 towards a new heat pump system for low- and middle-income households. Distributing the rebates is up to individual states, and analysts anticipate those programs getting up and running in late 2024, or early 2025, Kanj says. 

Heat pumps are a crucial component of plans to combat climate change. In a scenario where the world reaches net-zero emissions by 2050, heat pumps need to account for 20% of global heating capacity by the end of this decade, according to an IEA analysis.

“The next five, ten, 15 years are really going to be important,” Kanj says. “We definitely need to pick up the pace.”

Advanced solar panels still need to pass the test of time

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

It must be tough to be a solar panel. They’re consistently exposed to sun, heat, and humidity—and the panels installed today are expected to last 30 years or more.

But how can we tell that new solar technologies will stand the test of time? I’m fascinated by the challenge of predicting how new materials will hold up in decades of tough conditions. That’s been especially tricky for one emerging technology in particular: perovskites. They’re a class of materials that developers are increasingly interested in incorporating into solar panels because of their high efficiency and low cost. 

The problem is, perovskites are notorious for degrading when exposed to high temperatures, moisture, and bright light … all the things they’ll need to withstand to make it in the real world. And it’s not as if we can sit around for decades, testing out different cells in the field for the expected lifetime of a solar panel—climate change is an urgent problem. The good news: researchers have made progress in both stretching out the lifetime of perovskite materials and working out how to predict which materials will be winners in the long run. 

There’s almost constant news about perovskite solar materials breaking records. The latest such news comes from Oxford PV—in January, the company announced that one of its panels reached a 25% conversion efficiency, meaning a quarter of the solar energy beaming onto the panel was converted to electricity. Most high-end commercial panels have around a 20% efficiency, with some models topping 23%. 

The improvement is somewhat incremental, but it’s significant, and it’s all because of teamwork. Oxford PV and other companies are working to bring tandem solar technology to the market. These panels are basically sandwiches that combine layers of silicon (the material that dominates today’s solar market) and perovskites. Since the two materials soak up different wavelengths of light, they can be stacked together, adding up to a more efficient solar material. 

We’re seeing advances in tandem technology, which is why we named super-efficient tandem solar cells one of our 2024 Breakthrough Technologies. But perovskites’ nasty tendency to degrade is a major barrier standing in the way. 

Early perovskite solar cells went bad so quickly that researchers had to race across the laboratory to measure their efficiency. In the time it took to get from the area where solar cells were made to the side of the room where the testing equipment was, the materials basically lost their ability to soak up sunlight. 

The lifetime of perovskite materials isn’t nearly this fleeting now, but it’s not clear that the problem has been entirely solved. 

There’s been some real-world testing of new perovskite solar materials, with mixed results. Oxford PV hasn’t published detailed data, though as CTO Chris Case told Nature last year, the company’s outdoor tests show that the best cells lose only about 1% of their efficiency in their first year of operation, a rate that slows down afterwards. 

Other testing in more intense conditions has found less positive results, with one academic study finding that perovskite cells in hot and humid Saudi Arabia lost 20% of their efficiency after one year of operation. 

Those results are for one year of testing. How can we tell what will happen in 30 years? 

Since we don’t have years to test every new material that scientists dream up, researchers often put them through especially punishing conditions in the lab, bumping up the temperature and shining bright lights onto panels to see how quickly they’ll degrade. 

This sort of testing is standard for silicon solar panels, which make up over 90% of the commercial solar market today. But researchers are still working out just how well the correlations with known tests will transfer to new materials like perovskites. 

One of the issues has been that light, moisture, and heat all contribute to the quick degradation of perovskites. But it hasn’t been clear exactly which factor, or combination of them, would be best to apply in the lab to measure how a solar panel would fare in the real world. 

One study, published last year in Nature, suggested that a combination of high temperature and illumination would be the key to accelerated tests that reliably predict real-world performance. The researchers found that high-temperature tests lasting just a few hundred hours (a couple of weeks) translated well to nearly six months of performance in outdoor testing. 

Companies say they’re bringing new solar materials to the market as soon as this year.  Soon we’ll start to really see just how well these tests predict new technologies’ ability to withstand the tough job a commercial solar panel needs to do. I know I’ll be watching. 

Related reading

Read more about why super-efficient tandem solar cells made our list of 10 Breakthrough Technologies in 2024 here.

Here’s a look inside the race to get these next-generation solar technologies into the world.

Perovskites have been hailed as the hot new thing in solar for years. What’s been the holdup? In short: stability, stability, stability. 

Photo illustration concept of virtual power plant, showing two power plant stacks with a glitch effect.

SARAH ROGERS/MITTR | GETTY

Explained

Welcome to the wonderful world of virtual power plants (VPPs). While they’re not physical facilities, VPPs could have actual benefits for emissions by stitching together different parts of the grid to help meet electricity demand. 

What exactly is a VPP? How does it work? What does this all mean for climate action? Get the answers to all these questions and more in my colleague June Kim’s latest story.

Two more things 

Scattering small particles in the upper levels of the atmosphere could help reflect sunlight, slowing down planetary warming. While this idea, called solar geoengineering, sounds farfetched, it’s possible that small efforts could get started within a decade, as David Keith and Wake Smith write in a new op-ed. 

Read more about how geoengineering could start, and what these experts are saying we need to do about it, here

The US is pausing exports of liquefied natural gas. The move was met with a wide range of reactions and plenty of questions about what it will mean for emissions. 

As Arvind Ravikumar writes in a new op-ed, people are asking all the wrong questions about LNG. Whether this is a good idea depends on what the fuel would be replacing. Read his full take here. 

Keeping up with climate  

In an age of stronger hurricanes, some scientists say our current rating system can’t keep up. Adding a Category 6 could help us designate super-powerful storms. (Inside Climate News)

→ Here’s what we know about hurricanes and climate change. (MIT Technology Review

A fringe idea to put massive sunshades in space to cool down the planet is gaining momentum. Or we could, you know, stop burning fossil fuels? (New York Times)

Trains powered by hydrogen are starting to hit the rails. Here’s why experts say that might not be the best use for the fuel. (Canary Media)

According to the sponges, we’ve already sailed past climate goals. Scientists examining the skeletons of creatures called sclerosponges concluded that human-caused climate change has probably raised temperatures by 1.7 °C (3.1 °F) since the late 19th century. (New York Times)

A century-old law you’ve never heard of is slowing down offshore wind in the US. By requiring the use of US-built ships within the country’s waters, the Jones Act is behind some of the speed bumps facing the offshore wind industry. (Hakai Magazine)

→ Here’s what’s next for offshore wind, including when we can expect the first US-built ship to hit the waters. (MIT Technology Review)

Sorting recycling is a tough job, but AI might be able to help. New sorting systems could rescue more plastic from the landfill, though rolling out new technology to sorting facilities will be a challenge. (Washington Post)

Advanced solar panels still need to pass the test of time

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

It must be tough to be a solar panel. They’re consistently exposed to sun, heat, and humidity—and the panels installed today are expected to last 30 years or more.

But how can we tell that new solar technologies will stand the test of time? I’m fascinated by the challenge of predicting how new materials will hold up in decades of tough conditions. That’s been especially tricky for one emerging technology in particular: perovskites. They’re a class of materials that developers are increasingly interested in incorporating into solar panels because of their high efficiency and low cost. 

The problem is, perovskites are notorious for degrading when exposed to high temperatures, moisture, and bright light … all the things they’ll need to withstand to make it in the real world. And it’s not as if we can sit around for decades, testing out different cells in the field for the expected lifetime of a solar panel—climate change is an urgent problem. The good news: researchers have made progress in both stretching out the lifetime of perovskite materials and working out how to predict which materials will be winners in the long run. 

There’s almost constant news about perovskite solar materials breaking records. The latest such news comes from Oxford PV—in January, the company announced that one of its panels reached a 25% conversion efficiency, meaning a quarter of the solar energy beaming onto the panel was converted to electricity. Most high-end commercial panels have around a 20% efficiency, with some models topping 23%. 

The improvement is somewhat incremental, but it’s significant, and it’s all because of teamwork. Oxford PV and other companies are working to bring tandem solar technology to the market. These panels are basically sandwiches that combine layers of silicon (the material that dominates today’s solar market) and perovskites. Since the two materials soak up different wavelengths of light, they can be stacked together, adding up to a more efficient solar material. 

We’re seeing advances in tandem technology, which is why we named super-efficient tandem solar cells one of our 2024 Breakthrough Technologies. But perovskites’ nasty tendency to degrade is a major barrier standing in the way. 

Early perovskite solar cells went bad so quickly that researchers had to race across the laboratory to measure their efficiency. In the time it took to get from the area where solar cells were made to the side of the room where the testing equipment was, the materials basically lost their ability to soak up sunlight. 

The lifetime of perovskite materials isn’t nearly this fleeting now, but it’s not clear that the problem has been entirely solved. 

There’s been some real-world testing of new perovskite solar materials, with mixed results. Oxford PV hasn’t published detailed data, though as CTO Chris Case told Nature last year, the company’s outdoor tests show that the best cells lose only about 1% of their efficiency in their first year of operation, a rate that slows down afterwards. 

Other testing in more intense conditions has found less positive results, with one academic study finding that perovskite cells in hot and humid Saudi Arabia lost 20% of their efficiency after one year of operation. 

Those results are for one year of testing. How can we tell what will happen in 30 years? 

Since we don’t have years to test every new material that scientists dream up, researchers often put them through especially punishing conditions in the lab, bumping up the temperature and shining bright lights onto panels to see how quickly they’ll degrade. 

This sort of testing is standard for silicon solar panels, which make up over 90% of the commercial solar market today. But researchers are still working out just how well the correlations with known tests will transfer to new materials like perovskites. 

One of the issues has been that light, moisture, and heat all contribute to the quick degradation of perovskites. But it hasn’t been clear exactly which factor, or combination of them, would be best to apply in the lab to measure how a solar panel would fare in the real world. 

One study, published last year in Nature, suggested that a combination of high temperature and illumination would be the key to accelerated tests that reliably predict real-world performance. The researchers found that high-temperature tests lasting just a few hundred hours (a couple of weeks) translated well to nearly six months of performance in outdoor testing. 

Companies say they’re bringing new solar materials to the market as soon as this year.  Soon we’ll start to really see just how well these tests predict new technologies’ ability to withstand the tough job a commercial solar panel needs to do. I know I’ll be watching. 

Related reading

Read more about why super-efficient tandem solar cells made our list of 10 Breakthrough Technologies in 2024 here.

Here’s a look inside the race to get these next-generation solar technologies into the world.

Perovskites have been hailed as the hot new thing in solar for years. What’s been the holdup? In short: stability, stability, stability. 

Photo illustration concept of virtual power plant, showing two power plant stacks with a glitch effect.

SARAH ROGERS/MITTR | GETTY

Explained

Welcome to the wonderful world of virtual power plants (VPPs). While they’re not physical facilities, VPPs could have actual benefits for emissions by stitching together different parts of the grid to help meet electricity demand. 

What exactly is a VPP? How does it work? What does this all mean for climate action? Get the answers to all these questions and more in my colleague June Kim’s latest story.

Two more things 

Scattering small particles in the upper levels of the atmosphere could help reflect sunlight, slowing down planetary warming. While this idea, called solar geoengineering, sounds farfetched, it’s possible that small efforts could get started within a decade, as David Keith and Wake Smith write in a new op-ed. 

Read more about how geoengineering could start, and what these experts are saying we need to do about it, here

The US is pausing exports of liquefied natural gas. The move was met with a wide range of reactions and plenty of questions about what it will mean for emissions. 

As Arvind Ravikumar writes in a new op-ed, people are asking all the wrong questions about LNG. Whether this is a good idea depends on what the fuel would be replacing. Read his full take here. 

Keeping up with climate  

In an age of stronger hurricanes, some scientists say our current rating system can’t keep up. Adding a Category 6 could help us designate super-powerful storms. (Inside Climate News)

→ Here’s what we know about hurricanes and climate change. (MIT Technology Review

A fringe idea to put massive sunshades in space to cool down the planet is gaining momentum. Or we could, you know, stop burning fossil fuels? (New York Times)

Trains powered by hydrogen are starting to hit the rails. Here’s why experts say that might not be the best use for the fuel. (Canary Media)

According to the sponges, we’ve already sailed past climate goals. Scientists examining the skeletons of creatures called sclerosponges concluded that human-caused climate change has probably raised temperatures by 1.7 °C (3.1 °F) since the late 19th century. (New York Times)

A century-old law you’ve never heard of is slowing down offshore wind in the US. By requiring the use of US-built ships within the country’s waters, the Jones Act is behind some of the speed bumps facing the offshore wind industry. (Hakai Magazine)

→ Here’s what’s next for offshore wind, including when we can expect the first US-built ship to hit the waters. (MIT Technology Review)

Sorting recycling is a tough job, but AI might be able to help. New sorting systems could rescue more plastic from the landfill, though rolling out new technology to sorting facilities will be a challenge. (Washington Post)

How virtual power plants are shaping tomorrow’s energy system

MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

For more than a century, the prevalent image of power plants has been characterized by towering smokestacks, endless coal trains, and loud spinning turbines. But the plants powering our future will look radically different—in fact, many may not have a physical form at all. Welcome to the era of virtual power plants (VPPs).

The shift from conventional energy sources like coal and gas to variable renewable alternatives such as solar and wind means the decades-old way we operate the energy system is changing. 

Governments and private companies alike are now counting on VPPs’ potential to help keep costs down and stop the grid from becoming overburdened. 

Here’s what you need to know about VPPs—and why they could be the key to helping us bring more clean power and energy storage online.

What are virtual power plants and how do they work?

A virtual power plant is a system of distributed energy resources—like rooftop solar panels, electric vehicle chargers, and smart water heaters—that work together to balance energy supply and demand on a large scale. They are usually run by local utility companies who oversee this balancing act.

A VPP is a way of “stitching together” a portfolio of resources, says Rudy Shankar, director of Lehigh University’s Energy Systems Engineering, that can help the grid respond to high energy demand while reducing the energy system’s carbon footprint.

The “virtual” nature of VPPs comes from its lack of a central physical facility, like a traditional coal or gas plant. By generating electricity and balancing the energy load, the aggregated batteries and solar panels provide many of the functions of conventional power plants.

They also have unique advantages.

Kevin Brehm, a manager at Rocky Mountain Institute who focuses on carbon-free electricity, says comparing VPPs to traditional plants is a “helpful analogy,” but VPPs “do certain things differently and therefore can provide services that traditional power plants can’t.”

One significant difference is VPPs’ ability to shape consumers’ energy use in real time. Unlike conventional power plants, VPPs can communicate with distributed energy resources and allow grid operators to control the demand from end users.

For example, smart thermostats linked to air conditioning units can adjust home temperatures and manage how much electricity the units consume. On hot summer days these thermostats can pre-cool homes before peak hours, when air conditioning usage surges. Staggering cooling times can help prevent abrupt demand hikes that might overwhelm the grid and cause outages. Similarly, electric vehicle chargers can adapt to the grid’s requirements by either supplying or utilizing electricity. 

These distributed energy sources connect to the grid through communication technologies like Wi-Fi, Bluetooth, and cellular services. In aggregate, adding VPPs can increase overall system resilience. By coordinating hundreds of thousands of devices, VPPs have a meaningful impact on the grid—they shape demand, supply power, and keep the electricity flowing reliably.

How popular are VPPs now?

Until recently, VPPs were mostly used to control consumer energy use. But because solar and battery technology has evolved, utilities can now use them to supply electricity back to the grid when needed.

In the United States, the Department of Energy estimates VPP capacity at around 30 to 60 gigawatts. This represents about 4% to 8% of peak electricity demand nationwide, a minor fraction within the overall system. However, some states and utility companies are moving quickly to add more VPPs to their grids.

Green Mountain Power, Vermont’s largest utility company, made headlines last year when it expanded its subsidized home battery program. Customers have the option to lease a Tesla home battery at a discounted rate or purchase their own, receiving assistance of up to $10,500, if they agree to share stored energy with the utility as required. The Vermont Public Utility Commission, which approved the program, said it can also provide emergency power during outages.

In Massachusetts, three utility companies (National Grid, Eversource, and Cape Light Compact) have implemented a VPP program that pays customers in exchange for utility control of their home batteries.

Meanwhile, in Colorado efforts are underway to launch the state’s first VPP system. The Colorado Public Utilities Commission is urging Xcel Energy, its largest utility company, to develop a fully operational VPP pilot by this summer.

Why are VPPs important for the clean energy transition?

Grid operators must meet the annual or daily “peak load,” the moment of highest electricity demand. To do that, they often resort to using gas “peaker” plants, ones that remain dormant most of the year that they can switch during in times of high demand. VPPs will reduce the grids’ reliance on these plants.

The Department of Energy currently aims to expand national VPP capacity to 80 to 160 GW by 2030. That’s roughly equivalent to 80 to 160 fossil fuel plants that need not be built, says Brehm.

Many utilities say VPPs can lower energy bills for consumers in addition to reducing emissions. Research suggests that leveraging distributed sources during peak demand is up to 60% more cost effective than relying on gas plants.

Another significant, if less tangible, advantage of VPPs is that they encourage people to be more involved in the energy system. Usually, customers merely receive electricity. Within a VPP system, they both consume power and contribute it back to the grid. This dual role can improve their understanding of the grid and get them more invested in the transition to clean energy.

What’s next for VPPs?

The capacity of distributed energy sources is expanding rapidly, according to the Department of Energy, owing to the widespread adoption of electric vehicles, charging stations, and smart home devices. Connecting these to VPP systems enhances the grid’s ability to balance electricity demand and supply in real time. Better AI can also help VPPs become more adept at coordinating diverse assets, says Shankar.

Regulators are also coming on board. The National Association of Regulatory Utility Commissioners has started holding panels and workshops to educate its members about VPPs and how to implement them in their states. The California Energy Commission is set to fund research exploring the benefits of integrating VPPs into its grid system. This kind of interest from regulators is new but promising, says Brehm.

Still, hurdles remain. Enrolling in a VPP can be confusing for consumers because the process varies among states and companies. Simplifying it for people will help utility companies make the most of distributed energy resources such as EVs and heat pumps. Standardizing the deployment of VPPs can also speed up their growth nationally by making it easier to replicate successful projects across regions.

“It really comes down to policy,” says Brehm. “The technology is in place. We are continuing to learn about how to best implement these solutions and how to interface with consumers.”

Solar geoengineering could start soon if it starts small

For half a century, climate researchers have considered the possibility of injecting small particles into the stratosphere to counteract some aspects of climate change. The idea is that by reflecting a small fraction of sunlight back to space, these particles could partially offset the energy imbalance caused by accumulating carbon dioxide, thereby reducing warming as well as extreme storms and many other climate risks. 

Debates about this idea, a form of solar geoengineering called stratospheric aerosol injection (SAI), commonly focus either on small-scale outdoor research that seeks to understand the physical processes involved or on deployment at a climate-altering scale. The gulf between these is gigantic: an experiment might use mere kilograms of aerosol material whereas deployment that could substantially slow or even reverse warming would involve millions of metric tons per year—a billionfold difference in scale. Appreciably cooling the planet via SAI would also require a purpose-built fleet of high-altitude aircraft, which could take one or two decades to assemble. This long lead time encourages policymakers to ignore the hard decisions about regulating deployment of SAI. 

Such complacency is ill-advised. The barrier between research and deployment may be less distinct than is often assumed. Our analysis suggests a country or group of countries could conceivably start a subscale solar geoengineering deployment in as little as five years, one that would produce unmistakable changes in the composition of the stratosphere. A well-managed subscale deployment would benefit research by reducing important uncertainties about SAI, but it could not be justified as research alone—similar research could be carried out with a much smaller amount of aerosol particles. And it would have a non-negligible impact on the climate, providing as much cooling as sulfur pollution from international shipping did before the recent cleanup of shipping fuels. At the same time, the magnitude of the cooling would be small enough that its effects on climate, on a national or regional scale, would be very difficult to detect in the face of normal variability. 

While the climate impact of such a subscale deployment would be small (and most likely beneficial), the political impact could be profound. It could trigger a backlash that would upend climate geopolicy and threaten international stability. It could be an on-ramp to large-scale deployment. And it could be exploited by fossil fuel interests seeking to slow the essential task of cutting emissions. 

We oppose near-term deployment of solar geoengineering. In accord with the Climate Overshoot Commission, the most senior group of political leaders to examine the topic, we support a moratorium on deployment until the science is internationalized and critically assessed, and until some governance architecture is widely agreed upon. But if we are correct that such subscale deployments are plausible, then policymakers may need to confront solar geoengineering—its promise and disruptive potential, and its profound challenges to global governance—earlier than is now widely assumed. 

Obstacles to early deployment 

Humans already emit a huge quantity of aerosols into the troposphere (the turbulent lowest layer of the atmosphere) from sources such as shipping and heavy industry, but these aerosols fall to Earth or are removed by rainfall and other processes within about a week. Volcanic eruptions can have a more lasting effect. When eruptions are powerful enough to punch through the troposphere into the stratosphere, the aerosols deposited there can endure for roughly a year. SAI would, like the largest volcanic eruptions, inject aerosols or their precursors into the stratosphere. Given their vastly longer atmospheric endurance, aerosols placed there can have a cooling impact 100 times larger than they would if emitted at the surface. 

Getting aerosols to the stratosphere is another matter. Passenger jets routinely reach the lower stratosphere on transpolar flights. But to get efficient global coverage, aerosols are best deployed at low latitudes, where the stratosphere’s natural overturning circulation will carry them poleward and thus distribute them worldwide. The average height of the top of the troposphere is about 17 kilometers in the tropics, and models suggest injection needs to be a few kilometers higher than that to be captured in the upwelling stratospheric circulation. The altitude for efficient deployment is commonly assumed to be at least 20 kilometers, nearly twice the height at which commercial jets or large military aircraft cruise. 

Although small spy planes can cruise in this very thin air, they can carry only one to two metric tons of payload. That would be insufficient except for small-scale tests: offsetting a substantial fraction of global warming—say, 1 °C of cooling—would require platforms that could deliver several million metric tons per year of material to the stratosphere. Neither rockets nor balloons are suitable for hauling such a large mass to this high perch. Consequently, full-scale deployment would require a fleet of novel aircraft—a few hundred in order to achieve a 1 °C cooling target. Procuring just the first aircraft in the manner typical of large commercial or military aircraft development programs might take roughly a decade, and manufacturing the required fleet would take several years more. 

But starting with full-scale deployment is both imprudent and unlikely. Even if we are turning the global thermostat down, the faster we change the climate, the higher the risk of unforeseen impacts. A country or group of countries that wishes to deploy solar engineering is likely to appreciate the political and technical benefits of a slower start, one with a gradual reversal of warming that facilitates optimization and “learning by doing”, while minimizing the likelihood and impact of unintended consequences. 

We envision scenarios where, instead of attempting to inject aerosols in the most efficient way near the equator, a country or group of countries attempt to place a smaller amount of material in the lower stratosphere at higher latitudes. They could do this with existing aircraft, because the top of the troposphere slopes sharply downward as you move away from the equator. At 35° north and south, it is found at roughly 12 kilometers. Adding a 3 kilometer margin, an effective deployment altitude at 35° north and south would be 15 kilometers. This remains too high for airliners but is just below the 15.5 kilometer service ceiling of top-of-the-line business jets made by Gulfstream, Bombardier, and Dassault. The list of countries with territory at or near 35° north or south includes not only rich countries such as the US, Australia, Japan, South Korea, Spain, and China, but also poorer ones such as Morocco, Algeria, Iraq, Iran, Pakistan, India, Chile, and Argentina.

Subscale deployment

How might subscale deployment be accomplished? Most stratospheric scientific studies of aerosol injection assume the operative material is sulfur dioxide (SO2) gas, which is 50% sulfur by mass. Another plausible option is hydrogen sulfide (H2S), which cuts the mass requirement almost in half, though it is more hazardous to ground and flight crews than SO2 and thus might be eliminated from consideration. Carbon disulfide (CS2) gas cuts the mass requirement by 40% and is generally less hazardous than SO2. It is also possible to use elemental sulfur, which is the safest and easiest to handle, but this would require a method of combusting it on board before venting or the use of afterburners. No one has yet done the engineering studies required to determine which of these sulfur compounds would be the best choice. 

Using assumptions confirmed with Gulfstream, we estimate that any of its G500/600 aircraft could loft about 10 kilotons of material per year to 15.5 kilometers. If highly mass-efficient CS2 were used, a fleet of no more than 15 aircraft could carry up 100 kilotons of sulfur a year. Aged but operable used G650s cost about $25 million. Adding in the cost of modification, maintenance, spare parts, salaries, fuel, materials, and insurance, we expect the average total cost of a decade-long subscale deployment would be about $500 million a year. Large-scale deployment would cost at least 10 times as much.

How much is 100 kilotons of sulfur per year? It is a mere 0.3% of current global annual emissions of sulfur pollution into the atmosphere. Its contribution to the health impact of particulate air pollution would be substantially less than a tenth of what it would be if the same amount were emitted at the surface. As for its impact on climate, it would be about 1% of the sulfur injected in the stratosphere by the 1992 eruption of Mount Pinatubo in the Philippines. That well-studied event supports the assertion that no high-consequence unknown effects would occur. 

At the same time, 100 kilotons of sulfur per year is not insubstantial: it would be more than twice the natural background flux of sulfur from the troposphere into the stratosphere, absent unusual volcanic activity. The cooling effect would be enough to delay global rise in temperature for about a third of a year, an offset that would last as long as the subscale deployment was maintained. And because solar geoengineering is more effective at countering the rise in extreme precipitation than the rise in temperature, the deployment would delay the increasing intensity of tropical cyclones by more than half a year. These benefits are not negligible to those most at risk from climate impacts (though none of these benefits would necessarily be apparent due to the climate system’s natural variability).

We should mention that our 100 kilotons per year scenario is arbitrary. We define a subscale deployment to mean a deployment large enough to substantially increase the amount of aerosol in the stratosphere while being well below the level that is required to delay warming by a decade. With that definition, such a deployment could be several times larger or smaller than our sample scenario. 

Of course no amount of solar geoengineering can eliminate the need to reduce the concentration of greenhouse gases in the atmosphere. At best, solar geoengineering is a supplement to emissions cuts. But even the subscale deployment scenario we consider here would be a significant supplement: over a decade, it would have approximately half the cooling effect as eliminating all emissions from the European Union. 

The politics of subscale deployment

The subscale deployment we’ve outlined here could serve several plausible scientific and technological goals. It would demonstrate the storage, lofting, and dispersion technologies for larger-scale deployment. If combined with an observational program, it would assess monitoring capabilities as well. It would directly clarify how sulfate is carried around the stratosphere and how sulfate aerosols interact with the ozone layer. After a few years of such a subscale deployment, we would have a far better understanding of the scientific and technological barriers to large-scale deployment. 

At the same time, subscale deployment would pose risks for the deployer. It could trigger political instability and invite retribution from other countries and international bodies that would not respond well to entities fiddling with the planet’s thermostat without global coordination and oversight. Opposition might stem from a deep-rooted aversion to environmental modification or from more pragmatic concerns that large-scale deployment would be detrimental to some regions. 

Deployers might be motivated by a wide range of considerations. Most obviously, a state or coalition of states might conclude that solar geoengineering could significantly reduce their climate risk, and that such a subscale deployment would strike an effective balance between the goals of pushing the world toward large-scale deployment and minimizing the risk of political backlash. 

The deployers could decide that a subscale project might make bigger interventions possible. While scientists may be comfortable drawing inferences about solar geoengineering from tiny experiments and models, politicians and the public may be very cautious about atmospheric interventions that can alter the climate system and affect all the creatures that dwell within it. A subscale deployment that encountered no major surprises could go a long way toward reducing extreme concerns about full-scale deployment. 

The deployers could also claim some limited benefit from the subscale deployment itself. While the effects would be too small to be readily evident on the ground, the methods used to attribute extreme weather events to climate change could substantiate claims of small reductions in the severity of such events. 

They might also argue that the deployment is simply restoring atmospheric protection that was recently lost. The reduction in sulfur emissions from ships is now saving lives by creating cleaner air, but it is also accelerating warming by thinning the reflective veil that such pollution created. The subscale scenario we sketched out would restore almost half of that sunshade protection, without the countervailing air pollution.  

The deployers might also convince themselves that their action was consistent with international law because they could perform deployment entirely within their domestic airspace and because the effects, while global, would not produce “significant transboundary harm,” the relevant threshold under customary international law. 

The governance implications of such a subscale deployment would depend on the political circumstances. If it were done by a major power without meaningful attempts at multilateral engagement, one would expect dramatic backlash. On the other hand, were deployment undertaken by a coalition that included highly climate-vulnerable states and that invited other states to join the coalition and develop a shared governance architecture, many states might be publicly critical but privately pleased that geoengineering reduced climate risks.   

SAI is sometimes described as an imaginary sociotechnical scenario residing in a distant sci-fi future. But it is technically feasible to start subscale deployments of the kind we describe here in five years. A state or coalition of states that wished to meaningfully test both the science and politics of deployment may consider such subscale or demonstration deployments as climate risks become more salient. 

We are not advocating for such action—in fact, we reiterate our support for a moratorium against deployment until the science is critically assessed and some governance architecture is widely agreed upon. Yet a sound understanding of the interlinked technology and politics of SAI is hampered by the perception that it must start with a significant effort that would substantially slow or even reverse warming. The example we’ve outlined here illustrates that the infrastructural barriers to deployment are more easily overcome than is commonly assumed. Policymakers must take this into account—and soon—as they consider how to develop solar geoengineering in the public interest and what guardrails should be put in place.

David W. Keith is a professor of geophysical sciences and founding faculty director of the Climate Systems Engineering initiative at the University of Chicago. 

Wake Smith is a lecturer at the Yale School of Environment and a research fellow at the Harvard Kennedy School.  

We thank Christian V. Rice of VPE Aerospace for performing the payload calculations herein. Please consult this PDF for more detail on our estimates.