Less than 1% of clothing is recycled, and most of the rest ends up dumped in a landfill or burned. A team of researchers hopes to change that with a new process that breaks down mixed-fiber clothing into reusable, recyclable parts without any sorting or separation in advance.
“We need a better way to recycle modern garments that are complex, because we are never going to stop buying clothes,” says Erha Andini, a chemical engineer at the University of Delaware and lead author of a study on the process, which is out today in Science Advances. “We are looking to create a closed-loop system for textile recycling.”
Many garments are made of a mix of natural and synthetic fibers. Once these fibers are combined, they are difficult to separate. This presents a problem for recycling, which often needs textiles to be sorted into uniform categories, similar to how we sort glass, aluminum, and paper.
To tackle this problem, Andini and her team used a solvent that breaks the chemical bonds in polyester fabric while leaving cotton and nylon intact. To speed up the process, they power it with microwave energy and add a zinc oxide catalyst. This combination reduces the breakdown time to 15 minutes, whereas traditional plastic recycling methods take over an hour. What the polyester ultimately breaks down into is BHET, an organic compound that can, in theory, be turned into polyester once more. While similar methods have been used to recycle pre-sorted plastic, this is the first time they’ve been used to recycle mixed-fiber textiles without any sorting required.
COURTESY OF THE RESEARCHERS
In addition to speeding things up, the use of microwave energy also reduces the technique’s carbon footprint because it’s quicker and uses less energy, says Andini.
Nevertheless, the process could be difficult to scale, says Bryan Vogt, a chemical engineer at Penn State University, who was not involved in the study. That’s because the solvent used to break down polyester is expensive and difficult to recover after use. Further, according to Andini, even though BHET is easily turned back into clothing, it’s less clear what to do with the leftover fibers. Nylon could be especially tricky, as the fabric is degraded significantly by the team’s chemical recycling technique.
“We are chemical engineers, so we think of this process as a whole,” says Andini. “Hopefully, once we are able to get pure components from each part, we can transform them back into yarn and make clothes again.”
Andini, who just received a fellowship for entrepreneurs, is developing a business plan to commercialize the process. In the coming years, she aims to launch a startup that will take the clothes recycling technique out of the lab and into the real world. That could be a significant step toward reducing the large amounts of textile waste in landfills. “It’ll be a matter of having the capital or not,” she says, “but we’re working on it and excited for it.”
Hydropower is the world’s leading source of renewable electricity, generating more power in 2022 than all other renewables combined. But while hydropower is helping clean up our electrical grid, it’s not always a positive force for fish.
Dams that create reservoirs on rivers can change habitats. And for some species, especially those that migrate long distances, hydropower facilities can create dangerous or insurmountable barriers. In some parts of the world, including the US, Canada, and Europe, governments have put protections in place to protect ecosystems from hydropower’s potential harms.
New environmental regulations can leave older facilities facing costly renovations or force them to shutter entirely. That’s a big problem, because pulling hydropower plants off the grid eliminates a flexible, low-emissions power source that can contribute to progress in fighting climate change. New technologies, including fish-safe turbines, could help utilities and regulators come closer to striking a balance between the health of river ecosystems and global climate goals.
That’s where companies like Natel Energy come in. The company started with two big goals: high performance and fish survival, says Gia Schneider, Natel’s cofounder and chief commercial officer.
The company is making new designs for the turbines that generate electricity in hydropower plants as water rushes through equipment and moves their blades. Conventional turbine blades can move as fast as 30 meters per second, or about 60 to 70 miles per hour, Schneider says. When straight, thin edges are moving that quickly and striking fish, “it’s fairly obvious why that’s not a good outcome,” she says.
Natel’s turbine design focuses on preventing fast-moving equipment from making fatal contact with fish. The blades have a thicker leading edge that pushes water out in front of it, creating a stagnation zone, or “basically an airbag for fish,” Schneider says. The blades are also curved, so even if fish are struck, they don’t take a direct hit.
The company has tested its turbines with a range of species, including American eels, alewife, and rainbow trout. In the case of one recent study with American eels, scientists found that over 99% of eels survived after 48 hours of passing through Natel’s equipment. In comparison, one 2010 study found that just 40% of tagged European eels were able to pass through the turbines of a hydropower plant, though survival depended a lot on the size of both the eel and equipment in question.
Changing turbine designs won’t help fish survive all power plants: at some of the biggest plants with the tallest dams, rapid changes in water pressure can kill fish. But Schneider says that the company’s technology could be slotted into up to half of the existing US hydropower fleet to make plants more fish-safe.
Hydropower is one of the world’s older renewable energy sources. By 2030, more than 20% of the global fleet’s generating units will be more than 55 years old, according to the International Energy Agency. The average age of a hydropower plant in the US today is roughly 65 years.
In the US, privately held hydropower plants are licensed by an agency called the Federal Energy Regulatory Commission for a term of up to 50 years. Roughly 17 gigawatts’ worth of hydropower facilities (enough to power 13 million homes) are up for relicensing by 2035, according to the National Hydropower Association.
Since many of those facilities were started up, there have been significant changes to environmental requirements, and some plants may face high costs and difficult engineering work as they try to adhere to new rules and stay in operation. Adding screens to basically filter fish out of the intake for hydropower plants is one potential solution in some cases, but both installation and maintenance of such a system can add significant cost. In these facilities, Natel’s technology represents an alternative, Schneider says.
Natel has installed several projects in Maine, Oregon, and Austria. They all involve relatively small turbines, but the company is on the way to undertaking bigger projects and recently won a bid process with a manufacturing partner to supply a larger turbine that’s three meters in diameter to an existing plant, Schnieder says. The company is also licensing its fish-safe turbine designs to existing manufacturers.
Whether utilities move to adopt fish-safe design could depend on how it affects efficiency, or the amount of energy that can be captured by a given water flow. Natel’s turbine designs will, in some cases, be slightly less efficient than today’s conventional ones, Schneider says, though the difference is marginal, and they likely still represent an improvement over older designs.
While there’s sometimes a trade-off between fish-safe design and efficiency, that’s not the case with all novel turbines in all cases. A 2019 study from the US Army Corps of Engineers found that one new design improved fish safety while also producing more power.
Slotting new turbines into hydropower plants won’t solve all the environmental challenges associated with the technology, though. For example, the new equipment would only be relevant for downstream migration, like when eels move from freshwater rivers out into the ocean to reproduce. Other solutions would still be needed to allow a path for upstream migration.
Ideally, the best solution for many plants would likely be natural bypasses or ramps, which allow free passage of many species in both directions, says Ana T. Silva, a senior research scientist at the Norwegian Institute for Nature Research. However, because of space requirements, these can’t always be installed or used.
Natel CTO Abe Schneider holds a large trout used in fish passage testing at the Monroe Hydro Plant in Madras, Oregon.
NATEL
People have been trying to improve fish passage for a long time, says Michael Milstein, a senior public affairs officer at NOAA Fisheries, part of the US National Oceanic and Atmospheric Administration. The solutions in place today include fish ladders, where fish swim or hop up into successively taller pools to pass dams. Other dams are too tall for that, and fish are captured and loaded onto trucks to go around them.
The challenge, Milstein says, is that “every river is different, and every dam is different.” Solutions need to be adapted to each individual situation, he adds; fish-safe turbines would be most important when there’s no bypass and going through a facility is the only option fish have.
The issue of protecting ecosystems and providing safe passage for fish has sparked fierce debates over existing hydropower projects across the western US and around the world.
Even with the current state-of-the-art technology, “it’s not always possible to provide sufficient passage,” Milstein says. Several dams are currently being removed from the Klamath River in Oregon and Northern California because of the effects on local ecosystems. The dams drastically changed the river, wiping out habitat for local salmon, steelhead, and lamprey and creating ideal conditions for parasites to decimate fish populations.
But while hydropower facilities can have negative environmental impacts, climate change can also be extremely harmful to wildlife, Natel’s Schneider points out. If too many hydropower plants are shut down, it could leave a gap that keeps more fossil fuels on the grid, hampering efforts to address climate change.
Reducing hydropower plants’ impact on local environments could help ensure that more of them can stay online, generating renewable electricity that plays an important role in our electrical grid. “Fish-safe turbines won’t solve everything—there are many, many problems in our rivers,” Schneider says. “But we need to start tackling all of them, so this is one tool.”
This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.
How fish-safe hydropower technology could keep more renewables on the grid
Hydropower is the world’s leading source of renewable electricity, generating more power in 2022 than all other renewables combined. But while hydropower is helping clean up electrical grids, it’s not always great for fish. Dams can change their habitats. And for migratory species, hydropower facilities can create dangerous or insurmountable barriers.
That’s why, in some parts of the world, governments have put protections in place to protect ecosystems from hydropower’s potential harms. These can sometimes force older facilities to close, and that’s a big problem: pulling hydropower plants off the grid eliminates a flexible, low-emissions power source that can contribute to progress in fighting climate change.
But there’s some good news: new technologies, including fish-safe turbines, could help utilities and regulators come closer to striking a balance between the health of river ecosystems and global climate goals. Read the full story.
—Casey Crownhart
What it’s like to be a space debris engineer
Although significant attention has been devoted to launching spacecraft into space, the idea of what to do with their remains has been largely ignored until recently. Satellites have simply been left in orbit at the ends of their lives, creating debris that must be monitored and, if possible, maneuvered around to avoid a collision.
But there are people working on cleaning Earth’s orbit up. Meet Stijn Lemmens. He’s a senior space debris mitigation analyst at the European Space Agency. Lemmens works on counteracting space pollution by collaborating with spacecraft designers and the wider industry to create missions less likely to clutter the orbital environment. Read all about him and his work.
—Elna Schütz
This story is from the latest issue of MIT Technology Review. Subscribe to read the whole thing, if you don’t already!
The must-reads
I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 Apple is planning to bring AI features to the Vision Pro It must be hoping that this could boost sales of the device, which have been disappointing so far. (Bloomberg $) + The Vision Pro is now on sale outside the US. (Ars Technica)
2 Detroit is changing how its police use facial recognition It’s making the rules much stricter, after bad matches led to three wrongful arrests. (NYT $) + The movement to limit face recognition tech might finally get a win. (MIT Technology Review)
3 What is AI search good for? Given the errors, it’s best to think of its answers as a starting point rather than the final word. (Vox) + Here’s why chatbots make things up—and why it’s such a deep-rooted problem. (MIT Technology Review) + OpenAI has built an AI tool that it says can spot hallucinations. (IEEE Spectrum)
4 Amazon plans to spend over $100 billion on data centers over the next decade And yep, you guessed it: it’s all about meeting demand for AI tools. (WSJ $) + Amazon is copying Shein and Temu’s playbook, prioritizing cheapness over speed. (The Atlantic $)
5 Brazil’s Pantanal fire season is already breaking records And it isn’t even meant to have started yet. (ABC) + How NASA is using AI and drones to tackle wildfires. (CNET) + Meet the scientists trying to understand the world’s worst wildfires. (MIT Technology Review)
6 Combined covid-flu vaccines are coming Moderna has just completed successful phase III trials for the drug. (Nature) + The next generation of mRNA vaccines is on its way. (MIT Technology Review)
7 These parents are campaigning for a phone-free childhood They’re trying to do the right thing—but the odds are painfully stacked against them. (The Guardian) + New York City plans to ban phones from schools. (NPR)
8 There’s a big problem with electric vehicles: buggy software When you add more complexity, you add more points of failure. (The Verge) + How did China come to dominate the world of electric cars? (MIT Technology Review)
9 Hot AI Jesus is all over Facebook And he appears to be astonishingly popular engagement bait. (The Atlantic $)
10 Tennis hopes to use video games to win over new fans After all, it’s worked well as a strategy for soccer. (FT $)
Quote of the day
“I think we’re starting to increasingly lose touch with what an unedited face looks like.”
—Dr Kerry McInerney, a research associate at the University of Cambridge, tells CNN that AI is turbo-charging already-unrealistic beauty standards online.
The big story
A brief, weird history of brainwashing
SHIRLEY CHONG
April 2024
On a spring day in 1959, war correspondent Edward Hunter testified before a US Senate subcommittee investigating “the effect of Red China Communes on the United States.”
Hunter discussed a new concept to the American public: a supposedly scientific system for changing people’s minds, even making them love things they once hated.
Much of it was baseless, but Hunter’s sensational tales still became an important part of the disinformation and pseudoscience that fueled a “mind-control race” during the Cold War. US officials prepared themselves for a psychic war with the Soviet Union and China by spending millions of dollars on research into manipulating the human brain.
But while the science never exactly panned out, residual beliefs fostered by this bizarre conflict continue to play a role in ideological and scientific debates to this day. Read the full story.
—Annalee Newitz
We can still have nice things
A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)
When someone loses part of a leg, a prosthetic can make it easier to get around. But most prosthetics are static, cumbersome, and hard to move. A new neural interface connects a bionic limb to nerve endings in the thigh, allowing the limb to be controlled by the brain. The new device, which is described today in Nature Medicine, could help people with lower-leg amputations feel as if their prosthesis is part of them.
“When you ask a patient ‘What is your body?’ They don’t include the prosthesis,” says MIT biophysicist Hugh Herr, one of the lead authors on the study. The work is personal for him: he lost both his lower legs in a climbing accident when he was 17. He says linking the brain to the prosthesis can make it feel more like part of someone’s anatomy, which can have a positive emotional impact.
Getting the neural interface hooked up to a prosthetic takes two steps. First, patients undergo surgery. Following a lower leg amputation, portions of shin and calf muscle still remain. The operation connects shin muscle, which contracts to make the ankle flex upward, to calf muscle, which counteracts this movement. The prosthetic can also be fitted at this point. Reattaching the remnants of these muscles can enable the prosthetic to move more dynamically. It can also reduce phantom limb pain, and patients are less likely to trip and fall.
“The surgery stands on its own,” says Amy Pietrafitta, a para-athlete who received it in 2018. “I feel like I have my leg back.” But natural movements are still limited when the prosthetic isn’t connected to the nervous system.
In step two, surface electrodes measure nerve activity from the brain to the calf and shin muscles, indicating an intention to move the lower leg. A small computer in the bionic leg decodes those nerve signals and moves the leg accordingly, allowing the patient to move the limb more naturally.
“If you have intact biological limbs, you can walk up and down steps, for example, and not even think about it. It’s involuntary,” says Herr. “That’s the case with our patients, but their limb is made of titanium and silicone.”
The authors compared the mobility of seven patients using a neural interface with that of patients who had not received the surgery. Patients using the neural interface could walk 41% faster and climb sloped surfaces and steps. They could also dodge obstacles more nimbly and had better balance. And they described feeling that the prosthetic was truly a part of their body rather than just a tool that they used to get around.
“It’s a very forward-thinking approach,” says Hamid Charkhkar, a biomedical engineer at Case Western Reserve University, who was not involved in the study. “Our limbs are not like shoes. They’re not worn over our bodies. They are integrally attached to our bodies via bones, muscles, and nerves.”
There are limitations. The surgery can be done during amputation or several years later, but it won’t work equally well for every patient. If it’s done later, for example, some people’s upper thigh muscles could have atrophied too severely for them to receive the full benefits.
The surgery connecting the shin and calf muscles has become the standard of care at Brigham and Women’s Hospital in Boston. But the surface electrodes that give patients full neural control of their limbs are a few years away from being clinically implemented. Plus, the neural interfaces have only been used in laboratory settings, and it will be important to know how they hold up in the real world.
Herr and his team at MIT hope to provide users with even greater control over their prosthetic limbs. In the future, their efforts will likely involve replacing the surface electrodes with magnetic spheres, which can more accurately track muscle dynamics.
“The goal that we have is to really reconstruct bodies, to rebuild bodies,” says Herr. And to fully achieve that ambition, he says, “neural integration and embodiment is our long-term goal.”
This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.
This week I’ve been thinking about thought. It was all brought on by reading my colleague Niall Firth’s recent cover story about the use of artificial intelligence in video games. The piece describes how game companies are working to incorporate AI into their products to create more immersive experiences for players.
These companies are applying large language models to generate new game characters with detailed backstories—characters that could engage with a player in any number of ways. Enter in a few personality traits, catchphrases, and other details, and you can create a background character capable of endless unscripted, never-repeating conversations with you.
This is what got me thinking. Neuroscientists and psychologists have long been using games as research tools to learn about the human mind. Numerous video games have been either co-opted or especially designed to study how people learn, navigate, and cooperate with others, for example. Might AI video games allow us to probe more deeply, and unravel enduring mysteries about our brains and behavior?
I decided to call up Hugo Spiers to find out. Spiers is a neuroscientist at University College London who has been using a game to study how people find their way around. In 2016, Spiers and his colleagues worked with Deutsche Telekom and the games company Glitchers to develop Sea Hero Quest, a mobile video game in which players have to navigate a sea in a boat. They have since been using the game to learn more about how people lose navigational skills in the early stages of Alzheimer’s disease.
The use of video games in neuroscientific research kicked into gear in the 1990s, Spiers tells me, following the release of 3D games like Wolfenstein 3D and Duke Nukem. “For the first time, you could have an entirely simulated world in which to test people,” he says.
Scientists could observe and study how players behaved in these games: how they explored their virtual environment, how they sought rewards, how they made decisions. And research volunteers didn’t need to travel to a lab—their gaming behavior could be observed from wherever they happened to be playing, whether that was at home, at a library, or even inside an MRI scanner.
For scientists like Spiers, one of the biggest advantages of using games in research is that people want to play them. The use of games allows scientists to explore fundamental experiences like fun and curiosity. Researchers often offer a small financial incentive to volunteers who take part in their studies. But they don’t have to pay people to play games, says Spiers.
You’re much more likely to have fun if you’re motivated. It’s just not quite the same when you’re doing something purely for the money. And not having to pay participants allows researchers to perform huge studies on smaller budgets. Spiers has been able to collect data on over 4 million people from 195 countries, all of whom have willingly played Sea Hero Quest.
AI could help researchers go even further. A rich, immersive world filled with characters that interact in realistic ways could help them study how our minds respond to various social settings and how we relate to other individuals. By observing how players interact with AI characters, scientists can learn more about how we cooperate—and compete—with others. It would be far cheaper and easier than hiring actors to engage with research volunteers, says Spiers.
Spiers himself is interested in learning how people hunt, whether for food, clothes, or a missing pet. “We still use these bits of our brain that our ancestors would have used daily, and of course some traditional communities still hunt,” he tells me. “But we know almost nothing about how the brain does this.” He envisions using AI-driven nonplayer characters to learn more about how humans cooperate for hunting.
There are other, newer questions to explore. At a time when people are growing attached to “virtual companions,” and an increasing number of AI girlfriends and boyfriends are being made available, AI video-game characters could also help us understand these novel relationships. “People are forming a relationship with an artificial agent,” says Spiers. “That’s inherently interesting. Why would you not want to study that?”
Now read the rest of The Checkup
Read more from MITTechnology Review’s archive:
My fellow London-based colleagues had a lot of fun generating an AI game character based on Niall. He turned out to be a sarcastic, smug, and sassy monster.
Google DeepMind has developed a generative AI model that can generate a basic but playable video game from a short description, a hand-drawn sketch, or a photo, as my colleague Will Heaven wrote earlier this year. The resulting games look a bit like Super Mario Bros.
Today’s world is undeniably gamified, argues Bryan Gardiner. He explores how we got here in another article from the Play issue of the magazine.
Large language models behave in unexpected ways. And no one really knows why, as Will wrote in March.
Technologies can be used to study the brain in lots of different ways—some of which are much more invasive than others. Tech that aims to read your mind and probe your memories is already being used, as I wrote in a previous edition of The Checkup.
From around the web:
Bad night of sleep left you needing a pick-me-up? Scientists have designed an algorithm to deliver tailored sleep-and-caffeine-dosing schedules to help tired individuals “maximize the benefits of limited sleep opportunities and consume the least required amount of caffeine.” (Yes, it may have been developed with the US Army in mind, but surely we all stand to benefit?) (Sleep)
Is dog cloning a sweet way to honor the memory of a dearly departed pet, or a “frivolous and wasteful and ethically obnoxious” pursuit in which humans treat living creatures as nothing more than their own “stuff”? This feature left me leaning toward the latter view, especially after learning that people tend to like having dogs with health problems … (The New Yorker)
States that have enacted the strongest restrictions to abortion access have also seen prescriptions for oral contraceptives plummet, according to new research. (Mother Jones)
And another study has linked Texas’s 2021 ban on abortion in early pregnancy with an increase in the number of infant deaths recorded in the state. In 2022, across the rest of the US, the number of infant deaths ascribed to anomalies present at birth decreased by 3.1%. In Texas, this figure increased by 22.9%. (JAMA Pediatrics)
We are three months into the bird flu outbreak in US dairy cattle. But the country still hasn’t implemented a sufficient testing infrastructure and doesn’t fully understand how the virus is spreading. (STAT)
This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.
How AI video games can help reveal the mysteries of the human mind
Video gaming companies are applying large language models to generate new game characters with detailed backstories—characters that could engage with a player in any number of ways. Enter in a few personality traits, catchphrases, and other details, and you can create a character capable of endless unscripted, never-repeating conversations with you. (You can read our story all about that here.)
Beyond just gaming however, it’s a development that raises a tantalizing prospect: might AI video games allow neuroscientists and psychologists to probe more deeply, and unravel enduring mysteries about our brains and behavior? Our senior reporter Jessica Hamzelou decided to find out. Here’s what she learned.
This story is from The Checkup, our weekly newsletter all about biotech and health. Sign up to receive it in your inbox every Thursday.
Inside the US government’s brilliantly boring websites
Before the internet, Americans may have interacted with the federal government by stepping into grand buildings adorned with impressive stone columns and gleaming marble floors.
Today, the neoclassical architecture of those physical spaces has been (at least partially) replaced by the digital architecture of website design—HTML code, tables, forms, and buttons.
There are about 26,000 federal websites in the US. And for a long time, they were buggy or poorly designed. That all started changing in 2014, when President Obama created two new teams to help improve government tech. Read about what they’ve achieved since.
This story is from the latest issue of MIT Technology Review, which explores the theme of Play. Subscribe to read the whole thing, if you don’t already!
The must-reads
I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 Trump-Biden debate conspiracies are already all over the internet And plenty of them are being pushed by Trump himself. (Wired $) + Election misinformation is being repeated by AI tools like ChatGPT and Copilot too. (NBC) + Spare a thought for pollsters. Their job is only getting harder and harder these days. (Ars Technica)
2 The voices of AI can tell us a lot It’s new technology, but stereotypes of a compliant, endlessly empathetic female assistant are as old as it gets. (NYT $)
3 An effort is underway to encourage responsible use of AI in music But of course, it relies on getting enough adoption—and that’s a big ask. (CNET) + Especially as there’s a giant legal battle underway over getting AI companies to pay to use records for training data. (MIT Technology Review) + Content-licensing sellers have formed the first AI dataset trade body. (Reuters $) + Time is the latest publisher to strike a licensing deal with OpenAI. (Axios)
4 We’re getting a better idea of how weight loss drugs work Researchers have zeroed in on two groups of neurons in the brain that seem to regulate the feeling of fullness. (Nature)
5 Google says Gemini AI is 20% faster than ChatGPT And execs say it can now cite its sources, which is arguably even more important. (Quartz $) + It’s not just Nvidia: here’s the AI stocks to watch. (WP $)
6 Amazon is investigating AI search startup Perplexity Over whether it violated its rules by scraping its websites. (Wired $) + Perplexity’s CEO openly admitted to some pretty dodgy data practices when they were getting off the ground. (404 Media)
7 ISS astronauts had to take shelter after a Russian satellite disintegrated It broke up into over 100 pieces, raising speculation it could’ve been subject to an anti-satellite missile test. (Gizmodo) + Why the first-ever space junk fine is such a big deal. (MIT Technology Review)
8 A lot of Gen Zs describe themselves as content creators Passively lurking online is just not the vibe anymore. (WP $)
9 Would you clone your dog? It’d set you back $50,000—and in a way, you have to ask what you’re really getting for that. (New Yorker $) + These scientists are working to extend the life span of pet dogs—and their owners. (MIT Technology Review)
10 Why the internet’s going wild for Nerds Gummy Clusters No joke—people are getting tattoos. (Slate $)
Quote of the day
“Let’s not go overboard on this. Datacentres are, in the most extreme case, a 6% addition [in energy demand] but probably only 2% to 2.5%. The question is, will AI accelerate a more than 6% reduction? And the answer is: certainly.”
—Bill Gates claims AI will be more of a help than a hindrance in achieving climate goals, amid rising concern about its energy footprint, The Guardian reports.
The big story
Inside NASA’s bid to make spacecraft as small as possible
NASA/JPL-CALTECH
October 2023
Since the 1970s, we’ve sent a lot of big things to Mars. But when NASA successfully sent twin Mars Cube One spacecraft, the size of cereal boxes, in November 2018, it was the first time we’d ever sent something so small.
Just making it this far heralded a new age in space exploration. NASA and the community of planetary science researchers caught a glimpse of a future long sought: a pathway to much more affordable space exploration using smaller, cheaper spacecraft. Read the full story.
—David W. Brown
We can still have nice things
A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)
+ The Bear probably wouldn’t exist if it weren’t for the late, great Bourdain. + Exhausted? Remember your energy is a finite resource. Use it wisely. + Always late to everything? This has to be one of the funniest excuses I’ve heard yet.
AI music is suddenly in a make-or-break moment. On June 24, Suno and Udio, two leading AI music startups that make tools to generate complete songs from a prompt in seconds, were sued by major record labels. Sony Music, Warner Music Group, and Universal Music Group claim the companies made use of copyrighted music in their training data “at an almost unimaginable scale,” allowing the AI models to generate songs that “imitate the qualities of genuine human sound recordings.”
Two days later, the Financial Timesreported that YouTube is pursuing a comparatively aboveboard approach. Rather than training AI music models on secret data sets, the company is reportedly offering unspecified lump sums to top record labels in exchange for licenses to use their catalogues for training.
In response to the lawsuits, both Suno and Udio released statements mentioning efforts to ensure that their models don’t imitate copyrighted works, but neither company has specified whether their training sets contain them. Udio said its model “has ‘listened’ to and learned from a large collection of recorded music,” and two weeks before the lawsuits, Suno CEO Mikey Shulman told me its training set is “both industry standard and legal” but the exact recipe is proprietary.
While the ground here is changing fast, none of these moves should be all that surprising: litigious training-data battles have become something like a rite of passage for generative AI companies. The trend has led many of those companies, including OpenAI, to pay for licensing deals while the cases unfold.
However, the stakes are higher for AI music than for image generators or chatbots. Generative AI companies working in text or photos have options to work around lawsuits; for example, they can cobble together open-source corpuses to train models. In contrast, music in the public domain is much more limited (and not exactly what most people want to listen to).
Other AI companies can also more easily cut licensing deals with interested publishers and creators, of which there are many; but rights in music are far more concentrated than those in film, images, or text, industry experts say. They’re largely managed by the three biggest record labels—the new plaintiffs—whose publishing arms collectively own more than 10 million songs and much of the music that has defined the last century. (The filing names a long list of artists who the labels allege were wrongfully included in training data, ranging from ABBA to those on the Hamilton soundtrack.)
On top of all this, it’s also just more difficult to create music worth listening to—generating a readable poem or passable illustration with AI is one technical challenge, but infusing a model with the taste required to create music we like is another.
It’s of course possible that the AI companies will win the case, and none of this will matter; they would have carte blanche to train on a century of copyrighted music. But experts say the case from the record labels is strong, and it’s more likely that AI companies will soon have to pay up—and pay a lot—if they want to survive. If a court were to rule that AI music companies could not train for free on these labels’ catalogues, then expensive licensing deals, like the one YouTube is reportedly pursuing, would seem to be the only path forward. This would effectively ensure that the company with the deepest pockets ends up on top.
More than any training-data case yet, the outcome of this one will determine the shape of a big slice of AI—and whether there is a future for it at all.
Merits of the case
Suno’s music generator has been public for less than a year, but the company has already garnered 12 million users, a $125 million funding round last month, and a partnership with Microsoft Copilot. Udio is even newer to the scene, having launched in April with $10 million in seed funding from musician-investors like will.i.am and Common.
The record labels allege that both of the startups are engaging in copyright infringement on the training and the output sides of their models.
“The plaintiffs here have the best odds of almost anyone suing an AI company,” says James Grimmelmann, a professor of digital and information law at Cornell Law School. He draws comparisons to the ongoing New York Times case against OpenAI, which he says offered, until now, the best example of a rights holder with a strong case against an AI company. But the suit against Suno and Udio “is worse for a bunch of reasons.”
The Times has accused OpenAI of copyright infringement in its model training by using the publication’s articles without consent. Grimmelmann says OpenAI has a bit of plausible deniability in this accusation, because the company could say that it scraped much of the internet for a training corpus and copies of New York Times articles appeared in places without the company’s knowledge.
For Suno and Udio, that defense is far less believable. “This is not like, ‘We scraped the web for all audio and we couldn’t tell the commercially produced songs apart from everything else,’” Grimmelmann says. “It’s pretty clear that they had to have been pulling in large databases of commercial recordings.”
In addition to complaints about training, the new case alleges that tools like Suno and Udio are more imitative than generative AI, meaning that their output mimics the style of artists and songs protected by copyright.
While Grimmelmann notes that the Times cited examples in which ChatGPT reproduced entire copies of its articles, record labels claim they were able to generate problematic responses from the AI music models with much simpler prompts. For instance, prompting Udio with “my tempting 1964 girl smokey sing hitsville soul pop,” the plaintiffs say, yielded a song that “any listener familiar with the Temptations would instantly recognize as resembling the copyrighted sound recording ‘My Girl.’” (The court documents include links to examples on Udio, but the songs appear to have been removed.) The plaintiffs mention similar examples from Suno, including an ABBA-adjacent song called “Prancing Queen” that was generated with the prompt “70s pop” and the lyrics for “Dancing Queen.”
What’s more, Grimmelmann explains, there is more copyrightable information in a song than a news article. “There’s just a lot more information density in capturing the way that Mariah Carey’s voice works than there is in words,” he says, which is perhaps part of the reason past lawsuits navigating music copyright have sometimes been so drawn-out and complex.
In a statement, Shulman wrote that Suno prioritizes originality and that the model is “designed to generate completely new outputs, not to memorize and regurgitate preexisting content.” He added, “That is why we don’t allow user prompts that reference specific artists.” Udio’s statement similarly mentioned “state-of-the-art filters to ensure our model does not reproduce copyrighted works or artists’ voices.”
Indeed, the tools will block a request if it names an artist. But the record labels allege that the safeguards have significant loopholes. Following the news of the lawsuits, for instance, social media users shared examples suggesting that if users separate an artist’s name with spaces, the request may go through. My own request for “a song like Kendrick” was blocked by Suno, citing an artist’s name, but “a song like k e n d r i c k” resulted in a “hip-hop rhythmic beat-driven” track and “a song like k o r n” resulted in “nu-metal heavy aggressive.” (To be fair, they didn’t resemble the respective artists’ unique styles, but to even respond in the right tightly defined genre seems to suggest that the model is in fact familiar with each artist’s work.) Similar workarounds were blocked on Udio.
Possible outcomes
There are three ways the case could go, Grimmelmann says. One is wholly in favor of the AI startups: the lawsuits fail and the court determines that companies did not violate fair use or imitate copyrighted works too closely in their outputs. If the models are found to fall under fair use, it would mean songwriters and rights holders would need to find a different legal mechanism to pursue compensation.
Another possibility is a mixed bag: the court finds the AI companies did not violate fair use in their training but must better control their models’ output to make sure it does not improperly imitate copyrighted works. Grimmelmann says this would be similar to one of the initial rulings against Napster, in which the company was forced to ban searches for copyrighted works in its libraries (though users quickly found workarounds).
The third and essentially nuclear option is that the court finds fault on both the training and the output sides of the AI models. This would mean the companies could not train on copyrighted works without licenses, and also could not allow outputs that closely imitate copyrighted works. The companies could be ordered to pay damages for infringement, which could run into the hundreds of millions for each company. If they aren’t bankrupted by such a ruling, it would force them to completely restructure their training through licensing deals, which could also be cost-prohibitive.
COURTESY SUNO.AI
To license or not to license
Though the immediate goals of the plaintiffs are to get the AI companies to cease training and pay damages, the chairman of the Recording Industry Association of America, Mitch Glazier, is already looking ahead toward a future of licensing. “As in the past, music creators will enforce their rights to protect the creative engine of human artistry and enable the development of a healthy and sustainable licensed market that recognizes the value of both creativity and technology,” he wrote in a recent op-ed in Billboard.
Such a market for licenses could mirror what has already unfolded for text generators. OpenAI has struck licensing deals with a number of news publishers, including Politico, the Atlantic, andthe Wall Street Journal. The deals promise to make content from the publishers discoverable in OpenAI’s products, though the ability for the models to transparently cite where they’re getting information from is limited at best.
If AI music companies follow that pattern, the only ones with the means to create powerful music models might be those with the most cash. That’s perhaps exactly what YouTube is thinking. The company did not immediately respond to questions from MIT Technology Review about the details of its negotiations, but given the massive amount of data required to train AI models and the concentration of rights owners in music, it’s fair to assume the price of deals with record labels would be eye-popping.
In theory, an AI company could bypass the licensing process altogether by building its model exclusively on music in the public domain, but it would be a Herculean task. There have been similar efforts in the realm of text and image generation, including a legal consultancy in Chicago that created a model trained on dense regulatory documents, and a model from Hugging Face that trained on images of Mickey Mouse from the 1920s. But the models are small and unremarkable. If Suno or Udio is forced to train on only what’s in the public domain—think military march music and the royalty-free songs found in corporate videos—the resulting model would be a far cry from what they have today.
If AI companies do move forward with licensing agreements, negotiations may be tricky, says Grimmelmann. Music licensing is complicated by the fact that two different copyrights are at play: one for the song, which generally covers the composition, like the music and lyrics, and one for the master, which covers the recording—like what you’d hear if you streamed the song.
Some artists, like Taylor Swift and Frank Ocean, have come to own the masters of their catalogues after drawn-out legal battles, and would therefore be in the driver’s seat for any potential licensing deal. Many others, though, retain only the song copyright, while the record labels retain the masters. In these cases, the record label might theoretically be able to grant AI companies a license to use the music without an artist’s permission—but at the risk of burning relationships with artists and sparking more legal battles.
The question of whether to license their music to such companies has divided musician groups. In contract rules adopted in April by SAG-AFTRA, which represents recording artists as well as actors, AI clones of member voices are allowed, though there are minimum rates for compensation. Back in December, a group called the Indie Musicians Caucus expressed frustrations that the leading instrumental musicians’ union, the 70,000-member American Federation of Musicians (AFM), was not doing enough to protect its rank and file against AI companies in contracts. The caucus wrote that it would vote against any agreement “obligating AFM members to dig [their] own graves by participating—without a right to consent, compensation, or credit—in the training of our permanent Generative AI replacements.”
But at this point, AFM does not appear eager to facilitate any deals. I asked Kenneth Shirk, international secretary-treasurer at AFM, whether he thought musicians should engage with AI companies and push to be fairly compensated, whatever that means, or instead resist licensing deals completely.
“Looking at those questions makes me think, would you rather have a swarm of fire ants crawling all over you, or roll around in a bed of broken glass?” he told me. “We want musicians to get paid. But we also want to ensure that there’s a career in music to be had for those that are going to come after us.”
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
Some people track sports scores or their favorite artists’ tour set lists. Meanwhile, I’m just waiting to hear which climate tech startups are getting big funding awards from government agencies. It’s basically the same thing.
Every few years, the US agency that’s often called the “energy moonshot factory” announces such awards for a few companies to help them scale up their technology. (The agency’s official name is the Advanced Research Projects Agency—Energy, or ARPA-E.) The grants are designed to help companies take their tech from the lab or pilot stage and get it out into the world.
The latest batch of these awards was just announced, totaling over $63 million split between four companies. Let’s dig into the winners and consider what each one’s technology says about their respective corners of climate action.
Antora Energy: Heat batteries for industry
Let’s start with the company you’re most likely to know if you follow this newsletter: Antora Energy. The California-based company is building thermal batteries for use in heavy industry. I covered the company and its first pilot project last year, and thermal batteries were the readers’ choice winner on our list of Breakthrough Technologies this year.
In case you need a quick refresher, the basic idea behind Antora’s technology is to store energy from cheap, clean wind and solar power in the form of heat, and then use that heat in industrial facilities. It’s an elegant solution to the problem that renewables are available only sometimes, while industry needs clean energy all the time if it wants to cut its carbon emissions, which amount to a whopping 30% of the global total.
Antora was awarded $14.5 million to scale its technology. One thing the company hopes to achieve with the cash influx is progress on its second product, which delivers not only heat but also electricity.
Queens Carbon: Lower-emissions cement
Cement is a climate villain hiding in plain sight, as I’ve covered in this newsletter before. Producing the gray slabs that scaffold our world accounts for about 7% of global emissions.
The challenge in cleaning up the process lies, at least in part, in the fact that lava-hot temperatures are required to kick off the chemical reactions that make cement—I’m talking over 1,500 °C (2,700 °F).
Queens Carbon developed a new process that cuts down the temperature needed to under 540 °C (1,000 °F). Still toasty, but easier to reach efficiently and with electricity, the company’s CEO, CTO, and cofounder Daniel Kopp said on a press call about the awards. Ideally, that electricity will be supplied with renewables, which could mean big emissions savings.
Queens Carbon will also pocket $14.5 million, and the funding should help with the construction of a pilot plant currently being built in partnership with a major cement producer, Kopp said on the press call. The company plans to scale up to a full-size plant in late 2028 or 2029.
Ion Storage Systems: Next-generation batteries for EVs
The world is always clamoring for better batteries, and Maryland-based Ion Storage Systems wants to deliver with its solid-state lithium-metal technology.
We named lithium-metal batteries one of our 10 Breakthrough Technologies in 2021. The chemistry could deliver higher energy density, meaning longer range in EVs.
Ion Storage Systems is planning to produce its batteries first for military customers. With the funding ($20 million worth), the company may be able to get its tech ready for larger-scale production for the wider customer base of the electric-vehicle market.
I was really interested to hear about the emphasis on manufacturing from CTO Greg Hitz on the press call, as scaling up manufacturing has been a major challenge for other companies trying to build solid-state batteries. Hitz also said that the company’s batteries don’t need to be squeezed at high pressure within cells or heated up, and they can be more simply integrated into battery packs.
AeroShield Materials: High-tech insulation for more efficient buildings
Last but certainly not least is AeroShield Materials. Between 30% and 40% of energy we put into our buildings for heat and cooling is lost through windows and doors—that’s about $40 billion per year for residential buildings, said Elise Strobach, the company’s CEO and cofounder, on the press call.
AeroShield is making materials called aerogels that are clear, lightweight, and fire resistant. They can help make windows 65% more energy efficient, Strobach says.
Insulation isn’t always the most exciting topic, but efficiency is one of the best ways to cut down the need for more energy and provide a straightforward way to slash emissions. AeroShield is starting with windows and doors but plans to explore other projects like retrofitting windows and producing insulation for freezer and refrigerator doors, Strobach said on the call. The $14.5 million award will help build a pilot manufacturing facility.
These projects cover a huge range of businesses, from transportation and buildings to heavy industry. The one thing they have in common? All urgently need to clean up their act if the world is going to address climate change. Each of these awards is a big vote of confidence from an agency that’s had a lot of experience in energy technology—but what really matters is what these companies do with the money now.
There’s a growing pool of money for scientists exploring whether we can reflect away more sunlight to ease warming caused by climate change.
Quadrature Climate Foundation is among the organizations providing millions of dollars for research into solar geoengineering. This sort of funding can help scientists pursue lab work, modeling, and maybe even outdoor experiments that could improve our understanding of the often controversial field.
For more on where the money is coming from and how this might affect our efforts to address climate change, check out my colleague James Temple’s story here.
One more issue
We often talk about tech that’s serious business—but technology also has a huge effect on how we have fun. That’s the idea behind our latest print edition, the Play issue.
For the issue, I wrote about board games that take on the topic of climate change. Are they accurate about the challenge ahead, and crucially, can they be fun? Check out my take here. (For a more in-depth look at one particular game, a new climate-themed Catan, give this newsletter a read.)
I’d also highly recommend this feature from my colleague Eileen Guo, who looked into the growing business of surf pools—facilities that bring a usually ocean-based activity onto land. She gave one a spin, and considered how these spots affect places facing water scarcity.
The whole issue is great—find all the stories here.
Keeping up with climate
A new startup will take sodium sulfate, a waste material from manufacturing lithium-ion batteries, and turn it into chemicals that can go into new batteries. Aepnus Technology calls its approach a “fully circular” one. (Heatmap)
Solugen just scored a loan worth over $200 million from the US Department of Energy. The company uses biology to make chemicals used in industries from agriculture to concrete. (C&EN News)
Some Olympic teams, including the delegation from the US, plan to bring their own air conditioners to the Paris games this summer. It could be a big setback for the event’s climate goals. (Associated Press)
Advanced recycling promises an almost miraculous solution to our plastics crisis, but a close look at the industry reveals some problems. Very little plastic is made with these methods, and the industry is selling them on the basis of some tricky accounting. (ProPublica)
You may not know the name Yet-Ming Chiang, but you’ve probably heard of some of the companies he’s had a hand in starting, including Sublime Systems and Form Energy. Learn more about this MIT professor and serial entrepreneur here. (Cipher)
Running Tide had grand plans to suck carbon dioxide out of the atmosphere with the help of the ocean. Now, the startup is shutting down. Here’s what the company’s implosion means for carbon removal’s future. (Latitude Media)
→ The company was in some rocky waters a couple of years ago, as my colleague James Temple revealed at the time. (MIT Technology Review)
Volkswagen is investing $1 billion in the EV startup Rivian. The deal has the two companies creating a joint venture, and it could provide a path forward for Rivian, which has faced some struggles getting its vehicles to market. (TechCrunch)
This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.
Training AI music models is about to get very expensive
AI music is suddenly in a make-or-break moment. On June 24, Suno and Udio, two startups that let you generate complete songs from a prompt in seconds, were sued by major record labels. The labels alleged the startups had used copyrighted music as training data “at an almost unimaginable scale”.
Just two days later, the Financial Times reported that YouTube is pursuing a comparatively above-board approach. Rather than training AI music models on secret data sets, the company is reportedly offering unspecified lump sums to top record labels in exchange for licenses to use their catalogs for training data.
While the ground here is moving fast, none of these moves should be all that surprising: litigious training-data battles have become something like a rite of passage for generative AI companies. The trend has led many to pay for licensing deals while the cases unfold.
Every few years, the US agency that’s often called the “energy moonshot factory” announces big funding awards for a few companies to help them scale up their technology. (The agency’s official name is the Advanced Research Projects Agency—Energy, or ARPA-E.)
The grants are designed to help companies take their tech from the lab or pilot stage and get it out into the world. The latest batch of these awards was just announced, totaling over $63 million split between four companies. Read our story that digs into the winners and examines what each one’s technology says about their respective corners of climate action.
—Casey Crownhart
This story is from The Spark, our weekly newsletter giving you the inside track on all things climate tech. Sign up to receive it in your inbox every Wednesday.
Lego bricks are making science more accessible
Etienne Boulter walked into his lab at the Université Côte d’Azur in Nice, France, one morning with a Lego Technic excavator set tucked under his arm. His plan was simple yet ambitious: to use the pieces of the set to build a mechanical cell stretcher.
Boulter and his colleagues study mechanobiology—the way things like stretching or compression affect cells—and this piece of equipment is essential for his research. Commercial cell stretchers cost over $50,000. But one day, after playing with the Lego set, Boulter and his colleagues found a way to build one out of its components for only a little over $200.
Their Lego system stretches a silicone plate where cells are growing. This process causes the cells to deform and mimics how our own skin cells stretch. And Boulter is not alone. In fact, he’s one of many researchers turning to Lego components to build inexpensive yet extremely effective lab equipment.Read the full story.
—Elizabeth Fernandez
This story is from the latest issue of MIT Technology Review, which explores the theme of Play.
The must-reads
I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 The Supreme Court ruled the White House can contact social media firms It’s a blow for right-wing campaigners who argue their views are being censored online. (WP $) + Here’s what it means for the election. (NPR) + Russian propagandists are promoting deepfakes of Biden. (Wired $)
2 How AI has revolutionized protein science And the most exciting part? We’re really only at the beginning of discovering what machine learning could unlock. (Quanta $) + Google DeepMind’s new AlphaFold can model a much larger slice of biological life. (MIT Technology Review)
3 Inside California’s green energy revolution The state is showing how you can run a thriving modern economy on clean energy. (New Yorker $)
4 Toys ‘R’ Us used OpenAI’s video AI system Sora to make a commercial It’s a milestone for the use of AI in video production—but the response to it was very mixed. (NBC) + I tested out a buzzy new text-to-video AI model from China. (MIT Technology Review)
5 Secret Telegram channels are providing refuge for LGBTQ+ people in Russia Up to and including advice on how to leave the country, which is becoming less and less safe. (Wired $)
6 We really need AI to be able to cite its sources The trouble is, even if it could, would they be factually accurate? (The Atlantic $) + At least 10% of scientific research may already be co-authored by AI. (The Economist $)
7 Consultants are raking it in thanks to the AI boom But of course they are. (NYT $)
8 It’s become worryingly normalized to snoop on your partner’s online life Yet it’s still a really, really bad idea. (WP $)
9 Lawn Mowing Simulator is the latest anti-escapist video game Struggling to see the appeal personally, but hey, each to their own. (The Guardian)
10 McDonalds has rejected plant-based burgers After tests of its McPlant burger in San Francisco and Dallas failed. (Quartz $) + Here’s what a lab-grown burger tastes like. (MIT Technology Review)
Quote of the day
“There’s no question that this crosses a line that they hadn’t previously crossed. I think that suggests that the lines are becoming meaningless.”
—Darren Linvill, a founder of the Media Forensics Hub at Clemson University, tells the New York Times that aggressively targeting a US-based Chinese dissident’s 16-year-old daughter online represents a new low for the country’s security services.
The big story
Think that your plastic is being recycled? Think again.
MICHAEL BYERS
October 2023
The problem of plastic waste hides in plain sight, a ubiquitous part of our lives we rarely question. But a closer examination of the situation is shocking.
To date, humans have created around 11 billion metric tons of plastic, the vast majority of which ends up in landfills or the environment. Only 9% of the plastic ever produced has been recycled.
To make matters worse, plastic production is growing dramatically; in fact, half of all plastics in existence have been produced in just the last two decades.
So what do we do? Sadly, solutions such as recycling and reuse aren’t equal to the scale of the task. The only answer is drastic cuts in production in the first place. Read the full story.
—Douglas Main
We can still have nice things
A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)
+ Enjoy these award-winning black-and-white photos. + Is owning a pet good for you? On balance, it seems so! + I just learned that there’s more than one type of aurora. + Tis the season for potato salad, and this recipe is so good.
Stijn Lemmens has a cleanup job like few others. A senior space debris mitigation analyst at the European Space Agency (ESA), Lemmens works on counteracting space pollution by collaborating with spacecraft designers and the wider industry to create missions less likely to clutter the orbital environment.
Although significant attention has been devoted to launching spacecraft into space, the idea of what to do with their remains has been largely ignored. Many previous missions did not have an exit strategy. Instead of being pushed into orbits where they could reenter Earth’s atmosphere and burn up, satellites were simply left in orbit at the ends of their lives, creating debris that must be monitored and, if possible, maneuvered around to avoid a collision. “For the last 60 years, we’ve been using [space] as if it were an infinite resource,” Lemmens says. “But particularly in the last 10 years, it has become rather clear that this is not the case.”
Engineering the ins and outs: Step one in reducing orbital clutter—or, colloquially, space trash—is designing spacecraft that safely leave space when their missions are complete. “I thought naïvely, as a student, ‘How hard can that be?’” says Lemmens. The answer turned out to be more complicated than he expected.
At ESA, he works with scientists and engineers on specific missions to devise good approaches. Some incorporate propulsion that works reliably even decades after launch; others involve designing systems that can move spacecraft to keep them from colliding with other satellites and with space debris. They also work on plans to get the remains through the atmosphere without large risks to aviation and infrastructure.
Standardizing space: Earth’s atmosphere exerts a drag on satellites that will eventually pull them out of orbit. National and international guidelines recommend that satellites lower their altitude at the end of their operational lives so that they will reenter the atmosphere and make this possible. Previously the goal was for this to take 25 years at most; Lemmens and his peers now suggest five years or less, a time frame that would have to be taken into account from the start of mission planning and design.
Explaining the need for this change in policy can feel a bit like preaching, Lemmens says, and it’s his least favorite part of the job. It’s a challenge, he says, to persuade people not to think of the vastness of space as “an infinite amount of orbits.” Without change, the amount of space debris may create a serious problem in the coming decades, cluttering orbits and increasing the number of collisions.
Shaping the future: Lemmens says his wish is for his job to become unnecessary in the future, but with around 11,500 satellites and over 35,000 debris objects being tracked, and more launches planned, that seems unlikely to happen.
Researchers are looking into more drastic changes to the way space missions are run. We might one day, for instance, be able to dismantle satellites and find ways to recycle their components in orbit. Such an approach isn’t likely to be used anytime soon, Lemmens says. But he is encouraged that more spacecraft designers are thinking about sustainability: “Ideally, this becomes the normal in the sense that this becomes a standard engineering practice that you just think of when you’re designing your spacecraft.”