Here’s why we don’t have a cold vaccine. Yet.

For those of us in the Northern Hemisphere, it’s the season of the sniffles. As the weather turns, we’re all spending more time indoors. The kids have been back at school for a couple of months. And cold germs are everywhere.

My youngest started school this year, and along with artwork and seedlings, she has also been bringing home lots of lovely bugs to share with the rest of her family. As she coughed directly into my face for what felt like the hundredth time, I started to wonder if there was anything I could do to stop this endless cycle of winter illnesses. We all got our flu jabs a month ago. Why couldn’t we get a vaccine to protect us against the common cold, too?

Scientists have been working on this for decades. It turns out that creating a cold vaccine is hard. Really hard.

But not impossible. There’s still hope. Let me explain.

Technically, colds are infections that affect your nose and throat, causing symptoms like sneezing, coughing, and generally feeling like garbage. Unlike some other infections,—covid-19, for example—they aren’t defined by the specific virus that causes them.

That’s because there are a lot of viruses that cause colds, including rhinoviruses, adenoviruses, and even seasonal coronaviruses (they don’t all cause covid!). Within those virus families, there are many different variants.

Take rhinoviruses, for example. These viruses are thought to be behind most colds. They’re human viruses—over the course of evolution, they have become perfectly adapted to infecting us, rapidly multiplying in our noses and airways to make us sick. There are around 180 rhinovirus variants, says Gary McLean, a molecular immunologist at Imperial College London in the UK.

Once you factor in the other cold-causing viruses, there are around 280 variants all told. That’s 280 suspects behind the cough that my daughter sprayed into my face. It’s going to be really hard to make a vaccine that will offer protection against all of them.

The second challenge lies in the prevalence of those variants.

Scientists tailor flu and covid vaccines to whatever strain happens to be circulating. Months before flu season starts, the World Health Organization advises countries on which strains their vaccines should protect against. Early recommendations for the Northern Hemisphere can be based on which strains seem to be dominant in the Southern Hemisphere, and vice versa.

That approach wouldn’t work for the common cold, because all those hundreds of variants are circulating all the time, says McLean.

That’s not to say that people haven’t tried to make a cold vaccine. There was a flurry of interest in the 1960s and ’70s, when scientists made valiant efforts to develop vaccines for the common cold. Sadly, they all failed. And we haven’t made much progress since then.

In 2022, a team of researchers reviewed all the research that had been published up to that year. They only identified one clinical trial—and it was conducted back in 1965.

Interest has certainly died down since then, too. Some question whether a cold vaccine is even worth the effort. After all, most colds don’t require much in the way of treatment and don’t last more than a week or two. There are many, many more dangerous viruses out there we could be focusing on.

And while cold viruses do mutate and evolve, no one really expects them to cause the next pandemic, says McLean. They’ve evolved to cause mild disease in humans—something they’ve been doing successfully for a long, long time. Flu viruses—which can cause serious illness, disability, or even death—pose a much bigger risk, so they probably deserve more attention.

But colds are still irritating, disruptive, and potentially harmful. Rhinoviruses are considered to be the leading cause of human infectious disease. They can cause pneumonia in children and older adults. And once you add up doctor visits, medication, and missed work, the economic cost of colds is pretty hefty: a 2003 study put it at $40 billion per year for the US alone.

So it’s reassuring that we needn’t abandon all hope: Some scientists are making progress! McLean and his colleagues are working on ways to prepare the immune systems of people with asthma and lung diseases to potentially protect them from cold viruses. And a team at Emory University has developed a vaccine that appears to protect monkeys from around a third of rhinoviruses.

There’s still a long way to go. Don’t expect a cold vaccine to materialize in the next five years, at least. “We’re not quite there yet,” says Michael Boeckh, an infectious-disease researcher at Fred Hutch Cancer Center in Seattle, Washington. “But will it at some point happen? Possibly.”

At the end of our Zoom call, perhaps after reading the disappointed expression on my sniffling, cold-riddled face (yes, I did end up catching my daughter’s cold), McLean told me he hoped he was “positive enough.” He admitted that he used to be more optimistic about a cold vaccine. But he hasn’t given up hope. He’s even running a trial of a potential new vaccine in people, although he wouldn’t reveal the details.

“It could be done,” he said.

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

An AI app to measure pain is here

How are you feeling?

I’m genuinely interested in the well-being of all my treasured Checkup readers, of course. But this week I’ve also been wondering how science and technology can help answer that question—especially when it comes to pain. 
In the latest issue of MIT Technology Review magazine, Deena Mousa describes how an AI-powered smartphone app is being used to assess how much pain a person is in.

The app, and other tools like it, could help doctors and caregivers. They could be especially useful in the care of people who aren’t able to tell others how they are feeling.

But they are far from perfect. And they open up all kinds of thorny questions about how we experience, communicate, and even treat pain.

Pain can be notoriously difficult to describe, as almost everyone who has ever been asked to will know. At a recent medical visit, my doctor asked me to rank my pain on a scale from 1 to 10. I found it incredibly difficult to do. A 10, she said, meant “the worst pain imaginable,” which brought back unpleasant memories of having appendicitis.

A short while before the problem that brought me in, I’d broken my toe in two places, which had hurt like a mother—but less than appendicitis. If appendicitis was a 10, breaking a toe was an 8, I figured. If that was the case, maybe my current pain was a 6. As a pain score, it didn’t sound as bad as I actually felt. I couldn’t help wondering if I might have given a higher score if my appendix were still intact. I wondered, too, how someone else with my medical issue might score their pain.

In truth, we all experience pain in our own unique ways. Pain is subjective, and it is influenced by our past experiences, our moods, and our expectations. The way people describe their pain can vary tremendously, too.

We’ve known this for ages. In the 1940s, the anesthesiologist Henry Beecher noted that wounded soldiers were much less likely to ask for pain relief than similarly injured people in civilian hospitals. Perhaps they were putting on a brave face, or maybe they just felt lucky to be alive, given their circumstances. We have no way of knowing how much pain they were really feeling.

Given this messy picture, I can see the appeal of a simple test that can score pain and help medical professionals understand how best to treat their patients. That’s what is being offered by PainChek, the smartphone app Deena wrote about. The app works by assessing small facial movements, such as lip raises or brow pinches. A user is then required to fill a separate checklist to identify other signs of pain the patient might be displaying. It seems to work well, and it is already being used in hospitals and care settings.

But the app is judged against subjective reports of pain. It might be useful for assessing the pain of people who can’t describe it themselves—perhaps because they have dementia, for example—but it won’t add much to assessments from people who can already communicate their pain levels.

There are other complications. Say a test could spot that a person was experiencing pain. What can a doctor do with that information? Perhaps prescribe pain relief—but most of the pain-relieving drugs we have were designed to treat acute, short-term pain. If a person is grimacing from a chronic pain condition, the treatment options are more limited, says Stuart Derbyshire, a pain neuroscientist at the National University of Singapore.

The last time I spoke to Derbyshire was back in 2010, when I covered work by researchers in London who were using brain scans to measure pain. That was 15 years ago. But pain-measuring brain scanners are yet to become a routine part of clinical care.

That scoring system was also built on subjective pain reports. Those reports are, as Derbyshire puts it, “baked into the system.” It’s not ideal, but when it comes down to it, we must rely on these wobbly, malleable, and sometimes incoherent self-descriptions of pain. It’s the best we have.

Derbyshire says he doesn’t think we’ll ever have a “pain-o-meter” that can tell you what a person is truly experiencing. “Subjective report is the gold standard, and I think it always will be,” he says.

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

Take our quiz: How much do you know about antimicrobial resistance?

This week we had some terrifying news from the World Health Organization: Antibiotics are failing us. A growing number of bacterial infections aren’t responding to these medicines—including common ones that affect the blood, gut, and urinary tract. Get infected with one of these bugs, and there’s a fair chance antibiotics won’t help. 

The scary truth is that a growing number of harmful bacteria and fungi are becoming resistant to drugs. Just a few weeks ago, the US Centers for Disease Control and Prevention published a report finding a sharp rise in infections caused by a dangerous type of bacteria that are resistant to some of the strongest antibiotics. Now, the WHO report shows that the problem is surging around the world.

In this week’s Checkup, we’re trying something a bit different—a little quiz. You’ve probably heard about antimicrobial resistance (AMR) before, but how much do you know about microbes, antibiotics, and the scale of the problem? Here’s our attempt to put the “fun” in “fundamental threat to modern medicine.” Test your knowledge below!

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

This test could reveal the health of your immune system

Attentive readers might have noticed my absence over the last couple of weeks. I’ve been trying to recover from a bout of illness.

It got me thinking about the immune system, and how little I know about my own immune health. The vast array of cells, proteins, and biomolecules that works to defend us from disease is mind-bogglingly complicated. Immunologists are still getting to grips with how it all works.

Those of us who aren’t immunologists are even more in the dark. I had my flu jab last week and have no idea how my immune system responded. Will it protect me from the flu virus this winter? Is it “stressed” from whatever other bugs it has encountered in the last few months? And since my husband had his shot at the same time, I can’t help wondering how our responses will compare. 

So I was intrigued to hear about a new test that is being developed to measure immune health. One that even gives you a score.

Writer David Ewing Duncan hoped that the test would reveal more about his health than any other he’d ever taken. He described the experience in a piece published jointly by MIT Technology Review and Aventine.

The test David took was developed by John Tsang at Yale University and his colleagues. The team wanted to work out a way of measuring how healthy a person’s immune system might be.

It’s a difficult thing to do, for several reasons. First, there’s the definition of “healthy.” I find it’s a loose concept that becomes more complicated the more you think about it. Yes, we all have a general sense of what it means to be in good health. But is it just the absence of disease? Is it about resilience? Does it have something to do with withstanding the impact of aging?

Tsang and his colleagues wanted to measure “deviation from health.” They looked at blood samples from 228 people who had immune diseases that were caused by single-gene mutations, as well as 42 other people who were free from disease. All those individuals could be considered along a health spectrum.

Another major challenge lies in trying to capture the complexity of the immune system, which involves hundreds of proteins and cells interacting in various ways. (Side note: Last year, MIT Technology Review recognized Ang Cui at Harvard University as one of our Innovators under 35 for her attempts to make sense of it all using machine learning. She created the Immune Dictionary to describe how hundreds of proteins affect immune cells—something she likens to a “periodic table” for the immune system.)

Tsang and his colleagues tackled this by running a series of tests on those blood samples. The vast scope of these tests is what sets them apart from the blood tests you might get during a visit to the doctor. The team looked at how genes were expressed by cells in the blood. They measured a range of immune cells and more than 1,300 proteins.

The team members used machine learning to find correlations between these measurements and health, allowing them to create an immune health score for each of the volunteers. They call it the immune health metric, or IHM.

When they used this approach to find the immune scores of people who had already volunteered in other studies, they found that the IHM seemed to align with other measures of health, such as how people respond to diseases, treatments, and vaccines. The study was published in the journal Nature Medicine last year.

The researchers behind it hope that a test like this could one day help identify people who are at risk of cancer and other diseases, or explain why some people respond differently to treatments or immunizations.

But the test isn’t ready for clinical use. If, like me, you’re finding yourself curious to know your own IHM, you’ll just have to wait.

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

A pivotal meeting on vaccine guidance is underway—and former CDC leaders are alarmed

This week has been an eventful one for America’s public health agency. Two former leaders of the US Centers for Disease Control and Prevention explained the reasons for their sudden departures from the agency in a Senate hearing. And they described how CDC employees are being instructed to turn their backs on scientific evidence.

The CDC’s former director Susan Monarez and former chief medical officer Debra Houry took questions from a Senate committee on Wednesday. They painted a picture of a health agency in turmoil—and at risk of harming the people it is meant to serve.

On Thursday, an advisory CDC panel that develops vaccine guidance met for a two-day discussion on multiple childhood vaccines. During the meeting, which was underway as The Checkup went to press, members of the panel were set to discuss those vaccines and propose recommendations on their use.

Monarez worries that access to childhood vaccines is under threat—and that the public health consequences could be dire. “If vaccine protections are weakened, preventable diseases will return,” she said.

As the current secretary of health and human services, Robert F. Kennedy Jr. oversees federal health and science agencies that include the CDC, which monitors and responds to threats to public health. Part of that role involves developing vaccine recommendations.

As we’ve noted before, RFK Jr. has long been a prominent critic of vaccines. He has incorrectly linked commonly used ingredients to autism and made other incorrect statements about risks associated with various vaccines.

Still, he oversaw the recruitment of Monarez—who does not share those beliefs—to lead the agency. When she was sworn in on July 31, Monarez, who is a microbiologist and immunologist, had already been serving as acting director of the agency. She had held prominent positions at other federal agencies and departments too, including the Advanced Research Projects Agency for Health (ARPA-H) and the Biomedical Advanced Research and Development Authority (BARDA). Kennedy described her as “a public health expert with unimpeachable scientific credentials.”

His opinion seems to have changed somewhat since then. Just 29 days after Monarez took on her position, she was turfed out of the agency. And in yesterday’s hearing, she explained why.

On August 25, Kennedy asked Monarez to do two things, she said. First, he wanted her to commit to firing scientists at the agency. And second, he wanted her to “pre-commit” to approve vaccine recommendations made by the agency’s Advisory Committee on Immunization Practices (ACIP), regardless of whether there was any scientific evidence to support those recommendations, she said. “He just wanted blanket approval,” she said during her testimony

She refused both requests.

Monarez testified that she didn’t want to get rid of hardworking scientists who played an important role in keeping Americans safe. And she said she could not commit to approving vaccine recommendations without reviewing the scientific evidence behind them and maintain her integrity. She was sacked.

Those vaccine recommendations are currently under discussion, and scientists like Monarez are worried about how they might change. Kennedy fired all 17 members of the previous committee in June. (Monarez said she was not consulted on the firings and found out about them through media reports.)

“A clean sweep is needed to reestablish public confidence in vaccine science,” Kennedy wrote in a piece for the Wall Street Journal at the time. He went on to replace those individuals with eight new members, some of whom have been prominent vaccine critics and have spread misinformation about vaccines. One later withdrew.

That new panel met two weeks later. The meeting included a presentation about thimerosal—a chemical that Kennedy has incorrectly linked to autism, and which is no longer included in vaccines in the US—and a proposal to recommend that the MMRV vaccine (for measles, mumps, rubella, and varicella) not be offered to children under the age of four.

Earlier this week, five new committee members were named. They include individuals who have advocated against vaccine mandates and who have argued that mRNA-based covid vaccines should be removed from the market.

All 12 members are convening for a meeting that runs today and tomorrow. At that meeting, members will propose recommendations for the MMRV vaccine and vaccines for covid-19 and hepatitis B, according to an agenda published on the CDC website.

Those are the recommendations for which Monarez says she was asked to provide “blanket approval.” “My worst fear is that I would then be in a position of approving something that reduces access [to] lifesaving vaccines to children and others who need them,” she said.

That job now goes to Jim O’Neill, the deputy health secretary and acting CDC director (also a longevity enthusiast), who now holds the authority to approve those recommendations.

We don’t yet know what those recommendations will be. But if they are approved, they could reshape access to vaccines for children and vulnerable people in the US. As six former chairs of the committee wrote for STAT: “ACIP is directly linked to the Vaccines for Children program, which provides vaccines without cost to approximately 50% of children in the US, and the Affordable Care Act that requires insurance coverage for ACIP-recommended vaccines to approximately 150 million people in the US.”

Drops in vaccine uptake have already contributed to this year’s measles outbreak in the US, which is the biggest in decades. Two children have died. We are already seeing the impact of undermined trust in childhood vaccines. As Monarez put it: “The stakes are not theoretical.”

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

We can’t “make American children healthy again” without tackling the gun crisis

Note for readers: This newsletter discusses gun violence, a raw and tragic issue in America. It was already in progress on Wednesday when a school shooting occurred at Evergreen High School in Colorado and Charlie Kirk was shot and killed at Utah Valley University. 

Earlier this week, the Trump administration’s Make America Healthy Again movement released a strategy for improving the health and well-being of American children. The report was titled—you guessed it—Make Our Children Healthy Again.

Robert F. Kennedy Jr., who leads the Department of Health and Human Services, and his colleagues are focusing on four key aspects of child health: diet, exercise, chemical exposure, and overmedicalization.

Anyone who’s been listening to RFK Jr. posturing on health and wellness won’t be surprised by these priorities. And the first two are pretty obvious. On the whole, American children should be eating more healthily. And they should be getting more exercise.

But there’s a glaring omission. The leading cause of death for American children and teenagers isn’t ultraprocessed food or exposure to some chemical. It’s gun violence

Yesterday’s news of yet more high-profile shootings at schools in the US throws this disconnect into even sharper relief. Experts believe it is time to treat gun violence in the US as what it is: a public health crisis.

I live in London, UK, with my husband and two young children. We don’t live in a particularly fancy part of the city—in one recent ranking of London boroughs from most to least posh, ours came in at 30th out of 33. I do worry about crime. But I don’t worry about gun violence.

That changed when I temporarily moved my family to the US a couple of years ago. We rented the ground-floor apartment of a lovely home in Cambridge, Massachusetts—a beautiful area with good schools, pastel-colored houses, and fluffy rabbits hopping about. It wasn’t until after we’d moved in that my landlord told me he had guns in the basement.

My daughter joined the kindergarten of a local school that specialized in music, and we took her younger sister along to watch the kids sing songs about friendship. It was all so heartwarming—until we noticed the school security officer at the entrance carrying a gun.

Later in the year, I received an email alert from the superintendent of the Cambridge Public Schools. “At approximately 1:45 this afternoon, a Cambridge Police Department Youth Officer assigned to Cambridge Rindge and Latin School accidentally discharged their firearm while using a staff bathroom inside the school,” the message began. “The school day was not disrupted.”

These experiences, among others, truly brought home to me the cultural differences over firearms between the US and the UK (along with most other countries). For the first time, I worried about my children’s exposure to them. I banned my children from accessing parts of the house. I felt guilty that my four-year-old had to learn what to do if a gunman entered her school. 

But it’s the statistics that are the most upsetting.

In 2023, 46,728 people died from gun violence in the US, according to a report published in June by the Johns Hopkins Bloomberg School of Public Health. That includes both homicides and suicides, and it breaks down to 128 deaths per day, on average. The majority of those who die from gun violence are adults. But the figures for children are sickening, too. In 2023, 2,566 young people died from gun violence. Of those, 234 were under the age of 10.

Gun death rates among children have more than doubled since 2013. Firearms are involved in more child deaths than cancer or car crashes.

Many other children survive gun violence with nonfatal—but often life-changing—injuries. And the impacts are felt beyond those who are physically injured. Witnessing gun violence or hearing gunshots can understandably cause fear, sadness, and distress.  

That’s worth bearing in mind when you consider that there have been 434 school shootings in the US since Columbine in 1999. The Washington Post estimates that 397,000 students have experienced gun violence at school in that period. Another school shooting took place at Evergreen High School in Colorado on Wednesday, adding to that total.

“Being indirectly exposed to gun violence takes its toll on our mental health and children’s ability to learn,” says Daniel Webster, Bloomberg Professor of American Health at the Johns Hopkins Center for Gun Violence Solutions in Baltimore.

The MAHA report states that “American youth face a mental health crisis,” going on to note that “suicide deaths among 10- to 24-year-olds increased by 62% from 2007 to 2021” and that “suicide is now the leading cause of death in teens aged 15-19.” What it doesn’t say is that around half of these suicides involve guns.

“When you add all these dimensions, [gun violence is] a very huge public health problem,” says Webster.

Researchers who study gun violence have been saying the same thing for years. And in 2024, then US Surgeon General Vivek Murthy declared it a public health crisis. “We don’t have to subject our children to the ongoing horror of firearm violence in America,” Murthy said in a statement at the time. Instead, he argued, we should tackle the problem using a public health approach.

Part of that approach involves identifying who is at the greatest risk and offering support to lower that risk, says Webster. Young men who live in poor communities tend to have the highest risk of gun violence, he says, as do those who experience crisis or turmoil. Trying to mediate conflicts or limit access to firearms, even temporarily, can help lower the incidence of gun violence, he says.

There’s an element of social contagion, too, adds Webster. Shooting begets more shooting. He likens it to the outbreak of an infectious disease. “When more people get vaccinated … infection rates go down,” he says. “Almost exactly the same thing happens with gun violence.”

But existing efforts are already under threat. The Trump administration has eliminated hundreds of millions of dollars in grants for organizations working to reduce gun violence.

Webster thinks the MAHA report has “missed the mark” when it comes to the health and well-being of children in the US. “This document is almost the polar opposite to how many people in public health think,” he says. “We have to acknowledge that injuries and deaths from firearms are a big threat to the health and safety of children and adolescents.”

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

Putin says organ transplants could grant immortality. Not quite.

This week I’m writing from Manchester, where I’ve been attending a conference on aging. Wednesday was full of talks and presentations by scientists who are trying to understand the nitty-gritty of aging—all the way down to the molecular level. Once we can understand the complex biology of aging, we should be able to slow or prevent the onset of age-related diseases, they hope.

Then my editor forwarded me a video of the leaders of Russia and China talking about immortality. “These days at 70 years old you are still a child,” China’s Xi Jinping, 72, was translated as saying, according to footage livestreamed by CCTV to multiple media outlets.

“With the developments of biotechnology, human organs can be continuously transplanted, and people can live younger and younger, and even achieve immortality,” Russia’s Vladimir Putin, also 72, is reported to have replied.

Russian President Vladimir Putin, Chinese President Xi Jinping and North Korean leader Kim Jong Un walk side by side

SERGEI BOBYLEV, SPUTNIK, KREMLIN POOL PHOTO VIA AP

There’s a striking contrast between that radical vision and the incremental longevity science presented at the meeting. Repeated rounds of organ transplantation surgery aren’t likely to help anyone radically extend their lifespan anytime soon.

First, back to Putin’s proposal: the idea of continually replacing aged organs to stay young. It’s a simplistic way to think about aging. After all, aging is so complicated that researchers can’t agree on what causes it, why it occurs, or even how to define it, let alone “treat” it.

Having said that, there may be some merit to the idea of repairing worn-out body parts with biological or synthetic replacements. Replacement therapies—including bioengineered organs—are being developed by multiple research teams. Some have already been tested in people. This week, let’s take a look at the idea of replacement therapies.

No one fully understands why our organs start to fail with age. On the face of it, replacing them seems like a good idea. After all, we already know how to do organ transplants. They’ve been a part of medicine since the 1950s and have been used to save hundreds of thousands of lives in the US alone.

And replacing old organs with young ones might have more broadly beneficial effects. When a young mouse is stitched to an old one, the older mouse benefits from the arrangement, and its health seems to improve.

The problem is that we don’t really know why. We don’t know what it is about young body tissues that makes them health-promoting. We don’t know how long these effects might last in a person. We don’t know how different organ transplants will compare, either. Might a young heart be more beneficial than a young liver? No one knows.

And that’s before you consider the practicalities of organ transplantation. There is already a shortage of donor organs—thousands of people die on waiting lists. Transplantation requires major surgery and, typically, a lifetime of prescription drugs that damp down the immune system, leaving a person more susceptible to certain infections and diseases.

So the idea of repeated organ transplantations shouldn’t really be a particularly appealing one. “I don’t think that’s going to happen anytime soon,” says Jesse Poganik, who studies aging at Brigham and Women’s Hospital in Boston and is also in Manchester for the meeting.

Poganik has been collaborating with transplant surgeons in his own research. “The surgeries are good, but they’re not simple,” he tells me. And they come with real risks. His own 24-year-old cousin developed a form of cancer after a liver and heart transplant. She died a few weeks ago, he says.

So when it comes to replacing worn-out organs, scientists are looking for both biological and synthetic alternatives.  

We’ve been replacing body parts for centuries. Wooden toes were used as far back as the 15th century. Joint replacements have been around for more than a hundred years. And major innovations over the last 70 years have given us devices like pacemakers, hearing aids, brain implants, and artificial hearts.

Scientists are exploring other ways to make tissues and organs, too. There are different approaches here, but they include everything from injecting stem cells to seeding “scaffolds” with cells in a lab.

In 1999, researchers used volunteers’ own cells to seed bladder-shaped collagen scaffolds. The resulting bioengineered bladders went on to be transplanted into seven people in an initial trial

Now scientists are working on more complicated organs. Jean Hébert, a program manager at the US government’s Advanced Research Projects Agency for Health, has been exploring ways to gradually replace the cells in a person’s brain. The idea is that, eventually, the recipient will end up with a young brain.

Hébert showed my colleague Antonio Regalado how, in his early experiments, he removed parts of mice’s brains and replaced them with embryonic stem cells. That work seems a world away from the biochemical studies being presented at the British Society for Research on Ageing annual meeting in Manchester, where I am now.

On Wednesday, one scientist described how he’d been testing potential longevity drugs on the tiny nematode worm C. elegans. These worms live for only about 15 to 40 days, and his team can perform tens of thousands of experiments with them. About 40% of the drugs that extend lifespan in C. elegans also help mice live longer, he told us.

To me, that’s not an amazing hit rate. And we don’t know how many of those drugs will work in people. Probably less than 40% of that 40%.

Other scientists presented work on chemical reactions happening at the cellular level. It was deep, basic science, and my takeaway was that there’s a lot aging researchers still don’t fully understand.

It will take years—if not decades—to get the full picture of aging at the molecular level. And if we rely on a series of experiments in worms, and then mice, and then humans, we’re unlikely to make progress for a really long time. In that context, the idea of replacement therapy feels like a shortcut.

“Replacement is a really exciting avenue because you don’t have to understand the biology of aging as much,” says Sierra Lore, who studies aging at the University of Copenhagen in Denmark and the Buck Institute for Research on Aging in Novato, California.

Lore says she started her research career studying aging at the molecular level, but she soon changed course. She now plans to focus her attention on replacement therapies. “I very quickly realized we’re decades away [from understanding the molecular processes that underlie aging],” she says. “Why don’t we just take what we already know—replacement—and try to understand and apply it better?”

So perhaps Putin’s straightforward approach to delaying aging holds some merit. Whether it will grant him immortality is another matter.

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

RFK Jr’s plan to improve America’s diet is missing the point

A lot of Americans don’t eat well. And they’re paying for it with their health. A diet high in sugar, sodium, and saturated fat can increase the risk of problems like diabetes, heart disease, and kidney disease, to name a few. And those are among the leading causes of death in the US.

This is hardly news. But this week Robert F Kennedy Jr., who heads the US Department of Health and Human Services, floated a new solution to the problem. Kennedy and education secretary Linda McMahon think that teaching medical students more about the role of nutrition in health could help turn things around.

“I’m working with Linda on forcing medical schools … to put nutrition into medical school education,” Kennedy said during a cabinet meeting on August 26. The next day, HHS released a statement calling for “increased nutrition education” for medical students.

“We can reverse the chronic-disease epidemic simply by changing our diets and lifestyles,” Kennedy said in an accompanying video statement. “But to do that, we need nutrition to be a basic part of every doctor’s training.”

It certainly sounds like a good idea. If more Americans ate a healthier diet, we could expect to see a decrease in those diseases. But this framing of America’s health crisis is overly simplistic, especially given that plenty of the administration’s other actions have directly undermined health in multiple ways—including by canceling a vital nutrition education program.

At any rate, there are other, more effective ways to tackle the chronic-disease crisis.

The biggest killers, heart disease and stroke, are responsible for more than a third of deaths, according to the US Centers for Disease Control and Prevention. A healthy diet can reduce your risk of developing those conditions. And it makes total sense to educate the future doctors of America about nutrition.

Medical bodies are on board with the idea, too. “The importance of nutrition in medical education is increasingly clear, and we support expanded, evidence-based instruction to better equip physicians to prevent and manage chronic disease and improve patient outcomes,” David H. Aizuss, chair of the American Medical Association’s board of trustees, said in a statement.

But it’s not as though medical students aren’t getting any nutrition education. And that training has increased in the last five years, according to surveys carried out by the American Association of Medical Colleges.

Kennedy has referred to a 2021 survey suggesting that medical students in the US get only around one hour of nutrition education per year. But the AAMC argues that nutrition education increasingly happens through “integrated experiences” rather than stand-alone lectures.

“Medical schools understand the critical role that nutrition plays in preventing, managing, and treating chronic health conditions, and incorporate significant nutrition education across their required curricula,” Alison J. Whelan, AAMC’s chief academic officer, said in a statement.

That’s not to say there isn’t room for improvement. Gabby Headrick, a food systems dietician and associate director of food and nutrition policy at George Washington University’s Institute for Food Safety & Nutrition Security, thinks nutritionists could take a more prominent role in patient care, too.

But it’s somewhat galling for the administration to choose medical education as its focus given the recent cuts in federal funding that will affect health. For example, funding for the National Diabetes Prevention Program, which offers support and guidance to help thousands of people adopt healthy diets and exercise routines, was canceled by the Trump administration in March.

The focus on medical schools also overlooks one of the biggest factors behind poor nutrition in the US: access to healthy food. A recent survey by the Pew Research Center found that increased costs make it harder for most Americans to eat well. Twenty percent of the people surveyed acknowledged that their diets were not healthy.

“So many people know what a healthy diet is, and they know what should be on their plate every night,” says Headrick, who has researched this issue. “But the vast majority of folks just truly do not have the money or the time to get the food on the plate.”

The Supplemental Nutrition Assistance Program (SNAP) has been helping low-income Americans afford some of those healthier foods. It supported over 41 million people in 2024. But under the Trump administration’s tax and spending bill, the program is set to lose around $186 billion in funding over the next 10 years.

Kennedy’s focus is on education. And it just so happens that there is a nutrition education program in place—one that helps people of all ages learn not only what healthy foods are, but how to source them on a budget and use them to prepare meals.

SNAP-Ed, as it’s known, has already provided this support to millions of Americans. Under the Trump administration, it is set to be eliminated.

It is difficult to see how these actions are going to help people adopt healthier diets. What might be a better approach? I put the question to Headrick: If she were in charge, what policies would she enact?

“Universal health care,” she told me. Being able to access health care without risking financial hardship not only improves health outcomes and life expectancy; it also spares people from medical debt—something that affects around 40% of adults in the US, according to a recent survey.

And the Trump administration’s plans to cut federal health spending by about a trillion dollars over the next decade certainly aren’t going to help with that. All told, around 16 million people could lose their health insurance by 2034, according to estimates by the Congressional Budget Office.

“The evidence suggests that if we cut folks’ social benefit programs, such as access to health care and food, we are going to see detrimental impacts,” says Headrick. “And it’s going to cause an increased burden of preventable disease.”

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

RFK Jr’s plan to improve America’s diet is missing the point

A lot of Americans don’t eat well. And they’re paying for it with their health. A diet high in sugar, sodium, and saturated fat can increase the risk of problems like diabetes, heart disease, and kidney disease, to name a few. And those are among the leading causes of death in the US.

This is hardly news. But this week Robert F Kennedy Jr., who heads the US Department of Health and Human Services, floated a new solution to the problem. Kennedy and education secretary Linda McMahon think that teaching medical students more about the role of nutrition in health could help turn things around.

“I’m working with Linda on forcing medical schools … to put nutrition into medical school education,” Kennedy said during a cabinet meeting on August 26. The next day, HHS released a statement calling for “increased nutrition education” for medical students.

“We can reverse the chronic-disease epidemic simply by changing our diets and lifestyles,” Kennedy said in an accompanying video statement. “But to do that, we need nutrition to be a basic part of every doctor’s training.”

It certainly sounds like a good idea. If more Americans ate a healthier diet, we could expect to see a decrease in those diseases. But this framing of America’s health crisis is overly simplistic, especially given that plenty of the administration’s other actions have directly undermined health in multiple ways—including by canceling a vital nutrition education program.

At any rate, there are other, more effective ways to tackle the chronic-disease crisis.

The biggest killers, heart disease and stroke, are responsible for more than a third of deaths, according to the US Centers for Disease Control and Prevention. A healthy diet can reduce your risk of developing those conditions. And it makes total sense to educate the future doctors of America about nutrition.

Medical bodies are on board with the idea, too. “The importance of nutrition in medical education is increasingly clear, and we support expanded, evidence-based instruction to better equip physicians to prevent and manage chronic disease and improve patient outcomes,” David H. Aizuss, chair of the American Medical Association’s board of trustees, said in a statement.

But it’s not as though medical students aren’t getting any nutrition education. And that training has increased in the last five years, according to surveys carried out by the American Association of Medical Colleges.

Kennedy has referred to a 2021 survey suggesting that medical students in the US get only around one hour of nutrition education per year. But the AAMC argues that nutrition education increasingly happens through “integrated experiences” rather than stand-alone lectures.

“Medical schools understand the critical role that nutrition plays in preventing, managing, and treating chronic health conditions, and incorporate significant nutrition education across their required curricula,” Alison J. Whelan, AAMC’s chief academic officer, said in a statement.

That’s not to say there isn’t room for improvement. Gabby Headrick, a food systems dietician and associate director of food and nutrition policy at George Washington University’s Institute for Food Safety & Nutrition Security, thinks nutritionists could take a more prominent role in patient care, too.

But it’s somewhat galling for the administration to choose medical education as its focus given the recent cuts in federal funding that will affect health. For example, funding for the National Diabetes Prevention Program, which offers support and guidance to help thousands of people adopt healthy diets and exercise routines, was canceled by the Trump administration in March.

The focus on medical schools also overlooks one of the biggest factors behind poor nutrition in the US: access to healthy food. A recent survey by the Pew Research Center found that increased costs make it harder for most Americans to eat well. Twenty percent of the people surveyed acknowledged that their diets were not healthy.

“So many people know what a healthy diet is, and they know what should be on their plate every night,” says Headrick, who has researched this issue. “But the vast majority of folks just truly do not have the money or the time to get the food on the plate.”

The Supplemental Nutrition Assistance Program (SNAP) has been helping low-income Americans afford some of those healthier foods. It supported over 41 million people in 2024. But under the Trump administration’s tax and spending bill, the program is set to lose around $186 billion in funding over the next 10 years.

Kennedy’s focus is on education. And it just so happens that there is a nutrition education program in place—one that helps people of all ages learn not only what healthy foods are, but how to source them on a budget and use them to prepare meals.

SNAP-Ed, as it’s known, has already provided this support to millions of Americans. Under the Trump administration, it is set to be eliminated.

It is difficult to see how these actions are going to help people adopt healthier diets. What might be a better approach? I put the question to Headrick: If she were in charge, what policies would she enact?

“Universal health care,” she told me. Being able to access health care without risking financial hardship not only improves health outcomes and life expectancy; it also spares people from medical debt—something that affects around 40% of adults in the US, according to a recent survey.

And the Trump administration’s plans to cut federal health spending by about a trillion dollars over the next decade certainly aren’t going to help with that. All told, around 16 million people could lose their health insurance by 2034, according to estimates by the Congressional Budget Office.

“The evidence suggests that if we cut folks’ social benefit programs, such as access to health care and food, we are going to see detrimental impacts,” says Headrick. “And it’s going to cause an increased burden of preventable disease.”

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

I gave the police access to my DNA—and maybe some of yours

Last year, I added my DNA profile to a private genealogical database, FamilyTreeDNA, and clicked “Yes” to allow the police to search my genes.

In 2018, police in California announced they’d caught the Golden State Killer, a man who had eluded capture for decades. They did it by uploading crime-scene DNA to websites like the one I’d joined, where genealogy hobbyists share genetic profiles to find relatives and explore ancestry. Once the police had “matches” to a few relatives of the killer, they built a large family tree from which they plucked the likely suspect.

This process, called forensic investigative genetic genealogy, or FIGG, has since helped solve hundreds of murders and sexual assaults. Still, while the technology is potent, it’s incompletely realized. It operates via a mishmash of private labs and unregulated websites, like FamilyTree, which give users a choice to opt into or out of police searches. The number of profiles available for search by police hovers around 1.5 million, not yet enough to find matches in all cases.

To do my bit to increase those numbers, I traveled to Springfield, Massachusetts.

The staff of the local district attorney, Anthony D. Gulluni, was giving away free FamilyTree tests at a minor-league hockey game in an effort to widen its DNA net and help solve several cold-case murders. After glancing over a consent form, I spit into a tube and handed it back. According to the promotional material from Gulluni’s office, I’d “become a hero.”

But I wasn’t really driven by some urge to capture distantly related serial killers. Rather, my spit had a less gallant and more quarrelsome motive: to troll privacy advocates whose fears around DNA I think are overblown and unhelpful. By giving up my saliva for inspection, I was going against the view that a person’s DNA is the individualized, sacred text that privacy advocates sometimes claim.

Indeed, the only reason FIGG works is that relatives share DNA: You share about 50% with a parent, 25% with a grandparent, about 12.5% with a first cousin, and so on. When I got my FamilyTree report back, my DNA had “matched” with 3,309 people.

Some people are frightened by FIGG or reject its punitive aims. One European genealogist I know says her DNA is kept private because she opposes the death penalty and doesn’t want to risk aiding US authorities in cases where lethal injection might be applied. But if enough people share their DNA, conscientious objectors won’t matter. Scientists estimate that a database including 2% of the US population, or 6 million people, could identify the source of nearly any crime-scene DNA, given how many distant relatives each of us has.

Scholars of big data have termed this phenomenon “tyranny of the minority.” One person’s voluntary disclosure can end up exposing the same information about many others. And that tyranny can be abused.

DNA information held in private genealogy websites like FamilyTree is lightly guarded by terms of service. These agreements have flip-flopped over time; at one point all users were included in law enforcement searches by default. Rules are easily ignored, too. Recent court filings indicate that the FBI, in its zeal to solve crimes, sometimes barges past restrictions to look for matches in databases whose policies exclude police.

“Noble aims; no rules” is how one genetic genealogist described the overall situation in her field.

My uncertainty grew the more questions I asked. Who even controls my DNA file? That’s not easy to find out. FamilyTree is a brand operated by another company, Gene by Gene, which in 2021 was sold to a third company, MyDNA—ultimately owned by an Australian mogul whose name appears nowhere on its website. When I reached FamilyTree’s general manager, the genealogist Dave Vance, he told me that three-quarters of the profiles on the site were “opted in” to law enforcement searches.

One solution holds that the federal government should organize its own national DNA database for FIGG. But that would require new laws, new technical standards, and a debate about how our society wants to employ this type of big data—not just getting individual consent like mine. No such national project—or consensus—exists.

I’m still ready to join a national crime-fighting database, but I regret doing it the way I did—spitting in a tube on the sidelines of a hockey game and signing a consent form that affects not just me but all my thousands of genetic relatives. To them, I say: Whoops. Your DNA; my bad.

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.