Raman to go

For a harried wastewater manager, a commercial farmer, a factory owner, or anyone who might want to analyze dozens of water samples, and fast, it sounds almost miraculous. Light beamed from a central laser zips along fiber-optic cables and hits one of dozens of probes waiting at the edge of a field, or at the mouth of a sewage outflow, or wherever it’s needed. In turn, these probes return nearly instant chemical analysis of the water and its contaminants—fertilizer concentration, pesticides, even microplastics. No need to walk around taking samples by hand, or wait days for results from a lab. 

This networked system of pen-size probes is the brainchild of Nili Persits, a final-year doctoral candidate in electrical engineering at MIT. Persits, who sports a collection of tattoos and a head of bouncy curls, seems to radiate energy, much like the powerful lasers she works with. She hopes that her work to develop a highly sensitive probe will help a technology known as Raman spectroscopy step beyond the rarefied realm of laboratory settings and out into the real world. These spectrometers—which use a blast of laser light to analyze an object’s chemical makeup—have proved their utility in fields ranging from medical research to art restoration, but they come with frustrating drawbacks. 

raman setup on a media cart

KEN RICHARDSON AND REBECCA RODRIGUEZ

In a cluttered room full of dangling cables and winking devices in MIT’s Building 26, it’s easy to see the problem. A line of brushed-aluminum boxes stretching eight or so feet across a table makes up the conventional Raman spectrometer. It costs at minimum $70,000—in some cases, more than twice that amount—and the vibration-damping table it sits on adds another $15,000 to the tab. Even now, after six years of practice, it takes Persits most of a day to set it up and calibrate it before she can begin to analyze anything. “It’s so bulky, so expensive, so limited,” she says. “You can’t take it anywhere.” 

Elsewhere in the lab, two other devices hint at the future of Raman spectroscopy. The first is a system about the size of a desk. Although this version is too big and too sensitive to be moved, it can support up to 100 probes connected to it by fiber-­optic cables, making it possible to analyze samples kilometers away. 

The typical Raman system is “so bulky, so expensive, so limited. You can’t take it anywhere.”

The second is a truly portable Raman device, a laser about the size and shape of a Wi-Fi router, with just one probe and a cell-phone-size photodetector (a device that converts photons into electrical signals) attached. While other portable Raman systems do exist, Persits says their resolution and sensitivity leave a lot to be desired. And this one delivers results on par with those of bigger and pricier versions, she says. Whereas the bigger device is intended for large-scale operations such as chemical manufacturing facilities or wastewater monitoring, this one is suited for smaller uses such as medical studies. 

Persits has spent the last several years perfecting these devices and their attached probes, designing them to be easy to use and more affordable than traditional Raman systems. This new technology, she says, “could be used for so many different applications that Raman wasn’t really a possibility for before.” 

A molecular photograph with a hefty price tag 

All Raman spectrometers, big or small, take advantage of a quirk in the way that light behaves. If you shine a red laser at a wall, you’ll see a red dot. Of the photons that bounce off the wall and hit your retina, nearly all of them remain red. But for a precious few photons—one in 100 million—something strange happens. The springlike molecular bonds of the materials in the wall jangle the photon, which absorbs or loses energy on the rebound. This changes its wavelength, thereby changing its color. The color change corresponds to whatever type of molecule the photon collided with, whether it’s the polymers in the wall’s latex paint or the pigments that create its hue. 

This phenomenon, called Raman scattering, is happening right now, all around you. But you can’t see this color-shifted photon confetti—it’s far too faint, so looking for it is like trying to see a distant star on a sunny day. 

A traditional Raman spectrometer separates out this faint signal by guiding it through an obstacle course of mirrors, lenses, and filters. After the light of a powerful, single-color laser is beamed at a sample, the scattered light is directed through a filter to remove the returning photons that retained their original hue. The color-­shifted photons then go through a diffraction grating—a series of prisms—that separates them by color before they hit a detector that measures their wavelength and intensity. This detector, Persits says, is essentially the same as a digital camera’s light sensor. 

Raman probes designed by Nili Persits sit atop a cart, but the coiled fiberoptic cables allow them to be used on samples far away.
1. A mounted probe can be used to study non-liquid, uncontained samples like plants.
2. A probe encased in a protective sleeve is immersed in a liquid sample.
3. An optical receiver detects Raman photons collected by a probe and relayed by a fiber-optic cable.
4. A probe to measure small-volume liquids in a cuvette.
KEN RICHARDSON AND REBECCA RODRIGUEZ

At the end of the spectroscopy process, a researcher is left with something akin to a photograph—not of an object’s appearance, but of its molecular makeup. This allows researchers to study the chemical components of DNA, detect contaminants in food, or figure out if an antique painting is authentic or a modern counterfeit, among many other uses. What’s more, Raman spectroscopy makes it possible to analyze samples without grinding them up, dissolving them, or dousing them in chemicals.  

“The problem with spectrometers is that they have this intrinsic trade-off,” Persits says. The more light that goes into the spectrometer itself—specifically, into the color-separating diffraction grating and the detector—the harder it is to separate photons by wavelength, lowering the resolution of the resulting chemical snapshot. And because Raman light is so weak, researchers like Persits need to gather as much of it as possible, particularly when they’re searching for chemicals that occur in minute concentrations. One way to do this is to make the detector much bigger—even room-size, in the case of astrophysics applications. This, however, makes the setup “exponentially more expensive,” she says. 

Raman spectroscopy on the go

In 2013, Persits had bigger things to worry about than errant photons and unwieldy spectrometers. She was living in Tel Aviv with her husband, Lev, and their one-year-old daughter. She’d been working in R&D at a government defense agency—an easy, predictable job she describes as “engineering death”—when a thyroid cancer diagnosis ground her life to a halt. 

As Persits recovered from two surgeries and radiation therapy, she had time to take stock of her life. She resolved to complete her stalled master’s degree and, once that was done, begin a PhD program. Her husband encouraged her to apply beyond Israel, to the best institutions in the United States. In 2017, when her MIT acceptance letter arrived, it was a shock to Persits, but not to her husband. “That man has patience,” she says with a laugh, recalling Lev’s unflagging support. “He believes in me more than me.”

The family moved to Massachusetts that fall, and soon after, Persits joined the research group of Rajeev Ram, a professor of electrical engineering who specializes in photonics and electronics. “I’m looking for people who are willing to take risks and work on a new area,” Ram says. He saw particular promise in Persits’s keen interest in research outside her sphere of expertise. He put her to work learning the ins and outs of Raman spectroscopy, beginning with a project to analyze the metabolic components of blood plasma. 

“The first couple of years were pretty stressful,” Persits says. In 2016, she and her husband had welcomed their second child, another girl, making the pressures of grad school even more acute. The night before her quantum mechanics exam, she recalls, she was awake until 3 a.m. with a vomiting child. On another occasion, a sprinkler in the lab malfunctioned, ruining the Raman spectrometer she’d inherited from a past student. 

“We can have real-time assessment of what’s going on. Are our plants happy?”

Persits persevered, and things started to settle into place. She began to build on the earlier work of Ram and optical engineer Amir Atabaki, a former postdoc in the Ram lab who is now a research fellow at the Lawrence Berkeley National Laboratory in California. Atabaki had figured out a fix for that fundamental Raman trade-off—the brighter the light, the lower the resolution of the chemical snapshot—by using a tunable laser that emits a range of different colors, instead of a fixed laser limited to a single hue. Persits compares the process to photographing a rainbow. A traditional Raman spectrometer is like a camera that takes a picture of all the rainbow’s colors simultaneously; the updated system, in contrast, takes snapshots of only one color at a time.

This tunable laser eliminates the need for the bulkiest, costliest parts of a Raman spectrometer—those that diffract light and collect it in a photon-gathering sensor. This makes it possible to use miniaturized and “very simple” silicon photodetectors, Persits says, which “cost nothing” compared with the standard detectors.  

close-up of the device
One of Persits’s probes shines a red laser dot on
a small-volume sample in a 0.5-milliliter cuvette.
KEN RICHARDSON AND REBECCA RODRIGUEZ

Persits’s key innovation was an exceptionally sensitive probe that’s the size of a large marker and is connected to the laser via a fiber-optic cable. These cables can be as long (even kilometers long) or short as needed. Armed with a tunable laser, simple photodetectors, and her robust, internet­-enabled probes, Persits was able to develop both her handheld Raman device and the larger, nonportable version. This second system is more expensive, with a vibration-damping table needed for its sensitive laser, but it can support dozens of different probes, in essence offering multiple Raman systems for the price of one. It also has a much broader spectral range, allowing it to distinguish a greater variety of chemicals. 

These probes open up a remarkable host of possibilities. Take biologics, a class of drugs generated by genetically engineered cells, which account for more than half of all modern cancer treatments. For drug manufacturers, it’s important to make sure these cells are happy, healthy, and producing the desired compounds. But the mere act of checking in on them—cracking open the bioreactors in which they grow to remove a sample—stresses them out and introduces the risk of contamination. Persits’s probes can be left in vessels to monitor how much the cells are eating and what chemicals they’re secreting, all without any disturbance. 

Persits is particularly excited about the technology’s potential to simplify water monitoring. First, though, she and her team had to make sure that water testing was even feasible. “A lot of techniques don’t work in water,” she says. Last summer, an experiment with hydroponic bok choy proved the technology’s mettle. The team could watch, day by day, as the plants sucked up circulating nitrate fertilizer until none remained in the water. “We can actually have real-time assessment of what’s going on,” Persits says. “Are our plants happy? Are they getting enough nutrients?” 

In the future, this may allow for precision dosing of fertilizers on large commercial farms, saving farmers money and reducing the hazardous runoff of nitrates into local waterways. The technology can also be adapted for a range of other watery uses, such as monitoring chemical leakage from factories and refineries or searching for microplastics and other pollutants in drinking water. 

With graduation at the end of May, Persits has set her sights on the next phase of her career. Last year, funding and support from the Activate fellowship helped her launch her own company, Dottir Labs. Dottir—which stands for “digital optical technology” and also alludes to her two daughters, now 12 and eight—aims to bring her Raman systems to market. “Dottir is really focusing on the larger-scale applications where there are few alternatives to this type of chemical sensing,” Persits says. 

Like the subject of one of her tattoos, which shows a lotus growing from desert ground, Persits’s research career has been defined by surprising transformation—photons that change color after a glancing blow, bulky machines that she shrank down and supplemented with a web of probes. These transformations could nudge the world in a new direction as well, leading to cleaner water, safer drugs, and a healthier environment for all of us downstream.

Taking on climate change, Rad Lab style

When I last wrote, the Institute had just announced MIT’s Climate Project. Now that it’s underway, I’d like to tell you a bit more about how we came to launch this ambitious new enterprise. 

In the fall of 2022, as soon as I accepted the president’s job at MIT, several of my oldest friends spontaneously called to say, in effect, “Can you please fix the climate?”

And once I arrived, I heard the same sentiment, framed in local terms: “Can you please help us organize ourselves to help fix the climate?” 

Everyone understood that MIT brought tremendous strength to that challenge: More than 20% of our faculty already do leading-edge climate work. And everyone understood that in a place defined by its decentralization, focusing our efforts in this way would require a fresh approach. This was my kind of challenge—creating the structures and incentives to help talented people do much more together than they could do alone, so we could direct that collective power to help deliver climate solutions to the world, in time.

My first step was to turn to Vice Provost Richard Lester, PhD ’80, a renowned nuclear engineer with a spectacular record of organizing big, important efforts at MIT—including the Climate Grand Challenges. Working with more than 100 faculty, over the past year Richard led us to define the hardest climate problems where MIT could make the most substantial difference—our six Climate Missions:

  • Decarbonizing Energy and Industry
  • Restoring the Atmosphere, Protecting the Land and Oceans
  • Empowering Frontline Communities
  • Building and Adapting Healthy, Resilient Cities
  • Inventing New Policy Approaches
  • Wild Cards

Each mission will be a problem-solving community, focused on the research, translation, outreach, and innovation it will take to get emerging ideas out of the lab and deployed at scale. We are unabashedly focused on outcomes, and the faculty leaders we are recruiting for each mission will help develop their respective roadmaps.

In facing this vast challenge, we’re consciously building the Climate Project in the spirit of MIT’s Rad Lab, an incredible feat of cooperative research which achieved scientific miracles, at record speed, with an extraordinary sense of purpose. With the leadership and ingenuity of the people of MIT, and our partners around the globe, we aim for the Climate Project at MIT to do the same. 

Sally Kornbluth
March 20, 2024

I went to COP28. Now the real work begins.

As an international student at MIT, I find that the privileges I’ve experienced in the States have made me even more conscious of my nation’s struggles. Brief visits home remind me that in Jamaica, I can’t always count on what I often take for granted in Massachusetts: water flowing through the faucet, timely public transportation, a safe neighborhood to live in. And after working hard in school for years so my family and I won’t have to struggle so much to meet our basic needs, I’ve recently been challenging myself to think about the needs of nations too. Being from a developing nation, I am very aware of the urgent need for sustainable development, which the UN defines as “development that meets the needs of the present, without compromising the ability of future generations to meet their own needs.” 

Jamaica is among the countries least responsible for the acceleration of global warming, yet it is already facing some of its worst effects. Many Jamaicans can’t afford air-conditioning to cope with the extreme heat, and in my city, many of the trees that once provided shade are being cut down to build apartments, leaving people sweltering in a concrete jungle. Even if ambitious net-zero emissions targets are met, these severe consequences may continue to worsen for some years. 

Runako Gentles leaning against a fence overlooking the ocean
At home in Jamaica, Gentles has seen the impact of climate change firsthand.
COURTESY OF RUNAKO GENTLES

Beyond significantly lowering the standard of living for the poor and lower-middle classes, climate change is also threatening agriculture and tourism, two major sources of Jamaica’s GDP. Given that the country is already struggling with crime and widespread poverty, what’s going to happen as climate change continues causing droughts to worsen, beaches to shrink, and energy bills to rise?  

My MIT degree could definitely help me migrate to another country with a higher standard of living. But if young people like me leave these critical problems for someone else to solve, then what will the future look like for my family, friends, and neighbors? 

I grew up wanting to be a physician, but at MIT I became significantly more interested in the health of communities, the planet, and the economy. I decided to major in environmental engineering as a step toward addressing the social, economic, and environmental dimensions of issues like climate change, pollution, and water management. Then I took advantage of opportunities to attend conferences where I could gather with experts, industry leaders, and other young people eager to tackle these issues. Last fall I was elated to be selected as one of MIT’s six student delegates to COP28, the 28th Conference of the Parties to the UN Framework Convention on Climate Change. Some 84,000 attendees would converge in the United Arab Emirates over the course of two weeks in November and December for the world’s largest global climate conference. I would be among those attending the second half. 

We can’t wait for someone else to address the crises affecting not only our generation but also those to come.

After a 12-hour nonstop flight, I landed in the UAE around 7:30 p.m. local time and woke up early the next morning ready to get down to business. I was tired, but it was go time. Having attended the Global Youth Climate training program and MIT’s pre-COP28 sessions, I had spent a lot of time thinking about how to make the most of the conference. There were hundreds of plenary meetings, pavilions, side events, and booths to choose from. I combed through the COP schedule each day, noting events with themes relevant to developing nations and those in which I would likely find the leaders I wanted to connect with. 

I spent the week zipping from building to building in the enormous Dubai Exhibition Centre, listening to panels, presentations, and press conferences, as well as questioning speakers, observing negotiations, taking copious notes on my iPad, and networking. A highlight was getting to interview some of the senior Jamaican delegates. I shared with them my long-term plan to help the Caribbean adapt to climate change and develop sustainably. UnaMay Gordon, one of Jamaica’s leading climate-change specialists, gave me a memorable piece of advice: Be present, represent youth, and bring other young people along to engage with these issues. I was glad to receive the Jamaican delegates’ insights—and their contact information. I took full advantage of the opportunity to approach experts and introduce myself as an MIT undergraduate. It was my first COP, and I was a man on a mission. 

I left the UAE even more determined to support sustainable development, eager to bring about positive change in the MIT community during my final semester on campus—and feeling I had a lot of work to do before graduation. Progress toward becoming a more sustainable society cannot just rely on the relatively slow process of persuading governments to pass laws that enact COP agreements. Individual COP attendees play a pivotal role in supporting the sustainability transition by helping their communities take action. 

For my last semester, I decided I could have the most impact by helping implement a campus sustainability initiative, sharing my knowledge and experiences, and encouraging more undergraduates to get involved in sustainability efforts. I started by attending the Sustainability Connect 2024 meeting run by the MIT Office of Sustainability (MITOS), which led to my joining the MIT Food Waste Fighters and working to address the need for better separation of garbage in our campus dorms to help produce biofuels and reduce methane emissions from food waste in landfills. This gave me experience implementing on-the-ground strategy to take on a problem that is also very relevant to developing nations. 

Runako Gentles speaking at TEDxMIT
Gentles speaks at TEDx MIT in April.
JOHN WERNER

Meanwhile, I dove into organizing a student-led series of sustainability talks hosted by my department’s civil engineering society, Chi Epsilon, in collaboration with MITOS and the MIT Climate and Sustainability Consortium (MCSC). As an MCSC scholar, I worked on writing an opinion piece and a research article on my work analyzing earthquakes induced by carbon dioxide sequestration. I was also chosen to give a talk at TEDx MIT in April on how MIT can equip undergrads so they’re ready to seize opportunities to support the sustainability transition.

It was a lot to tackle on top of my classes, but I really wanted to do all I could in my last few months to galvanize the MIT community. And at the same time, I wanted to remind everyone of the importance of having empathy for those who are most vulnerable to—and least responsible for—the consequences of unsustainable behavior and of innovation that doesn’t factor in sustainability. 

I hope my work empowers more MIT undergraduates to step up and help tackle the many obstacles to achieving sustainable development while setting the stage for a more just society. We can’t wait for someone else to address the crises affecting not only our generation but also those to come. We need more minds and hands to work on ensuring that the places we live remain livable.

Runako Gentles ’24 plans to return to Jamaica upon graduation and will begin a master’s program in environmental engineering at Stanford in the fall.

What’s one memento you kept from your time at MIT?

Alumni leave MIT armed with knowledge and a whole lot of memories. During Tech Reunions in 2023, the MIT Alumni Association asked returning alums what else they had held onto since leaving campus. Here are just a few of their responses. 

Diane Marie McKnight ’75, SM ’78, PhD ’79, kept a bronze oarlock used for securing an oar on a boat. “I sand-casted it myself as part of my last class in mechanical engineering, and I learned how to use a lathe,” she said.

Amy (Schonsheck) Simpkins ’03 got her Institute keepsake early— a “cheap hoodie sweatshirt that was on special at the Coop the first week of my freshman year.” She still wears it almost every day.

Alan Paul Lehotsky ’73 said that in addition to his brass rat, he still has the Groucho glasses he wore to graduation. He admitted that the mustache has not held up very well.

Elliot Owen ’18, SM ’20, still has the precision-machined aluminum flexures that he used for his graduate research. “It is easy to create structures with a low stiffness in the direction of travel and high stiffness in all other directions,” he said. “I keep them on my bookshelf and show them off when I have people over. Most people are very surprised to see a solid piece of metal flex and move so easily and without friction.”

Walt Gibbons ’73, SM ’75, had the most popular response, provided by 22 of the 69 alums interviewed. He named his MIT brass rat.

“I kept a propeller from one of the first planes I ever built,” said Morgan Ferguson ’23. “It was a spare propeller from a plane that I worked on as part of a team of undergraduate and graduate students at MIT that develop aircraft for the annual AIAA [American Institute of Aeronautics and Astronautics] Design/Build/Fly competition. I continue to work on these planes.” His latest aircraft is shown above.

Jeanne Yu ’13 said, “The one thing I kept from MIT was my sense of resilience.”

Check out the recent MIT alumni video about physical objects grads have kept—and why they kept them—at bit.ly/MITMemento.

The silver-platter season

In the spring of 1974, I was new to both MIT and rugby football. As a Course 2 graduate student, I shared a basement office with several other students, including two players on the Tech rugby club who encouraged me to join them. Being both an Anglophile and a beer drinker, I was pretty easily talked into participating in this sport, with its British roots and after-match parties.

I played mainly on the squad’s B side that season but was among those asked to join the A side players in the annual tournament of the New England Rugby Football Union (NERFU), held at UMass Amherst. We needed extra men for the exhausting tournament schedule, in which players from both the A and B sides would be combined in various ways for different matches. Today NERFU has many more teams and several divisions of competition. But in 1974 it had just one division and held a single annual tournament.  

Institute records show rugby being played as early as 1882, making the Tech club the oldest in NERFU and one of the oldest in the nation. In 1974, it fielded two 15-man sides that practiced twice a week and played every Saturday during the spring and fall seasons. (There was no women’s side then.) Our school-supplied uniforms were classics of a bygone era—striped long-sleeve jerseys with collars and rubber buttons.

Rugby matches are grueling affairs involving continuous running and tackling and (for forwards like me, who make up half the team) pushing in organized scrums and ad hoc rucks. (In both scrums and rucks, players grab teammates’ shirts, binding together to push against the opposing team while attempting to gain possession of a ball on the ground with their feet.) In 1974, substitution was allowed only in cases of injury. Usually, one match per week was all a player would play. Making it to the tournament’s championship match would require playing four or five in two days, so some players would need to sit out some of the matches. 

group photo of the 1974 rugby champions
The storied MIT rugby club of 1974. The author is in the back row, third from the right.
MIT RUGBY FOOTBALL CLUB

Unlike now, in the 1970s there were few (if any) US high school or under-19 rugby teams, so American college teams were generally inexperienced. However, the 1974 MIT club had several international players who had been playing since grade school in England, Scotland, New Zealand, France, Argentina, or Japan. It also included grad students and an assistant professor (Ron Prinn, ScD ’71), which raised the average age of the team. MIT was thus not a typical college team, although we might have been mistaken for one. Undoubtedly some club teams in the 1974 tournament rested their best players when scheduled to play us. 

Our coach was Serge Gallant, a savvy, bearded Frenchman and former scrum half forced by concussions to retire from playing. Shin Yoshida ’76, our fly half, was our star player. Shin would kick high-arching punts downfield, accurately positioned to allow our team to immediately tackle opponents receiving them, or occasionally to recover the ball ourselves. Much like a fast-break offense from a basketball team with smaller players, this helped neutralize the height and power of bigger teams.    

The 1974 NERFU tournament, held on May 11 and 12, pitted 24 teams against each other in five rounds of single-elimination matches. The MIT club had some role in the seeding, so we managed to get a first-round bye and the prospect of an easy opponent in the second round. However, the remaining matches promised to be very difficult.

Our first match on Saturday was in the second round against Springfield, whom we beat handily, 13–0. Our last match of the day was against Charles River, a club that had beaten us the week before. We eked out a 16–12 victory in double overtime. 

Since we’d advanced to the semifinal round to be held on Sunday, arrangements were made for our team to pile into a few rooms of an Amherst motel for the night. But first most of us went out to a local restaurant. Despite our camaraderie and shared joy over having won our first two matches, our celebration was subdued, with none of the usual libations and rugby songs. We were pleasantly surprised when a former MIT rugby player turned businessman pick up our meal tab. 

At the restaurant we exchanged friendly banter with a well-known forward on the Providence city club, our next opponent. During the meal he playfully growled at us while chomping on a handful of spring onions. However, he did not play against us in the semifinals on Sunday. He was rested for the finals match he never got to play.

During the Providence match, their sideline people kept yelling “Get the foot,” meaning to target Yoshida and take him out of the game. But our “enforcers” took care of theirs, and he was not hurt. We went on to win, 6–3. 

I had played in the third- and fourth-round matches and was exhausted. So when our coach asked me to play in the finals, I begged off. My spot was taken by Mark Sneeringer ’76, PhD ’82, an amiable sophomore from Gettysburg, Pennsylvania. Because I wasn’t playing, I was picked to serve as a line judge.

For the championship match Tech faced off against the Beacon Hill club, which had won the year before. This was another tight and grueling game that went into double overtime. In the first overtime, our forwards were gasping for breath. Roger Simmonds, PhD ’78 (an Englishman and our most experienced player), lifted spirits and energy levels with an impromptu pep talk noting how well the forwards were playing and how worn out the Beacon Hill squad was.    

In the second overtime, team captain Paul Dwyer, SM ’73, finally scored the game-winning try. Because I was a line judge, my jumping for joy with a cloth in my hand caused temporary confusion. That was soon resolved when I explained that my action was not an officiating signal. We’d bested Beacon Hill, 7–3. 

Our reward for winning the championship was a silver platter. In those days, beer was always on hand after rugby matches, so while still on the pitch, we awkwardly drank beer from the platter as if it were a trophy cup. 

Having pulled off a major upset in the NERFU tournament, MIT was no longer a dark horse in the 1974 fall season, and other teams made sure to give us their best efforts. The loss of Yoshida, Dwyer, and other key players from the spring season weakened our fall A side, to which I was promoted. We began the fall season with two wins and two losses and then lost the rest of our matches, including one in which the Boston club thoroughly overpowered and crushed us. 

Nevertheless, Tech reigned as the NERFU champion until the next tournament. NERFU would eventually add a college division to its annual competition, so to this day, MIT’s rugby club remains the only college side ever to capture the top-tier NERFU title.

After retiring from a long career in mechanical and nuclear engineering, Dan Guzy, MechE ’75, has written four books and many articles on local history.

The energy transition’s effects on jobs

A county-by-county analysis by MIT researchers shows the places in the US that stand to see the biggest economic changes from the switch to cleaner energy because their job markets are most closely linked to fossil fuels. 

While many of those places have intensive drilling and mining operations, the researchers find, areas that rely on industries such as heavy manufacturing could also be among the most significantly affected—a reality that policies intended to support American workers during the energy transition may not be taking into account, given that some of these communities don’t qualify for federal assistance under the Inflation Reduction Act.

This map shows which US counties have the
highest concentration of jobs that could be
affected by a transition to renewable energy.
Counties in blue are less likely to be affected,
and counties in red are more likely.
COURTESY OF THE RESEARCHERS

“The impact on jobs of the energy transition is not just going to be where oil and natural gas are drilled,” says Christopher Knittel, an economist at the MIT Sloan School of Management and coauthor of the paper. “It’s going to be all the way up and down the value chain of things we make in the US. That’s a more extensive, but still focused, problem.” 

Using several data sources measuring energy consumption by businesses, as well as detailed employment data from the US Census Bureau, Knittel and Kailin Graham, a master’s student in the Technology and Policy Program, calculated the “employment carbon footprint” of every county in the US.

“Our results are unique in that we cover close to the entire US economy and consider the impacts on places that produce fossil fuels but also on places that consume a lot of coal, oil, or natural gas for energy,” says Graham. “This approach gives us a much more complete picture of where communities might be affected and how support should be targeted.”

He adds, “It’s important that policymakers understand these economy-­wide employment impacts. Our aim in providing these data is to help policymakers incorporate these considerations into future policies.”

An invisibility cloak for would-be cancers

One of the immune system’s roles is to detect and kill cells that have acquired cancerous mutations. However, some early-stage cancer cells manage to survive. A new study on colon cancer from MIT and the Dana-Farber Cancer Institute has identified one reason why: they turn on a gene called SOX17, which renders them essentially invisible to immune surveillance.

The researchers focused on precancerous growths called polyps that often form as mutations accumulate in the intestinal stem cells, whose job is to continually regenerate the lining of the intestines. Using a technique they had developed for growing mini colon tumors in a lab dish and then implanting them in mice, they engineered tumors to express mutations that are often found in human colon cancers.

In the mice, the researchers observed a dramatic increase in the tumors’ expression of SOX17. This gene encodes a transcription factor that is normally active only during embryonic development, when it helps control development of the intestines and the formation of blood vessels.

The experiments revealed that when SOX17 is turned on in cancer cells, it helps them create an immunosuppressive environment. Among its effects, SOX17 prevents cells from synthesizing the receptor that normally detects interferon gamma, one of the immune system’s primary weapons against cancer cells. Without those receptors, cancerous and precancerous cells can simply ignore messages from the immune system, which would normally direct them to die off.

The absence of this signaling also lets cancer cells minimize their production of molecules called MHC proteins, which display cancerous antigens to the immune system, and prevents them from producing molecules called chemokines, which normally recruit T cells that would help destroy the cancerous cells.

When the researchers generated colon tumor organoids with SOX17 knocked out, and implanted those into mice, their immune system was able to attack them much more effectively. This suggests that blocking the gene or the pathway that it activates could offer a new way to treat early-stage cancers before they grow into larger tumors.

“Just by turning off SOX17 in fairly complex tumors, we were able to essentially obliterate the ability of these tumor cells to persist,” says MIT research scientist Norihiro Goto, the lead author of a paper on the work.

But transcription factors such as the one encoded by the SOX17 gene are considered difficult to target using drugs, in part because of their structure. The researchers now plan to identify other proteins that this transcription factor interacts with, in hopes that it might be easier to block some of those interactions. They also plan to investigate what triggers SOX17 to turn on in precancerous cells.

“Activation of the SOX17 program in the earliest innings of colorectal cancer formation is a critical step that shields precancerous cells from the immune system,” says Ömer Yilmaz, an MIT associate professor of biology, a member of the Koch Institute for Integrative Cancer Research, and one of the study’s senior authors. “If we can inhibit the SOX17 program, we might be better able to prevent colon cancer, particularly in patients that are prone to developing colon polyps.”

A linguistic warning sign for dementia

Older people with mild cognitive impairment, especially when characterized by episodic memory loss, are at increased risk for dementia due to Alzheimer’s disease. Now a study by researchers from MIT, Cornell, and Massachusetts General Hospital has identified a key deficit unrelated to memory that may help reveal the condition early—when any available treatments are likely to be most effective.

The issue has to do with a subtle aspect of language processing: people with amnestic mild cognitive impairment (aMCI) struggle with certain ambiguous sentences in which pronouns could refer to people not referenced in the sentences themselves.For instance, in “The electrician fixed the light switch when he visited the tenant,” it is not clear without context whether “he” refers to the electrician or some other visitor. But in “He visited the tenant when the electrician repaired the light switch,” “he” and “the electrician” cannot be the same person. And in “The babysitter emptied the bottle and prepared the formula,” there is no reference to a person beyond the sentence.

The researchers found that people with aMCI performed significantly worse than others at producing sentences of the first type. “It’s not that aMCI individuals have lost the ability to process syntax or put complex sentences together, or lost words; it’s that they’re showing a deficit when the mind has to figure out whether to stay in the sentence or go outside it to figure out who we’re talking about,” explains coauthor Barbara Lust, a professor emerita at Cornell and a research affiliate at MIT. 

“While our aMCI participants have memory deficits, this does not explain their language deficits,” adds MIT linguistics scholar Suzanne Flynn, another coauthor. The findings could steer neuroscience studies on dementia toward brain regions that process language. “The more precise we can become about the neuronal locus of deterioration,” she says, “that’s going to make a big difference in terms of developing treatment.”

Purpose-built AI builds better customer experiences

In the bygone era of contact centers, the customer experience was tethered to a singular channel—the phone call. The journey began with a pre-recorded message prompting the customer to press a number corresponding to their query. Today’s contact centers have evolved from the confines of just traditional phone calls to multiple channels from emails to social media to chatbots.

Customers have access to more business information than ever. But improving the quality of customer experiences means becoming more customer-centric and data-driven and scaling available human representatives for round-the-clock assistance.

Enabling these improvements is no small feat for enterprises, though, says senior product marketing manager at NICE, Michele Carlson. With large data streams and the demand for personalized experiences, artificial intelligence has become the key enabler in fostering these better customer experiences.

“There’s such an enormous amount of data available that without artificial intelligence as this driving force for better customer experiences, it would be impossible to meet customer’s expectations today.”

Amid the many moving parts in a contact center from managing multiple incoming calls to taking accurate notes of each interaction to measuring success metrics, AI can help smooth friction. Sentiment analysis can help supervisors identify in real-time which calls require escalation or further support and AI tools can summarize calls and automate note-taking to free up agents to focus more closely on customer needs. These use cases not only improve customer and employee experiences but also save time and money.

While the promises of AI have many enterprises making swift investments, Carlson cautions leaders to be goal-oriented first. Rather than deploy AI because it’s popular, AI-driven solutions need to be purpose-built to support and align with goals. 

“There are so many available artificial intelligence solutions right now, but it’s really critical to choose AI that is designed and built on data that is specific to your organization,” says Carlson.

Looking ahead, Carlson sees the evolution toward AI-enabled customer centricity as a signal of a customer experience paradigm shift where AI will augment not just operational details but offer insights into high-level business strategy.

“As everyone gets introduced to this technology,” says Carlson, “it’s going to be those that are open to using new things and open to using AI, but also the ones that are selecting the right types of artificial intelligence to compliment their business that are going to be the most successful in using it, and gaining the efficiency and optimizing the customer experiences.”


This episode of Business Lab is produced in partnership with NICE.

Full Transcript

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic is deploying customer service with AI to maximize results. As artificial intelligence evolves in the call center, it can provide real-time guidance. But measuring success remains key to operational efficiency and customer satisfaction.

Two words for you: better service.

My guest is Michele Carlson, senior product marketing manager at NICE.

This podcast is produced in partnership with NICE.

Welcome, Michele.

Michele: Thank you so much, Laurel. I’m so excited to be here.

Laurel: Well, welcome. And let’s begin by setting some context for our conversation. Long ago, in technology years, one would talk to a live person when calling a customer service number, and then we moved on to automated menu choices and beyond. So how have call centers evolved to better serve customers? And to bring us up to the present day, how is AI an enabler of that evolution?

Michele: The really good place to get started is, how did this all begin? So right now, contact centers are more customer-focused than they’ve ever been. Like you mentioned, they first started with a call, or maybe what we call an IVR, or an internal voice recording, where you would put in a phone number, or put in a number if you wanted to go to a certain queue to answer a certain type of question. And now we’ve advanced far beyond that.

So there still are things like IVRs in the market, but there are more channels than ever now that customers are interacting with. So it’s not just the phone calls, it’s email, it’s social, it’s the chatbots on their website. It’s the more sophisticated website. So there’s more places that customers can get information about a business than ever before. So that’s something that’s really changed for contact centers.

So the way that they’re really handling that to give better customer experience, and to engage more with their customers, is focusing more on becoming customer-centric. Which are things like more personalization, being more data-driven, having greater availability for their agents. And all of these options that, for us as consumers, are really exciting because we can reach out to a business in many different ways at many different hours of the day, 24/7 access to get our questions answered.

While this is exciting for customers, it also creates a challenge for contact centers. Because, yes, it’s a way that they can evolve to serve their customers in all these places, but it’s a challenge for them. And you asked about artificial intelligence or AI, how is AI supporting that?

And that’s a big enabler for contact centers to be able to deliver these better experiences to customers, because there are so many channels, there’s so much need and expectation for personalization. There is a need to be more data-driven. And artificial intelligence allows businesses, allows contact centers, to evaluate and see what their customers are calling about, when their customers are going to call, what channels their customers are interacting with, and even the questions that customers are asking on different channels.

Using all of that data is a way that they can personalize and deliver better experiences. And artificial intelligence allows them to look at all that data. There’s such an enormous amount of data available that without artificial intelligence as this driving force for better customer experiences, it would be impossible to meet customer’s expectations today.

And so it’s really exciting to think … As you mentioned, it’s a long time ago in technology years, which is really a very short time. We’ve seen this evolution really pick up pace in the last few years with the integration of things like conversational AI and generative AI into that contact center space. And we’ll talk more about those in the course of our conversation, too.

Laurel: So yeah, speaking of data, it’s such a central role to most technology deployments and digital transformations. So then, what is the role of data in this context? And how can organizations best manage and use the data, since it is coming from so many different places as well as where it needs to be saved, to ensure a more efficient experience with contact centers?

Michele: Yeah, so the role data plays in our world today is a substantial one. “Data is the new oil.” It’s not my quote, but I’ll borrow it.

And data, there is so much of it. And the idea is it’s so very valuable, and it’s really critical to have all this data gathered together to be able to use it and be able to understand it.

So what contact centers are doing, the ones that are really successful in this, is they’re benefiting by aligning their data and building what we’re calling an interaction-centric approach.

Rather than saying I’m just going to look at my customers in a web version, or I’m just going to look at my customers through voice, being able to look at data from all over and all these different places makes this interaction-centric approach really crucial to getting started and using the data in a way that makes sense for the business.

So this is allowing them to move from things like voice and digital messaging to chatbots and social media, just on one platform. So if you or me, if we were to call into a contact center, they would know where our journey has gone. If we went to the website, if we went to the chatbot, if we called, how our call went, who we spoke with, what the outcome of that interaction was.

And that lens, in having the data, is more powerful in keeping this customer-centric approach, or this customer-centric mindset. Because it brings together all of these touch points on one channel, so that you can move interactions into one platform, which allows all these organizations to then look at different types of applications and solutions to solve different problems within their contact centers and their customer experience groups.

Laurel: So could you share some of those specific examples of how AI-driven solutions can address these unique challenges in contact centers, and also provide improvements in both customer and employee experiences?

Michele: Yes, of course. And I really like how you frame that question. Because it’s about both the customer experience and the employee experience. Without helping your employees and supporting your employees, it would be very difficult to provide, in turn, that great customer experience. And artificial intelligence-driven, AI-driven types of solutions, just to go back to that previous question around data, the AI solutions are only as good as the data that’s available to them.

So in a contact center where customer experience is the goal, you want your artificial intelligence and the data to be driven off of interactions with your customers, and that’s a very crucial foundational element across the board in choosing and using an artificially intelligent solution. One of the ways that organizations are doing this, they’re thinking about, we started with that IVR [interactive voice response]. By the time I get to item nine in the menu, I’ve usually forgotten what the previous items are.

But rather than using an IVR, you can use artificially intelligent routing. So you can predict why a customer is calling, who and which agent they might best interact with. And you can use data kind of on both sides to understand the customer’s needs, and the agents, to direct the call so it has the best outcome.

Once the interaction begins, we can use data, artificial intelligence, to measure sentiment, customer sentiment. And in the course of the interaction, an agent can get a notification from their supervisor that says, “Here’s a couple different things that you can do to help improve this call.” Or, “Hey, in our coaching session, we talked about being more empathetic, and that’s what this means for this customer.” So, giving specific prompts to make the interaction move better in real-time.

Another example supervisors are also burdened with; they usually have a large team of somewhere up to 20, sometimes 25 different agents who all have calls going at the same time.

And it’s difficult for supervisors to keep a pulse on, who is on which interaction with what customer? And is this escalation important, or which is the most important place? Because we can only be one place at one time. As much as we try with modern technology to do many things, we can only do one really well at once.

So for supervisors, they can get a notification about which calls are in need of escalation, and where they can best support their agent. And they can see how their teams are performing at one time as well.

Once the call is over, artificial intelligence can do things like summarize the interaction. During a context interaction, agents take in a lot of information. And it is difficult to then decipher that, and their next call is going to be coming in very quickly. So artificial intelligence can generate a summary of that interaction, instead of the agent having to write notes.

And this is a huge improvement because it improves the experience for customers. That next time they call, they know those notes are going to go over to the agent, the agent can use them. Agents also really appreciate this, because it’s difficult for them in shorthand to recreate very complicated, in healthcare for example, all of the different coding numbers for different types of procedures, or are the provider, or multiple providers, or explanations of benefits to summarize all of that concisely before they take their next call.

So an auto-summarization tool does that automatically based off of the conversation, saving the agents up to a minute of post-call notes, but also saving businesses upwards of $14 million a year for 1,000 agents. Which is great, but agents appreciate it because 85% of them don’t really like all of their desktop applications. They have a lot of applications that they manage. So artificial intelligence is helping with these call summaries.

It can also help with reporting after the fact, to see how all of the calls are trending, is there high sentiment or low sentiment? And also in the quality management aspect of managing a contact center, every single call is evaluated for compliance, for greeting, for how the agent resolved the call. And one of the big challenges in quality management without artificial intelligence is that it’s very subjective.

So you and I could listen to the same call, and we could have very different viewpoints of how the call went. And agents, it’s difficult for them to get conflicting feedback on their performance. And so artificial intelligence can listen to the call, extract data points baseline, and consistently evaluate every single interaction that’s coming into a contact center.

They get better feedback and then they grow, they learn, they have a better overall experience because of this consistency in the evaluations.

So to answer your question, there are a lot of different ways artificial intelligence can support these contact center needs. And if you’re a business and customer satisfaction is your main goal, it’s really critical to think about not just one point of an interaction you have with a customer, but really before, during, and after every interaction, there’s all these opportunities to bring in data for greater consistency, and that’s something that is gained through using artificial intelligence.

Laurel: Yeah, that’s certainly quite a bit there. So when a company is thinking about integrating AI into their customer experiences, what are some common pitfalls they need to look out for, and how can those be mitigated or avoided?

Michele: Yeah, I think one of the most common pitfalls, and we’re all attracted to what’s new and exciting, and artificial intelligence is definitely on that list. And one of the reasons, or one of the pitfalls I’ve seen as organizations are getting started, they focus on too much on using AI.

Somebody said they read a cool article, “We’ve got to use AI for that.” And yeah, you could use AI for that. But really you’re choosing a type of technology, or you’re choosing artificial intelligence, to solve a specific problem. So what I would encourage everyone to do is, think about what is your goal? And then choose AI-driven solutions to then support and align with your goals.

So for typical goals in the contact center, these might be around measuring customer experience like CSAT, sentiment, first call resolution, average handle time, a digital resolution rate, digital containment rate. These are all different types of metrics or goals an organization could have.

But among the chief dos and don’ts is, make sure you’re choosing AI that is specific to what your goals are. I would say very close second is making sure you’re choosing AI that is purpose-built for customer experience. Or purpose-built for, if you’re not in a contact center, whatever your specific type of organization does.

There are so many available artificial intelligence solutions right now, but it’s really critical to choose AI that is designed and built on data that is specific to your organization. So in this instance, customer experience.

And that allows you to benefit from how those models and how that AI is built so that you can use something out of the box. You don’t have to build everything on your own, because that could be very time-consuming. And also creates some ethical dilemmas if you don’t have a large enough data set because your AI is only going to be as good as the data that it’s trained upon. So you want to make sure it has as much data, and relevant data, for your use case as possible.

Laurel: So you did touch on this a little bit. Which is, how can AI and automation enhance the day-to-day work of contact center agents without creating additional challenges? How can it actually continuously improve both the employee and customer experience?

Michele: Yeah, of course. So I’ll give a couple more examples. I think there is a few I gave earlier. So the first I think is just being objective about how a call has been handled, I think that’s one of the most critical use cases.

And so at NICE, we have AI models that learn these different agent soft skills. So everything from how to ask good probing questions, to being empathetic, to taking ownership and resolving an issue efficiently. These models are looking at how to do that. And I think that’s one of the pieces that helps in the day-to-day work for contact center agents. Because they are getting consistent feedback on how they’re performing, but also the models continue to improve over time as well because you’re giving the models new data to work from, new calls, new interactions. And then that is improving both the evaluations for the agents, but it’s improving the customer experience as well.

Because if your baseline is that your sentiment level was at a five, and now you’ve expanded the baseline and you’ve increased your baseline and now you’re at an eight, you’re consistently improving in that way where you’ve now, one, measured what you want to do, which is improve customer experience. You’ve given your agents a measurement, a consistent measurement to deliver on your goal. And then three, you’re continuing to measure over time as you have more different interactions.

So not only are your agents getting better, but your models become more finely-tuned for your organization as well.

Laurel: So as we’re discussing this, in terms of coaching and training agents, how can AI-driven tools effectively provide that kind of real-time guidance without being intrusive? But then also, strike that balance between support and autonomy for the agents?

Michele: Yeah, and I think that’s a great place to be thinking about. If you are a contact center agent, you are on the phone, and then you’re also multitasking on your screen. You’re looking for data, you’re looking for information, you have the customer’s card and hopefully information up from their previous interaction. You have maybe an IM message with your supervisor, you have a lot going on at one time.

So I think when you’re thinking about things like real-time guidance, and coaching and training, this is where it becomes really crucial. I mentioned this being interaction-centric and having everything on one platform, but having the ability to use that sentiment data or customer satisfaction data in multiple places can be very powerful. Because then you’re not introducing new information in real time.

I think that’s the biggest piece to be aware of. Is that in real time, that is not the first time agents are seeing this information about how they could become more empathetic, or how they can deliver on their coaching that they had with their supervisor in a previous interaction.

So by putting everything, anchoring in on this interaction-centric piece and then converging everything on one type of a data platform. In the industry we call it CCaaS, contact center as a service. By delivering on one platform, you enable your organization to use the same data point in multiple places.

So the agent is using this data, they get a popup in real time. But they’ve also had conversations with their supervisor about these skill sets after their previous interactions. And it’s that cycle, and it’s that consistency, that makes agents better aware of and more adaptable for this environment. So that you’re not going to them and giving them yet another thing that they need to resolve, but you’re providing them with information that is relevant and real-time for that particular interaction that they’re on.

Laurel: So you’ve touched on this here and there, but a key element of deploying any kind of new tool or technology is measuring its success. What metrics should organizations then prioritize to measure both customer satisfaction and operational efficiency?

Michele: Some of the key metrics that organizations most focus on in the contact center are net promoter score, so NPS score, or a customer satisfaction score. Those are key measurements of how a customer perceives the interaction. You’ll also see things like customer sentiment, how a customer is feeling about the interaction, included in those measurements.

And then you get into some measurements that are more around the length of the call or efficiency-driven, like an average handle time or an average talk time. And I would say between the CSAT-type measurements and efficiency-type measurements, those make up the measurements for many of the voice types of interactions. So, how long a call is correlates directly to the cost of the call.

Then what’s kind of exciting in this new space is, there’s a lot more organizations that are moving into digital interactions as well. And organizations are looking at things like digital containment, or the number of digital resolutions. How many customer questions was my website or my chatbot able to resolve?

Those then translate into what could be cost savings for a voice interaction. Voice interactions are about a hundred times more expensive than a web or chatbot interaction. So by building effective chatbots, by building effective IVAs, they are also, in turn, improving their overall cost goals for their organization.

I’d say the other metric that everyone is focused on as well is agent retention. So are you giving your tools to your agents to support them in the coaching process and the quality process, in their interactions so that they have a better experience with your organization in answering questions, and that you’re giving them tools to grow as well?

Being a contact center agent is probably one of the hardest and most difficult jobs in that business space. And they are on the phone, they’re inundated with information. So any tools that you can provide them with to help them access information more quickly is hugely beneficial.

Laurel: So it’s clear that there’s lots of opportunities for greater efficiencies and optimizing customer experiences. But looking into the future, how do you see AI and customer experience evolving?

Michele: I think there’s definitely going to be more use cases where we see … And here at NICE, we’re already integrating generative AI and conversational AI into our solutions. And as you adapt these new technologies, it’s only going to build upon itself, where there’s going to be more evolutions in this space.

I think one of the most exciting things that we’ve introduced recently is this idea of using generative AI. So we’ve put guardrails around it, and the guardrails are really crucial when you’re working with artificial intelligence and the large language models, LLMs. We’ve all played with ChatGPT or Claude, and you can interact with those.

And what is really exciting that we’ve done is, we’ve used that type of technology to generate conversations and answers and information. But we’ve put guardrails on it so that organizations can better interact with just their customer experience specific data.

And what this means is, when you are in leadership in an organization, for example, if you were looking for a report, it may take you 12 emails multiple times back and forth saying what you’d like to see in that report. But if you have, again, all of these interactions on one platform, you’ve made it interaction-centric, you’re using all these solutions that kind of compliment each other for every part of the interaction.

What you can do is, instead of emailing a data analyst back and forth for a report, you could interact with generative AI. You could type a question to say, “Hey, who are my top 10 performing agents by sentiment, and what are their key skills that they are using in those interactions?” Then you can generate a report based off of that.

What we’re seeing is that all of these solutions are not necessarily replacing people, but we’re seeing a lot of AI-adjacent or AI-augmented interactions in this contact center space that are coming into play.

And what this is doing is, it’s allowing decision-makers to focus more on their overall strategy and the overall experience that they’re delivering to customers. Rather than being very specific in emailing about a report, or even for agents to be able to type into a conversational AI interface that they can look for specific types of information, rather than searching everywhere for it.

So we are seeing a lot more AI augmented users. And so as everyone gets introduced to this technology, it’s going to be those that are open to using new things and open to using AI, but also the ones that are selecting the right types of artificial intelligence to compliment their business that are going to be the most successful in using it, and gaining the efficiency and optimizing the customer experiences.

Laurel: That’s great insight, Michele. Thank you so much for being on the Business Lab today.

Michele: Thanks so much, Laurel. It was great to be here.

Laurel: That was Michele Carlson, who is the senior product marketing manager at Nice, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the global director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web, and at events each year around the world.

For more information about us and the show, please check out our website at technologyreview.com. This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studio. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.