Building a more reliable supply chain

In 2021, when a massive container ship became wedged in the Suez Canal, you could almost hear the collective sigh of frustration around the globe. It was a here-we-go-again moment in a year full of supply chain hiccups. Every minute the ship remained stuck represented about $6.7 million in paralyzed global trade.

The 12 months leading up to the debacle had seen countless manufacturing, production, and shipping snags, thanks to the covid-19 pandemic. The upheaval illuminated the critical role of supply chains in consumers’ everyday lives—nothing, from baby formula to fresh produce to ergonomic office chairs, seemed safe.

For companies producing just about any physical product, the many “black swan” events (catastrophic incidents that are nearly impossible to predict) of the last four years illustrate the importance of supply chain resilience—businesses’ ability to anticipate, respond, and bounce back. Yet many organizations still don’t have robust measures in place for future setbacks.

In a poll of 250 business leaders conducted by MIT Technology Review Insights in partnership with Infosys Cobalt, just 12% say their supply chains are in a “fully modern, integrated” state. Almost half of respondents’ firms (47%) regularly experience some supply chain disruptions—nearly one in five (19%) say they feel “constant pressure,” and 28% experience
“occasional disruptions.” A mere 6% say disruptions aren’t an issue. But there’s hope on the horizon. In 2024, rapidly advancing technologies are making transparent, collaborative, and data-driven supply chains more realistic.

“Emerging technologies can play a vital role in creating more sustainable and circular supply chains,” says Dinesh Rao, executive vice president and co-head of delivery at digital services and consulting company Infosys. “Recent strides in artificial intelligence and machine learning, blockchain, and other systems will help build the ability to deliver future-ready, resilient supply chains.”

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Unlocking the power of sustainability

According to UN climate experts, 2023 was the warmest year on record. This puts the heat squarely on companies to accelerate their sustainability efforts. “It’s quite clear that the sense of urgency is increasing,” says Jonas Bohlin, chief product officer for environmental, social, and governance (ESG) platform provider Position Green.

That pressure is coming from all directions. New regulations, such as the Corporate Sustainability Reporting Directive (CSDR) in the EU, require that companies publicly report on their sustainability efforts. Investors want to channel their money into green opportunities. Customers want to do business with environmentally responsible companies. And organizations’ reputations for sustainability are playing a bigger role in attracting and retaining employees.

On top of all these external pressures, there is also a significant business case for sustainability efforts. When companies conduct climate risk audits, for example, they are confronted with escalating threats to business continuity from extreme weather events such as floods, wildfires, and hurricanes, which are occurring with increasing frequency and severity.

Mitigating the risks associated with direct damage to facilities and assets, supply chain disruptions, and service outages very quickly becomes a high-priority issue of business resiliency and competitive advantage. A related concern is the impact of climate change on the availability of natural resources, such as water in drought-prone regions like the American Southwest.

Much more than carbon

“The biggest misconception that people have is that sustainability is about carbon emissions,” says Pablo Orvananos, global sustainability consulting lead at Hitachi Digital Services. “That’s what we call carbon tunnel vision. Sustainability is much more than carbon. It’s a plethora of environmental issues and social issues, and companies need to focus on all of it.”

Companies looking to act will find a great deal of complexity surrounding corporate sustainability efforts. Companies are responsible not only for their own emissions and fossil fuels usage (Scope 1), but also the sustainability efforts of their energy suppliers (Scope 2) and their supply chain partners (Scope 3). New regulations require organizations to look beyond just emissions. Companies must ask questions about a broad range of environmental and societal issues: Are supply chain partners sourcing raw materials in an environmentally conscious manner? Are they treating workers fairly?

Sustainability can’t be siloed into one specific task, such as decarbonizing the data center. The only way to achieve sustainability is with a comprehensive, holistic approach, says Daniel Versace, an ESG research analyst at IDC. “A siloed approach to ESG is an approach that’s bound to fail,” he adds.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

The worst technology failures of 2023

Welcome to our annual list of the worst technologies. This year, one technology disaster in particular holds lessons for the rest of us: the Titan submersible that imploded while diving to see the Titanic

Everyone had warned Stockton Rush, the sub’s creator, that it wasn’t safe. But he believed innovation meant tossing out the rule book and taking chances. He set aside good engineering in favor of wishful thinking. He and four others died. 

To us it shows how the spirit of innovation can pull ahead of reality, sometimes with unpleasant consequences. It was a phenomenon we saw time and again this year, like when GM’s Cruise division put robotaxis into circulation before they were ready. Was the company in such a hurry because it’s been losing $2 billion a year? Others find convoluted ways to keep hopes alive, like a company that is showing off its industrial equipment but is quietly still using bespoke methods to craft its lab-grown meat. The worst cringe, though, is when true believers can’t see the looming disaster, but we do. That’s the case for the new “Ai Pin,” developed at a cost of tens of millions, that’s meant to replace smartphones. It looks like a titanic failure to us. 

Titan submersible

This summer we were glued to our news feeds as drama unfolded 3,500 meters below the ocean’s surface. An experimental submarine with five people aboard was lost after descending to see the wreck of the Titanic.  

the oceangate submersible underwater

GDA VIA AP IMAGES

The Titan was a radical design for a deep-sea submersible: a minivan-size carbon fiber tube, operated with a joystick, that aerospace engineer Stockton Rush believed would open the depths to a new kind of tourism. His company, OceanGate, had been warned the vessel hadn’t been proved to withstand 400 atmospheres of pressure. His answer? “I think it was General MacArthur who said ‘You’re remembered for the rules you break,” Rush told a YouTuber.

But breaking the rules of physics doesn’t work. On June 22, four days after contact was lost with the Titan, a deep-sea robot spotted the sub’s remains. It was most likely destroyed in a catastrophic implosion.

In addition to Rush, the following passengers perished:

  • Hamish Harding, 58, tourist
  • Shahzada Dawood, 48, tourist
  • Suleman Dawood, 19, tourist
  • Paul-Henri Nargeolet, 77, Titanic expert

More: The Titan Submersible Was “an Accident Waiting to Happen” (The New Yorker), OceanGate Was Warned of Potential for “Catastrophic” Problems With Titanic Mission (New York Times), OceanGate CEO Stockton Rush said in 2021 he knew he’d “broken some rules” (Business Insider)


Lab-grown meat

Instead of killing animals for food, why not manufacture beef or chicken in a laboratory vat? That’s the humane idea behind “lab-grown meat.”

The problem, though, is making the stuff at a large scale. Take Upside Foods. The startup, based in Berkeley, California, had raised more than half a billion dollars and was showing off rows of big, gleaming steel bioreactors.

But journalists soon learned that Upside was a bird in borrowed feathers. Its big tanks weren’t working; it was growing chicken skin cells in much smaller plastic laboratory flasks. Thin layers of cells were then being manually scooped up and pressed into chicken pieces. In other words, Upside was using lots of labor, plastic, and energy to make hardly any meat.

Samir Qurashi, a former employee, told the Wall Street Journal he knows why Upside puffed up the potential of lab-grown food. “It’s the ‘fake it till you make it’ principle,” he said.

And even though lab-grown chicken has FDA approval, there’s doubt whether lab meat will ever compete with the real thing. Chicken goes for $4.99 a pound at the supermarket. Upside still isn’t saying how much the lab version costs to make, but a few bites of it sell for $45 at a Michelin-starred restaurant in San Francisco.

Upside has admitted the challenges. “We signed up for this work not because it’s easy, but because the world urgently needs it,” the company says.

More: I tried lab-grown chicken at a Michelin-starred restaurant (MIT Technology Review), The Biggest Problem With Lab-Grown Chicken Is Growing the Chicken (Bloomberg), Insiders Reveal Major Problems at Lab-Grown-Meat Startup Upside Foods (Wired)


Cruise robotaxi

Sorry, autopilot fans, but we can’t ignore the setbacks this year. Tesla just did a massive software recall after cars set on self-driving mode slammed into emergency vehicles. But the biggest reversal was at Cruise, the division of GM that became the first company to offer driverless taxi rides in San Francisco, day or night, with a fleet exceeding 400 cars.

Cruise argues that robotaxis don’t get tired, don’t get drunk, and don’t get distracted. It even ran a full-page newspaper ad declaring that “humans are terrible drivers.”

a Cruise vehicle parked on the street in front of a residential home as a person descends a front staircase in the background

CRUISE

But Cruise forgot that to err is human—not what we want from robots. Soon, it was Cruise’s sensor-laden Chevy Bolts that started racking up noticeable mishaps, including dragging a pedestrian for 20 feet. This October, the California Department of Motor Vehicles suspended GM’s robotaxis, citing an “unreasonable risk to public safety.”

It’s a blow for Cruise, which has since laid off 25% of its staff and fired its CEO and cofounder, Kyle Vogt, a onetime MIT student. “We have temporarily paused driverless service,” Cruise’s website now reads. It says it’s reviewing safety and taking steps to “regain public trust.”

More: GM’s Self-Driving Car Unit Skids Off Course (Wall Street Journal), Important Updates from Cruise (Getcruise.com)


Plastic proliferation

Plastic is great. It’s strong, it’s light, and it can be pressed into just about any shape: lawn chairs, bobbleheads, bags, tires, or thread.

The problem is there’s too much of it, as Doug Main reported in MIT Technology Review this year. Humans make 430 million tons of plastic a year (significantly more than the weight of all people combined), but only 9% gets recycled. The rest ends up in landfills and, increasingly, in the environment. Not only does the average whale have kilograms of the stuff in its belly, but tiny bits of “microplastic” have been found in soft drinks, plankton, and human bloodstreams, and even floating in the air. The health effects of spreading microplastic pollution have barely been studied.

Awareness of the planetary scourge is growing, and some are calling for a “plastics treaty” to help stop the pollution. It’s going to be a hard sell. That’s because plastic is so cheap and useful. Yet researchers say the best way to cut plastic waste is not to make it in the first place.

More: Think your plastic is being recycled? Think again (MIT Technology Review),  Oh Good, Hurricanes Are Now Made of Microplastics (Wired)


Humane Ai Pin

The New York Times declared it Silicon Valley’s “Big, Bold Sci-Fi Bet” for what comes after the smartphone. The product? A plastic badge called the Ai Pin, with a camera, chips, and sensors.

Humane's AI Pin worn on a sweatshirt

HUMANE

A device to wean us off our phone addiction is a worthy goal, but this blocky $699 pin (which also requires a $24-a-month subscription) isn’t it. An early review called the device, developed by startup Humane Ai, “equal parts magic and awkward.” Emphasis on the awkward. Users must speak voice commands to send messages or chat with an AI (a laser projector in the pin will also display information on your hand). It weighs as much as a golf ball, so you probably won’t be attaching it to a T-shirt. 

It is the creation of a husband-and-wife team of former Apple executives, Bethany Bongiorno and Imran Chaudhri, who were led to their product idea with the guidance of a Buddhist monk named Brother Spirit, raising $240 million and filing 25 patents along the way, according to the Times.

Clearly, there’s a lot of thought, money, and engineering involved in its creation. But as The Verge’s wearables reviewer Victoria Song points out, “it flouts the chief rule of good wearable design: you have to want to wear the damn thing.” As it is, the Ai Pin is neat, but it’s still no competition for the lure of a screen.

More: Can A.I. and Lasers Cure Our Smartphone Addiction? (New York Times) Screens are good, actually (The Verge)


Social media superconductor

A room-temperature superconductor is a material offering no electrical resistance. If it existed, it would make possible new types of batteries and powerful quantum computers, and bring nuclear fusion closer to reality. It’s a true Holy Grail.

So when a report emerged this July from Korea that a substance called LK-99 was the real thing, attention seekers on the internet were ready to share. The news popped up first in Asia, along with an online video of a bit of material floating above a magnet. Then came the booster fuel of social media hot takes.

Pellet of LK-99 being repelled by a magnet

HYUN-TAK KIM/WIKIMEDIA

“Today might have seen the biggest physics discovery of my lifetime,” said a post to X that has been viewed 30 million times. “I don’t think people fully grasp the implications … Here’s how it could totally change our lives.”

No matter that the post had been written by a marketer at a coffee company. It was exciting—and hilarious—to see well-funded startups drop their work on rockets and biotech to try to make the magic substance. Kenneth Chang, a reporter at the New York Times, dubbed LK-99 “the Superconductor of the Summer.”

But summer’s dreams soon ripped at the seams after real physicists couldn’t replicate the work. No, LK-99 is not a superconductor. Instead, impurities in the recipe could have misled the Korean researchers—and, thanks to social media, the rest of us too.

More: LK-99 Is the Superconductor of the Summer (New York Times)  LK-99 isn’t a superconductor—how science sleuths solved the mystery (Nature)


Rogue geoengineering

Solar geoengineering is the idea to cool the planet by releasing reflective materials into the atmosphere. It’s a fraught concept, because it won’t stop the greenhouse effect—only mask it. And who gets to decide to block the sun?

Mexico banned geoengineering trials early this year after a startup called Make Sunsets decided it could commercialize the effort. Cofounder Luke Iseman decided to launch balloons in Mexico designed to disperse reflective sulfur dioxide into the sky. The startup is still selling “cooling credits” for $10 each on its website.

Injecting particles into the sky is theoretically cheap and easy, and climate warming is a huge threat. But moving too fast can create a backlash that stalls progress, according to my colleague James Temple. “They’re violating the rights of communities to dictate their own future,” one critic said.

Iseman remains unrepentant. “I don’t poll billions before taking a flight,” he has said. “I’m not going to ask for permission from every person in the world before I try to do a bit to cool Earth.” 

More: The flawed logic of rushing out extreme climate solutions (MIT Technology Review), Mexico bans solar geoengineering experiments after startup’s field tests (The Verge), Researchers launched a solar geoengineering test flight in the UK last fall (MIT Technology Review)

Augmenting the realities of work

Imagine an integrated workplace with 3D visualizations that augment presentations, interactive and accelerated onboarding, and controlled training simulations. This is the future of immersive technology that global head of Immersive Technology Research at JPMorgan Chase, Blair MacIntyre is working to build. Augmented reality (AR) and virtual reality (VR) technologies can blend physical and digital dimensions together and infuse new innovations and efficiencies into business and customer experiences.

“These technologies can offer newer ways of collaborating over distance both synchronously and asynchronously than we can get with the traditional work technologies that we use right now,” says MacIntyre. “It’s these new ways to collaborate, ways of using the environment and space in new and interesting ways that will hopefully offer new value and change the way we work.”

Many enterprises are integrating VR into business practices like video conference calls. But having some participants in a virtual world and some sidelined creates imbalances in the employee experience. MacIntyre’s team is looking for ways to use AR/VR technologies that can be additive, like 3D data visualizations that enhance financial forecasting within a bank, not ones that overhaul entire experiences.

Although the potential of AR/VR is quickly evolving, it’s unlikely that customers’ interactions or workplace environments will be entirely moved to the virtual world anytime soon. Rather, MacIntyre’s immersive technology research looks to infuse efficiencies into existing practices.

“It’s thinking about how the technologies integrate and how we can add value where there is value and not trying to replace everything we do with these technologies,” MacIntyre says.

AI can help remove some of the tedium from immersive technologies that have made them impractical for widespread enterprise use in the past. Using VR technology in the workplace may prohibit taking notes and having access to traditional input devices and files. AI tools can take and transcribe notes and fill in any other gaps to help remove that friction and eliminate redundancies.

Connected Internet of things (IoT) devices are also key to enabling AR/VR technologies. To create a valuable immersive experience, MacIntyre says, it’s imperative to know as much about the surrounding world of the user as well as their needs, habits, and preferences.

“If we can figure out more ways of enabling people to work together in a distributed way, we can start enabling more people to participate meaningfully in a wider variety of jobs,” says MacIntyre.

This episode of Business Lab is produced in association with JPMorgan Chase.

Full transcript

Laurel: From MIT Technology Review, I’m Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic today is emerging technologies, specifically, immersive technologies like augmented and virtual reality. Keeping up with technology trends may be a challenge for most enterprises, but it’s a critical way to think about future possibilities from product to customer service to employee experience. Augmented and virtual realities aren’t necessarily new, but when it comes to applying them beyond gaming, it’s a brave new world.

Two words for you: emerging realities.

My guest is Blair MacIntyre, who is the global head of Immersive Technology Research at JPMorgan Chase.

This podcast is produced in association with JPMorgan Chase.

Welcome, Blair.

Blair MacIntyre: Thank you. It’s great to be here.

Laurel: Well, let’s do a little bit of context setting. Your career has been focused on researching and exploring immersive technology, including software and design tools, privacy and ethics, and game and experience design. So what brought you to JPMorgan Chase, and could you describe your current role?

Blair: So before joining the firm, I had spent the last 23 years as a professor at Georgia Tech and Northeastern University. During that time, as you say, I explored a lot of ways that we can both create things with these technologies, immersive technologies and also, what they might be useful for and what the impacts on people in society and how we experience life are. But as these technologies have become more real, moved out of the lab, starting to see real products from real companies, we have this opportunity to actually see how they might be useful in practice and to have, for me, an impact on how these technologies will be deployed and used that goes beyond the traditional impact that professors might have. So beyond writing papers, beyond teaching students. That’s what brought me to the firm, and so my current role is, really, to explore that, to understand all the different ways that immersive technology could impact the firm and its customers. Right? So we think about not just customer-facing and not just products, but also employees and their experience as well.

Laurel: That’s really interesting. So why does JPMorgan Chase have a dedicated immersive technology focus in its global technology applied research division, and what are the primary goals of your team’s research within finance and large enterprises as a whole?

Blair: That’s a great question. So JPMorgan Chase has a fairly wide variety of research going on within the company. There’s large efforts in AI/ML, in quantum computing, blockchain. So they’re interested in looking at all of the range of new technologies and how they might impact the firm and our customers, and immersive technologies represent one of those technologies that could over time have a relatively large impact, I think, especially on the employee experience and how we interact with our customers. So they really want to have a group of people focusing on, really, looking both in the near and long term, and thinking about how we can leverage the technology now and how we might be able to leverage it down the road, and not just how we can, but what we should not do. Right? So we’re interested in understanding of these applications that are being proposed or people are imagining could be used. Which ones actually have value to the company, and which ones may not actually have value in practice?

Laurel: So when people think of immersive technologies like augmented reality and virtual reality, AR and VR, many think of headsets or smartphone apps for gaming and retail shopping experiences. Could you give an overview of the state of immersive technology today and what use cases you find to be the most innovative and interesting in your research?

Blair: So, as you say, I think many people think about smartphones, and we’ve seen, at least in movies and TV shows, head mounts of various kinds. The market, I would divide it right now into the two parts, the handheld phone and tablet experience. So you can do augmented reality now, and that really translates to we take the camera feed, and we can overlay computer graphics on it to do things like see what something you might want to buy looks like in your living room or do, in an enterprise situation, remote maintenance assistance where I can take my phone, point it at a piece of technology, and a remote expert could draw on it or help me do something with it.

There’s the phone-based things, and we carry these things in our pockets all the time, and they’re relatively cheap. So there’s a lot of opportunities when it’s appropriate to use those, but the big downside of those devices is that you have to hold them in your hands, so if you wanted to try to put information all around you, you would have to hold the device up and look around, which is uncomfortable and awkward. So that is where the head mount displays come in.

So either virtual reality displays which, right now, many of us think about computer games and education as use cases in the consumer world or augmented reality displays. These sorts of displays now let us do the same kind of things we might do with our phones, but we can do it without our hands having to hold something so we can be doing whatever work it was we wanted to do, right? Repairing the equipment, taking notes, working with things in the world around us, and we can have information spread all around us, which I think is the big advantage of head mounts.

So many of the things people imagine when they think about augmented reality in particular involve this serendipitous access to information. I’m walking into a conference room, and I see sort of my notes and information about the people I’m meeting there and the materials from our last meeting, whatever it is, or I’m walking down the street, and I see advertising or other kinds of, say, tourism information, but those things only work if the device is out of mind. If I can put it on, and then go about my life, I’m not going to walk into a conference room, and hold up a phone, and look at everybody through it.

So that, I think, is the big difference. You could implement the same sorts of applications on both the handheld devices and the head-worn devices, but the two different form factors are going to make very different applications appropriate for those two sorts of technologies.

On the virtual reality side, we’re at the point now where the displays we can buy are light enough and comfortable enough that we could wear them for half an hour, a couple hours without discomfort. So a lot of the applications that people imagine there, I think the most popular things that people have done research on and that I see having a near-term impact in the enterprise are immersive training applications where you can get into a situation rather than, say, watching a video or a little click-through presentation as part of your annual training. You could really be in an experience and hopefully learn more from it. So I think those sorts of experiences where we’re totally immersed and focused is where virtual reality comes in.

The big thing that I think is most exciting about head-worn displays in particular where we can wear them while we’re doing work as opposed to just having these ephemeral experiences with a phone is the opportunity to do things together, to collaborate. So I might want to look at a map on a table and see a bunch of data floating above the map, but it would be better if you and our other colleagues were around the table with me, and we can all see the same things, or if we want to take a training experience, I could be in there getting my training experience, but maybe someone else is joining me and being able to both offer feedback or guidance and so on.

Essentially, when I think about these technologies, I think about the parallels to how we do work regularly, right? We generally collaborate with people. We might grab a colleague and have them look at our laptop to show them something. I might send someone something on my phone, and then we can talk about it. So much of what we do involves interactions with other people and with the data that we are doing our job with that anything we do with these immersive technologies is really going to have to mimic that and give us the ability to do our real work in these immersive spaces with the people that we normally work with.

Laurel: Well, speaking of working with people, how can the scale of an institution like JPMorgan Chase help propel this research forward in immersive technology, and what opportunities does it provide that are otherwise limited in a traditional university or startup research environment?

Blair: I think it comes down to a few different things. On one hand, we have the access to people who are really doing the things that we want to build technologies to help with. Right? So if I wanted to look how I could use immersive visualization of data to help people in human resources do planning or help people who are doing financial modeling look at the data in new and interesting ways, now I could actually do the research in conjunction with the real people who do that work. Right? So I’ve already and I’ve been at the firm for a little over a year, and many conversations we’ve had were either we’ve had an idea or somebody has come to us with an idea. Through the course of the conversations, relatively quickly, we hone in on things that are much more sophisticated, much more powerful than what we might have thought of at a university where we didn’t have that sort of direct access to people doing the work.

On the other hand, if we actually build something, we can actually test it with the same people, which is an amazing opportunity. Right? When I go to a conference, we’re going to put 20 people who actually represent the real users of those systems. So, for me, that’s where I think the big opportunity of doing research in an enterprise is, is building solutions for the real people of that enterprise and being able to test it with those people.

Laurel: Recent years have actually changed what customers and employees expect from enterprises as well, like omnichannel retail experiences. So immersive technologies can be used to bridge gaps between physical and virtual environments as you were saying earlier. What are the different opportunities that AR and VR can offer enterprises, and how can these technologies be used to improve employee and customer experience?

Blair: So I alluded back to some of that in previous answers. I think the biggest opportunities have to do with how employees within the organization can do new things together, can interact, and also how companies can interact with customers. Now, we’re not going to move all of our interactions with our customers into the virtual world, or the metaverse, or whatever you want to call it nowadays anytime soon. Right? But I think there are opportunities for customers who are interested in those technologies, and comfortable with them, and excited by them to get new kinds of experiences and new ways of interacting with our firm or other firms than you could get with webpages and in-person meetings.

The other big opportunity I think is as we move to a more hybrid work environment and a distributed work environment, so a company like JPMorgan Chase is huge and spread around the world. We have over 300,000 employees now in most countries around the world. There might be groups of people, but they’re connected together through video right now. These technologies, I think, can offer new ways of collaborating over distance both synchronously and asynchronously than we can get with the traditional work technologies that we use right now. So it’s those new ways to collaborate, ways of using the environment and space in new and interesting ways that is going to, hopefully, offer new value and change the way we work.

Laurel: Yeah, and staying on that topic, we can’t really have a discussion about technology without talking about AI which is another evolving, increasingly popular technology. So that’s being used by many enterprises to reduce redundancies and automate repetitive tasks. In this way, how can immersive technology provide value to people in their everyday work with the help of AI?

Blair: So I think the big opportunity that AI brings to immersive technologies is helping ease a lot of the tedium and burden that may have prevented these technologies from being practical in the past, and this could happen in a variety of ways. When I’m in a virtual reality experience, I don’t have access to a keyboard, I don’t have access to traditional input devices, I don’t have necessarily the same sorts of access to my files, and so on. With a lot of the new AI technologies that are coming around, I can start relying on the computer to take notes. I can have new ways of pulling up information that I otherwise wouldn’t have access to. So, I think AI reducing the friction of using these technologies is a huge opportunity, and the research community is actively looking at that because friction has been one of the big problems with these technologies up till now.

Laurel: So, other than AI, what are other emerging technologies that can aid in immersive technology research and development?

Blair: So, aside from AI, if we step back and look at all of the emerging technologies as a whole and how they complement each other, I think we can see new opportunities. So, in our research, we work closely with people doing computer vision and other sort of sensing research to understand the world. We work closely with people looking at internet of things and connected devices because at a 10,000-foot level, all of these technologies are based on the idea of understanding, sensing the world, understanding what people are doing in it, understanding what people’s needs might be, and then somehow providing information to them or actuating things in the world, displaying stuff on walls or displays.

From that viewpoint, immersive technologies are primarily one way of displaying things in a new and interesting way and getting input from people, knowing what people want to do, allowing them to interact with data. But in order to do that, they need to know as much about the world around the user as possible, the structure of it, but also, who’s there, what we are doing, and so on. So all of these other technologies, especially the Internet of things (IoT) and other forms and ways of sensing what’s happening in the world are very complimentary and together can create new sorts of experiences that neither could do alone.

Laurel: So what are some of the challenges, but also, possible opportunities in your research that contrast the future potential of AR and VR to where the technology is today?

Blair: So I think one of the big limitations of technology today is that most of the experiences are very siloed and disconnected from everything else we do. During the pandemic, many of us experimented with how we could have conferences online in various ways, right? A lot of companies, small companies and larger companies, started looking at how you could create immersive meetings and big group experiences using virtual reality technology, but all of those experiences that people created were these closed systems that you couldn’t bring things into. So one of the things we’re really interested in is how we stop thinking about creating new kinds of experiences and new ways of doing things, and instead think about how do we add these technologies to our existing work practices to enhance them in some way.

So, for example. Right now, we do video meetings. It would be more interesting for some people to be able to join those meetings, say, in VR. Companies have experimented with that, but most of the experiments that people are doing assume that everyone is going to move into virtual reality, or we’re going to bring, say, the people in as a little video wall on the side of a big virtual reality room, making them second class citizens.

I’m really interested and my team is interested in how we can start incorporating technologies like this while keeping everyone a first-class participant in these meetings. As one example, a lot of the systems that large enterprises build, and we’re no different, are web-based right now. So if, let’s say, I have a system to do financial forecasting, you could imagine there’s a bunch of those at a bank, and it’s a web-based system, I’m really interested in how do we add the ability for people to go into a virtual reality or augmented reality experience, say, a 3D visualization of some kind of data at the moment they want to do it, do the work that they want to do, invite colleagues in to discuss things, and then go back to the work as it was always done on a desktop web browser.

So that idea of thinking of these technologies as a capability, a feature instead of a new whole application and way of doing things permeates all the work we’re doing. When I look down the road at where this can go, I see in, say, let’s say, two to five years, I see people with displays maybe sitting on their desk. They have their tablet and their phone, and they might also have another display or two sitting there. They’re doing their work, and at different times, they might be in a video chat, they might pick up a head mount and put it on to do different things, but it’s all integrated. I’m really interested in how we connect these together and reduce friction. Right? If it takes you four or five minutes to move your work into a VR experience, nobody is going to do it because it just is too problematic. So it’s that. It’s thinking about how the technologies integrate and how we can add value where there is value and not trying to replace everything we do with these technologies.

Laurel: So to stay on that future focus, how do you foresee the immersive technology landscape entirely evolving over the next decade, and how will your research enable those changes?

Blair: So, at some level, it’s really hard to answer that question. Right? So if I think back 10 years to where immersive technologies were, it would have been inconceivable for us to imagine the videos that are coming out. So, at some level, I can say, “Well, I have no idea where we’re going to be in 10 years.” On the other hand, it’s pretty safe to imagine the kinds of technologies that we’re experimenting with now just getting better, and more comfortable, and more easy to integrate into work. So I think the landscape is going to evolve in the near term to be more amenable to work.

Especially for augmented reality, the threshold that these devices would have to get to such that a lot of people would be willing to wear them all the time while they’re walking down the street, playing sports, doing whatever, that’s a very high bar because it has to be small, it has to be light, it has to be cheap, it has to have a battery that lasts all day, etcetera, etcetera. On the other hand, in the enterprise, in any business situation, it’s easy to imagine the scenario I described. It’s sitting on my desk, I pick it up, I put it on, I take it off.

In the medium term after that, I think we will see more consumer applications as people start solving more of the problems that are preventing people from wearing these devices for longer periods of time. Right? It’s not just size, and battery power, and comfort, it’s also things like optics. Right? A lot of people — not a lot, but say, let’s say 10%, 15% of people might experience headaches, or nausea, or other kinds of discomfort when they wear a VR display as they’re currently built, and a lot of that has to do with the fact that the optics that you’re looking at when you’re putting this display are built in a way that makes it hard to comfortably focus at objects at different distances away from you without getting into the nitty-gritty details. For many of us, that’s fine. We can deal with the slight problems. But for some people, it’s problematic.

So as we figure out how to solve problems like that, more people can wear them, and more people can use them. I think that’s a really critical issue for not just consumers, but for the enterprise because if we think about a future where more of our business applications and the kind of way we work are done with technologies like this, these technologies have to be accessible to everybody. Right? If that 10% or 15% of people get headaches and feel nauseous wearing this device, you’ve now disenfranchised a pretty significant portion of your workforce, but I think those can be solved, and so we need to be thinking about how we can enable everybody to use them.

On the other hand, technologies like this can enfranchise more people, where right now, working remotely, working in a distributed sense is hard. For many kinds of work, it’s difficult to do remotely. If we can figure out more ways of enabling people to work together in a distributed way, we can start enabling more people to participate meaningfully in a wider variety of jobs.

Laurel: Blair, that was fantastic. It’s so interesting. I really appreciate your perspective and sharing it here with us on the Business Lab.

Blair: It was great to be here. I enjoyed talking to you.

Laurel: That was Blair MacIntyre, the global head of Immersive Technology Research at JPMorgan Chase, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the global director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.


This podcast is for informational purposes only and it is not intended as legal, tax, financial, investment, accounting or regulatory advice. Opinions expressed herein are the personal views of the individual(s) and do not represent the views of JPMorgan Chase & Co. The accuracy of any statements, linked resources, reported findings or quotations are not the responsibility of JPMorgan Chase & Co.

Customer experience horizons

Customer experience (CX) is a leading driver of brand loyalty and organizational performance. According to NTT’s State of CX 2023 report, 92% of CEOs believe improvements in CX directly impact their improved productivity, and customer brand advocacy. They also recognize that the quality of their employee experience (EX) is critical to success. The real potential for transforming business, according to 95% of CEOs, is bringing customer and employee experience improvements together into one end-to-end strategy. This, they anticipate, will deliver revenue growth, business agility, and resilience.

To succeed, organizations need to reimagine what’s possible with customer and employee experience and understand horizon trends that will affect their business. This MIT Technology Review Insights report explores the strategies and technologies that will transform customer experience and contact center employee experience in the years ahead. It is based on nearly two dozen interviews with customer experience leaders, conducted between December 2022 and April 2023. The interviews explored the future of customer experience and employee experience and the role of the contact center as a strategic driver of business value.

The main findings of this report are as follows:

  • Richly contextualized experiences will create mutual value for customers and brands. Organizations will grow long-term loyalty by intelligently using customer data to contextualize every interaction. They’ll gather data that serves a meaningful purpose past the point of sale, and then use that information to deliver future experiences that are more personalized than any competitor could provide. The value of data sharing will be evident to the customer, building trust and securing the relationship between individual and brand.
  • Brands will view every touchpoint as a relationship-building opportunity. Rather than view customer interactions as queries to be resolved as quickly and efficiently as possible, brands will increasingly view every touchpoint as an opportunity to deepen the relationship and grow lifetime value. Organizations will proactively share knowledge and anticipate customer issues; they’ll become trusted advisors and advocate on behalf of the customer. Both digital and human engagement will be critical to building loyal ongoing relationships.
  • AI will create a predictive “world without questions.” In the future, brands will have to fulfill customer needs preemptively, using contextual and real-time data to reduce or eliminate the need to ask repetitive questions. Surveys will also become less relevant, as sentiment analysis and generative AI provide deep insights into the quality of customer experiences and areas for improvement. Leading organizations will develop robust AI roadmaps that include conversational, generative, and predictive AI across both the customer and employee experience.
  • Work becomes personalized. Brands will recognize that humans have the same needs, whether as a customer or an employee. Those include being known, understood, and helped—in other words, treated with empathy. One size does not fit all, and leading organizations will empower employees to work in a way that meets their personal and professional objectives. Employees will have control over their hours and schedule; be routed interactions where they are best able to succeed; and receive personalized training and coaching recommendations. Their knowledge, experiences, and interests will benefit customers as they resolve complex issues, influence purchase decisions, or discuss shared values such as sustainability. This will increase engagement, reduce attrition, and manage costs.
  • The contact center will be a hub for customer advocacy and engagement. Offering the richest sources of real-time customer data, the contact center becomes an organization’s eyes and ears to provide a single source of truth for customer insights. Having a complete perspective of experience across the entire journey, the contact center will increasingly advocate for the customer across the enterprise. For many organizations, the contact center is already an innovation test bed. This trend will accelerate, as technologies like generative AI rapidly find application across a variety of use cases to transform productivity and strategic decision-making.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Enabling enterprise growth with data intelligence

Data — how it’s stored and managed — has become a key competitive differentiator. As global data continues to grow exponentially, organizations face many hurdles between piling up historical data, real-time data streams from IoT sensors, and building data-driven supply chains. Senior vice president of product engineering at Hitachi Vantara, Bharti Patel sees these challenges as an opportunity to create a better data strategy.

“Before enterprises can become data-driven, they must first become data intelligent,” says Patel. “That means knowing more about the data you have, whether you need to keep it or not, or where it should reside to derive the most value out of it.”

Patel stresses that the data journey begins with data planning that includes all stakeholders from CIOs and CTOs to business users. Patel describes universal data intelligence as enterprises having the ability to gain better insights from data streams and meet increasing demands for transparency by offering seamless access to data and insights no matter where it resides.

Building this intelligence means building a data infrastructure that is scalable, secure, cost-effective, and socially responsible. The public cloud is often lauded as a way for enterprises to innovate with agility at scale while on premises infrastructures are viewed as less accessible and user friendly. But while data streams continue to grow, IT budgets are not and Patel notes that many organizations that use the cloud are facing cost challenges. Combating this, says Patel, means finding the best of both worlds of both on-prem and cloud environments in private data centers to keep costs low but insights flowing.

Looking ahead, Patel foresees a future of total automation. Today, data resides in many places from the minds of experts to documentation to IT support tickets, making it impossible for one person to be able to analyze all that data and glean meaningful insights.

“As we go into the future, we’ll see more manual operations converted into automated operations,” says Patel. “First, we’ll see humans in the loop, and eventually we’ll see a trend towards fully autonomous data centers.”

This episode of Business Lab is produced in partnership with Hitachi Vantara.

Full transcript

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic today is building better data infrastructures. Doing just the basics with data can be difficult, but when it comes to scaling and adopting emerging technologies, it’s crucial to organize data, tear down data silos, and focus on how data infrastructure, which is so often in the background, comes to the front of your data strategy.

Two words for you: data intelligence.

My guest is Bharti Patel. Bharti is a senior vice president of product engineering at Hitachi Vantara.

This episode of Business Lab is sponsored by Hitachi Vantara.

Welcome, Bharti.

Bharti Patel: Hey, thank you Laurel. Nice to be with you again.

Laurel: So let’s start off with kind of giving some context to this discussion. As global data continues to grow exponentially, according to IDC, it’s projected to double between 2022 and 2026. Enterprises face many hurdles to becoming data-driven. These hurdles include, but aren’t of course limited to, piles of historical data, new real-time data streams, and supply chains becoming more data-driven. How should enterprises be evaluating their data strategies? And what are the markers of a strong data infrastructure?

Bharti: Yeah, Laurel, I can’t agree more with you here. Data is growing exponentially, and as per one of the studies that we conducted recently where we talked to about 1,200 CIOs and CTOs from about 12 countries, then we have more proof for it that data is almost going to double every two to three years. And I think what’s more interesting here is that data is going to grow, but their budgets are not going to grow in the same proportion. So instead of worrying about it, I want to tackle this problem differently. I want to look at how we convert this challenge into an opportunity by deriving value out of this deal. So let’s talk a little more about this in the context of what’s happening in the industry today.

I’m sure everyone by now has heard about generative AI and why generative AI or gen AI is a buzzword. AI has been there in the industry forever. However, what has changed recently is ChatGPT has exposed the power of AI to common people right from school going kids to grandparents by providing a very simple natural language interface. And just to talk a little bit more about ChatGPT, it is the fastest growing app in the industry. It touched 100 million users in just about two months. And what has changed because of this very fast adoption is that this has got businesses interested in it. Everyone wants to see how to unleash the power of generative AI. In fact, according to McKinsey, they’re saying it’s like it’s going to add about $2.6 trillion to $4.4 trillion to the global economy. That means we are talking about big numbers here, but everyone’s talking about ChatGPT, but what is the science behind it? The science behind it is the large language models.

And if you think of these large language models, they are AI models with billions or even trillions of parameters, and they are the science behind ChatGPT. However, to get most of these large language models or LLMs, they need to be fine-tuned because that means you’re just relying on the public data. Then what you’re getting, it means you’re not getting first, you’re not getting the information that you want, correct all the time. And of course there is a risk of people feeding bad data associated with it. So how do you make the most of it? And here actually comes your private data sets. So your proprietary data sets are very, very important here. And if you use this private data to fine-tune your models, I have no doubt in mind that it will create differentiation for you in the long run to remain competitive.

So I think even with this, we’re just scratching the surface here when it comes to gen AI. And what more needs to be thought about for enterprise adoption is all the features that are needed like explainability, traceability, quality, trustworthiness, reliability. So if you again look at all these parameters, actually data is again the centerpiece of everything here. And you have to harness this private data, you have to curate it, and you have to create the data sets that will give you the maximum return on investment. Now, before enterprises can become data-driven, I think they must first become data intelligent.

And that means knowing more about the data you have, whether you need to keep it or not, or where it should reside to derive the most value out of it. And as I talk to more and more CIOs and CTOs, it is very evident that there’s a lot of data out there and we need to find a way to fix the problem. Because that data may or may not be useful, but you are storing it, you are keeping it, and you are spending money on it. So that is definitely a problem that needs to be solved. Then back to your question of, what is the right infrastructure, what are some of the parameters of it? So in my mind, it needs to be nimble, it needs to be scalable, trusted, secured, cost-effective, and finally socially responsible.

Laurel: That certainly gives us a lot of perspective, Bharti. So customers are demanding more access to data and enterprises also need to get better insights from the streams of data that they’re accumulating. So could you describe what universal data intelligence is, and then how it relates to data infrastructure?

Bharti: Universal data intelligence is the ability for businesses to offer seamless access to data and insights irrespective of where it resides. So basically we are talking about getting full insights into your data in a hybrid environment. Also, on the same lines, we also talk about our approach to infrastructure, which is a distributed approach. And what I mean by distributed is that you do as little data movement as possible because moving data from one place to another place is expensive. So what we are doing here at Hitachi Vantara, we are designing systems. Think of it as there is an elastic fabric that ties it all together and we are able to get insights from the data no matter where it resides in a very, very timely manner. And even this data could be in any format, from structured, unstructured, and it could be blocked to file to objects.

And just to kind of give you an example of the same, recently we worked with the Arizona Department of Water Resources to simplify their data management strategy. They have data coming from more than 300,000 water resources like means we are talking about huge data sets here. And what we did there for them was we designed an intelligent data discovery and automation tool. And in fact, we completed this data discovery and the metadata cataloging and platform migration in just two weeks with minimal downtime. And we are hearing all the time from them that they are really happy with it and they’re now able to understand, integrate, and analyze the data sets to meet the needs of their water users, their planners, and their decision makers.

Laurel: So that’s a great example. So data and how it’s stored and managed is clearly a competitive differentiator as well. But although the amount of data is increasing, many budgets, as you mentioned, particularly IT budgets are not. So how can organizations navigate building a data infrastructure that’s effective and cost-efficient? And then do you have another example of how to do more with less?

Bharti: Yeah, I think that’s a great question. And this goes back to having data intelligence as the first step to becoming data-driven and reaping the full benefits of the data. So I think it goes back to you needing to know what exists and why it exists. And all of it should be available to the decision makers and the people who are working on the data at their fingertips. Just to give an example here, suppose you have data that you’re just retaining because you need to just retain it for legal purposes, and the likelihood of it being used is extremely, extremely low. So there’s no point in storing that data on an expensive storage device. It makes sense to transfer that data to a low cost object storage.

And at the same time, you might have the data that you need to access all the time. And speed is important. Low latency is important, and that kind of data needs to reside on fast NVMEs. And in fact, many of our customers do it all the time, and in fact in all the sectors. So what they do is they have their data, which through the policies, they constantly transfer from our highly, highly efficient file systems to object storage based on the policies. And it’s like they still retain the pointers there in the file system and they’re able to access it back in case they need it.

Laurel: So the public cloud is often cited as a way for enterprises to scale, be more agile, and innovate while by contrast, legacy on-premises infrastructures are seen as less user-friendly and accessible. How accurate is this conception and how should enterprises approach data modernization and management of that data?

Bharti: Yeah, I’ve got to admit here that the public cloud and the hyperscalers have raised the bar in terms of what is possible when it comes to innovation. However, we are also seeing and hearing from our customers that the cost is a concern there. And in fact, many of our customers, they move to cloud very fast and now they’re facing the cost challenge. When their CIOs see the bills going exponentially up, they’re asking like, “Hey, well how could we keep it flat?” That’s where I think we see a big opportunity, how to provide the same experience that cloud provides in a private data center so that when customers are talking about partition of the data, we have something equivalent to offer.

And here again, I have got to say that we want to address in a slightly different manner. I think we want to address it so that customers are able to take full advantage of the elasticity of the cloud, and also they’re able to take full advantage of on-prem environments. And how we want to do it, we want to do it in such a way that it’s almost in a seamless way, in a seamless manner. They can manage the data from their private data centers, doing the cloud and get the best from both worlds.

Laurel: An interesting perspective there, but this also kind of requires different elements of the business to come in. So from a leadership perspective, what are some best practices that you’ve instituted or recommended to make that transition to better data management?

Bharti: Yeah, I would say I think the data journey starts with data planning, and which should not be done in a siloed manner. And getting it right from the onset is extremely, extremely important. And what you need to do here is at the beginning of your data planning, you’ve got to get all the stakeholders together, whether it’s your CIO, your business users, your CTOs. So this strategy should never be done in a siloed manner. And in fact, I do want to think about, highlight another aspect, which probably people don’t do very much is how do you even bring your partners into the mix? In fact, I do have an example here. Prior to joining Hitachi Vantara, I was a CTO, an air purifier company. And as we were defining our data strategy, we were looking at our Salesforce data, we were looking at data in our NetSuite, we were looking at the customer tickets, and we were doing all this to see how we can drive marketing campaigns.

And as I was looking at this data, I felt that something was totally missing. And in fact, what was missing was the weather data, which is not our data, which was third-party data. For us to design effective marketing campaigns, it was very important for us to have insights into this weather data. For example, if there are allergies in a particular region or if there are wildfires in a particular region. And that data was so important. So having a strategy where you are able to bring all stakeholders, all parts of data together and think from the beginning is the right thing to get started.

Laurel: And with big hairy problems and goals, there’s also this consideration that data centers contribute to an enterprise’s carbon emissions. Thinking about partnerships and modernizing data management and everything we’ve talked about so far, how can enterprises meet sustainability goals while also modernizing their data infrastructure to accommodate all of their historical and real-time data, especially when it comes from, as you mentioned, so many different sources?

Bharti: Yeah, I’m glad that you are bringing up this point because it’s very important not to ignore this. And in fact, with all the gen AI and all the things that we are talking about, like one fine-tuning of one model can actually generate up to five times the carbon emissions that are possible from a passenger car in a lifetime. So we’re talking about a huge, huge environmental effect here. And this particular topic is extremely important to Hitachi. And in fact, our goal is to go carbon-neutral with our operations by 2030 and across our value chain by 2050. And how we are addressing this problem here is kind of both on the hardware side and also on the software side. Right from the onset, we are designing our hardware, we are looking at end-to-end components to see what kind of carbon footprint it creates and how we could really minimize it. And in fact, once our hardware is ready, actually, it needs to pass through a very stringent set of energy certifications. And so that’s on the hardware side.

Now, on the software side, actually, I have just started this initiative where we are looking at how we can move to modern languages that are more likely to create less carbon footprint. And this is where we are looking at how we can replace our existing Java [code base] with Rust, wherever it makes sense. And again, this is a big problem we all need to think about and it cannot be solved overnight, but we have to constantly think about interface manner.

Laurel: Well, certainly are impressive goals. How can emerging technologies like generative AI, as you were saying before, help push an organization into a next generation of data infrastructure systems, but then also help differentiate it from competitors?

Bharti: Yeah, I want to take a kind of a two-pronged approach here. First, what I call is table stakes. So if you don’t do it, you’ll be completely wiped out. And these are simple things about how you automate certain things, how you create better customer experience. But in my mind, that’s not enough. You got to think about what kind of disruptions you will create for yourself and for your customers. So a couple of ideas that we are working on here are the companions or copilots. And these are, think of them as AI agents in the data centers. And these agents actually help the data center environment from becoming more reactive to proactive.

So basically these agents are running in your data center all the time and they’re watching if there is a new patch available and if you should update to the new patch, or maybe there’s a new white paper that has better insights to manage some of your resources. So this is like these agents are constantly acting in your data center. They are aware of what’s going on on the internet based on how you have designed, and they’re able to provide you with creative solutions. And I think that’s going to be the disruption here, and that’s something we are working on.

Laurel: So looking to the future, what tools, technologies, or trends do you see emerging as more and more enterprises look to modernize their data infrastructure and really benefit from data intelligence?

Bharti: Again, I’ll go back to what I’m talking about, generative AI here, and I’ll give an example. For one of our customers, we are managing their data center, and I’m also part of that channel where we see constant back and forth between the support and the engineering. The support is asking, “Hey, this is what is happening, what should we be doing?” So just think of it like a different scenario that you have all this and you were able to collect this data and feed it into the LLMs. When you’re talking about this data, this data resides at several places. It resides in the heads of our experts. It is there in the documentation, it’s there in the support tickets, it’s there in logs, like life logs. It is there in the traces. So it’s almost impossible for a human being to analyze this data and get meaningful insights.

However, if we combine LLMs with the power of, say, knowledge graphs, vector databases, and other tools, it will be possible to analyze this data at the speed of light, and present the recommendation in front of the user through a very simple user interface. And in most cases, just via a very simple natural language interface. So I think that’s a kind of a complete paradigm shift where you have so many sources that you need to constantly analyze versus having the full automation. And that’s why I feel that these copilots will become an essential part of the data centers. In the beginning they’ll help with the automation to deal with the problems prevalent in any data center like resource management and optimization, proactive problem determination, and resolution of the same. As we go into the future, we’ll see more manual operations converted into automated operations. First, we’ll see humans in the loop, and eventually we’ll see a trend towards fully autonomous data centers.

Laurel: Well, that is quite a future. Thank you very much for joining us today on the Business Lab.

Bharti: Thank you, Laurel. Bye-bye.

Laurel: That was Bharti Patel, who is the senior vice president of Product Marketing at Hitachi Vantara who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Turning medical data into actionable knowledge

Advances in imaging technologies are giving physicians unprecedented insights into disease states, but fragmented and siloed information technology systems make it difficult to provide the personalized, coordinated care that patients expect.

In the field of medical imaging, health care providers began replacing radiographic films with digital images stored in a picture and archiving communication system (PACS) in the 1980s. As this wave of digitization progressed, individual departments—ranging from cardiology to pathology to nuclear medicine, orthopedics, and beyond—began acquiring their own, distinct IT solutions.

PACS remains an indispensable tool for viewing and interpreting imaging results, but leading health care providers are now beginning to move beyond PACS. The new paradigm brings data from multiple medical specialties together into a single platform, with a single user interface that strives to provide a holistic understanding of the patient and facilitate clinical reporting. By connecting data from multiple specialties and enabling secure and efficient access to relevant patient data, advanced information technology platforms can enhance patient care, simplify workflows for clinicians, and reduce costs for health care organizations. This organizes data around patients, rather than clinical departments.

Meeting patient expectations

Health care providers generate an enormous volume of data. Today, nearly one-third of the world’s data volume is generated by the health care industry. The growth in health care data outpaces media and entertainment, whose data is expanding at a 25% compound annual growth rate ,compared to the 36% rate for health care data. This makes the need for a comprehensive health care data management systems increasingly urgent.

The volume of health care industry data is only part of the challenge. Different data types stored in different formats create an additional hurdle to the efficient storage, retrieval, and sharing of clinically important patient data.

PACS was designed to view and store data in the Digital Imaging and Communications in Medicine (DICOM) standard, and a process known as “DICOM-wrapping” is used for PACS to provide access to patient information stored in PDF, MP4, and other file formats. In addition to adding additional steps that impede efficient workflow, DICOM-wrapping makes it difficult for clinicians to work with a file in its native format. PACS users are given what is essentially a screen shot of an Excel file, which makes it impossible to use the data analysis features in the Excel software.

With an open image and data management (IDM) system coupled with an intuitive reading and reporting workspace, patient data can be consolidated in one location instead of in multiple data silos, providing clinicians with the information they need to provide the highest level of patient-centered care. In a 2017 survey by health insurance company Humana, its patients said they aren’t interested in the details of health care IT, but are nearly unanimous when it comes to their expectations, with 97% of patients saying that their health care providers should have access to their complete medical history.

Adapting to clinical needs

To meet patient expectations and needs, health care IT seeks to meet the needs of health care providers and systems by offering flexibility—both in its initial setup and in its capacity to scale to meet evolving organizational demands.

A modular architecture enables health care providers and systems to tailor their system to their specific needs. Depending on clinical needs, health care providers can integrate specialist applications for reading and reporting, AI-powered functionalities, advanced visualization, and third-party tools. The best systems are scalable, so that they can grow as an organization grows, with the ability to flexibly scale hardware by expanding the number of servers and storage capacity.

A simple, unified UI enables a quick learning curve across the organization, while the adoption of a single enterprise system helps reduce IT costs by enabling the consolidation and integration of previously distinct systems. Through password-protected data transfers, these systems can also facilitate communication with patients.

Many into one

One solution to the challenges and opportunities created by the growing volume of medical data is Siemens Healthineers’ Syngo Carbon Core. It combines two elements: Syngo Carbon Space is a front-end workspace for reporting and routine reading. On the back end is Syngo Carbon IDM, a powerful and flexible IDM system. By combining these two elements, Syngo Carbon Core allows health care providers to manage data around patients, not departments.

Syngo Carbon Space brings imaging data, diagnostic software elements, and clinical tools together into a single, intuitive workspace for both routine and more complex cases. Customizable layouts allow clinicians to tailor their routine reader to their needs and preferences, with workflow optimization tools that maximize efficiency. In addition, organizations have the flexibility to use editable structured reporting templates or free format reports. The translation of findings into coded and discrete data help specialists generate patient-centered reports that help guide clinical decision-making. Through the workspace, clinicians can also directly access Syngo Carbon’s Advanced Visualization, which incorporates additional tools and AI-powered applications, without having to switch to another application.

On the back end of Syngo Carbon Core, robust IDM consolidates patient data and seamlessly integrates systems across an enterprise. Its open design enables the integration of existing DICOM Long Term Archives (LTAs), including legacy PACS systems. All data is kept in its native format—meaning that a PDF remains a PDF, for example—to ensure interoperability.

The growing volume of data generated in modern health care environments creates challenges, but also presents tremendous opportunities for delivering high-quality, personalized medicine. With comprehensive health care data management systems, health care providers can turn data into a strategic asset for their organizations and their patients.

This content was produced by Siemens Healthineers. It was not written by MIT Technology Review’s editorial staff.

Why embracing complexity is the real challenge in software today

Technology Radar is a snapshot of the current technology landscape produced by Thoughtworks twice a year; it’s based on technologies we’ve been using as an organization and communicates our perspective on them. There is always a long list of candidates to be featured for us to work through and discuss, but with each edition that passes, the number of technologies the group discusses grows ever longer. It seems there are, increasingly, more and more ways to solve a problem. On the one hand this is a good thing—the marketplace is doing its job offering a wealth of options for technologists. Yet on the other it also adds to our cognitive load: there are more things to learn about and evaluate.

It’s no accident that many of the most widely discussed trends in technology—such as data mesh and, most recently, generative AI (GenAI)—are presented as solutions to this complexity. However, it’s important that we don’t ignore complexity or see it as something that can be fixed: we need to embrace it and use it to our advantage.

Redistributing complexity

The reason we can’t just wish away or “fix” complexity is that every solution—whether it’s a technology or methodology—redistributes complexity in some way. Solutions reorganize problems. When microservices emerged (a software architecture approach where an application or system is composed of many smaller parts), they seemingly solved many of the maintenance and development challenges posed by monolithic architectures (where the application is one single interlocking system). However, in doing so microservices placed new demands on engineering teams; they require greater maturity in terms of practices and processes. This is one of the reasons why we cautioned people against what we call “microservice envy” in a 2018 edition of the Technology Radar, with CTO Rebecca Parsons writing that microservices would never be recommended for adoption on Technology Radar because “not all organizations are microservices-ready.” We noticed there was a tendency to look to adopt microservices simply because it was fashionable.

This doesn’t mean the solution is poor or defective. It’s more that we need to recognize the solution is a tradeoff. At Thoughtworks, we’re fond of saying “it depends” when people ask questions about the value of a certain technology or approach. It’s about how it fits with your organization’s needs and, of course, your ability to manage its particular demands. This is an example of essential complexity in tech—it’s something that can’t be removed and which will persist however much you want to get to a level of simplicity you find comfortable.

In terms of microservices, we’ve noticed increasing caution about rushing to embrace this particular architectural approach. Some of our colleagues even suggested the term “monolith revivalists” to describe those turning away from microservices back to monolithic software architecture. While it’s unlikely that the software world is going to make a full return to monoliths, frameworks like Spring Modulith—a framework that helps developers structure code in such a way that it becomes easier to break apart a monolith into smaller microservices when needed—suggest that practitioners are becoming more keenly aware of managing the tradeoffs of different approaches to building and maintaining software.

Supporting practitioners with concepts and tools

Because technical solutions have a habit of reorganizing complexity, we need to carefully attend to how this complexity is managed. Failing to do so can have serious implications for the productivity and effectiveness of engineering teams. At Thoughtworks we have a number of concepts and approaches that we use to manage complexity. Sensible defaults, for instance, are starting points for a project or piece of work. They’re not things that we need to simply embrace as a rule, but instead practices and tools that we collectively recognize are effective for most projects. They give individuals and teams a baseline to make judgements about what might be done differently.

One of the benefits of sensible defaults is that they can guard you against the allure of novelty and hype. As interesting or exciting as a new technology might be, sensible defaults can anchor you in what matters to you. This isn’t to say that new technologies like generative AI shouldn’t be treated with enthusiasm and excitement—some of our teams have been experimenting with these tools and seen impressive results—but instead that adopting new tools needs to be done in a way that properly integrates with the way you work and what you want to achieve. Indeed, there are a wealth of approaches to GenAI, from high profile tools like ChatGPT to self-hosted LLMs. Using GenAI effectively is as much a question of knowing the right way to implement for you and your team as it is about technical expertise.

Interestingly, the tools that can help us manage complexity aren’t necessarily new. One thing that came up in the latest edition of Technology Radar was something called risk-based failure modeling, a process used to understand the impact, likelihood and ability of detecting the various ways that a system can fail. This has origins in failure modes and effects analysis (FMEA), a practice that dates back to the period following World War II, used in complex engineering projects in fields such as aerospace. This signals that there are some challenges that endure; while new solutions will always emerge to combat them, we should also be comfortable looking to the past for tools and techniques.

Learning to live with complexity

McKinsey’s argument that the productivity of development teams can be successfully measured caused a stir across the software engineering landscape. While having the right metrics in place is certainly important, prioritizing productivity in our thinking can cause more problems than it solves when it comes to complex systems and an ever-changing landscape of solutions. Technology Radar called this out with an edition with the theme, “How productive is measuring productivity?”This highlighted the importance of focusing on developer experience with the help of tools like DX DevEx 360. 

Focusing on productivity in the way McKinsey suggests can cause us to mistakenly see coding as the “real” work of software engineering, overlooking things like architectural decisions, tests, security analysis, and performance monitoring. This is risky—organizations that adopt such a view will struggle to see tangible benefits from their digital projects. This is why the key challenge in software today is embracing complexity; not treating it as something to be minimized at all costs but a challenge that requires thoughtfulness in processes, practices, and governance. The key question is whether the industry realizes this.

This content was produced by Thoughtworks. It was not written by MIT Technology Review’s editorial staff.

Optimizing platforms offers customers and stakeholders a better way to bank

When it comes to banking, whether it’s personal, business, or private, customer experience is everything. Building new technologies and platforms, employing them at scale, and optimizing workflows is especially critical for any large bank looking to meet evolving customer and internal stakeholder demands for faster and more personalized ways of doing business. Institutions like JPMorgan Chase are implementing best practices, cost efficient cloud migration, and emerging AI and machine learning (ML) tools to build better ways to bank, says Head of Managed Accounts, Client Onboarding and Client Services Technology at J. P. Morgan Private Bank, Vrinda Menon.

Menon stresses that it is critical that technologists stay very focused on the business impact of the software and tools they develop.

“We coach our teams that success and innovation does not come from rebuilding something that somebody has already built, but instead from leveraging it and taking the next leap with additional features upon it to create high impact business outcomes,” says Menon.

At JPMorgan Chase, technologists are encouraged, where possible, to see the bigger picture and solve for the larger pattern rather than just the singular problem at hand. To reduce redundancies and automate tasks, Menon and her team focus on data and measurements that indicate where emerging technologies like AI and machine learning could enhance processes like onboarding or transaction processing at scale. 

AI/ML have become commonplace across many industries with private banking being no exception, says Menon. At a base level, AI/ML can extract data from documents, classify information, analyze data smartly and detect issues and outliers across a wide range of use cases. But Menon is looking to the near future when AI/ML can help proactively predict client needs based on various signals. For example, a private banking client that has recently been married may ask their bank for a title change. Using the client’s data in context and this new request, AI/ML tools could proactively help bankers identify additional things to ask this client, such as need to change beneficiaries or the possibility to optimize taxes by considering jointly filed taxes.

“You have an opportunity to be more proactive and think about it holistically so you can address their needs before they even come to you to ask for that level of engagement and detail,” says Menon.

This episode of Business Lab is produced in association with JPMorgan Chase. 

Full transcript

Laurel Ruma: From MIT Technology Review. I’m Laurel Ruma and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic today is investing in building great experiences. A number of people benefit from enterprise investment in emerging and new technologies, including customers who want better, faster, and newer ways of doing business. But internal stakeholders want the same investment in better tools and systems to build those fast and new ways of doing business. Balancing both needs is possible.

Two words for you: optimizing platforms.

Today we’re talking with Vrinda Menon, the chief technology officer of Managed Accounts, Client Onboarding and Client Services at JPMorgan Private Bank.

This podcast is produced in association with JPMorgan Chase.

Welcome, Vrinda.

Vrinda Menon: Thank you so much, Laurel. I’m looking forward to this discussion.

Laurel: Great. So, let’s start with how often people think of JPMorgan Chase. They likely associate the company with personal banking, ATMs and credit cards, but could you describe what services the private bank provides and how operations and client services have evolved and transformed since you began your role at JPMorgan Chase?

Vrinda: Sure. JPMorgan Chase indeed does far more than personal banking, credit cards and ATMs. The private bank of JPMorgan Chase is often referred to as the crown jewel of our franchise. We service our high net worth clients and ultra-high net worth clients across the globe. We provide them services like investment management, trust and estate planning, banking services, brokerage services, customized lending, etc., just to name a few. And in terms of what has transformed in the recent years since I joined, I would say that we’ve become far more tech savvy as an organization, and this is thanks and no small measure to new leadership as well in operations and client services. I think three things have changed very dramatically since I’ve joined. The first is culture. In my first few months, I spent a week doing the job of an operations analyst. And in doing that I started to understand firsthand the painful manual work that people were subject to and feeling like they did not have the permission to have things changed for them.

But working off that and actually connecting with a lot more people at the ground who are doing these types of activities, we worked with them to make those changes and make them see light at the end of the tunnel. And then suddenly the demand for more change and demand for more automation started building as a groundswell energy with support from our partners in operations and services. Now, routine, repetitive, mundane, mind-numbing work is not an option at the table. It’s become a thing of the past. And secondly, what we’ve done also is we’ve grown an army of citizen developers who really have access to tools and technologies where they can do quick automation without having to depend on broader programs and broader pieces of technology. We’ve also done something super interesting, which is, over the past three years we’ve taken every new analyst in the private bank and trained them on Python.

And so, they’ve started to see the benefits of doing things themselves. So, culture change I think has been one of the biggest things that we’ve achieved in the past few years since I joined. Second, we built a whole set of capabilities, we call them common capabilities. Things like how do you configure new workflows? How do you make decisions using spreadsheets and decision models versus coding it into systems? So,  you can configure it, you can modify it, and you can do things more effectively. And then tools like checklists, which can be again put into systems and automated in a few minutes, in many cases. Today, we have millions of tasks and millions of decisions being executed through these capabilities, which has suddenly game-changed our ability to provide automation at scale.

And last but not least, AI and machine learning, it now plays an important role in the underpinnings of everything that we do in operations and client services. For example, we do a lot of process analytics. We do load balancing. So, when a client calls, which agent or which group of people do we direct that client call to so that they can actually service the client most effectively. In the space of payments, we do a lot with machine learning. Fraud detection is another, and I will say that I’m so glad we’ve had the time to invest and think through all of these foundational capabilities. So, we are now poised and ready to take on the next big leap of changes that are right now at our fingertips, especially in the evolving world of AI and machine learning and of course the public cloud.

Laurel: Excellent. Yeah, you’ve certainly outlined the diversity of the firm’s offerings. So, when building new technologies and platforms, what are some of the working methodologies and practices that you employ to build at scale and then optimize those workflows?

Vrinda: Yeah, as I said before, the private bank has a lot of offerings, but then amplify that with all the other offerings that JPMorgan Chase, the franchise has, a commercial bank, a corporate and investment bank, a consumer and community bank, and many of our clients cross all of these lines of business. It brings a lot of benefits, but it also has complexities. And one of the things that I obsess personally over is how do we simplify things, not add to the complexity? Second is a mantra of reuse. Don’t reinvent because it’s easy for technologists to look at a piece of software and say, “That’s great, but I can build something better.” Instead, the three things that I ask people to focus on and our organization collectively with our partners focus on is first of all, look at the business outcome. We coach our teams that success and innovation does not come from rebuilding something that somebody has already built, but instead from leveraging it and taking the next leap with additional features upon it to create high impact business outcomes.

So, focusing on outcome number one. Second, if you are given a problem, try and look at it from a bigger picture to see whether you can solve the pattern instead of that specific problem. So, I’ll give you an example. We built a chatbot called Casey. It’s one of the most loved products in our private bank right now. And Casey doesn’t do anything really complex, but what it does is solves a very common pattern, which is ask a few simple questions, get the inputs, join this with data services and join this with execution services and complete the task. And we have hundreds of thousands of tasks that Casey performs every single day. And one of them, especially a very simple functionality, the client wants a bank reference letter. Casey is called upon to do that thousands of times a month. And what used to take three or four hours to produce now takes like a few seconds.

So, it suddenly changes the outcome, changes productivity, and changes the happiness of people who are doing things that you know they themselves felt was mundane. So, solving the pattern, again, important. And last but not least, focusing on data is the other thing that’s helped us. Nothing can be improved if you don’t measure it. So, to give you an example of processes, the first thing we did was pick the most complex processes and mapped them out. We understood each step in the process, we understood the purpose of each step in the process, the time taken in each step, we started to question, do you really need this approval from this person? We observed that for the past six months, not one single thing has been rejected. So, is that even a meaningful approval to begin with?

Questioning if that process could be enhanced with AI, could AI automatically say, “Yes, please approve,” or “There’s a risk in this do not approve,” or “It’s okay, it needs a human review.” And then making those changes in our systems and flows and then obsessively measuring the impact of those changes. All of these have given us a lot of benefits. And I would say we’ve made significant progress just with these three principles of focus on outcome, focus on solving the pattern and focus on data and measurements in areas like client onboarding, in areas like maintaining client data, et cetera. So, this has been very helpful for us because in a bank like ours, scale is super important.

Laurel: Yeah, that’s a really great explanation. So, when new challenges do come along, like moving to the public cloud, how do you balance the opportunities of that scale, but also computing power and resources within the cost of the actual investment? How do you ensure that the shifts to the cloud are actually both financially and operationally efficient?

Vrinda: Great question. So obviously every technologist in the world is super excited with the advent of the public cloud. It gives us the powers of agility, economies of scale. We at JPMorgan Chase are able to leverage world class evolving capabilities at our fingertips. We have the ability also to partner with talented technologies at the cloud providers and many service providers that we work with that have advanced solutions that are available first on the public cloud. We are eager to get our hands on those. But with that comes a lot of responsibility because as a bank, we have to worry about security, client data, privacy, resilience, how are we going to operate in a multi-cloud environment because some data has to remain on-prem in our private cloud. So, there’s a lot of complexity, and we have engineers across the board who think a lot about this, and their day and night jobs are to try and figure this out.

As we think about moving to the public cloud in my area, I personally spend time thinking in depth about how we could build architectures that are financially efficient. And the reason I bring that up is because traditionally as we think about data centers where our hardware and software has been hosted, developers and architects haven’t had to worry about costs because you start with sizing the infrastructure, you order that infrastructure, it’s captive, it remains in the data center, and you can expand it, but it’s a one-time cost each time that you upgrade. With the cloud, that situation changes dramatically. It’s both an opportunity but also a risk. So, a financial lens then becomes super important right at the outset. Let me give you a couple of examples of what I mean. Developers in the public cloud have a lot of power, and with that power comes responsibility.

So, I’m a developer and my application is not working right now because there’s some issue. I have the ability to actually spin up additional processes. I have the ability to spin up additional environments, all of which attract costs, and if I don’t control and manage that, the cost could quickly pile up. Data storage, again, we had fixed storage, we could expand it periodically in the data centers, but in the public cloud, you have choices. You can say data that’s going to be slowly accessed versus data that’s going to be accessed frequently to be stored in different types of storage with different costs as a result. Now think about something like a financial ledger where you have retention requirements of let’s say 20 years. The cost could quickly pile up if you store it in the wrong type of storage. So, there’s an opportunity to optimize cost there, and if you ignore it and you’ve not kept an eye on it, you could actually have costs that are just not required.

To do this right, we have to ask developers, architects, and our engineers to not just think about the best performance, the most optimal resilience, but also think about cost as a fundamental aspect of how we look at architectures. So, this for me is a huge area of focus, starting with awareness for our people, training our people, thinking about architecture patterns, solution patterns, tooling, measurements, so that we completely stay on top of this and become more effective and more efficient in how we get to the public cloud. While the journey is exciting, I want to make sure that as we land there, we land safely and optimally from a cost standpoint.

Laurel: And especially in your position, thinking about how technology will affect the firm years and the future is critical. Therefore, as emerging technologies like AI and machine learning become more commonplace across industries, could you offer an example of how you’re using them in the areas that you cover?

Vrinda: Yeah, certainly. And we use AI/ML at many levels of complexity. So let me start with the base case. AI/ML, especially in operations and client services, starts with can I get data from documents? Can I OCR those documents, which is optical character recognition? Can I get information out of it, can I classify it? Can I perform analytics on it? So that’s the base case. On top of that, as you look at data, for example, payments data or data of transactions, and let’s say human beings are scanning them for issues or outliers, outlier detection techniques with AI/ML, they are also table stakes now, and many of our systems do that. But as you move on to the next level of prediction, what we’ve been able to do is start to build up models where say the client is calling. The client has all these types of cases in progress right now. What could they be calling about in addition to this?

The client expressed sentiment about something that they were not happy with two weeks ago. Is it likely that they’re calling about this? Can I have that information at the fingertips of the client service agent so they can look at it and respond as soon as the client asks for something? And think about the next stage of evolution, which is, the client came to us and said, “Change my title because I just got married.” Typically, in a transactional kind of activity, you would respond to the client and fix the title from let’s say, Ms. to Mrs. if that’s what they asked you to do. But imagine if when they came to do that, we said to them, here’s 10 other things that you should possibly think of now that you said you’ve got married. Congratulations. Do you want to address your beneficiaries? Do you want to change something in tax planning? Do you want to change the type of tax calculations that you do because you want to optimize now that you’re married, and you and your spouse could be filing jointly? Again, not that the client would choose to change those things, but you have an opportunity to be more proactive and think about it holistically so you can address their needs before they even come to you asking for that level of engagement and detail.

We also exploit a lot of AI/ML capabilities and in client onboarding to get better data to start to predict what data is right and start to predict risk. And our next leap, I believe strongly, and I’m super excited about this area of large language models, which I think are going to offer us exponential possibilities, not just in JPMorgan Chase, but as you can see in the world right now with technologies like ChatGPT, OpenAI’s technologies, as well as any of the other publicly available large language models that are being developed every single day.

Laurel: Well, it’s clear that AI offers great opportunities for optimizing platforms and transformations. Could you describe the process of how JPMorgan Chase decided to create dedicated teams for AI and machine learning, and how did you build out those teams?

Vrinda: Yeah, certainly. At JPMorgan Chase, we’ve been cultivating the mindset for some years now to think AI-first while hiring people. And we also leverage the best talent in the industry, and we’ve hired a lot of people in our research divisions as well to work on AI/ML. We’ve got thousands, several thousand technologists focused on AI. For me personally, in 2020, during the first months of the pandemic, I decided that I needed to see more AI/ML activity across my areas. So, I did what I called the “Summer of AI/ML,” and this was a fully immersive program that ran over 12 weeks with training for our people, and it was not full-time. So, they would dial in for a couple of hours, get trained on an AI/ML concept and some techniques, and then they would continue that and practice that for the week.

Then we had ideation sessions with our users for a couple of weeks and then a hackathon and some brilliant ideas came out of it. But when I stepped back and looked at this whole thing and the results of it, a few months later, I realized that many of the ideas had not reached the final destination into production. And in thinking a little more deeply about that, I understood that we had a problem. The problem was as follows, while AI is a great thing and everybody appreciates it, until AI becomes ingrained in everybody’s brain as the first thing to think about, there’s always going to be a healthy tension between choosing the next best feature on a product, which is very deterministic. If you say, add this button here or add these features using conventional technologies like Java versus game-changing the product using AI, which is a little bit more of a risk, the results are not always predictable, and it requires experimentation and R&D.

And so, when you have a choice of incremental changes that are deterministic and changes that are more probabilistic, people tend to take the most certain answer. And so, I decided that I needed to build out a focused, dedicated team of data scientists who were just going to obsess about solving problems in the space of data science and embed them across the products that we were building. And now the results are starting to show for themselves because the work they’ve done is phenomenal and the demand on them is growing every single day to the point where I’ve grown the team and the value that they’re providing is also measured and visible to the broader organization.

Laurel: So, in JPMorgan Chase’s client services, customer experience is clearly a driving force. How do you ensure that your teams are providing clients, especially those high-net-worth private clients that have high expectations of service with services that then meet their banking and account management needs?

Vrinda: So, we obsess over customer experience starting from the CEO down to every single employee. I have three tenets for my team. Number one is client experience, the second is user experience, and third is engineering excellence. And they know that a lot of us are measured by how well we service our clients. So, in the private bank specifically, in addition to reviewing our core capabilities like our case management system, our voice recognition systems, our fraud capture systems, all of that, we continuously analyze data received from client surveys, data received through every single interaction that we have with our client across all channels. So, whether it be a voice channel, whether it be emails, whether it be things that the client types in our websites, the places that they access, and our models just do not look at sentiment, they also look at client experience.

And as they look at experience, the things that we are trying to understand are, first of all, how’s the client feeling in this interaction? But more important is client one and client two and client three feeling the same thing about a particular aspect of our process, and do we need to change that process as a result, or is there more training that needs to be provided to our agents because we are not able to fully satisfy this category of requests? And by doing that continuously and analyzing it, and back to the point that I made earlier, by measuring it constantly, we are able to say, first of all, how was the experience to begin with? How is the experience now and after making these changes on these training programs or these fixes in our systems, how is that experience showing? And some of the other things we are able to do are look at experiences over a period of time.

So, for example, the client came to us last year and their experience based on the measurements that we did was at a certain level, they continue to interact with us over a period of months. Has it gone up? Has it gone down? How is that needle trending? How do we take that to superb? And we’ve been able to figure out these in ways that we’ve been able to prevent complaints, for example, and get to a point where things are escalated to the right people in the organization, especially in the servicing space where we are able to triage and manage these things more effectively because we are a high- touch client business, and we need to make sure that our clients are extremely happy with us.

Laurel: Oh yeah, absolutely. And sort of like another phase or idea when we’re thinking about customer experience and customer services, building a workforce that can respond to it. So here we’re going to talk a bit about how we promote diversity, which has been a tenet of your career, and you currently sit on the board of the Transition Network, which is a nonprofit that empowers a diverse network of women throughout career transitions. So, at JPMorgan Chase, how do you grow talent and improve representation across the company? And then how does that help build better customer experience?

Vrinda: Sure, that’s a great question. I certainly am very passionate about diversity, and during the past 15 years of my career, I’ve spent a lot of time supporting diversity. In my prior firm, I was co-head of the Asian Professional Network. Then subsequently for the past three years, I’ve been a board member at the Transition Network, which is all about women in transition. Meaning as they grow out of their careers into retirement and into other stages of life, how do we help them transition? And then here at JPMorgan Chase, I’m the sponsor for what is called the Take It Forward initiative, which is an initiative that supports 15,000 women technologists. JPMorgan Chase, as you know, does a broad range of activities in the area of diversity across all kinds of business resource groups, and we invest a lot of time and energy.

But specifically, for the Take It Forward initiative that I sponsor, it plays a key role in helping these 15,000 women technologists continuously enhance their leadership skills, grow their technical skills, build their confidence, develop networks, learn from senior sponsors and mentors, grow their careers, all of which makes their work experience very enriching. When I hear things like I’m motivated, I get new energy interacting with these senior women, I trust my personal power more. I’m confident to negotiate with my manager for a better role; I feel confident that I can discuss my compensation. It makes me really happy, and especially when they say I stay in JPMorgan Chase because of Take It Forward, it brings tears to my eyes. It’s really one of the most amazing volunteer driven initiatives in this organization. And a lot of people pour in passion, energy, time to make it succeed. And this initiative has won many awards as well externally.

I strongly believe all of these efforts are critical as it changes when people’s experiences change and when they’re happy, what they do becomes that much more effective. And it changes how we work internally, how we present ourselves externally and game changes our business outcomes. I’ve seen that in problem solving meetings. When you evaluate risk and you bring in people from diverse backgrounds, some are more risk averse, some are more risk taking, and you suddenly see that dynamic play out and the outcome becomes much different from what it would’ve been if you didn’t have all those people in the mix. So overall, I strongly believe in this, and I’ve seen it play out in every single firm that I’ve ever worked at. So that’s my take on diversity and how it helps us.

Laurel: Well, it certainly is important work, especially as it ties so tightly with the firm’s own ethos. So Vrinda, looking forward, how do you envision the future of private banking and client management? As we see emerging technologies become more prevalent and enterprises start to shift their infrastructure to the public cloud?

Vrinda: As I mentioned earlier, I see the next set of emerging technologies taking the world on a super exciting ride, and I think it’s going to be as transformational as the advent of the world wide web. Just take the example of large language models. The areas that are most likely to be first disrupted will be any work that involves content creation because that is table stakes for a large language model. As I expand that to my work, the rest of my work, client services and operations and many other areas that require repetitive work and large-scale interpretation and synthesis of information, that’s again, table stakes for large language models.

Expand that now to the next evolution, which is agents that are now the emerging technology with large language models. When agents are provided with a suite of tools and they can use reasoning like humans to decide which tool to execute based on input, that’ll game change the whole thing I was talking about earlier on workflow and task execution and operationally intense activities in the organization. When I look at myself as a software developer in the areas of code generation, code testing, code correction, test data generation, just to name a few, all of those are going to be game changed. So not just the work that our users do in the private bank, but the work that we do as technologists in the private bank. A lot of that is going to game change dramatically.

And then you add on the next level, which is problem solving. Large language models are continuously being trained on all subjects ever known to humans. And that for me is the most fascinating part of this. It’s like hundreds of thousands of brains that are working together on a diverse set of subjects. So, imagine a model that’s been trained on a domain like medicine or aerospace or defense, and then trying to bring all of that brainpower together to solve a problem in finance. That truly to me is the ultimate gold standard of problem solving. We talked about diverse people in the room coming with different experiences but imagine models that have been trained. You suddenly have breadth, depth, range of diverse knowledge that could never have been contemplated at that scale.

And in order to do all of this, obviously one of the key underpinnings is the public cloud and being able to spin up compute as quickly as possible to do complex calculations and then spin it down when you don’t need it, which is where the public cloud becomes super important. So, all I can say in conclusion is I think this is an amazing time to be in technology, and I just cannot wait to see how we further step up our game in the coming months and years, and things are moving almost at the speed of light now. Every single day, new papers get published and new ideas are coming out building on top of some of the exponential technologies that we are seeing in the world today.
 
Laurel: Oh, that’s fantastic. Vrinda, thank you so much for being on the Business Lab today.

Vrinda: Thank you so much, Laurel. It’s my pleasure. I really enjoyed speaking with you and thank you for your thoughtful questions. They were super interesting.

Laurel: That was Vrinda Menon, the chief technology officer of Managed Accounts, Client Onboarding and Client Services at J.P. Morgan Private Bank, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review overlooking the Charles River.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This podcast is for informational purposes only and it is not intended as legal, tax, financial, investment, accounting or regulatory advice. Opinions expressed herein are the personal views of the individual(s) and do not represent the views of JPMorgan Chase & Co. The accuracy of any statements, linked resources, reported findings or quotations are not the responsibility of JPMorgan Chase & Co.

Making sense of sensor data

Consider a supply chain where delivery vehicles, shipping containers, and individual products are sensor-equipped. Real-time insights enable workers to optimize routes, reduce delays, and efficiently manage inventory. This smart orchestration boosts efficiency, minimizes waste, and lowers costs.

Many industries are rapidly integrating sensors, creating vast data streams that can be leveraged to open profound business possibilities. In energy management, growing use of sensors and drone footage promises to enable efficient energy distribution, lower costs, and reduced environmental impact. In smart cities, sensor networks can enhance urban life by monitoring traffic flow, energy consumption, safety concerns, and waste management.

These aren’t glimpses of a distant future, but realities made possible today by the increasingly digitally instrumented world. Internet of Things (IoT) sensors have been rapidly integrated across industries, and now constantly track and measure properties like temperature, pressure, humidity, motion, light levels, signal strength, speed, weather events, inventory, heart rate and traffic.  

The information these devices collect—sensor and machine data—provides insight into the real-time status and trends of these physical parameters. This data can then be used to make informed decisions and take action—capabilities that unlock transformative business opportunities, from streamlined supply chains to futuristic smart cities.

John Rydning, research vice president at IDC, projects that sensor and machine data volumes will soar over the next five years, achieving a greater than 40% compound annual growth rate through 2027. He attributes that not primarily to an increasing number of devices, as IoT devices are already quite prevalent, but rather due to more data being generated by each one as businesses learn to make use of their ability to produce real-time streaming data.

Meanwhile, sensors are growing more interconnected and sophisticated, while the data they generate increasingly includes a location in addition to a timestamp. These spatial and temporal features not only capture data changes over time, but also create intricate maps of how these shifts unfold across locations—facilitating more comprehensive insights and predictions.

But as sensor data grows more complex and voluminous, legacy data infrastructure struggles to keep pace. Continuous readings over time and space captured by sensor devices now require a new set of design patterns to unlock maximum value. While businesses have capitalized on spatial and time-series data independently for over a decade, its true potential is only realized when considered in tandem, in context, and with the capacity for real-time insights.

Download the report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.