Generative AI: Differentiating disruptors from the disrupted

Generative AI, though still an emergent technology, has been in the headlines since OpenAI’s ChatGPT sparked a global frenzy in 2023. The technology has rapidly advanced far beyond its early, human-like capacity to enhance chat functions. It shows extensive promise across a range of use cases, including content creation, translation, image processing, and code writing. Generative AI has the potential not only to reshape key business operations, but also to shift the competitive landscape across most industries.

The technology has already started to affect various business functions, such as product innovation, supply chain logistics, and sales and customer experience. Companies are also beginning to see positive return on investment (ROI) from deployment of generative-AI powered platforms and tools.

While any assessment of the technology’s likely business impact remains more forecast than empirical, it is necessary to look beyond the inevitable hype. To examine enterprises’ technological and business needs for effective implementation of generative AI, 300 senior executives across a range of regions and industries were surveyed. Respondents were asked about the extent of their corporate rollouts, implementation plans, and the barriers to deployment. Combined with insights from an expert interview panel, this global survey sheds light on how companies may or may not be ready to tackle the challenges to effective adoption of generative AI.

The overarching message from this research is that plans among corporate leaders to disrupt competition using the new technology—rather than being disrupted–—may founder on a host of challenges that many executives appear to underestimate.  

Executives expect generative AI to disrupt industries across economies. Overall, six out of 10 respondents agree that “generative AI technology will substantially disrupt our industry over the next five years.” Respondents that foresee disruption exceed those that do not across every industry.

A majority of respondents do not envision AI disruption as a risk; instead, they hope to be disruptors. Rather than being concerned about risk, 78% see generative AI as a competitive opportunity. Just 8% regard it as a threat. Most respondents hope to be disruptors: 65% say their businesses are “actively considering new and innovative ways to use generative AI to unlock hidden opportunities from our data.”

Despite expectations of change, few companies went beyond experimentation with, or limited adoption of, generative AI in 2023. Although most (76%) companies surveyed had worked with generative AI in some way in 2023, few (9%) adopted the technology widely. Those that used the technology experimented with or deployed it in only one or a few limited areas.

Companies have ambitious plans to increase adoption in 2024. Respondents expect the number of functions where they aim to deploy generative AI to more than double in 2024. This will involve frequent application of the technology in customer experience, strategic analysis, and product innovation.

Companies need to address IT deficiencies, or risk falling short of their ambitions to deploy generative AI, leaving them open to disruption. Fewer than 30% of respondents rank each of eight IT attributes at their companies as conducive to rapid adoption of generative AI. Those with the most experience of deploying generative AI have less confidence in their IT than their peers.

Non-IT factors also undermine the successful use of generative AI. Survey respondents also report non-IT impediments to the extensive use of generative AI. These factors include regulatory risk, budgets, the competitive environment, culture, and skills.

Executives expect generative AI to provoke a wave of disruption. In many cases, however, their hopes to be on the right side of this innovation are endangered by impediments that their companies do not fully appreciate.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Conversational AI revolutionizes the customer experience landscape

In the ever-evolving landscape of customer experiences, AI has become a beacon guiding businesses toward seamless interactions. While AI has been transforming businesses long before the latest wave of viral chatbots, the emergence of generative AI and large language models represents a paradigm shift in how enterprises engage with customers and manage internal workflows.

“We know that consumers and employees today want to have more tools to get the answers that they need, get things done more effectively, more efficiently on their own terms,” says Elizabeth Tobey, head of marketing, digital & AI at NICE.

Breaking down silos and reducing friction for both customers and employees is key to facilitating more seamless experiences. Just as much as customers loathe an unhelpful automated chatbot directing them to the same links or FAQ page, employees similarly want their digital solutions to direct them to the best knowledge bases without excessive alt-tabbing or listless searching.

“We’re seeing AI being able to help uplift that to make all of those struggles and hurdles that we are seeing in this more complex landscape to be more effective, to be more oriented towards actually serving those needs and wants of both employees and customers,” says Tobey.

The capacity for AI tools to understand sentiment and create personalized answers is where most automated chatbots today fail. Enter conversational AI. Its recent progression holds the potential to deliver human-readable and context-aware responses that surpass traditional chatbots, says Tobey.

“We’re seeing even more gains that no matter how I ask a question or you ask a question, the answer coming back from self-service or from that bot is going to understand not just what we said but the intent behind what we said and it’s going to be able to draw on the data behind us,” she says.

Creating the most optimized customer experiences takes walking the fine line between the automation that enables convenience and the human touch that builds relationships. Tobey stresses the importance of identifying gaps and optimal outcomes and using that knowledge to create purpose-built AI tools that can help smooth processes and break down barriers.

Looking to the future, Tobey points to knowledge management—the process of storing and disseminating information within an enterprise—as the secret behind what will push AI in customer experience from novel to new wave.

“I think that for me, one of the exciting things and the challenging things is to explain how all of this is connected,” says Tobey.

This episode of Business Lab is produced in partnership with NICE.

Full Transcript

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma and this is Business Lab. The show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic is creating great customer experiences with AI, from the call center to online, to in-person. Building relationships with customers and creating data-driven but people-based support teams is critical for enterprises. And although the technology landscape is ever-changing, embracing what comes next doesn’t have to be a struggle.

Two words for you: foundational AI.

My guest is Elizabeth Tobey, head of marketing, digital and AI at NICE.

This podcast is produced in partnership with NICE.

Welcome Elizabeth.

Elizabeth Tobey: Happy to be here. Really excited to talk about this today.

Laurel: Great. Well, let’s go ahead and start. To set some context for our conversation, what is the customer experience landscape like now? And how has it and will it continue to change with AI?

Elizabeth: Well, to start, I think it’s important to note that AI isn’t a new technology, especially not in the customer experience (CX) era. One of the things that is quite new though is generative AI and the way we are using and able to use large language models in the CX paradigm. So we know that consumers and employees today want to have more tools to get the answers that they need, get things done more effectively, more efficiently on their own terms. So for consumers, we often hear that they want to use digital solutions or channels of their choice to help find answers and solve problems on their own time, on their own terms.

I think the same applies when we talk about either agents or employees or supervisors. They don’t necessarily want to be alt-tabbing or searching multiple different solutions, knowledge bases, different pieces of technology to get their work done or answering the same questions over and over again. They want to be doing meaningful work that really engages them, that helps them feel like they’re making an impact. And in this way we are seeing the contact center and customer experience in general evolve to be able to meet those changing needs of both the [employee experience] EX and the CX of everything within a contact center and customer experience.

And we’re also seeing AI being able to help uplift that to make all of those struggles and hurdles that we are seeing in this more complex landscape to be more effective, to be more oriented towards actually serving those needs and wants of both employees and customers.

Laurel: A critical element of great customer experience is building that relationship with your customer base. So then how can technologies, like you’ve been saying, AI in general, help with this relationship building? And then what are some of the best practices that you’ve discovered?

Elizabeth: That’s a really complicated one, and I think again, it goes back to the idea of being able to use technology to facilitate those effective solutions or those impactful resolutions. And what that means depends on the use case.

So I think this is where generative AI and AI in general can help us break down silos between the different technologies that we are using in an organization to facilitate CX, which can also lead to a Franken-stack of nature that can silo and fracture and create friction within that experience.

Another is to really be flexible and personalize to create an experience that makes sense for the person who’s seeking an answer or a solution. I think all of us have been consumers where we’ve asked a question of a chatbot or on a website and received an answer that either says they don’t understand what we’re asking or a list of links that maybe are generally related to one keyword we have typed into the bot. And those are, I would say, the infant notions of what we’re trying to achieve now. And now with generative AI and with this technology, we’re able to say something like, “Can I get a direct flight from X to Y at this time with these parameters?” And the self-service in question can respond back in a human-readable, fully formed answer that’s targeting only what I’ve asked and nothing else without having me to click into lots of different links, sort for myself and really make me feel like the interface that I’ve been using isn’t actually meeting my need. So I think that’s what we’re driving for.

And even though I gave a use case there as a consumer, you can see how that applies in the employee experience as well. Because the employee is dealing with multiple interactions, maybe voice, maybe text, maybe both. They’re trying to do more with less. They have many technologies at their fingertips that may or may not be making things more complicated while they’re supposed to make things simpler. And so being able to interface with AI in this way to help them get answers, get solutions, get troubleshooting to support their work and make their customer’s lives easier is a huge game changer for the employee experience. And so I think that’s really what we want to look at. And at its core that is how artificial intelligence is interfacing with our data to actually facilitate these better and more optimal and effective outcomes.

Laurel: And you mentioned how people are familiar with chatbots and virtual assistants, but can you explain the recent progression of conversational AI and its emerging use cases for customer experience in the call centers?

Elizabeth: Yes, and I think it’s important to note that so often in the Venn diagram of conversational AI and generative AI, we see an overlap because we are generally talking about text-based interactions. And conversational AI is that, and I’m being sort of high level here as I make our definitions for this purpose of the conversation, is about that human-readable output that’s tailored to the question being asked. Generative AI is creating that new and novel content. It’s not just limited to text, it can be video, it can be music, it can be an image. For our purposes, it is generally all text.

I think that’s where we’re seeing those gains in conversational AI being able to be even more flexible and adaptable to create that new content that is endlessly adaptable to the situation at hand. And that means in many ways, we’re seeing even more gains that no matter how I ask a question or you ask a question, the answer coming back from self-service or from that bot is going to understand not just what we said but the intent behind what we said and it’s going to be able to draw on the data behind us.

This is where the AI solutions are, again, more than just one piece of technology, but all of the pieces working in tandem behind the scenes to make them really effective. That data will also drive understanding my sentiment, my history with the company, if I’ve had positive or negative or similar interactions in the past. Knowing someone’s a new customer versus a returning customer, knowing someone is coming in because they’ve had a number of different issues or questions or concerns versus just coming in for upsell or additive opportunities.

That’s going to change the tone and the trajectory of the interaction. And that’s where I think conversational AI with all of these other CX purpose-built AI models really do work in tandem to make a better experience because it is more than just a very elegant and personalized answer. It’s one that also gets me to the resolution or the outcome that I’m looking for to begin with. That’s where I feel like conversational AI has fallen down in the past because without understanding that intent and that intended and best outcome, it’s very hard to build towards that optimal trajectory.

Laurel: And speaking of that kind of optimal balance between everything, trying to balance AI and the human touch that many customers actually want to get out of their experiences with companies like retail shopping or customer service interactions, when they lodge complaints, refunds, returns, all of these reasons. That’s a fine line to walk. So how do you strike the balance to ensure that customers enjoy the benefits of AI, automation, convenience, and availability, but without losing that human aspect to it?

Elizabeth: I think there’s many different ways to go about this, but I think it is again about connecting a lot of those touch points that historically companies have kept siloed or separate. The notion of a web presence and a marketing presence and a sales presence and a support presence or even an operations’ presence feels outdated to me. Those areas of expertise and even those organizations and the people working there do need to be connected. I feel in many ways we’ve gone down this rabbit hole where technology has advanced and we’ve added it on top of our old processes that sometimes date years or decades back that are no longer applicable.

And until we get to the root of rethinking all of those, and in some cases this means adding empathy into our processes, in some it means breaking down those walls between those silos and rethinking how we do the work at large. I think all of these things are necessary to really build up a new paradigm and a new way of approaching customer experience to really suit the needs of where we are right now in 2024. And I think that’s one of the big blockers and one of the things that AI can help us with.

Because some of the solutions and benefits we’ve been seeing are really about identifying gaps, identifying optimal flows or outcomes or employees who are generating great outcomes, and then finding a way to utilize that information to take action to better the business and better the flow. And I think that that’s something that we really want to hone in on because in so many ways we’re still talking about this technology and AI in general, in a very high level. And we’ve gotten most folks bought in saying, “I know I need this, I want to implement it.”

But they do need to take a step back and think about what are they looking for as a success metric when they do implement it, and how are they going to vet all of the different technologies and vendors and use cases to choose which one to go after first and how to implement it and how even to choose a partner. Because even if we say all solutions and technologies are created equal, which is a very generous statement to start with, that doesn’t mean they’re all equally applicable to every single business in every single use case. So they really have to understand what they’re looking for as a goal first before they can make sure whatever they purchase or build or partner with is a success.

Laurel: So how can companies take advantage of AI to tailor customer experiences on that individual level? And then what kind of major challenges are you advising that they may come across while creating these holistic experiences?

Elizabeth: I do think that change management within an organization, understanding that we’re going to have to change those muscles and those workflows is one of the biggest things you’ll see organizations grapple with. And that’s going to happen no matter what partner or vendor you choose. That’s something you’ll just have to embrace and run with and understand it’s going to happen. And I think also being able to take a step back and not assume you know the best use case, but let AI almost guide you in what will be the most impactful use case.

Some of the technologies and solutions we have can go in and find areas that are best for automation. Again, when I say best, I’m very vague there because for different companies that will mean different things. It really depends on how things are set up, what the data says and what they are doing in the real world in real time right now, what our solutions will end up finding and recommending. But being able to actually use this information to even have a more solid base of what to do next and to be able to fundamentally and structurally change how human beings can interface, access, analyze, and then take action on data. That’s I think one of the huge aha moments we are seeing with CX AI right now, that has been previously not available. And the only way you can truly utilize that is to have AI that is fully connected within all of your CX workflows, tools, applications and data, which means having that unified platform that’s connecting all of these pieces across all interactions across the entire customer journey.

And I think that’s one of the big areas that is possibly going to be the biggest hurdle to get your head wrapped around because it sounds enormous. But it’s actually a very fundamental and base level change that will then cascade out to make every action you take next far simpler and faster and will start to speed up the pace of the innovation and the change management within the organization.

Laurel: Since AI has become this critical tool across industries for customer interactions and experiences, how does generative AI now factor into a customer experience strategy? What are the opportunities here?

Elizabeth: We always go immediately to those chatbots and that self-service. And I think the applications there are wide and broad and probably fairly easy for us to conjure up. That idea of being able to on your own time in the channel of your choice, have a conversation in the future state, not know and not care if you are speaking to an artificial intelligence or a human led interaction because both are just as quick and just as flexible and just as effective for you. I think the ways that are more interesting to talk about now that maybe aren’t top of mind to everyone right now are around how we help agents and supervisors.

We hear a lot about AI co-pilots helping out agents, that by your side assistant that is prompting you with the next best action, that is helping you with answers. I think those are really great applications for generative AI, and I really want to highlight how that can take a lot of cognitive load off those employees that right now, as I said, are overworked. So that they can focus on the next step that is more complex, that needs a human mind and a human touch.

And they are more the orchestrator and the conductor of the conversation where a lot of those lower level and rote tasks are being offloaded to their co-pilot, which is a collaborator in this instance. And so they’re still in control of editing and deciding what happens next. But the co-pilot can even in a moment explain where a very operational task can happen and take the lead or something more empathetic needs to be said in the moment. And again, all of this information if you have this connected system on a unified platform can then be fed into a supervisor.

And we do now have a co-pilot in our ecosystem for supervisors who can then help them change from being more of a taskmaster of coming in and saying, “What do I need to do today? Who do I need to focus on?” Answer that question for the supervisors so they can become far more strategic and impactful into not diverting crises as they appear. But understanding the full context of what’s happening within their organization and with their teams to be able to build them up and better them and be far more strategic, proactive, and personalized in giving guidance or coaching or even figuring out how to raise information to leadership on what is going well.

So that again, they’re helping improve the pace of business, improve the quality of their employees’ lives and their consumers’ lives. Instead of feeling like they are almost triaging and trying to figure out even where to spend their energy. Their co-pilot can actually offload a lot of that for themselves. And this is always happening through generative AI because it is that conversational interface that you have, whether you’re pulling up data or actions of any sort that you want to automate or personalized dashboards.

All of this can be done without needing to know how to code, to have to write a SQL query, anything like that, that used to be a barrier to entry in the past.

Laurel: So this is sort of a follow-on to that, which is how can companies invest in generative AI as a way to support employees internally? There’s a learning curve there, as well as customers externally. And I know it’s early days, but what other benefits are possible?

Elizabeth: I think one of the “a-ha” moments for some of the technology we’re working on is really around, as I said, that conversational interface to tap into unstructured data. With the right knowledge management and with the right purpose-built AI, you’re going to be able to take a person like me. It’s been decades since I’ve written any code or done anything that complex, and you’re going to be able to have me be able to interface with the entirety of our CX data. Be able to pull it, ask questions of it through a conversational interface that looks a lot like a search engine we know and love today, and get back personalized reports or dashboards that will help inform me.

And then again, after seeing all of that information, I can continue the conversation that same way to drill down into that information and then maybe even take action to automate. And again, this goes back to that idea of having things integrated across the tech stack to be involved in all of the data and all of the different areas of customer interactions across that entire journey to make this possible. I think that’s a really huge moment for us. And I think that that’s where… At least I am still trying to help people understand how that applies in very tangible, impactful, immediate use cases to their business. Because it still feels like a big project that’ll take a long time and take a lot of money.

But actually this is just really new technology that is opening up an entirely new world of possibility for us about how to interact with data. That we just haven’t had the ability to have in the past before. And so again, I say this isn’t eliminating any data scientists or engineers or analysts out there. We already know that no matter how many you contract or hire, they’re already fully utilized by the time they walk in on their first day. This is really taking their expertise and being able to tune it so that they are more impactful, and then give this kind of insight and outcome-focused work and interfacing with data to more people.

So that they can all make better use of this information that before was just not able to be accessed and analyzed.

Laurel: So when you think about the future, Elizabeth, what innovations or developments in AI and customer experience are you most excited about and how do you anticipate these trends emerging?

Elizabeth: I think you’re going to hear from me and folks within our organization talking a lot about how knowledge management is at the core of artificial intelligence. Because your AI is only as good as the data that it is trained on and how your data is presented and accessible to AI is a huge game changer in whether your AI projects are going to really work for you or falter and not meet your goals. And so I think that for me, one of the exciting things and the challenging things is to explain how all of this is connected.

And that while in many ways we’re talking a lot about large language models and artificial intelligence at large. That sometimes some of the things that we’ve been discussing for a long time in CX, knowledge management is the secret behind all of this that’s going to take us from novel and interesting and a fun thing to demo to something that’s actually really impactful and revenue generating for your business.

Laurel: Thank you so much Elizabeth for joining us today on the Business Lab.

Elizabeth: Thank you for having me. This was a great conversation.

Laurel: That was Elizabeth Tobey, who is the head of marketing, digital and AI at NICE, who I spoke with from Cambridge Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the Global Director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print on the web and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Data at the center of business

With more than 5,000 branches across 48 states and 80 million customers, each with its own unique requirements to satisfy its customers’ financial needs, a clear data strategy is key for JPMorgan Chase. According to Mark Birkhead, firm-wide chief data officer at JPMorgan Chase, data analytics is the oxygen that breathes life into the firm to deliver growth and improve the customer experience.

Providing first-class business in a first-class way for clients and customers applies to every part of the firm, including its heavy investments in data analytics, machine learning, and AI. Using these advanced technologies, JPMorgan Chase can gain a deeper understanding of the breadth and specificity of the needs of the customers and communities it serves.

“It means using our data to drive positive outcomes for our customers and our clients and our business partners. And it means using this to actually help our customers and clients manage their daily lives in a better, simpler way,” says Birkhead.

At their best, a strong data strategy along with AI and machine learning adoption can free employees from tedious tasks to focus on high-value work. Reaching this extended intelligence — humans and machines working better together — means having the right deployment strategy. It’s key to understand both the potential and the limitations of these tools to make sure your enterprise is investing wisely in the areas where technologies like AI and machine learning can offer the greatest value.

“At the end of the day, what we’re trying to do is build an analytic factory that can deliver AI/ML at scale,” says Birkhead. “And that type of a factory requires a really sound strategy, efficient platforms and compute, solid governance and controls, and incredible talent.”

Adopting this vision at scale is a long-term investment that requires strong conviction, adherence to governance and controls, and operationalizing data. One of the most challenging aspects of this, Birkhead says, is defining your data priorities.

“Everyone talks about data every minute of every day. However, data has been oftentimes, I think, thought of as exhaust from some product, from some process, from some application, from a feature, from an app, and enough time has not been spent actually ensuring that that data is considered an asset, that that data is of high quality, that it’s fully understood by humans and machines.”

This episode of Business Lab is produced in association with JPMorgan Chase.

Full Transcript

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic is data and analytics. Building a global data strategy requires a strong understanding of governance, regulations, and customer experience for both internal and external customers. As technologies like AI emerge, the opportunity expands for real-time learnings and making better decisions.

Two words for you: data strategy.

My guest is Mark Birkhead, who is the firmwide chief data officer at JPMorgan Chase.

This podcast is produced in association with JPMorgan Chase.

Welcome, Mark.

Mark Birkhead: Thank you for having me, Laurel. It’s great to be here.

Laurel: Let’s start here. You were recently appointed to firmwide chief data officer for JPMorgan Chase. Previously you were the chief data and analytics officer at Chase and JPMorgan Wealth Management. Can you give us some insight into how your new role factors into the firm’s data strategy?

Mark: Absolutely. My new role as the firmwide chief data officer will be focused primarily on driving this strategy and solutions, that maximize the impact that data can have on our clients and customers across the globe and doing it in a highly governed and controlled ways. Data plays a huge part in our firmwide strategy. It’s been described by several of our senior leaders as the oxygen that powers the firm. And I truly believe that. Data analytics has propelled so many of our businesses, including our consumer bank and business bank, our commercial bank, our wealth management businesses, and our payments business globally. And its impact continues to grow in more meaningful ways every single day and month.

Strong data analytics capabilities really do provide the foundational underpinnings for our core business activities, but it’s actually fueling the growth of our businesses in meaningful ways. This addition is driving productivity, delivering insights that help our customers grow their businesses, and enabling our bankers and advisors to deliver elevated customer experiences.

Laurel: Thank you Mark for giving that context. As a global firm, you talk about delivering first-class business in a first-class way for clients and customers. Could you tell us how data and analytics, AI and machine learning are used to improve outcomes for your customers?

Mark: Absolutely. When we talk about first-class business in a first-class way, it really applies to every part of our firm and we’re investing heavily in data analytics, machine learning, and AI. But this is not new to us. We’ve been utilizing AI and ML for many, many years in many different ways. The Chase Analytics team actually will celebrate a sixth anniversary next March with the same mission and objective. Again, this is not new to us, but when we think about applying first-class business in a first-class way to the new set of AI capabilities, the new set of LLMs [large language models], the new set of generative AI, it means to us really honoring our customers’ expectations when it comes to privacy. It means using our data to drive positive outcomes for our customers and our clients and our business partners. And it means using this to actually help our customers and clients manage their daily lives in a better, simpler way.

I’m going to actually spend most of my time talking about my former role as the chief analytics officer for Chase and JPMorgan Wealth Management, but really our AI efforts across the globe are very similar to what has been happening at Chase and JPMorgan Wealth Management. It’s really been focused on improving the financial health for our customers and our clients. Today, JPMorgan Chase serves over 80 million customers in the US and we use advanced analytics to deliver best-in-class experiences and to respond to the needs of our customers. And our customers have all kinds of situations at any given moment in time. And at one moment we’re planning for college and other times we’re dealing with some difficult times in a family situation. And being able to have the right tools for our bankers, for advisors, for our call center agents to utilize is really important to us.

And I mentioned the breadth of data analytics as being the oxygen of the firm, and that really is reflected in the Chase business. And one of the hardest parts of the CDAO [chief data and analytics officer] job is to determine what investments to make and where to focus our attention when it comes to solving data problems and also determining where we have to lead with AI/ML and when we actually don’t. For us, there’s a couple of things that we always have to lead in given the nature of our business. We have a branch network of 5,000 branches. It covers all over 48 states. And we’ve got to lead with geospatial analytics and that heavily utilizes AI and ML to determine the optimal placement of our branches, of our network, of our community centers, and for the staffing within those branches. We also have over 60 million digitally active customers.

We have to lead in product analytics, experimentation, understand the customer experience in a journey, and how they interact with our products across multiple channels. It might start in a branch, end up in a mobile app, and end up in a call center, but all that has to be stitched together. We also have to lead when it comes to preventing fraud, and it has really become a difficult task given what’s going on across the world. But protecting our customers, from these types of acts, is incredibly important to us.

And we also need to make sure that within our branches, within our customers, they get the best experience possible, which means really using data analytics, machine learning, AI to understand our customers and communities in deeper ways. And in fact, for our 5,000 branches, there’s not a lot of similarities. And we actually have to prepare playbooks for these branches to make sure our employees are trained on these types of situations, the needs of their customers and clients so they can actually produce the best possible service. The only way to deliver all that at our scale is through leveraging data analytics, machine learning, and AI.

Laurel: Touching on that, at its best artificial intelligence, machine learning, and a robust data strategy can automate those tedious tasks to free people up to focus on high-value work. How do you think about that as an ongoing effort?

Mark: We think about this a lot and innovation has cycles, and that includes my field as well. But those cycle times are really changing and becoming more compressed, and that’s drawn a lot of attention and scrutiny particularly to the field of AI. At the end of the day, with the emergence of LLMs and generative AI, there’s just more opportunities to enhance the work of our employees day-to-day. Sandy Pentland, who helped form your MIT Media Lab, really described a few years back to our employees, this interaction is extended intelligence, humans and machines working better together. And this is actually one of our highest priorities at JPMorgan Chase, leveraging machines to help our employees do their jobs better for our customers and for our clients. And today we’re exploring experimenting with LLMs in a number of capacities. But it’s really important to understand what these tools can do well and what they can’t, and then making sure that we’re actually organizing ourselves against them and making the right investments in people and resources in those areas where these actual tools can help us to the greatest extent.

It’s also important that we focus on the governance and controls around this. And all that comes into play when it comes to figuring out what we do with these tools and how we apply them. I was meeting with our global marketers a couple of weeks ago, and every time I do this and talk about our plans for generative AI across the firm or at Chase, talk about the impact it can have on JPMorgan Chase, I get two types of questions. One is, “What does this mean for me and my employees?” And I think the answer is, with any type of technology, it’s not exactly going to take your job, but people who do use this technology will. And that’s the same thing with AI. And the only caveat to all of this is I think when it comes to this type of technology and capability, particularly with generative AI, those that understand what this does well and what it doesn’t, will actually have a leg up and be better positioned to actually succeed.

The second question I always get is, “If we’re always using the same tool for every company, the same model, aren’t we all going to sound the same?” And that’s where I think the relationship of the business and our models and data scientists has to evolve. Every time we build a model or an AI solution, we always engage with the business. But I think given what’s going on now with LLMs and generative AI, it’s really important to mature that model. The thinking around design and analytics needs to change to ensure that we incorporate the brand voice, the marketers’ voice into these solutions to make sure that the content that we deliver using these tools reflects the brands that the customers have come to know is really important. And this entire operating model has to evolve. And I think it presents really exciting opportunities to go deeper with customers in meaningful ways, but it requires the model to change.

Laurel: Speaking of having a leg up, successfully deploying AI and machine learning has become a competitive differentiator for large enterprises. What are the challenges of deploying AI and machine learning at scale? And then a second big question is, as regulations for AI and machine learning evolve, how does the firm manage government regulations?

Mark: That’s a great question. And first, I would say across JPMorgan Chase, we do view this as an investment. And every time I talk to a senior leader about the work we do, I never speak of expenses. It is always investment. And I do firmly believe that. At the end of the day, what we’re trying to do is build an analytic factory that can deliver AI/ML at scale. And that type of a factory requires a really sound strategy, efficient platforms and compute, solid governance and controls, and incredible talent. And for an organization of any scale, this is a long-term investment, and it’s not for the faint of heart. You really have to have conviction to do this and to do this well. Deploying this at scale can be really, really challenging. And it’s important to ensure that as we’re thinking about AI/ML, it’s done with controls and governance in place.

We’re a bank. We have a responsibility to protect our customers and clients. We have a lot of financial data and we have an obligation to the countries that we serve in terms of ensuring that the financial health of this firm remains in place. And at JPMorgan Chase, we’re always thinking about that first and foremost, and about what we actually invest in and what we don’t, the types of things we want to do and the things that we won’t do. But at the end of the day, we have to ensure that we understand what’s going on with these technologies and tools and the explainability to our regulators and to ourselves is really, really high. And that really is the bar for us. Do we truly understand what’s behind the logic, what’s behind the decision-ing, and are we comfortable with that? And if we don’t have that comfort, then we don’t move forward.

We never release a solution until we know it’s sound, it’s good, and we understand what’s going on. In terms of government relations, we have a large focus on this, and we have a large footprint across the globe. And at JPMorgan Chase, we really are focused on engaging with policymakers to understand their concerns as well as to share our concerns. And I think largely we’re united in the fact that we think this technology can be harnessed for good. We want it to work for good. We want to make sure it stays in the hands of good actors, and it doesn’t get used for harm for our clients or our customers or anything else. And it’s a place where I think business and policymakers need to come together and really have one solid voice in terms of the path forward because I think we’re highly, highly aligned.

Laurel: You did touch on this a bit, but enterprises are relying on data to do so many things like improving decision-making and optimizing operations as well as driving business growth. But what does it mean to operationalize data and what opportunities could enterprises find through this process?

Mark: I mentioned earlier that one of the hardest parts of the CDAO job is actually understanding and trying to determine what the priorities should be, what types of activities to go after, what types of data problems, big or small or otherwise. I would say with that, equally as difficult, is trying to operationalize this. And I think one of the biggest things that have been overlooked for so long is that data itself, it’s always been critical. It’s in our models. We all know about it. Everyone talks about data every minute of every day. However, data has been oftentimes, I think, thought of as exhaust from some product, from some process, from some application, from a feature, from an app, and enough time has not been spent actually ensuring that that data is considered an asset, that that data is of high quality, that it’s fully understood by humans and machines.

And I think it’s just now becoming even more clear that as you get into a world of generative AI, where you have machines trying to do more and more, it’s really critical that it understands the data. And if our humans have a difficult time making it through our data estate, what do you think a machine is going to do? And we have a big focus on our data strategy and ensuring that data strategy means that humans and machines can equally understand our data. And because of that, operationalizing our data has become a big focus, not only of JPMorgan Chase, but certainly in the Chase business itself.

We’ve been on this multi-year journey to actually improve the health of our data, make sure our users have the right types of tools and technologies, and to do it in a safe and highly governed way. And a lot of focus on data modernization, which means transforming the way we publish and consume data. The ontologies behind that are really important. Cloud migration, making sure that our users are in the public cloud, that they have the right compute with the right types of tools and capabilities. And then real-time streaming, enabling streaming, and real-time decision-ing is a really critical factor for us and requires the data ecosystem to shift in significant ways. And making that investment in the data allows us to unlock the power of real-time and streaming.

Laurel: And speaking of data modernization, many organizations have turned to cloud-based architectures, tools, and processes in that data modernization and digital transformation journey. What has JPMorgan Chase’s road to cloud migration for data and analytics looked like, and what best practices would you recommend to large enterprises undergoing cloud transformations?

Mark: We’ve been on this journey for quite some time across JPMorgan Chase and globally. And we have a really solid relationship with our technology partners, with our cloud providers, and we really have ensured that as we move up to the cloud, we do it safely and thoughtfully with a sound strategy and governance and controls. And that’s been the first and foremost piece I would say with regard to a business like Chase and JPMorgan Wealth Management, which into itself is incredibly large and we’ve talked about this publicly many, many times. It is something that requires conviction and a sound data strategy, but at the end of the day, we are not just moving to the public cloud. We’re going to do that with modernized data, but we’re also going to improve governance and controls while improving the user experience.

And to do all of that, it’s a massive undertaking. And to ensure that our data is discoverable and easily usable where our analysts require us to make informed decisions when it comes to these investments, as well as these different types of choices and staging of the work product. And as we think about this, and my advice to others would be to do the same. If you look at the user experience when it comes to your data scientists and your modelers and how they spend their time, what their challenges are, what your analytic priorities are, all those have to be brought together before you actually start building out a data strategy. Otherwise, you’ll be building things that you may not need. And this is already hard enough, why not make it easier by understanding what you’re trying to build, what user population is looking for and then building to that specifically and then staging out in the right appropriate ways?

And that’s been our journey. And we have these milestones. We have goals and everything else. We have OKRs [objectives and key results], we have product teams, we have data engineers. Everyone is aligned and doing this, and we’re focused on doing this in the right way. We’re also focused on ensuring that we can do this in many cloud platforms, not just one. And that requires modern pipelines. It requires us to organize our data differently and inventory it in a certain way and describe it in ways that are easily understandable. This is really difficult work, but it’s well worth the investment. Even if you have to go slow and make little bits of progress year over year, this will absolutely pay off.

Laurel: Speaking of that payoff, working across the company is crucial to meet goals. What is your talent and skills strategy to mobilize cross-functional teams to ensure a data literate workforce that uses both domain and technical knowledge like data science?

Mark: Absolutely. And I’m really proud of our focus on talent, not only across JPMorgan Chase, but at Chase specifically. It has been really difficult to find great talent in this space. And once you have them, you want them to stay, you want them to grow, you want them to feel supported, and you want them to feel challenged, and you want them to be able to experiment and to work and design solutions that are elegant, that meet the needs of customers and that are advanced. And as we think about all of this, there’s a number of buckets that we’re really focused on.

First, in terms of attracting talent, we do have a very robust campus program. We have a very robust internship program, and we have a very robust rotational program that actually spans the firm. And in Chase, this rotational program has existed for many, many, many years. And it really gives data scientists and aspiring data scientists a chance to spend a couple of years with us, move across the bank and the firm, and to really understand what it’s like to work in various different types of settings and before they land in a job or land in a function or a field.

And that’s one piece. It’s really understanding the community, the new talent coming in at campuses, out of graduate programs, out of Ph.D. programs, and making sure that we have the right types of programs that meet their needs. And that’s one piece. We’re also really focused on our existing talent and our existing talent is absolutely incredible. And they come to us because they want to continue to grow. They want to continue to learn. And we’re heavily investing in training to make sure that learning development opportunities are available to our existing employees and design for the different types of data users and the different types of career goals that they have. And that’s a great thing about our field today. There are so many avenues with which you can go.

And it’s really exciting to actually be able to pick in adventure, pick a career with a firm like ours, at JPMorgan Chase. And then as I mentioned before, we are really focused on our communities and giving back. And in addition to our campus programs, we also try and invest in talent that may not actually come to work for us ever. And we do have hackathons. We bring in hundreds of college students twice a year for our campuses. We pay for everything. And they go through a twenty-four-hour hackathon where they work with other teams, meet other students, work with JPMorgan Chase volunteers, and really try to solve problems for a local nonprofit. And those hackathons really are investment in the next generation of analytic talent, but it also gives them an opportunity to work with real data, with real problems, and to learn a little bit and to help build the community.

And then lastly, we have programs around Data for Good, and our employees absolutely love this. We partner with over 30 nonprofits over the past two years to help them solve their needs. And nonprofits are amazing at serving their communities and finding needs. They’re not always great at bringing tech stacks or digital solutions or using data or analytics to help their nonprofits. And we have great partnership with them. All of this encompasses our talent strategy. It’s focused on engaging students early on in the process, experienced hires, developing our core talent, and giving them opportunities to do things beyond their core job, like giving back to their communities.

Laurel: Mark, thank you so much for joining us today on the Business Lab.

Mark: Laurel, thank you so much for having me. It was great to be here.

Laurel: That was Mark Birkhead, firmwide chief data officer at JPMorgan Chase, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the global director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This podcast is for informational purposes only and it is not intended as legal, tax, financial, investment, accounting or regulatory advice. Opinions expressed herein are the personal views of the individual(s) and do not represent the views of JPMorgan Chase & Co. The accuracy of any statements, linked resources, reported findings or quotations are not the responsibility of JPMorgan Chase & Co.

Transforming document understanding and insights with generative AI

At some point over the last two decades, productivity applications enabled humans (and machines!) to create information at the speed of digital—faster than any person could possibly consume or understand it. Modern inboxes and document folders are filled with information: digital haystacks with needles of insight that too often remain undiscovered.

Generative AI is an incredibly exciting technology that’s already delivering tremendous value to our customers across creative and experience-building applications. Now Adobe is embarking on our next chapter of innovation by introducing our first generative AI capabilities for digital documents and bringing the new technology to the masses.

AI Assistant in Adobe Acrobat, now in beta, is a new generative AI–powered conversational engine deeply integrated into Acrobat workflows, empowering everyone with the information inside their most important documents.

Accelerating productivity across popular document formats

As the creator of PDF, the world’s most trusted digital document format, Adobe understands document challenges and opportunities well. Our continually evolving Acrobat PDF application, the gold standard for working with PDFs, is already used by more than half a billion customers to open around 400 billion documents each year. Starting immediately, customers will be able to use AI Assistant to work even more productively. All they need to do is open Acrobat on their desktop or the web and start working.

With AI Assistant in Acrobat, project managers can scan, summarize, and distribute meeting highlights in seconds, and sales teams can quickly personalize pitch decks and respond to client requests. Students can shorten the time they spend hunting through research and spend more time on analysis and understanding, while social media and marketing teams can quickly surface top trends and issues into daily updates for stakeholders. AI Assistant can also streamline the time it takes to compose an email or scan a contract of any kind, enhancing productivity for knowledge workers and consumers globally.

Innovating with AI—responsibly

Adobe has continued to evolve the digital document category for over 30 years. We invented the PDF format and open-sourced it to the world. And we brought Adobe’s decade-long legacy of AI innovation to digital documents, including the award-winning Liquid Mode, which allows Acrobat to dynamically reflow document content and make it readable on smaller screens. The experience we’ve gained by building Liquid Mode and then learning how customers get value from it is foundational to what we’ve delivered in AI Assistant.

Today, PDF is the number-one business file format stored in the cloud, and PDFs are where individuals and organizations keep, share, and collaborate on their most important information. Adobe remains committed to secure and responsible AI innovation for digital documents, and AI Assistant in Acrobat has guardrails in place so that all customers—from individuals to the largest enterprises—can use the new features with confidence.

Like other Adobe AI features, AI Assistant in Acrobat has been developed and deployed in alignment with Adobe’s AI principles and is governed by secure data protocols. Adobe has taken a model-agnostic approach to developing AI Assistant, curating best-in-class technologies to provide customers with the value they need. When working with third-party large language models (LLMs), Adobe contractually obligates them to employ confidentiality and security protocols that match our own high standards, and we specifically prohibit third-party LLMs from manually reviewing or training their models on Adobe customer data without their consent.

The future of intelligent document experiences

Today’s beta features are part of a larger Adobe vision to transform digital document experiences with generative AI. Our vision for what’s next includes the following:

  • Insights across multiple documents and document types: AI Assistant will work across multiple documents, document types, and sources, instantly surfacing the most important information from everywhere.
  • AI-powered authoring, editing, and formatting: Last year, customers edited tens of billions of documents in Acrobat. AI Assistant will make it simple to quickly generate first drafts, as well as helping with copy editing, including instantly changing voice and tone, compressing copy length, and suggesting content design and layout options.
  • Intelligent creation: Key features from Firefly, Adobe’s family of creative generative models, and Adobe Express will make it simple for anyone to make their documents more creative, professional, and personal.
  • Elevating document collaboration with AI-supported reviews: Digital collaboration is how work gets from draft to done. And with a 75% year-over-year increase in the number of documents shared, more collaboration is happening in Acrobat than ever. Generative AI will make the process simple, analyzing feedback and comments, suggesting changes, and even highlighting and helping resolve conflicting feedback.

As we have with other Adobe generative AI features, we look forward to bringing our decades of experience, expertise, and customers along for the ride with AI Assistant.

This article contains “forward-looking statements” within the meaning of applicable securities laws, including those related to Adobe’s expectations and plans for AI Assistant in Reader and Acrobat, Adobe’s vision and roadmap for future generative AI capabilities and offerings and the expected benefits to Adobe. All such forward-looking statements are based on information available to us as of the date of this press release and involve risks and uncertainties that could cause actual results to differ materially. Factors that might cause or contribute to such differences include, but are not limited to: failure to innovate effectively and meet customer needs; issues relating to the development and use of AI; failure to realize the anticipated benefits of investments or acquisitions; failure to compete effectively; damage to our reputation or brands; service interruptions or failures in information technology systems by us or third parties; security incidents; failure to effectively develop, manage and maintain critical third-party business relationships; risks associated with being a multinational corporation and adverse macroeconomic conditions; failure to recruit and retain key personnel; complex sales cycles; changes in, and compliance with, global laws and regulations, including those related to information security and privacy; failure to protect our intellectual property; litigation, regulatory inquiries and intellectual property infringement claims; changes in tax regulations; complex government procurement processes; risks related to fluctuations in or the timing of revenue recognition from our subscription offerings; fluctuations in foreign currency exchange rates; impairment charges; our existing and future debt obligations; catastrophic events; and fluctuations in our stock price. For a discussion of these and other risks and uncertainties, please refer to Adobe’s most recently filed Annual Report on Form 10-K and other filings we make with the Securities and Exchange Commission from time to time. Adobe undertakes no obligation, and does not intend, to update the forward-looking statements, except as required by law.

This content was produced by Adobe. It was not written by MIT Technology Review’s editorial staff.

Responsible technology use in the AI age

The sudden appearance of application-ready generative AI tools over the last year has confronted us with challenging social and ethical questions. Visions of how this technology could deeply alter the ways we work, learn, and live have also accelerated conversations—and breathless media headlines—about how and whether these technologies can be responsibly used.

Responsible technology use, of course, is nothing new. The term encompasses a broad range of concerns, from the bias that might be hidden inside algorithms, to the data privacy rights of the users of an application, to the environmental impacts of a new way of work. Rebecca Parsons, CTO emerita at the technology consultancy Thoughtworks, collects all of these concerns under “building an equitable tech future,” where, as new technology is deployed, its benefits are equally shared. “As technology becomes more important in significant aspects of people’s lives,” she says, “we want to think of a future where the tech works right for everyone.”

Technology use often goes wrong, Parsons notes, “because we’re too focused on either our own ideas of what good looks like or on one particular audience as opposed to a broader audience.” That may look like an app developer building only for an imagined customer who shares his geography, education, and affluence, or a product team that doesn’t consider what damage a malicious actor could wreak in their ecosystem. “We think people are going to use my product the way I intend them to use my product, to solve the problem I intend for them to solve in the way I intend for them to solve it,” says Parsons. “But that’s not what happens when things get out in the real world.”

AI, of course, poses some distinct social and ethical challenges. Some of the technology’s unique challenges are inherent in the way that AI works: its statistical rather than deterministic nature, its identification and perpetuation of patterns from past data (thus reinforcing existing biases), and its lack of awareness about what it doesn’t know (resulting in hallucinations). And some of its challenges stem from what AI’s creators and users themselves don’t know: the unexamined bodies of data underlying AI models, the limited explainability of AI outputs, and the technology’s ability to deceive users into treating it as a reasoning human intelligence.

Parsons believes, however, that AI has not changed responsible tech so much as it has brought some of its problems into a new focus. Concepts of intellectual property, for example, date back hundreds of years, but the rise of large language models (LLMs) has posed new questions about what constitutes fair use when a machine can be trained to emulate a writer’s voice or an artist’s style. “It’s not responsible tech if you’re violating somebody’s intellectual property, but thinking about that was a whole lot more straightforward before we had LLMs,” she says.

The principles developed over many decades of responsible technology work still remain relevant during this transition. Transparency, privacy and security, thoughtful regulation, attention to societal and environmental impacts, and enabling wider participation via diversity and accessibility initiatives remain the keys to making technology work toward human good.

MIT Technology Review Insights’ 2023 report with Thoughtworks, “The state of responsible technology,” found that executives are taking these considerations seriously. Seventy-three percent of business leaders surveyed, for example, agreed that responsible technology use will come to be as important as business and financial considerations when making technology decisions. 

This AI moment, however, may represent a unique opportunity to overcome barriers that have previously stalled responsible technology work. Lack of senior management awareness (cited by 52% of those surveyed as a top barrier to adopting responsible practices) is certainly less of a concern today: savvy executives are quickly becoming fluent in this new technology and are continually reminded of its potential consequences, failures, and societal harms.

The other top barriers cited were organizational resistance to change (46%) and internal competing priorities (46%). Organizations that have realigned themselves behind a clear AI strategy, and who understand its industry-altering potential, may be able to overcome this inertia and indecision as well. At this singular moment of disruption, when AI provides both the tools and motivation to redesign many of the ways in which we work and live, we can fold responsible technology principles into that transition—if we choose to.

For her part, Parsons is deeply optimistic about humans’ ability to harness AI for good, and to work around its limitations with common-sense guidelines and well-designed processes with human guardrails. “As technologists, we just get so focused on the problem we’re trying to solve and how we’re trying to solve it,” she says. “And all responsible tech is really about is lifting your head up, and looking around, and seeing who else might be in the world with me.”

To read more about Thoughtworks’ analysis and recommendations on responsible technology, visit its Looking Glass 2024.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Providing the right products at the right time with machine learning

Whether your favorite condiment is Heinz ketchup or your preferred spread for your bagel is Philadelphia cream cheese, ensuring that all customers have access to their preferred products at the right place, at the right price, and at the right time requires careful supply chain organization and distribution. Amid the proliferation of e-commerce and shifting demand within the consumer-packaged goods (CPG) sector, AI and machine learning (ML) have become helpful tools in enabling efficiency and better business outcomes.

The journey toward successfully deployed machine learning operations (MLOps) starts with data, says global head of machine learning operations and platforms at Kraft Heinz Company, Jorge Balestra. Curating well-organized and accessible data means enterprises can leverage their data volumes to train and develop AI and machine learning models. A strong data strategy lays the foundation for these AI and machine learning tools to use data to detect supply chain disruptions, identify and address cost inefficiencies, and predict demand for products.

“Never forget that data is the fuel, and data, it takes effort, it is a journey, it never ends, because that’s what is really what I would call what differentiates a lot of successful efforts compared to unsuccessful ones,” says Balestra.

This is especially crucial but challenging within the CPG sector where data is often incomplete given the inconsistent methods for consumer habit tracking among different retailers.

He explains, “We don’t know exactly and we don’t even want to know exactly what people are doing in their daily lives. What we want is just to get enough of the data so we can provide the right product for our consumers.”

To deploy AI and machine learning tools at scale, the Heinz Kraft Company has turned to the flexibility of the cloud. Using the cloud can allow for much-needed data accessibility while mitigating compute power.  “The agility of the whole thing increases exponentially because what used to take months, now can be done in a matter of seconds via code. So, definitely, I see how all of this explosion around analytics, around AI, is possible, because of cloud really powering all of these initiatives that are popping up left, right, and center.” says Balestra.

While it may be challenging to predict future trends in a sector so prone to change, Balestra says that preparing for the road ahead means focusing on adaptability and agility.

“Our mission is to delight people via food. And the technology, AI or what have you, is our tool to excel at our mission. Being able to learn how to leverage existing and future [technology] to get the right product at the right price, at the right location is what we are all about.”

This episode of Business Lab is produced in partnership with Infosys Topaz and Infosys Cobalt.

Full Transcript

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic is machine learning in the food and beverage industry. AI offers opportunities for innovation for customers and operational efficiencies for employees, but having a data strategy in place to capture these benefits is crucial.

Two words for you: global innovation.

My guest is Jorge Balestra, global head of machine learning operations and platforms at Kraft Heinz Company.

This episode of Business Lab is produced in partnership with Infosys Topaz and Infosys Cobalt.

Welcome, Jorge.

Jorge Balestra: Thank you very much. Glad to be here.

Laurel: Well, wonderful to have you. So people are likely familiar with Kraft Heinz since it is one of the world’s largest food and beverage companies. Could you talk to us about your role at Kraft Heinz and how machine learning can help consumers in the grocery aisle?

Jorge: Certainly. My role, I will call, has two major focuses in two areas. One of them is I lead the machine learning engineering operations of the company globally. And on the other hand, I provide all of the analytical platforms that the company is using also on a global basis. So in role number one in my machine learning engineering and operations, what my team does is we grab all of these models that our community of data scientists that are working globally are coming up with, and we grabbed them and we strengthened it. Our major mission here is the first thing we need to do is we need to make sure that we are applying engineering practices to make them production ready and they can scale, they can also run in a cost-effective manner, and from there we ensure that in my operations hat they are there when needed.

So a lot of these models, because they become part of our day-to-day operations, they’re going to come with certain specific service level commitments that we need to make, so my team makes sure that we are delivering on those with the right expectations. And on my other hand, which is the analytical platforms, is that we do a lot of descriptive, predictive, and prescriptive work in terms of analytics. The descriptive portion where you’re talking about just the regular dashboarding, summarization piece around our data and where the data lives, all of those analytical platforms that the company is using are also something that I take care of. And with that, you would think that I have a very broad base of customers in the company both in terms of geographies where they are from some of our businesses in Asia, all the way to North America, but also across the organization from marketing to HR and everything in between.

Going into your other question about how machine learning is helping our consumers in the grocery aisle, I’ll probably summarize that for a CPG it’s all about having the right product at the right price, at the right location for you. What that means is on the right product, their machine learning can help a lot of our marketing teams, for example, even when they are now with the latest generative AI capabilities are showing up like brainstorming and creating new content to R&D, what we’re trying to figure out what is the best formulas for our products, there’s definitely now ML is making inroads in that space, the right price, all about cost efficiencies throughout from our plans to our distribution centers, making sure that we are eliminating waste. Leveraging machine learning capabilities is something that we are doing across the board from our revenue management, which is the right price for people to buy our products.

And then last but not least is the right location. So we need to make sure that when our consumers are going into their stores or are buying our products online that the product is there for you and you’re going to find the product you like, the flavor you like immediately. And so there is a huge effort around predicting our demand, organizing our supply chain, our distribution, scheduling our plans to make sure that we are producing the right quantities and delivering them to the right places so our consumers can find our products.

Laurel: Well, that certainly makes sense since data does play such a crucial role in deploying advanced technologies, especially machine learning. So how does Kraft Heinz ensure the accessibility, quality and security of all of that data at the right place at the right time to drive effective machine learning operations or MLOps? Are there specific best practices that you’ve discovered?

Jorge: Well, the best practice that I can probably advise people on is definitely data is the fuel of machine learning. So without data, there is no modeling. And data, organizing your data, both the data that you have internally and externally takes time. Making sure that it’s not only accessible and you are organizing it in a way that you don’t have a gazillion technologies to deal with is important, but also I would say the curation of it. That is a long-term commitment. So I strongly advise anyone that is listening right now to understand that your data journey, as it is, is a journey, it doesn’t have an end destination, and also it’s going to take time.

And the more you are successful in terms of getting all the data that you need organized and making sure that is available, the more successful you’re going to be leveraging all of that with models in machine learning and great things that are there to actually then accomplish a specific business outcome. So a good metaphor that I like to say is there’s a lot of researchers, and MIT is known for its research, but the researchers cannot do anything without the librarians, with all the people that’s organizing the knowledge around so you can go and actually do what you need to do, which is in this case research. Never forget that data is the fuel, and data, it takes effort, it is a journey, it never ends, because that’s what is really what I would call what differentiates a lot of successful efforts compared to unsuccessful ones.

Laurel: Getting back to that right place at the right time mentality, within the last few years, the consumer packaged goods, or you mentioned earlier, the CPG sector, has seen such major shifts from changing customer demands to the proliferation of e-commerce channels. So how can AI and machine learning tools help influence business outcomes or improve operational efficiency?

Jorge: I’ve got two examples that I can say. One is, well, obviously we all want to forget about what happened during the pandemic, but for us it was a key, very challenging time, because out of nowhere all of our supply chains got disrupted, our consumers needed our products more than ever because there were more hunkered down at home. So one of the things that I tell you, at least for us, that was key was through our modeling, through the data that we’ve had, we’ve had some good early warning of certain disruptions in the supply chain and we were able to at least get… Especially when the outbreak started, a couple of weeks in advance, we were moving product, we were taking early actions in terms of ensuring that we were delivering an increased amount of product that was needed.

And that was because we had the data and we had some of those models that were alerting us about, “Hey, something is wrong here, something is happening with our supply chain, you need to take action.” And taking action at the right time, it’s key in terms of getting ahead of a lot of the things that can happen. And in our case, obviously we live in a competitive world, so taking actions before competition is important, that timing component. Another example I can give you and is more of something that is we’re doing more and more nowadays is this piece that I was referring to about the right location about product availability is key for CPG, and that is measured in something that is called the CFR, and is the customer field rate, which means is when someone is ordering product from Kraft Heinz that we are able to fulfill that order to 100%, and we are expecting to be really high with high 90s in terms of how efficient we are filling those orders.

We have developed new technology that I think we are pretty proud of because I think it is unique within CPG that allows us to really predict what is going to happen with CFR in the future based on the specific actions we’re taking today, whereas it’s changing our production lines, whereas changes in distribution, et cetera, we’re able to see not only the immediate effect, but what’s going to happen in the future with that CFR so we can really act on it and deliver actions right now that are in the benefit of our distribution in the future. So those are, I would call it, say, two examples in terms of how we’re leveraging AI and machine learning tools in our day-to-day operations.

Laurel: Are those examples, the CFR as well as the supply chain and making sure consumers had everything on demand almost, is this unique to the food and beverage industry? And what are perhaps some other unique challenges that the food and beverage industry faces when you’re implementing AI and machine learning innovations? And how do you navigate challenges like that?

Jorge: Yeah, I think something that is very unique for us is that we always have to deal with an incomplete picture in terms of the data that we have in our consumers. So if you think about it, when you go into a grocery store, a couple of things, well, you are buying from that store, the Kroger’s, Walmart’s, et cetera, and some of those will have you identified in terms of what is your consumption patterns, some will not. But also, in our case, if you are going to go buy a Philadelphia [cream cheese], for example, you may choose to buy your Philadelphia in multiple outlets. Sometimes you want more and you go to Costco, sometimes you need less, in my case, I live in the Chicago land area, I go to a Jewel supermarket.

We always have to deal with incomplete data on our customers, and that is a challenge because what we are trying to figure out is how to better serve our consumers based on what product you like, where you’re buying them, what is the right price point for you, but we’re always dealing with data that is incomplete. So in this case, having a clear data strategy around what we have there and a clear understanding of the markets that we have out there so we can really grab that incomplete data that we have out there and still come up with the right actions in terms of what are the right products to put, just to give you an example, a clear example of it is… And I’m going back to Philadelphia because, by the way, that’s my favorite Kraft product ever…

Laurel: Philadelphia cream cheese, right?

Jorge: Yes, absolutely. It’s followed by a close second with our ketchup. I have a soft spot for Philadelphia, pun intended.

Laurel: – and the ketchup.

Jorge: Exactly. No, but you have different presentations. You have the spreadable, you have the brick of cream cheese, within the brick you have some flavors, and what we want to do is make sure that we are providing the flavors that people really want, not producing the ones that people don’t want, because that’s just waste, without knowing specifically who is buying on the other side and you want to buy it in a supermarket, one or two, or sometimes you are shifting. But those are the things that we are constantly on the lookout for, and obviously dealing with the reality about, hey, data is going to be incomplete. We don’t know exactly and we don’t even want to know exactly what people are doing in their daily lives. What we want is just to get enough of the data so we can provide the right product for our consumers.

Laurel: And an example like cream cheese and ketchup probably, especially if a kid is in the house, it’s one of those products that you use on a fairly daily basis. So knowing all of this, how does Kraft Heinz prepare data for AI projects, because that in itself is a project? So what are the first steps to get ready for AI?

Jorge: One thing that we have been pretty successful on is what I would call the potluck approach for data. Meaning that individual projects, individual groups are focused on delivering a very specific use case, and that is the right thing to do. When you are dealing with a project in supply chain and you’re trying just to, for example, say, “Hey, I want to optimize my CFR,” you are really not going to be caring that much about what sales wants to do. However, if you implement a potluck approach, meaning that, okay, you need data from somebody else, and it’s very likely that you have data to offer because that’s part of your use case. So the potluck approach means that if you want to try out the food of somebody else, you need to bring your own to the table. So if you do that, what starts happening is your data, your enterprise data, becomes little by little more accessible, and if you do it right eventually you pretty much have a lot and almost everything in there.

That is one thing that I will strongly advise people to do. Think big, think strategically, but act tactically, act knowing that individual projects, they’re going to have more limited scope, but if you establish certain practices around sharing around how data should be managed, then each individual projects are going to be contributing to the larger strategy without the largest strategy being a burden for the individual projects, if that makes sense.

Laurel: Sure.

Jorge: So at least for us that has been pretty successful over time. So we have data challenges absolutely as everybody else has, but at least from what I’ve been able to hear from other people, but Kraft Heinz is in a good place in terms of that availability. Because once you reach a certain critical mass, what ends up happening is there’s no need to bring additional data, you are always now reusing it because data is large but it’s finite. So it’s not infinite. It’s not something that’s going to grow forever. If you do it right, you should see that eventually, you don’t need to bring in more and more data. You just need to fine-tune and really leverage the data that you have, probably be more granular, and probably get it faster. That’s a good signal. I have the data, but I need it faster because I need to act on it. Great, you’re on the right track. And also your associated cost around data should reflect that. It should not grow to infinity. Data is large but is finite.

Laurel: So speaking of getting data quickly and making use of it, how does Kraft Heinz use compute power and the cloud scaling ability for AI projects? How do you see these two strategies coming together?

Jorge: Definitely the technology has come a long way in the last few years, because what cloud is offering is more of that flexibility, and it’s removing a lot of the limitations, both in terms of the scale and performance we used to have. So to give you an example, a few years back I had to worry about “Do I have enough storage in my servers to host all the data that we are getting in?” And then if I didn’t, how long is it going to take for me to add another server? With the cloud as an enabler, that’s no longer an issue. It’s a few lines of code and you get what you need. Also, especially on the data side, some of the more modern technologies, talking about Snowflake or BigQuery, enable you to separate your compute from your storage. What it basically means in practical terms is you don’t have people fighting over limited compute power.

So data can be the same for everyone and everybody can be accessing the data without having to overlap each other and then fighting about, oh, if you run this, I cannot run that, and then we have all sorts of problems so definitely what the cloud allowed us to do is get out of the way in terms of the technology as a limitation. And the great thing that happened down there now with all the AI projects is now you could focus on actually delivering on the use cases that you have without having to have limitations around “how am I going to scale?”. That is no longer the case. You have to worry about costs because it could cost you an arm and a leg, but not necessarily around how to scale and how long it’s going to take you to scale.

The agility of the whole thing increases exponentially because what used to take months, now can be done in a matter of seconds via code. So definitely I see how all of this explosion around analytics, around AI is possible, because of cloud really powering all of these initiatives that are popping up left, right, and center.

Laurel: And speaking about this, you can’t really go it alone, so how do partners like Infosys help bring in those new skills and industry know-how to help build the overall digital strategy for data, AI, cloud, and whatever comes next?

Jorge: Much in the same way that I think cloud has been an enabler in terms of this, I think companies and partners like Infosys are also that kind of enablers, because, in a way, they are part of what I would call an expertise ecosystem. I don’t think any company nowadays can do any of this on its own. You need partners. You need partners that both are bringing in new ideas, new technology, but also they are bringing in the right level of expertise in terms of people that you need, and in a global sense, at least for us, having someone that has a global footprint is important because we are a global company. So I will say that it’s the same thing that we talked about earlier about cloud being an enabler: that expert ecosystem represented by companies like Infosys is just another key enabler without which you will really struggle to deliver. So that’s what I’ll probably say to anyone that is listening right now, make sure that your ecosystem, your expert ecosystem is good and is thriving and you have the right partners for the right job.

Laurel: When you think about the future and also all these tough problems that you’re tackling at Kraft Heinz, how important will something like synthetic data be to your data strategy and business strategy as well? What is synthetic data? And then what are some of those challenges associated with using it to fill in the gaps for real-world data?

Jorge: In our case, we don’t use a lot of synthetic data nowadays because at least from the areas that we have holes to fill in terms of data is something that we’ve been dealing with for a while. So we are, let’s put it this way, already familiar on how to produce and fill in the gaps using some of the synthetic data techniques, but not really to the same extent as other organizations are. So we are still looking for opportunities when that is the case in terms of what we need to use and leverage synthetic data, but it’s not something that least for Kraft Heinz and CPG at all we use extensively in multiple places as other organizations are.

Laurel: And so, lastly, when you think ahead to the future, what will the digital operating model for an AI-first firm that’s focused on data look like? What do you see for the future?

Jorge: What I see for the future is, well, first of all, uncertainty, meaning that I don’t think we can predict exactly what’s going to happen because the area in particular is growing and evolving at a speed that I think is just honestly dazzling just because of the major things. I think at least what I would say is the real muscle that we need to be exercising and be ready for is adaptability. Meaning that we can learn, we can react, and apply all of the new things that are coming in hopefully at the same speed that they’re occurring and really leveraging new opportunities when they present themselves in an agile way. But at least from the how to prepare for it I think it’s more about preparing the organization, your team, to be ready for that, really act on it, and be ready also to understand the specific business challenges that are there, and look for opportunities where any of the new things or maybe existences that are happening can be applied to solve a specific problem.

We are a CPG company, and that means the right product, right price, right location, so anything boils down to how can I be better in those three dimensions leveraging whatever is available today, whatever’s going to be available tomorrow. But keep focusing on, at least for us, we are a CPG company, we manufacture in Philadelphia, we manufacture ketchup, we feed people. Our mission is to delight people via food. And the technology, AI or what have you, is our tool to excel at our mission. Being able to learn how to leverage existing and future to get the right product at the right price at the right location is what we are all about.

Laurel: That’s fantastic. Thank you so much, Jorge. I appreciate you being with us today on the Business Lab.

Jorge: Thank you very much. Thank you for inviting me.

Laurel: That was Jorge Balestra, global head of machine learning operations and platforms at Kraft Heinz Company, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma, I’m the director of insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can also find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

 This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Unlocking the power of sustainability

According to UN climate experts, 2023 was the warmest year on record. This puts the heat squarely on companies to accelerate their sustainability efforts. “It’s quite clear that the sense of urgency is increasing,” says Jonas Bohlin, chief product officer for environmental, social, and governance (ESG) platform provider Position Green.

That pressure is coming from all directions. New regulations, such as the Corporate Sustainability Reporting Directive (CSDR) in the EU, require that companies publicly report on their sustainability efforts. Investors want to channel their money into green opportunities. Customers want to do business with environmentally responsible companies. And organizations’ reputations for sustainability are playing a bigger role in attracting and retaining employees.

On top of all these external pressures, there is also a significant business case for sustainability efforts. When companies conduct climate risk audits, for example, they are confronted with escalating threats to business continuity from extreme weather events such as floods, wildfires, and hurricanes, which are occurring with increasing frequency and severity.

Mitigating the risks associated with direct damage to facilities and assets, supply chain disruptions, and service outages very quickly becomes a high-priority issue of business resiliency and competitive advantage. A related concern is the impact of climate change on the availability of natural resources, such as water in drought-prone regions like the American Southwest.

Much more than carbon

“The biggest misconception that people have is that sustainability is about carbon emissions,” says Pablo Orvananos, global sustainability consulting lead at Hitachi Digital Services. “That’s what we call carbon tunnel vision. Sustainability is much more than carbon. It’s a plethora of environmental issues and social issues, and companies need to focus on all of it.”

Companies looking to act will find a great deal of complexity surrounding corporate sustainability efforts. Companies are responsible not only for their own emissions and fossil fuels usage (Scope 1), but also the sustainability efforts of their energy suppliers (Scope 2) and their supply chain partners (Scope 3). New regulations require organizations to look beyond just emissions. Companies must ask questions about a broad range of environmental and societal issues: Are supply chain partners sourcing raw materials in an environmentally conscious manner? Are they treating workers fairly?

Sustainability can’t be siloed into one specific task, such as decarbonizing the data center. The only way to achieve sustainability is with a comprehensive, holistic approach, says Daniel Versace, an ESG research analyst at IDC. “A siloed approach to ESG is an approach that’s bound to fail,” he adds.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Building innovation with blockchain

In 2015, JPMorgan Chase embarked on a journey to build a more secure and open wholesale banking. For chief technology officer at Onyx by J.P.Morgan, Suresh Shetty, investing in blockchain, a distributed ledger technology in its early days, was about ubiquity.

“We actually weighted ubiquity in terms of who can use the technology, who was trying to use the technology over technology superiority,” says Shetty. “Because eventually, our feeling was that the network effect, the community effect of ubiquity, actually overcomes any technology challenges that a person or a firm might have.”

Years later, JPMorgan Chase has Onyx, a blockchain-based platform to leverage innovations at scale and solve real-world banking challenges. Chief among them are global wholesale payment transactions. Much more complicated than moving money from point A to point B, Shetty says, wholesale transactions require multiple hops and fulfilling regulatory obligations.

Transferring money around the world requires several steps, including a credit check, sanctions check, and account validation. The process can lead to errors and hiccups. This is where blockchain comes in.

“Now, as you can imagine, because of the friction in this process and the multiple hops, it is a process that’s very prone to error. So this is one of the ideal use cases for a blockchain, where we try to take out that operational friction from any process.”

Although blockchain has the potential to cause major waves in financial services from securing transactions to ensuring smooth operations, sustainability remains a major consideration with any technology deployed at this scale. The shift from proof-of-work to proof-of-stake systems, says Shetty, reduces emissions and computing energy.

“The amount of energy that’s being used in a proof of stakes system goes down to 1% of the overall carbon impact of a proof of work system, so thereby, that shift alone was very important from a carbon emission perspective.”

This episode of Business Lab is produced in association with JPMorgan Chase.

Full Transcript 

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic today is blockchain. Technology has changed how money moves around the world, but the opportunity and value from distributed ledger technology is still in its early days. However, deploying on a large scale openly and securely should move it along quickly.

Two words for you: building innovation.

My guest is Suresh Shetty, who is the chief technology officer at Onyx by J.P.Morgan at JPMorgan Chase.

This podcast is produced in association with JPMorgan Chase.

Welcome, Suresh.

Suresh Shetty: Thank you so much, Laurel. Looking forward to the conversation.

Laurel: So to set the context of this conversation, JPMorgan Chase began investing in blockchain in 2015, which as we all know, in technology years is forever ago. Could you describe the current capabilities of blockchain and how it’s evolved over time at JPMorgan Chase?

Suresh: Absolutely. So when we began this journey, as you mentioned, in 2015, 2016, as any strategy and exploration of new technologies, we had to choose a path. And one of the interesting things is that when you’re looking at strategic views of five, 10 years into the future, inevitably, there needs to be some course correction. So what we did in JPMorgan Chase was we looked at a number of different lines of inquiry, and in each of these lines of inquiries, our focus was trying to be as inclusive as possible. So what we mean by that is that we actually weighted ubiquity in terms of who can use the technology, who was trying to use the technology over technology superiority. Because eventually, our feeling was that the network effect, the community effect of ubiquity, actually overcomes any technology challenges that a person or a firm might have.

Now, I think that a very relevant example is the Betamax-VHS example. It’s a bit dated but I think it really is important in this type of use case. So as many of you know, Betamax was a superior technology at the time and VHS was much more ubiquitous in the marketplace. And over time, what happened was that people gravitated, firms gravitated towards that ubiquity over the superiority of the technology that was in Betamax. And similarly, that was our feeling too in terms of blockchain in general and specifically the path that we took, which was in and around the Ethereum ecosystem. We felt that the Ethereum ecosystem had the largest developer community, and we thought over time, that was where we needed to focus in on.

So I think that that was our journey to date in terms of looking, and we continue to make those decisions in terms of collaboration, inclusiveness, as opposed to just purely looking at technology itself.

Laurel:And let’s really focus on those efforts. In 2020, the firm debuted Onyx by J.P.Morgan, which is a blockchain-based platform for wholesale payment transactions. Could you explain what wholesale payment transactions are and why they’re the basis of Onyx’s mission?

Suresh: Absolutely. Now, it was interesting. My background is that I came from the markets world and markets is really involved in front office trading, investment banking and so forth, and eventually, went over to the payments world. And if you juxtapose the two, it’s actually very interesting because initially, people feel that the market space is much more complicated, much more exciting than payments, and they feel that payments is a relatively straightforward exercise. You’re moving money from point A to point B.

What actually happens is actually, payments is much more complicated, especially from a transactional perspective. So what I mean by that is that if you look at markets, what happens is if you do a transaction, it flows through. If there’s an error, what you do is that you correct the initial transaction, cancel it, and put in a new transaction. So all you do is that there’s a series of cancel corrects, all of which are linked together by the previous transaction, so there’s a daisy chain of transactions which are relatively straightforward and easy to migrate upon.

But if you look at the payments world, what happens is that you have a transaction, it flows through. If there’s an error, you hold the transaction, you correct it, and then keep going. Now, if you think about it from a technology perspective, this is a lot more complicated because what you have to do is you have to keep in mind the state engine of the transactional flow, and you have to store it somewhere, and then you have to constantly make sure that as it flows to the next unit of work, it actually is not only referenced but it actually has the data and transactionality from the previous unit of work. So a lot more complicated.

Now, from a business perspective, what cross-border payments or wholesale payments involved is that, as I mentioned, you’re moving money from point A to point B. In an ideal fashion, and I’ll give you an example. Since I’m in India, in an ideal example, we would move money from JPMorgan Chase to State Bank of India, and the transaction is complete, and everybody is happy. And in between that transaction, we do things like a credit check to make sure that the money that is being sent, there’s money in the account of the sender. We need to make sure that the receiver of the account has a valid bank account, so you need to do that validation, so there’s a credit check. Then on top of that, you do a sanctions check. A sanctions check means that we are evaluating whether the money is being moved to a bad actor, and if it is, we stop the transaction and we inform the relevant parties. So it looks relatively straightforward in an idealized version.

Unfortunately, what happens is because of the fractured nature of banking across the world as well as regulatory obligations, what happens is that it’s never a single point to point movement. It involves multiple hops. So in that same example where I’m moving money from JPMorgan Chase to India, what usually happens is JPMorgan Chase sends it to, let’s say Standard Chartered in England. Standard Chartered then sends it to State Bank of India. State Bank of India then sends it to Bank of Baroda, and then Bank of Baroda eventually sends it to my bank, which is Vijaya Bank in India.

In each of those steps or hops, a credit check happens, a sanctions check happens, and the money moves forward. Also, there’s an account validation step that also happens to make sure that the payment transactional flow is correct, as well as the detail in the payment messages are correct as well. Now, as you can imagine, because of the friction in this process and the multiple hops, it is a process that’s very prone to error. So this is one of the ideal use cases for a blockchain, where we try to take out that operational friction from any process.

Laurel: That’s a really good illustrative example since one of the benefits of being a global firm is that JPMorgan Chase can operate at this massive scale. So what are some of the benefits and challenges that blockchain technology presents to the firm? I think you kind of alluded to some there.

Suresh: Absolutely, and it’s interesting, people sometimes conflate the technology innovation in the blockchain with a moonshot. Now, what’s interesting is that blockchain itself is based on very sound computing principles that have been around for a very long time, which are actually based on distributed computing. So at the heart of blockchain, it is a distributed computing system or platform. Now, all the challenges that you would have in a distributed computing platform, you would have within blockchain. Now this is further heightened by the fact that you have multiple nodes on the network. Each of those nodes has a copy of the data as well as the business logic. So one of the real challenges that we have is around latency in the network. So the number of nodes is directly correlated to the amount of latency that you have in the network, and that’s something that in a financial transaction, we have to be very cognizant about.

Secondarily is that there is an enormous amount of existing assets that are already in place from a code perspective within the enterprise. So the question is do we need to rewrite the entire code base in the languages that are supported by the various blockchains? So in Ethereum, do we need to rewrite all of this in Solidity, or can we somehow leverage the language or the code base that’s already been created? Now in our experience, we’ve had to actually do quite an extensive analysis on what needs to be on chain as opposed to what needs to be off chain. The off chain code base is something that we need to be able to leverage as we go forward because the business feels comfortable about that, and the question is why would we need to rewrite that? And the stuff that’s on the chain itself, that needs to be something that we really feel is important to be able to be distributed to the various nodes in the network itself. So I think that that’s some of the challenges that we’ve had in the blockchain space.

Now, in terms of benefits, I think that at the end of the day, we want to be able to have a cryptographically secure, auditable transactional record. And I think that there are many use cases within banking, especially those that are really within the sweet spot of the blockchain, such as those that require a lot of reconciliation. There are multiple actors, and in a distributed platform, regardless of whether it’s in blockchain or not.

Laurel: And cybersecurity is definitely one of those areas where blockchain can help, for example, transactions, improve transparency, et cetera. But how can organizations ensure safe and robust blockchain transactions and networks?

Suresh: Fantastic question. It’s interesting, that JPMorgan Chase is a private permission network. Now what does that mean? That means that every actor within our blockchain network is actually known to us. Now, it’s also interesting that hand in hand with that security aspect is the operational considerations of actually running a network. So we would need to be able to not only ensure security across the network, but we need to also ensure that we have transactional flow that meet the service level agreements between the various actors. Now, in a centralized private permission network, which is what Onyx is, is that JPMorgan Chase has taken the onus in terms of running the network itself.

Now, people want to be able to say that they want to run their own nodes and they want to be able to ensure their own security, which is great if it’s unique and singular to themselves, but when you’re participating in a network, the weakest link in the chain actually becomes your greatest challenge. So all of the actors or all the nodes that are participating in our network would have to meet the same security and operational considerations that everyone else has. So when we pose that question to the participants in our network and say, “Listen, you have an opportunity to run your own node or you can have us do it for you,” most of them, 95% of them, want us to run their nodes for them. So I think that that’s one of the big challenges or one of the big aspects of a private permission network as opposed to a public network.

Now, we’re increasingly seeing that there needs to be some integration across private permission networks and public networks. And again, when we have to integrate between these, we again run into classical problems or classical challenges, I should say, with the interconnected distributed platforms. But I think that this goes directly to the level of scale that we need to be at, especially within JPMorgan Chase, in order to be successful.

Laurel: So there’s also the challenge of keeping up with emerging technologies. What are some of the emerging or advanced technologies that have enabled blockchain innovations? And this is clearly important to Onyx since it created a blockchain launch to focus on developing and commercializing those new applications and networks.

Suresh: Absolutely. So within Onyx, we have three main lines of businesses and then we have Blockchain Launch, which looks at the hill beyond the hill in terms of evaluating new technologies. And we’ve done everything from looking at zero-knowledge proofs to trying to beam payments through a satellite back down to Earth and all of those types of things to create business value for our clients.

So I would say that the two most exciting things, or there’s a third one which I think there’s a topic that we’ll broach later, but the two most exciting topics that we’ve talked about so far and we’re very excited about is around zero-knowledge proofs as well as artificial intelligence and machine learning. If you think about the network that we have right now within JPMorgan Chase for Onyx, the various participants within the network will eventually start to create enough data through the network effect that it might be very interesting to see what other data enrichment, data mining type use cases can come out of that, and we are only going to see an uptick in that as you start to expand the network and we start to get more scale as we add more use cases onto the Onyx network.

Laurel: And so while we’re on that topic of emerging technologies, how does quantum computing and blockchain, how do those two technologies work together?

Suresh: So the quantum computing piece and the blockchain piece are very collaborative and very symbiotic in nature. Now, if you think about the idea of utilizing quantum mechanics, it’s been around since the mid 1970s when it was first proposed that there was an algorithm that we can with very large numbers that can be factored using a theoretical quantum computing. It was pretty much in the background, and then suddenly in October 2019, Google announced that it achieved quantum supremacy by solving a problem in 200 seconds that would’ve taken thousands of years to be able to solve.

And although that specific use case was sort of not specific to a business use case, the impact of that is very far-reaching because all of a sudden, it demonstrated that you could use quantum computing to actually create a mechanism that would impact pretty much every cryptographically secure transactional flow.

So as we are looking through this, some of the things that we looked at in quantum computing was around looking at the quantum key distribution, looking at cryptographically secure vaulting, distributed identity. All of these we believe are key to the future of blockchain and actually impact even things as mundane as the topic that we spoke about before, which is around the cross-border transactional flow as well.

Laurel: So while blockchain certainly seems to have the potential to shift the financial services industry, the need to focus on sustainability goals and follow regulations are also a major consideration. So how can innovations in blockchain be balanced with mitigating its emissions and environmental impact?

Suresh: This is a question that we’ve been asked many times by our businesses in terms of how environmentally conscious are we? I would say that one of the big advances recently, especially within the Ethereum space, was the shift from proof to work to proof of stake.

Now in proof-of-work systems, miners compete with one another to see who can problem solve the fastest in exchange for crypto rewards, and because of this, the proof-of-work systems take up a large amount of energy. Juxtaposing this is the proof-of-stake systems which rely on market incentives and validators, and in exchange for the right to add blocks, they remove the competition from the system. Now, because of this, the amount of energy that’s being used in a proof-of-stakes system goes down to 1% of the overall carbon impact of a proof-of-work system, so thereby, that shift alone was very important from a carbon emission perspective.

Secondarily, within the Onyx system itself, we’ve shifted to a situation where we have set our gas fees to zero and the only compute is minimalistic in terms of just computing the business logic itself. And also, we’re using the BFT class of algorithms as well as RafT. Both are not compute intensive, aside from the business logic itself.

Laurel: Thank you, Suresh. You’ve certainly given us a lot to think about. So looking to the future, what are some trends in technology that you’re excited about in the next three to five years?

Suresh: So I think that we mentioned some of the topics before around quantum computing, artificial intelligence, machine learning. All of those I think are very important to us. Now, I would also say that the three to five year time horizon is probably too long. I think that when we speak in investment banking, we speak about an 18- to 24-month time horizon. We think that that’s probably a similar time horizon that we’re seeing in the blockchain space itself. So as we evolve, I think that the really interesting aspect of this is going to be where social networks and business networks overlap and how they organically evolve to support each other as we go forward, and how the payment space itself evolves in order to take advantage of this.

Laurel: Excellent. Thank you so much for joining us today on the Business Lab, Suresh.

Suresh: Fantastic. Thank you so much, Laurel.

Laurel: That was Suresh Shetty, who is the chief technology officer at Onyx by J.P.Morgan, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the global director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This podcast is for informational purposes only and it is not intended as legal, tax, financial, investment, accounting or regulatory advice. Opinions expressed herein are the personal views of the individual(s) and do not represent the views of JPMorgan Chase & Co. The accuracy of any statements, linked resources, reported findings or quotations are not the responsibility of JPMorgan Chase & Co.

Actionable insights enable smarter business buying

For decades, procurement was seen as a back-office function focused on cost-cutting and supplier management. But that view is changing as supply chain disruptions and fluctuating consumer behavior ripple across the economy. Savvy leaders now understand procurement’s potential to deliver unprecedented levels of efficiency, insights, and strategic capability across the business.

However, tapping into procurement’s potential for generating value requires mastering the diverse needs of today’s global and hybrid businesses, navigating an increasingly complex supplier ecosystem, and wrangling the vast volumes of data generated by a rapidly digitalizing supply chain. Advanced procurement tools and technologies can support all three.

Purchasing the products and services a company needs to support its daily operations aggregates thousands of individual decisions, from a remote worker selecting a computer keyboard to a materials expert contracting with suppliers. Keeping the business running requires procurement processes and policies set by a chief procurement officer (CPO) and team who “align their decisions with company goals, react to changes with speed, and are agile enough to ensure a company has the right products at the right time,” says Rajiv Bhatnagar, director of product and technology at Amazon Business.

At the same time, he says, the digitalization of the supply chain has created “a jungle of data,” challenging procurement to “glean insights, identify trends, and detect anomalies” with record speed. The good news is advanced analytics tools can tackle these obstacles, and establish a data-driven, streamlined approach to procurement. Aggregating the copious data produced by enterprise procurement—and empowering procurement teams to recognize and act on patterns in that data—enables speed, agility, and smarter decision-making.

Today’s executives increasingly look to data and analytics to enable better decision-making in a challenging and fast-changing business climate. Procurement teams are no exception. In fact, 65% of procurement professionals report having an initiative aimed at improving data and analytics, according to The Hackett Group’s 2023 CPO Agenda report.

And for good reason—analytics can significantly enhance supply chain visibility, improve buying behavior, strengthen supply chain partnerships, and drive productivity and sustainability. Here’s how.

Gaining full visibility into purchasing activity

Just getting the full view of a large organization’s procurement is a challenge. “People involved in the procurement process at different levels with different goals need insight into the entire process,” says Bhatnagar. But that’s not easy given the layers upon layers of data being managed by procurement teams, from individual invoice details to fluctuating supplier pricing. Complicating matters further is the fact that this data exists both within and outside of the procurement organization.

Fortunately, analytics tools deliver greater visibility into procurement by consolidating data from myriad sources. This allows procurement teams to mine the most comprehensive set of procurement information for “opportunities for optimization,” says Bhatnagar. For instance, procurement teams with a clear view of their organization’s data may discover an opportunity to reduce complexity by consolidating suppliers or shifting from making repeated small orders to more cost-efficient bulk purchasing.

Identifying patterns—and responding quickly

When carefully integrated and analyzed over time, procurement data can reveal meaningful patterns—indications of evolving buying behaviors and emerging trends. These patterns can help to identify categories of products with higher-than-normal spending, missed targets for meeting supplier commitments, or a pattern of delays for an essential business supply. The result, says Bhatnagar, is information that can improve budget management by allowing procurement professionals to “control rogue spend” and modify a company’s buying behavior.

In addition to highlighting unwieldy spending, procurement data can provide a glimpse into the future. These days, the world moves at a rapid clip, requiring organizations to react quickly to changing business circumstances. Yet only 25% of firms say they are able to identify and predict supply disruptions in a timely manner “to a large extent,” according to Deloitte’s 2023 Global CPO survey.

“Machine learning-based analytics can look for patterns much faster,” says Bhatnagar. “Once you have detected a pattern, you can take action.” By detecting patterns in procurement data that could indicate supply chain interruptions, looming price increases, or new cost drivers, procurement teams can proactively account for market changes. For example, a team might enable automatic reordering of an essential product that is likely to be impacted by a supply chain bottleneck.

Sharing across the partner ecosystem

Data analysis allows procurement teams to “see some of the challenges and react to them in real-time,” says Bhatnagar. But in an era of interconnectedness, no one organization acts alone. Instead, today’s supplier ecosystems are deeply interconnected networks of supply-chain partners with complex interdependencies.

For this reason, sharing data-driven insights with suppliers helps organizations better pinpoint causes for delays or inaccurate orders and work collaboratively to overcome obstacles. Such “discipline and control” over data, says Bhatnagar, not only creates a single source of truth for all supply-chain partners, but helps eliminate finger-pointing while also empowering procurement teams to negotiate mutually beneficial terms with suppliers.

Improving employee productivity and satisfaction

Searching for savings opportunities, negotiating with suppliers, and responding to supply-chain disruptions—these time-consuming activities can negatively impact a procurement team’s productivity. However, by relying on analytics to discover and share meaningful patterns in data, procurement teams can shift focus from low-value tasks to business-critical decision-making.

Shifting procurement teams to higher-impact work results in a better overall employee experience. “Using analytics, employees feel more productive and know that they’re bringing more value to their job,” says Bhatnagar.

Another upside of heightening employee morale is improved talent retention. After all, workers with a sense of value and purpose are likelier to stay with an employer. This is a huge benefit in a time when nearly half (46%) of CPOs cite the loss of critical talent as a high or moderate risk, according to Deloitte’s 2023 Global CPO survey.

Meeting compliance metrics and organizational goals

Procurement analytics can also deliver on a broader commitment to changing how products and services are purchased.

According to a McKinsey Global Survey on environmental, social, and governance (ESG) issues, more than nine in ten organizations say ESG is on their agenda. Yet 40% of CPOs in the Deloitte survey report their procurement organizations need to define or measure their own set of relevant ESG factors.

Procurement tools can bridge this gap by allowing procurement teams to search for vendor or product certifications and generate credentials reports to help them shape their organization’s purchases toward financial, policy, or ESG goals. They can develop flexible yet robust spending approval workflows, designate restricted and out-of-policy purchases, and encourage the selection of sustainable products or preference for local or minority-owned suppliers.

“A credentials report can really allow organizations to improve their visibility into sustainability [initiatives] when they’re looking for seller credentials or compliant credentials,” says Bhatnagar. “They can track all of their spending from diverse sellers or small sellers—whatever their goals are for the organization.”

Delivering the procurement of tomorrow

Advanced analytics can free procurement teams to glean meaningful insights from their data—information that can drive tangible business results, including a more robust supplier ecosystem, improved employee productivity, and a greener planet.

As supply chains become increasingly complex and the ecosystem increasingly digital, data-driven procurement will become critical. In the face of growing economic instability, talent shortages, and technological disruption, advanced analytics capabilities will enable the next generation of procurement.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Learn how Amazon Business is leveraging AI/ML to offer procurement professionals more efficient processes, a greater understanding of smart business buying habits and, ultimately, reduced prices.

Start with data to build a better supply chain

In business, the acceleration of change means enterprises have to live in the future, not the present. Having the tools and technologies to enable forward-thinking and underpin digital transformation is key to survival. Supply chain procurement leaders are tasked with improving operational efficiencies and keeping an eye on the bottom line. For Raimundo Martinez, global digital solutions manager of procurement and supply chain at bp, the journey toward building a better supply chain starts with data.

“So, today, everybody talks about AI, ML, and all these tools,” says Martinez. “But to be honest with you, I think your journey really starts a little bit earlier. I think when we go out and think about this advanced technology, which obviously, have their place, I think in the beginning, what you really need to focus is in your foundational [layer], and that is your data.”

In that vein, all of bp’s data has been migrated to the cloud and its multiple procurement departments have been consolidated into a single global procurement organization. Having a centralized, single data source can reduce complexities and avoid data discrepancies. The biggest challenge to changes like data centralization and procurement reorganization is not technical, Martinez says, but human. Bringing another tool or new process into the fold can cause some to push back. Making sure that employees understand the value of these changes and the solutions they can offer is imperative for business leaders.

Honesty toward both employees and end users—where an enterprise keeps track of its logistics, inventory, and processes—can be a costly investment. For a digital transformation journey of bp’s scale, an investment in supply chain visibility is an investment in customer trust and business reputability.

“They feel part of it. They’re more willing to give you feedback. They’re also willing to give you a little bit more leeway. If you say that the tool is going to be, or some feature is going to be delayed a month, for example, but you don’t give the reasons and they don’t have that transparency and visibility into what is driving that delay, people just lose faith in your tool,” says Martinez.

Looking to the future, Martinez stresses the importance of a strong data foundation as a precursor to taking advantage of emerging technologies like AI and machine learning that can work to create a semi-autonomous supply chain.

“Moving a supply chain from a transactional item to a much more strategic item with the leverage of this technology, I think, that, to me, is the ultimate vision for the supply chain,” says Martinez.

This episode of Business Lab is produced in partnership with Infosys Cobalt.

Full Transcript

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma. And this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic is building a better supply chain. AI can bring efficiencies to many aspects of an enterprise, including supply chain. And where better to start than internal procurement processes. With better data, better decisions can be made quicker, both internally and by customers and partners. And that is better for everyone.

Two words for you: automating transformation.

My guest is Raimundo Martinez, who is the global digital solutions manager of procurement and supply chain at bp.

This episode of Business Lab is produced in partnership with Infosys Cobalt.
Welcome, Raimundo.

Raimundo Martinez: Hi, Laurel. Thanks for having me today.

Laurel: So, let’s start with providing some context to our conversation. bp has been on a digital transformation journey. What spurred it, and how is it going?

Raimundo: I think there’s many factors spurring digital transformation. But if I look at all of this, I think probably the key one is the rate of change in the world today and in the past. I think instead of slowing down, I think the rate of change is accelerating, and that makes business survivability the need to have quick access to the data to almost not live in today, but live in the future. And having tools and technologies that allow them to see what is coming up, what routes of action they can take, and then to enact those mitigation plans faster.

And I think that’s where the digital transformation is the key enabler of that. And I would say that’s on the business side. I think the other one is the people mindset change, and that ties into how things are going. I think things are going pretty good. Technology wise, I’ve seen a large number of tools and technologies adopted. But I think probably the most important thing is this mindset and the workforce and the adoption of agile. This rate of change that we just talked in the first part can only probably be achieved in tame when the whole workforce has this agile mindset to react to it.

Laurel: Well, supply chain procurement leaders are under pressure to improve operational efficiencies while keeping a careful eye on the bottom line. What is bp’s procurement control tower, and how has it helped with bp’s digital transformation?

Raimundo: Yeah, sure. In a nutshell, think about old as myriad of systems of record where you have your data and users having to go to all of those. So, our control tower, what it does, is consolidate all the data in a single platform. And what we have done is not just present the data, but truly configured the data in form of alerts. And the idea is to tell our user, “This is what’s important. This are the three things that you really need to take care now.” And not stopping there, but then saying, “Look, in order to take that action, we’re giving you a summary information so you don’t have to go to any other system to actually understand what is driving that alert.” But then on top of that, we’re integrating that platform with this system’s record so that request can complete it in seconds instead of in weeks.

So, that in a nutshell, it’s the control tower platform. And the way have helped… Again, we talk about tools and people. So, on the tool side, being able to demonstrate how this automation is done and the value of it and being able for other units to actually recycle the work that you have done, it accelerates and inspire other technical resources to take advantage of that. And then on the user side, one of the effects that have, again, this idea of the ability mindset, everything that we’ve done in the tool development is agile. So, bringing the users into that journey have actually helped us to also accelerate that aspect of our digital transformation.

Laurel: On that topic of workplace agility. In 2020, bp began a reorganization that consolidated its procurement departments into that single global procurement organization. What were the challenges that resulted from this reorganization?

Raimundo: Yeah. To give you a more context on that. So, if you think about bp being this really large global organizations divided in business units, before the organizations, every one of these business units have their own procurement departments, which handle literally billions of dollars that’s how big they were. And in that business, they have the ERP systems, your contract repository, your process and process deviation. But you only manage the portfolio to that. Once you integrate all of those organizations into a single one, now your responsibility become across some of those multiple business units, that has your data in all of these business systems.

So, if you want to create a report, then it’s really complicated because you have to not only go to these different systems, but the taxonomy of the data is different. So, an example, some business will call their territory, North America, the other one will call it east and west coast. So, if you want a report for a new business owner, it becomes really, really hard, and also the reports might not be as complete as they are. So, that really calls for some tools that we need to put in place to support that. And on top of that, the volume of requests now is so greater that just changing and adding steps to process aren’t going to be enough. You really need to look into automation to satisfy this higher demand.

Laurel: Well, speaking of automation, it can leverage existing technology and build efficiencies. So, what is the role of advanced technologies, like AI, machine learning and advanced analytics in the approach to your ongoing transformation?

Raimundo: So, today, everybody talks about AI, ML, and all these tools. But to be honest with you, I think your journey really starts a little bit earlier. I think when we go out and think about this advanced technology, which obviously, have their place, I think in the beginning, what you really need to focus is in your foundational, and that is your data. So, you ask about the role of the cloud. So, for bp, what we have done is all of the data used to reside in multiple different sites out there. So, what we have done is all the data now has been migrated to the cloud. And then what the cloud also allows is to do transformations in place that help us really homogenize, what I just described before, North America, South America, then you can create another column and say, okay, now call it, whatever, United States, or however you want to call it.

So, all of this data transformation happened in a single spot. And what that does is also allow our users that need this data to go to a single source of truth and not be pulling data from multiple systems. An example of the chaos that that creates is somebody will be pulling invoice and data from Spence, somebody will pull in PayData. So, then you already have data discrepancy on the reporting. And having a centralized tool where everybody goes for the data reduces so much complexity on the system.

Laurel: And speaking about that kind of complexity, it’s clear that multiple procurement systems made it difficult to maintain quality compliance as well, and as well as production tracking in bp supply chain. So, what are some of the most challenging aspects of realizing this new vision with a centralized one-stop platform?

Raimundo: Yeah, we have a good list in there. So, let me break it into maybe technical and people, because I think people is something that we should talk about it. So, technical. I think one of the biggest things in technical is working with your technical team to find the right architecture. This is how your vision fits into our architecture, which will create less, let’s say, confusion and complexity into your architecture. And the other side of the technical challenge is finding the right tools. I’ll give you an example for our project. Initially, I thought, okay, RPA [robotic process automation] will be the technology to do this. So, we run a pilot RPA. And obviously, RPA has incredible applications out there. But at this point, RPA really wasn’t the tool for us given the changes that could happen on the screens from the system that we’re using. So, then we decided instead of going to RPA, going to API.

So, that’s an example of a challenge of finding exactly the right tool that you have. But to be honest with you, I think the biggest challenge is not technical, but human. Like I mentioned before, people are immersed in the sea of change that is going on, and here you come with yet another tool. So, even the tool you’re giving them might be a lot more efficient, people still want to cling to what they know. So, if they say, “Look, if I have to spend another two hours extracting data, putting Excel, collating and running a report…” Some people may rather do that than go to a platform where all of that is done for them. So, I think change management is key in these transformations to make sure that they’re able to sell or make people understand what the value of the tool is, and overcome that challenge, which is human normal aversion to change. And especially when you’re immersed on this really, really sea of change that was already going as a result of the reorganization.

Laurel: Yeah. People are hard, and tech can be easy. So, just to clarify, RPA is the robotic process automation in this context, correct?

Raimundo: Yeah, absolutely. Yeah. Sorry about the pretty layered… Yeah.

Laurel: No, no. There’s lots of acronyms going around.

So, inversely, we’re just discussing the challenges, what are the positive outcomes from making this transformation? And could you give us an example or a use case of how that updated platform boosted efficiency across existing processes?

Raimundo: Absolutely. Just quick generic. So, generic things is you find yourself a lot in this cycle of that data. The users look at the applications that said that data’s not correct, and they lose the appetite for using that, but the problem is they own the data, but the process to change the data is so cumbersome that people don’t really want to take ownership of that because they said, “Look, I have 20 things to do. The least in my list is updating that data.”

So, we’re in this cycle of trying to put tools out for the user, the data is not correct, but we’re not the ones who own the data. So, the specific example of how we broke that cycle is using automation. So, to give you an example, before we create automation, if you needed to change any contract data, you have to find what the contract is, then you have to go to a tool like Salesforce and create a case. That case goes to our global business support team, and then they have to read the case, open the system of record, make the change. And that could take between days or weeks. Meantime, the user is like, “Well, I requested this thing, and it hasn’t even happened.”

So, what we did is leverage internal technology. We already had a large investment on Microsoft, as you can imagine. And we said, look, “From Power BI, you can look at your contract, you can click on the record you want to change. Power App comes up and tells you what do you want to do.” Say, I want to change the contract owner, for example. It opens a window, says, “Who’s the new person you want to put in?” And as soon as you submit it, literally, within less than a second, the API goes to the system of record, change the owner, creates an email that notifies everybody who is an stakeholder in that contract, which then increases visibility to changes across the organization.

And at the same time, it leaves you an audit trail. So, if somebody wants to challenge that, you know exactly what happened. So, that has been an incredible outcome of reducing cycle time from days and weeks to merely seconds, at the same time, increasing communication and visibility into the data. That has been proved one of the greatest achievements that we have.

Laurel: Well, I think you’ve really outlined this challenge. So, investing in supply chain visibility can be costly, but often bolsters trust and reputability among customers. What’s the role of transparency and visibility in a digital transformation journey of this size?

Raimundo: I keep talking about agile, and I think that’s one of the tenets. And what I will add to transparent visibility, I would add actually honesty. I think it’s very, very easy to learn from success. Everybody wants to tout the great things that they have done, but people may a little bit less inclined to speak out about their mistakes. I’ll just give you an example of our situation with RPA. We don’t feel bad about it. We feel that the more we share that knowledge with the technical teams, the much more value it has because then people will learn from that again and not commit the same mistake obviously.

But I think also what honesty do in this visibility is when you bring your users into the development team, you have that visibility. They feel part of it. They’re more willing to give you feedback. And also, they’re also willing to give you a little bit more leeway. If you say that the tool is going to be, or some feature is going to be delayed a month, for example, but you don’t give the reasons and they don’t have that transparency and visibility into what is driving that delay, people just lose faith in your tool.

Where I think the more open, the more visible you are, but also, again, with honesty, is you have a product that is so much more well received and that everybody feels part of the tool. It’s something that in every training, at the end of the training, I just say, “By the way, this is not my tool. This is your tool. And the more engaged you are with us, the much better outcome you’re going to have.” And that’s just achieved through transparency and visibility.

Laurel: So, for other large organizations looking to create a centralized platform to improve supply chain visibility, what are some of the key best practices that you’ve found that leadership can adopt to achieve the smoothest transition?

Raimundo: So, I probably think about three things. I think, one, the leadership needs to really, really do is understand the project. And when I say, understand the project, is really understanding the technical complexity, the human aspect of it, because I think that’s where your leadership has a lot of role to play. They’re able to influence their teams on this project that you’re trying to… And then they really need to understand also what are the risks associated with this project. And also that these could be a very lengthy journey. Hopefully, obviously, there’ll be results and milestones along, but they need to feel comfortable with also this agile mentality that we’re going to do features, fail, adapt, and they really need to be part of that journey.

The second biggest, I think, most important thing is having the right team. And in that, I think I’ve been super fortunate. We have a great partnership with Infosys. I’ve got one of the engineers named Sai. What the Infosys team and my technical team says is, “Look, do not shortchange yourself on the ideas that you bring from the business side.” A lot of times, we might think about something as impossible. They really encourage me to come up with almost crazy ideas. Just come with everything that you can think about. And they’re really, really incredible of delivering all the resources to bring in a solution to that. We almost end up using each other’s phrases. So, having a team that is really passionate about change, about being honest, about working together is the key to delivery. And finally, data foundation. I think that we get so stuck looking at the shiny tools out there that seem like science fiction and they’ll be great, and we forget that the outcome of those technologies are only as good as the data that we are supporting.

And data, a lot of times, it seem as like the, I don’t know, I don’t want to call it ugly sister, the ugly person in the room. But it’s really people… They’re like, “Oh, I don’t want to deal with that. I just want to do AI.” Well, your AI is not going to give you what you want if it doesn’t understand where you’re at. So, data foundation is key. Having the perfect team and technology partners and understanding the project length, the risk and being really engaged will be, for me, the key items there.

Laurel: That’s certainly helpful. So, looking ahead, what technologies or trends do you foresee will enable greater efficiencies across supply chain and operations?

Raimundo: It’s not like a broken record, bet. I really think that technologies that look at our data and help us clean the data, foresee what items we’re going to have with the data, how we can really have a data set that is really, really powerful, that is easy, and it has reflects exactly our situation, it’s the key for then the next step, which is all of these amazing technologies. If I think about our vision, for the platform is to create a semi-autonomous supply chain. And the vision is imagine having, again, first, the right data, and now what you have is AI/ML and all these models that look at that internal data, compare that with external factors.

And what it does is instead of presenting us alerts, we’ll go to the next level, and it, basically, presents scenarios. And say, “Look, based on the data that I see on the market, what you have had in your history, these are the three things that can happen, these are the plans that the tool recommends, and this is how you interact or affect that change.” So, moving a supply chain from a transactional item to a much more strategic item with the leverage of this technology, I think, that, to me, is the ultimate vision for supply chain.

Laurel: Well, Raimundo, thank you so much for joining us today on the Business Lab. This has been very enlightening.

Raimundo: Thank you. I’ve been a pleasure. And I wish everybody a great journey out there. It’s definitely a very exciting moment right now.

Laurel: Thank you.

That was Raimundo Martinez, who is a global digital solutions manager, procurement and supply chain at bp, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.