Shaping the future with adaptive production

Adaptive production is more than a technological upgrade: it is a paradigm shift. This new frontier enables the integration of cutting-edge technologies to create an increasingly autonomous environment, where interconnected manufacturing plants go beyond the limits of traditional automation. Artificial intelligence, digital twins, and robotics are among the powerful tools manufacturers are using to create dynamic, intelligent systems that not only perform tasks, but also learn, make decisions, and evolve in real-time.

Taking this kind of adaptive approach can transform a manufacturer’s productivity, efficiency, and innovation. But beyond the factory, it also has the potential to deliver society-wide benefits, by bolstering economic growth locally, creating more attractive and accessible employment opportunities, and supporting a sustainability agenda.

As efforts to revive and modernize local manufacturing accelerate in regions around the world, including North America and Europe, adaptive production could help manufacturers overcome some of their biggest obstacles—firstly, attracting and retaining talent. Nearly 60% of manufacturers cited this as their top challenge in a 2024 US-based survey. Highly automated, technology-led adaptive production methods hold new promise for attracting talent to roles that are safer, less repetitive, and better paid. “The ideal scenario is one where AI enhances human capabilities, leads to new task creation, and empowers the people who are most at risk from automation’s impact on certain jobs, particularly those without college degrees,” says Simon Johnson, co-director of MIT’s Shaping the Future of Work Initiative.

Secondly, the digitalization of manufacturing—embedded in the very foundation of adaptive production technologies—allows companies to better address complex sustainability challenges through process and resource optimization and a better understanding of data. “By integrating these advanced technologies, we gain a more comprehensive picture across the entire production process and product lifecycle,” explains Jelena Mitic, head of technology for the Future of Automation at Siemens. “This will provide a much faster and more efficient way to optimize operations and ensure that all the necessary safety and sustainability requirements are met during quality control.”

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Finding value with AI automation

In June 2023, technology leaders and IT services executives had a lightning bolt headed their way when McKinsey published the “The economic potential of generative AI: The next productivity frontier” report. It echoed a moment from the 2010s when Amazon Web Services launched an advertising campaign aimed at Main Street’s C-suite: Why would any fiscally responsible exec allow their IT teams to spend capex for servers and software when AWS only cost 10 cents per virtual machine? 

Vendors understand that these kinds of reports and aggressive advertising around competitive risks projected onto an industry sector would drive many calls from boards to their C-suite, rolling from C-suite to their staff all asking, “What are we doing with AI?” When asked to “do something with AI,” technical leadership and their organizations promptly responded — sometimes begrudgingly and sometimes excitedly — for work-sanctioned opportunities to get their hands on a new technology. At that point, there was no time to sort between actual business returns from applying AI and “AI novelty” use cases that were more Rube Goldberg machines than tangible breakthroughs. 

Today’s opportunity: Significant automation gains 

When leaders respond to immediate panic, new business risks and mitigations often emerge.  Two recent examples highlight the consequences of rushing to implement and publish positive results from AI adoption. The Wall Street Journal reported in April 2025 on companies struggling to realize returns on AI. Just weeks later, it covered MIT’s retraction of a technical paper about AI where the results that led to its publication could not be substantiated.  

While these reports demonstrate the pitfalls of over-reliance on AI without common-sense guardrails, not all is off track in the land of enterprise AI adoption. Incredible results being found from judicious use of AI and related technologies in automating processes across industries. Now that we are through the “fear of missing out” stage and can get down to business, where are the best places to look for value when applying AI to automation of your business?  

While chatbots are almost as pervasive as new app downloads for mobile phones, the applications of AI realizing automation and productivity gains line up with the unique purpose and architecture of the underlying AI system they are built on. The dominant patterns where AI gains are realized currently boil down to two things: language (translation and patterns) and data (new format creation and data search).  

Example one: Natural language processing  

Manufacturing automation challenge: Failure Mode and Effects Analysis (FMEA) is both critical and often labor intensive. It is not always performed prior to a failure in manufacturing equipment, so very often FMEA occurs in a stressful manufacturing lines-down scenario. In Intel’s case, a global footprint of manufacturing facilities separated by large distances along with time zones and preferred language differences makes this even more difficult to find the root cause of a problem. Weeks of engineering effort are spent per FMEA analysis repeated across large fleets of tools spread between these facilities.  

Solution: Leverage already deployed CPU compute servers for natural language processing (NLP) across the manufacturing tool logs, where observations about the tools’ operations are maintained by the local manufacturing technicians. The analysis also applied sentiment analysis to classify words as positive, negative, or neutral. The new system performed FMEA on six months of data in under one minute, saving weeks of engineering time and allowing the manufacturing line to proactively service equipment on a pre-emptive schedule rather than incurring unexpected downtime.  

Financial institution challenge: Programming languages commonly used by software engineers have evolved. Mature bellwether institutions were often formed through a series of mergers and acquisitions over the years, and they continue to rely on critical systems that are based on 30-year-old programming languages that current-day software engineers are not familiar with. 

Solution: Use NLP to translate between the old and new programming languages, giving software engineers a needed boost to improve the serviceability of critical operational systems. Use the power of AI rather than doing a risky rewrite or massive upgrade. 

Example two: Company product specifications and generative AI models 

Sales automation challenge: The time it takes to reformat a company’s product data into a specific customer RFP format has been an ongoing challenge across industries. Teams of sales and technical leads spend weeks of work across different accounts reformatting the same root data between the preferred PowerPoint or Word document formats. The customer response times were measured in weeks, especially if the RFPs required legal reviews. 

Solution: By using generative AI combined with a data extraction and prompting technique called retrieval augmented generation (RAG), companies can rapidly reformat product information between different customer required RFP response formats. The time spent moving data between different documents and different document types only to find an unforced error in the move is reduced to hours instead of weeks.  

HR policy automation challenge: Navigating internal processes can be time consuming and confusing for both HR and employees. The consequences of misinterpretation, access outages, and personal information or private data being exposed are massively important to the company and the individual. 

Solution: Combine generative AI, RAG, and an interactive chatbot that uses employee-assigned assets to determine identity and access rights, provides employees interactive query-based chat formats to answer their questions in real time. 

Finding your best use cases for AI 

In a world where 80% to 90% of all AI proof of concepts fail to scale, now is the time to develop a framework that is based on caution. Consider starting with a data strategy and governance assessment. Then find opportunities to compare successful AI-based automation efforts at peer companies through peer discussions. Clear, rules-based policies and processes offer the best opportunities to begin a successful AI automation journey in your enterprise. Where you encounter disparate data sources (e.g., unstructured, video, structured databases) or unclear processes, maintain tighter human-in-the-loop decision controls to avoid unexpected data or token exposure and cost overruns. 

As the AI hype cycle cools and business pressure mounts, now is the time to become practical. Apply AI to well-defined use cases and begin unlocking the automation benefits that will matter not just in 2025, but for years to come.

This content was produced by Intel. It was not written by MIT Technology Review’s editorial staff.

Battling next-gen financial fraud 

From a cluster of call centers in Canada, a criminal network defrauded elderly victims in the US out of $21 million in total between 2021 and 2024. The fraudsters used voice over internet protocol technology to dupe victims into believing the calls came from their grandchildren in the US, customizing conversations using banks of personal data, including ages, addresses, and the estimated incomes of their victims. 

The proliferation of large language models (LLMs) has also made it possible to clone a voice with nothing more than an hour of YouTube footage and an $11 subscription. And fraudsters are using such tools to create increasingly more sophisticated attacks to deceive victims with alarming success. But phone scams are just one way that bad actors are weaponizing technology to refine and scale attacks. 

Synthetic identity fraud now costs banks $6 billion a year, making it the fastest-growing financial crime in the US Criminals are able to exploit personal data breaches to fabricate “Frankenstein IDs.” Cheap credential-stuffing software can be used to test thousands of stolen credentials across multiple platforms in a matter of minutes. And text-to-speech tools powered by AI can bypass voice authentication systems with ease. 

“Technology is both catalyzing and transformative,” says John Pitts, head of industry relations and digital trust at Plaid. “Catalyzing in that it has accelerated and made more intense longstanding types of fraud. And transformative in that it has created windows for new, scaled-up types of fraud.” 

Fraudsters can use AI tools to multiply many times over the number of attack vectors—the entry points or pathways that attackers can use to infiltrate a network or system. In advance-fee scams, for instance, where fraudsters pose as benefactors gifting large sums in exchange for an upfront fee, scammers can use AI to identify victims at a far greater rate and at a much lower cost than ever before. They can then use AI tools to carry out tens of thousands, if not millions, of simultaneous digital conversations. 

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Building an innovation ecosystem for the next century

Michigan may be best known as the birthplace of the American auto industry, but its innovation legacy runs far deeper, and its future is poised to be even broader. From creating the world’s largest airport factory during World War II at Willow Run to establishing the first successful polio vaccine trials in Ann Arbor to the invention of the snowboard in Muskegon, Michigan has a long history of turning innovation into lasting impact. 

Now, with the creation of a new role, chief innovation ecosystem officer, at the Michigan Economic Development Corporation (MEDC), the state is doubling down on its ambition to become a modern engine of innovation, one that is both rooted in its industrial past and designed for the evolving demands of the 21st century economy.  

“How do you knit together risk capital founders, businesses, universities, and state government, all of the key stakeholders that need to be at the table together to build a more effective innovation ecosystem?” asks Ben Marchionna, the first to hold this groundbreaking new position. 

Leaning on his background in hard tech startups and national security, Marchionna aims to bring a “builder’s thinking” to the state government. “I’m sort of wired for that—rapid prototyping, iterating, scaling, and driving that muscle into the state government ecosystem,” he explains.

But these efforts aren’t about creating a copycat Silicon Valley. Michigan’s approach is uniquely its own. “We want to develop the thing that makes the most sense for the ingredients that Michigan can bring to bear to this challenge,” says Marchionna. 

This includes cultivating both mom-and-pop businesses and tech unicorns, while tapping into the state’s talent, research, and manufacturing DNA. 

In an era where economic development often feels siloed, partisan, and reactive, Michigan is experimenting with a model centered on long-term value and community-oriented innovation. “You can lead by example in a lot of these ways, and that flywheel really can get going in a beautiful way when you step out of the prescriptive innovation culture mindset,” says Marchionna.

This episode of Business Lab is produced in partnership with the Michigan Economic Development Corporation.

Full Transcript 

Megan Tatum: From MIT Technology Review. I’m Megan Tatum, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. 

Today’s episode is brought to you in partnership with the Michigan Economic Development Corporation. 

Our topic today is building a statewide innovation economy. Now, the U.S. state of Michigan has long been recognized as a leader in vehicle and mobility innovation. Detroit put it on the map, but did you know it’s also the birthplace of the snowboard or that the University of Michigan filed more than 600 invention disclosures in 2024, second only to the Massachusetts Institute of Technology, or that in the past five years, 40% of the largest global IPOs have been Michigan built companies?

Two words for you: innovation ecosystem. 

My guest is Ben Marchionna, chief innovation ecosystem officer at the Michigan Economic Development Corporation, the MEDC. 

Ben, thank you ever so much for joining us.

Ben Marchionna: Thanks, Megan. Really pleased to be here.

Megan: Fantastic. And just to set some context to get us started, I wondered if we could take a kind of high-level look at the economic development landscape. I mean, you joined the MEDC team last year as Michigan’s first chief innovation ecosystem officer. In fact, you were the first to hold such a role in the country, I believe. I wondered if you could talk a bit about your unique mission and how this economic development approach differs from efforts in other states.

Ben: Yeah, sure would love to. Probably worth pointing out that while I’ve been in this role for about a year now, it was indeed a first-of-its-kind role in the state of Michigan and first of its kind in the country. The slight difference in the terminology, chief innovation ecosystem officer, it differs a little bit from what folks might think of as a chief innovation officer. I’m not all that focused on driving innovation within government, which is what some other chief innovation officers would be focused on around the country. Instead, you can think of my role as Michigan’s chief architect for innovation, if you will. So, how do you knit together risk capital founders, businesses, universities, and state government, all of the key stakeholders that need to be at the table together to build a more effective innovation ecosystem? I talk a lot about building connective tissues that can achieve one plus one equals three outcomes.

Michigan’s got all kinds of really interesting ingredients and has the foundation to take advantage of the moment in a really interesting way over the next decades as we look to supercharge some of the growth of our innovation ecosystem development.

My charter is relatively simple. It’s to help make sure that Michigan wins in a now hyper-competitive global economy. And to do that, I end up being super focused on orienting us towards a growth and innovation-driven economy. That can mean a lot of different things, but I ultimately came to the MEDC and the role within the state with a builder’s mindset. My background is not in traditional economic development, it’s in not government at all. I spent the last 10 years building hard tech startups, one in Ann Arbor, Michigan, and another one in the Northern Virginia area. Before that, I spent a number of years at, think of it like, an innovation factory at Lockheed Martin Skunk Works in the Mojave Desert, working on national security projects.

I’m sort of wired for that, builder’s thinking, rapid prototyping, iterating, scaling, and driving that muscle into the state government ecosystem. I think it’s important that the government also figure out how to pull out all the stops and be able to move at the speed that founders expect. A bias towards action, if you will. And so this is ultimately what my mission is. There are a lot of real interesting things that the state of Michigan can bring to bear to building our innovation ecosystem. And I think, tackling it with this sort of a mindset, I am absolutely optimistic for the future that we’ve got ahead of us.

Megan: Fantastic. It almost sounds like your role is sort of building a statewide startup incubator of sorts. As we mentioned in the opening, Michigan actually has a really interesting innovation history even in addition to the advances in the automotive industry. I wondered if you could talk a bit more about that history and why Michigan, in particular, is poised to support that sort of statewide startup ecosystem.

Ben: Yeah, absolutely. And I would even broaden it. Building the startup ecosystem is one of the essential layers, but to be able to successfully do that, we have to bring in the research universities, we have to bring in the corporate innovation ecosystem, we have to bring in the risk capital, et cetera. So yes, absolutely, startups are important. And equally as important are all of these other elements that are necessary for a startup ecosystem to thrive, but are also the levers that are just sitting there waiting for us to pull them.

And we can get into some of the details over the course of our chat today on the auto industry and how this fits into it, but Michigan does a lot more than just automotive stuff. And you noted, I think, the surfboard as an example in the intro. Absolutely correct. We have a reputation as Motor City, but Michigan’s innovation record is a lot weirder in a fun way and richer than just cars.

Early 20th century, mostly industrial moonshot innovation. So first paved mile of concrete was in Detroit in 1909. A few years later, this is when the auto sector started to really come about with Henry Ford’s moving assembly line. Everyone tends to know about those details. But during World War II, Willow Run Airport sort of smack between Detroit and Ann Arbor, Michigan they had the biggest airplane factory in the world. They were cranking out B-24 bombers once every 63 minutes, and I’ve actually been to the office that Henry Ford and Charles Lindbergh shared. It’s still at the airport. And it was pretty cool because Henry Ford had a window built into the office that looked sort of around the corner so that he could tick off as airplanes rolled out of the hanger and make sure that they were following the same high rate production mentality that the auto sector was able to develop over the decades prior. 

And so they came in to help make sure that you could leverage that industrial sector to drive very rapid production, the at-scale mentality, which is also a really important part of the notion of re-industrialization that is taking hold across the country now. Happy to get into that a bit, but yeah, Willow Run, I don’t think most folks realize that that was the biggest airplane factory in the world sitting right here in Michigan.

And all of this provided the mass production DNA that was able to help build the statewide supplier base. And today, yes, we use that for automotive, EVs, space hardware, batteries, you name it. But this is the foundation, I think, that we’ve got to be able to build on in the future. In the few decades since you saw innovations in sports, space, advanced materials, it’s like the sixties to the eighties. You said the snowboard. That was invented in Muskegon on the west side of the state in 1965.

Dow Chemical’s here in a really big way. They’ve pioneered silicone and advanced plastics in Michigan. University of Michigan’s Dr. Thomas Francis is the world’s first successful polio vaccine trials that were pioneered out of Ann Arbor, and that Big 10 research horsepower that we’ve got in the state, between the University of Michigan, Michigan State University. We also have Wayne State University in Detroit, which is a powerhouse. And then Michigan Tech University in the Upper Peninsula just recently became an R1 research institution, which essentially means those top-tier research powerhouses and that culture of tinkering matter a lot today.

I think in more recent history, you saw design and digital innovations emerge. I don’t think a lot of people appreciate that Herman Miller and Steelcase reinvented office ergonomics on the west side of the state, or that Stryker is based in Kalamazoo. They became a global medical device powerhouse over the last couple of decades, too. Michigan’s first unicorn, Duo Security, the two-factor authentication among many other things that they do there, was sold to Cisco in 2018 for 2.35 billion.

Like I said, the first unicorn in the few years since we’ve had another 10 unicorns. And I think probably what would be surprising to a lot of people is it’s in sectors well beyond mobility, it’s marketplace like StockX, FinTech, logistics, cybersecurity, of course. It’s a little bit of everything, and I think that goes to show that some of the fabric that exists within Michigan is a lot richer than what people think of, Motor City. We can scale software, we can scale life sciences innovation. It’s not just metal bending, and I talked about re-industrialization earlier. So I think about where we are today, there’s a hard tech renaissance and a broad portfolio of other high-growth sectors that Michigan’s poised to do really well in, leveraging all of that industrial base that has been around for the last century. I’m just super excited about the future and where we can take things from here.

Megan: I mean, genuinely, a really rich and diverse history of innovation that you’ve described there.

Ben: That’s right.

Megan: And last year, when Michigan’s Governor Whitmer announced this new initiative and your position, she noted the need to foster this sort of culture of innovation. And we hear that a lot that terminal in the context of company cultures. It’s interesting to hear in the context of a U.S. state’s economy. I wonder what your strategy is for building out this ecosystem, and how do you foster a state’s innovation culture?

Ben: Yeah, it’s an awesome point, and I think I mentioned earlier that I came into the role with this builder’s mentality. For me, this is how I am wired to think. This is how a lot of the companies and other founders that I spent a lot of time with, this is how they think. And so bringing this to the state government, I think of Blue Origin, Jeff Bezos’ space company, their motto, the English translation at least of it, is “Step by Step, Ferociously.” And I think about that as a lot as a proxy for how I do that within the state government. There’s a lot of iterative work that needs to happen, a lot of coaching and storytelling that happens to help folks understand how to think with that builder’s mindset. The wonderful news is that when you start having that conversation, this is one of those in these complicated political times, this is a pretty bipartisan thing, right?

The notion of how to build small businesses that create thriving main street communities while also supporting high-growth, high-tech startups that can drive prosperity for all, and population growth, while also being able to cover corporate innovation and technology transfer out of universities. All of these things touch every corner of the state.

And Michigan’s a surprisingly large and very geographically diverse state. Most of the things that we tend to be known for outside the state are in a pretty small corner of Southeast Michigan. That’s the Motor City part, but we do a lot and we have a lot of really interesting hubs for innovation and hubs for entrepreneurship, like I said, from the small mom-and-pop manufacturing shop or interest in clothing business all the way through to these insane life sciences innovations being spun out of the university. Being able to drive this culture of innovation ends up being applicable really across the board, and it just gets people really fired up when you start talking about this, fired up in a good way, which is, I think, what’s really fantastic.

There’s this notion of accelerating the talent flywheel and making sure that the state can invest in the cultivation of really rich communities and connections, and this founder culture. That stuff happens organically, generally, and when you talk about building startup ecosystems, it’s not like the state shows up and says, “Now you’re going to be more innovative and that works.” That is not the case.

And so to be able to develop those things, it’s much more about this notion of ecosystem building and getting the ingredients and puzzle pieces in the right place, applying a little bit of funding here and there, or loosening a restriction here or there, and then letting the founders do what they do best, which is build. And so this is what I think I end up being super passionate about within the state. You can lead by example in a lot of these ways, and that flywheel that I mentioned really can get going in a beautiful way when you step out of the prescriptive innovation culture mindset.

Megan: And given that role, I wonder what milestones the campaign has experienced in your first year? Could you share some highlights and some developing projects that you’re really excited about?

Ben: We had a recent one, I think that was pretty tremendous. Just a couple of months ago, Governor Whitmer signed into law a bipartisan legislation called the Michigan Innovation Fund. This was a multi-year effort that resulted in the state’s biggest investment in the innovation ecosystem development in over two decades. A lot of this funding is going to early stage venture capital firms that will be able to support the broad seeding of new companies and ideas, keep talent within the state from some of those top tier research institutions, bring in really high quality companies that early stage, growth stage companies from out of state, and then develop or supercharge some of that innovation ecosystem fabric that ties those things together. So that connective tissue that I talked about, and that was an incredible win to launch the year with.

This was just back in January, and now we’re working to get some of those funds out over the course of the next month or two so we can put them to use. What was really interesting about that was, it wasn’t just a top-down thing. This was supported from the top all the way up to and including Governor Whitmer. I mentioned bipartisan support within Michigan’s legislature and then bottom-up from all of the ecosystem partners, the founders, the investors advocating as a whole block, which I think is really powerful. Rather than trying to go for one-off things, this huge coalition of the willing got together organically and advocated for, hey, this is why this is such a great moment. This is the time to invest. And Governor Whitmer and the legislators, they heard that call, and we got something done, and so that happened relatively quickly. Like I said, biggest investment in the last two decades, and I think we’re poised to have some really great successes in the coming year as well.

Another really interesting one that I haven’t seen other states do yet, Governor Whitmer, around a year ago, signed an executive order called the Infrastructure for Innovation. Essentially, what that does is it opens up state department and agency assets to startups in the name of moving the ball forward on innovation projects. And so if you’re a startup and you need access to some very hard-to-find, very expensive, maybe like a test facility, you can use something that the state has, and all of the processes to get that done are streamlined so that you’re not beating your head against a wall. Similarly, the universities and even federal labs and corporate resources, while an executive order can’t compel those folks to do that, we’ve been finding tremendous buy-in from those stakeholders who want to volunteer access to their resources.

That does a lot of really good things, certainly for the founders, that provides them the launchpad that they need. But for those corporations and universities, and whatnot, a lot of them have these very expensive assets sitting around wildly underutilized, and they would be happy to have people come in and use them. That also gives them exposure to some of the bleeding-edge technology that a lot of these startups today are developing. I thought that was a really cool example of state government leadership using some of the tools that are available to a governor to get things moving. We’ve had a lot of early wins with startups here that have been able to leverage what that executive order was able to do for them.

Here we are talking about the MIT Technology Review to tie in an MIT piece here, we also started a Team Michigan for MIT’s REAP program. It’s the Regional Entrepreneurship Acceleration Program, and this is one of the global thought leaders on best practices for innovation ecosystem development. And so we’ve got a cohort of about a dozen key leaders from across all of those different stakeholders who need to have a seat at the table for this ecosystem development.

We go out to Cambridge twice a year for a multi-day workshop, and we get to talk about what we’ve learned as best practices, and then also learn from other cohorts from around the world on what they’ve done that is great. And then also get to hear some of the academic best practices that the MIT faculty have discovered as part of this area of expertise. And so that’s been a very interesting way for us to be able to connect outside of the state government boundaries, if you will. You sort of get out there and see where the leading edge is and then come back and be able to talk about the things that we learned from all of these other global cohorts. So always important to be focused on best practices when you’re trying to do new things, especially in government.

Megan: Sounds like there are some really fantastic initiatives going on. It sounds like a very busy first year.

Ben: It’s been a very busy first year couldn’t be more thrilled about it.

Megan: Fantastic. And in early 2023, I know that Newlab partnered with Michigan Central to establish a startup incubator too, which brought in more than a hundred startups just in its first 14 months. I wonder if you could talk a bit about how the incubator fits in with the statewide startup ecosystem and the importance of partnerships, too, for innovation.

Ben: Yeah, a key element, and I think the partnerships piece is essential here. Newlab is one of the larger components of the Southeast Michigan and especially the Detroit innovation ecosystem development. They will hit their two-year launch anniversary in just a couple of weeks, here I think. This will be mid-May, it will be two years and in that time, they’ve now got 140 plus startups all working out of their space, and Newlab they’re actually headquartered in Brooklyn, New York, but they run this big startup accelerator incubator out of Detroit as well and so this is sort of their second flagship location. They’ve been a phenomenal partner, and so speaking of the partnerships, what do those do?

They de-risk the technologies to help enable broader adoptions. Corporations can provide early revenues, the state can provide non-dilutive grant matching. Universities can bring IP and this renewable source of talent generation, and being able to stitch together all of those pieces can create some really interesting unlocks for startups to grow. But again, also this broader entrepreneurship and innovation ecosystem to really be able to thrive.

Newlab has been thrilled with their partnership in Southeast Michigan, and I think it’s a model that can be tailored across the state so that, depending on what assets are available in your backyard, you can make sure that you can best harness those for future growth.

Megan: Fantastic. What’s the long-term vision for the state’s innovation landscape when you think about it in five, 10 years from now? What do you envisage?

Ben: Amazing question. This is probably what I get most excited about. I think earlier we talked about the Willow Run B-24 bomber plant. That is what made Michigan known as the arsenal of democracy back in the day. I want Michigan to be the arsenal of innovation. We’re not trying to recreate a Silicon Valley. Silicon Valley does certain things, not trying to recreate what El Segundo wants to do in hard tech or New York City in FinTech, and all of these other things. We want to develop the thing that makes the most sense for the ingredients that Michigan can bring to bear to this challenge.

I think that becoming the Midwest arsenal of innovation, that’s something that Michigan is very well poised to use as a springboard for the decades to come. I want us to be the default launch pad for building a hard tech company, a life sciences company, an agricultural tech company. You name it. If you’ve got a design prototype and want to mass produce something, don’t want to hop coast, you want to be somewhere that has a tremendous quality of life, an affordable place, somewhere that government is at the table and willing to move fast, this is a place to do that.

That can be difficult to do in some of the more established ecosystems, especially post-covid, as a lot of them are going through really big transition periods. Michigan’s already a top 10 state for business in the next 10 years. I want us to be a top 10 state for employment, top 10 state for household median income for post-secondary education attainment, and net talent migration. Those are my four top tens that I want to see in the next 10 years. And we covered a lot of topics today, but I think those are the reasons that I am super optimistic about being able to accomplish those.

Megan: Fantastic. Well, I’m tempted to move to Michigan, so I’m sure plenty of other people will be now, too. Thank you so much, Ben. That was really fascinating.

Ben: Thanks, Megan. Really delighted to be here.

Megan: That was Ben Marchionna, chief innovation ecosystem officer at the Michigan Economic Development Corporation, whom I spoke with from Brighton, England. 

That’s it for this episode of Business Lab. I’m your host, Megan Tatum. I’m a contributing editor and host for Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print on the web and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts, and if you enjoy this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks ever so much for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Producing tangible business benefits from modern iPaaS solutions

When a historic UK-based retailer set out to modernize its IT environment, it was wrestling with systems that had grown organically for more than 175 years. Prior digital transformation efforts had resulted in a patchwork of hundreds of integration flows spanning cloud, on-premises systems, and third-party vendors, all communicating across multiple protocols. 

The company needed a way to bridge the invisible seams stitching together decades of technology decisions. So, rather than layering on yet another patch, it opted for a more cohesive approach: an integration platform as a service (iPaaS) solution, i.e. a cloud-based ecosystem that enables smooth connections across applications and data sources. By going this route, the company reduced the total cost of ownership of its integration landscape by 40%.

The scenario illustrates the power of iPaaS in action. For many enterprises, iPaaS turns what was once a costly, complex undertaking into a streamlined, strategic advantage. According to Forrester research commissioned by SAP, businesses modernizing with iPaaS solutions can see a 345% return on investment over three years, with a payback period of less than six months.

Agile integration for an AI-first world

In 2025, the business need for flexible and friction-free integration has new urgency. When core business systems can’t communicate easily, the impacts ripple across the organization: Customer support teams can’t access real-time order statuses, finance teams struggle to consolidate data for monthly closes, and marketers lack reliable insights to personalize campaigns or effectively measure ROI.

A lack of high-quality data access is particularly problematic in the AI era, which depends on current, consistent, and connected data flows to fuel everything from predictive analytics to bespoke AI copilots. To unleash the full potential of AI, enterprises must first solve for any bottlenecks that prevent information from flowing freely across their systems. They must also ensure data pipelines are reliable and well-governed; when AI models are trained on inconsistent or outdated data, the insights they generate can be misleading or incomplete—which can undermine everything from customer recommendations to financial forecasting.

iPaaS platforms are often well-suited for accomplishing this across dynamic, distributed environments. Built as cloud-native, microservices-based integration hubs, modern iPaaS platforms can scale rapidly, adapt to changing workloads, and support hybrid architectures without adding complexity. They also help simplify the user experience for everyday business users via low-code functionalities that allow both technical and non-technical employees to build workflows with simple drag-and-drop or click-to-configure interfaces.

This self-service model has practical, real-world applications across business functions: For instance, customer service agents can connect support ticketing systems with real-time inventory or shipping data, finance departments can link payment processors to accounting software, and marketing teams can sync CRM data with campaign platforms to trigger personalized outreach—all without waiting for IT to come to the rescue.

Architectural foundations for fast, flexible integration

Several key architectural elements make the agility associated with iPaaS solutions possible:

  1. API-first design that treats every connection as a reusable service
  2. Event-driven capabilities that enable real-time responsiveness
  3. Modular components that can be mixed and matched to address specific business scenarios

These principles are central to making the transition from “spaghetti architecture” to “integration fabric”—a shift from brittle point-to-point connections to intelligent, policy-driven connectivity that spans multidimensional IT environments.

This approach means that when a company wants to add a new application, onboard a new partner, or create a new customer experience, they’re able to do so by tapping into existing integration assets rather than starting from scratch—which can lead to dramatically faster deployment cycles. It also helps enforce consistency and, in some cases, security and compliance across environments (role-based access controls and built-in monitoring capabilities, for example, can allow organizations to apply standards more uniformly).

Further, studies suggest that iPaaS solutions enable companies to unlock new revenue streams by integrating previously siloed data and processes. Forrester research found that organizations adopting iPaaS solutions stand to generate nearly $1 million in incremental profit over three years by creating new digital services, improving customer experiences, and automating revenue-generating processes that were previously manual.

Where iPaaS is headed: convergence and intelligence

All this momentum is perhaps one of the reasons why the global iPaaS market, valued at approximately $12.9 billion in 2024, is projected to reach more than $78 billion by 2032—with growth rates exceeding 25% annually.

This trajectory is contingent on two ongoing trends: the convergence of integration capabilities into broader application development platforms, and the infusion of AI into the integration lifecycle.

Today, the boundaries between iPaaS, automation platforms, and AI development environments are blurring as vendors create unified solutions that can handle everything from basic data synchronization to complex business processes. 

AI and machine learning capabilities are also being embedded directly into integration platforms. Soon, features like predictive maintenance of integration flow or intelligent routing of data based on current conditions are likely to become table stakes. Already, integration platforms are becoming smarter and more autonomous, capable of optimizing themselves and, in some cases, even initiating self-healing actions when problems arise.

At the same time, this shift is transforming how businesses think about integration as a dynamic enabler of AI strategy. In the near future, robust integration frameworks will be essential to operationalize AI at scale and feed these systems the rich, contextual data they need to deliver meaningful insights.

Building integration as competitive advantage

In addition to the retail modernization story detailed earlier, a few more real-world examples highlight the potential of iPaaS:

  • A chemicals manufacturer migrated 363 legacy interfaces to an iPaaS platform and now spins up new integrations 50% faster.
  • A North American bottling company reduced integration runtime costs by more than 50% while supporting 12 legal entities on a single cloud ERP instance through common APIs.
  • A global shipping-technology firm connected its CRM and third-party systems via cloud-based iPaaS solutions, enabling 100% touchless order fulfillment and a 95% cut in cost centers after a nine-month rollout in its first region.

Taken together, these examples make a compelling case for integration as strategy, not just infrastructure. They reflect a shift in mindset, where integration is democratized and embedded into how every team, not just IT, gets work done. Companies that treat integration as a core capability versus an IT afterthought are reaping tangible, enterprise-wide benefits, from faster go-to-market timelines and reduced operational costs to fully automated business processes.

As AI reshapes business processes and customer standards continue to climb, enterprises are realizing that integration architecture determines not only what they can build today, but how quickly they can adapt to whatever comes tomorrow.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Scaling integrated digital health

Around the world, countries are facing the challenges of aging populations, growing rates of chronic disease, and workforce shortages, leading to a growing burden on health care systems. From diagnosis to treatment, AI and other digital solutions can enhance the efficiency and effectiveness of health care, easing the burden on straining systems. According to the World Health Organization (WHO), spending an additional $0.24 per patient per year on digital health interventions could save more than two million lives from non-communicable diseases over the next decade.

To work most effectively, digital solutions need to be scaled and embedded in an ecosystem that ensures a high degree of interoperability, data security, and governance. If not, the proliferation of point solutions— where specialized software or tools focus on just one specific area or function—could lead to silos and digital canyons, complicating rather than easing the workloads of health care professionals, and potentially impacting patient treatment. Importantly, technologies that enhance workforce productivity should keep humans in the loop, aiming to augment their capabilities, rather than replace them. 

Through a survey of 300 health care executives and a program of interviews with industry experts, startup leaders, and academic researchers, this report explores the best practices for success when implementing integrated digital solutions into health care, and how these can support decision-makers in a range of settings, including laboratories and hospitals. 

Key findings include: 

Health care is primed for digital adoption. The global pandemic underscored the benefits of value-based care and accelerated the adoption of digital and AI-powered technologies in health care. Overwhelmingly, 96% of the survey respondents say they are “ready and resourced” to use digital health, while one in four say they are “very ready.” However, 91% of executives agree interoperability is a challenge, with a majority (59%) saying it will be “tough” to solve. Two in five leaders say balancing security with usability is the biggest challenge for digital health. With the adoption of cloud solutions, organizations can enjoy the benefits of modernized IT infrastructure: 36% of the survey respondents believe scalability is the main benefit, followed by improved security (28%). 

Digital health care can help health care institutions transform patient outcomes—if built on the right foundations. Solutions like AI-powered diagnostics, telemedicine, and remote monitoring can offer measurable impact across the patient journey, from improving early disease detection to reducing hospital readmission rates. However, these technologies can only support fully connected health care when scaled up and embedded in ecosystems with robust data governance, interoperability, and security. 

Health care data has immense potential—but fragmentation and poor interoperability hinder impact. Health care systems generate vast quantities of data, yet much of it remains siloed or unusable due to inconsistent formats and incompatible IT systems, limiting scalability. 

Digital tools must augment, not overload, the workforce. With global health care workforce shortages worsening, digital solutions like clinical decision support tools, patient prediction, and remote monitoring can be seen as essential aids rather than threats to the workforce. Successful deployment depends on usability, clinician engagement, and training. 

Regulatory evolution, open data policies, and economic sustainability are key to scaling digital health. Even the best digital tools struggle to scale without reimbursement frameworks, regulatory support, and viable business models. Open data ecosystems are needed to unleash the clinical and economic value of innovation. Regulatory and reimbursement innovation is also critical to transitioning from pilot projects to high-impact, system-wide adoption.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Powering next-gen services with AI in regulated industries 

Businesses in highly-regulated industries like financial services, insurance, pharmaceuticals, and health care are increasingly turning to AI-powered tools to streamline complex and sensitive tasks. Conversational AI-driven interfaces are helping hospitals to track the location and delivery of a patient’s time-sensitive cancer drugs. Generative AI chatbots are helping insurance customers answer questions and solve problems. And agentic AI systems are emerging to support financial services customers in making complex financial planning and budgeting decisions. 

“Over the last 15 years of digital transformation, the orientation in many regulated sectors has been to look at digital technologies as a place to provide more cost-effective and meaningful customer experience and divert customers from higher-cost, more complex channels of service,” says Peter Neufeld, who leads the EY Studio+ digital and customer experience capability at EY for financial services companies in the UK, Europe, the Middle East, and Africa. 

For many, the “last mile” of the end-to-end customer journey can present a challenge. Services at this stage often involve much more complex interactions than the usual app or self-service portal can handle. This could be dealing with a challenging health diagnosis, addressing late mortgage payments, applying for government benefits, or understanding the lifestyle you can afford in retirement. “When we get into these more complex service needs, there’s a real bias toward human interaction,” says Neufeld. “We want to speak to someone, we want to understand whether we’re making a good decision, or we might want alternative views and perspectives.” 

But these high-cost, high-touch interactions can be less than satisfying for customers when handled through a call center if, for example, technical systems are outdated or data sources are disconnected. Those kinds of problems ultimately lead to the possibility of complaints and lost business. Good customer experience is critical for the bottom line. Customers are 3.8 times more likely to make return purchases after a successful experience than after an unsuccessful one, according to Qualtrics. Intuitive AI-driven systems— supported by robust data infrastructure that can efficiently access and share information in real time— can boost the customer experience, even in complex or sensitive situations. 

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Shoring up global supply chains with generative AI

The outbreak of covid-19 laid bare the vulnerabilities of global, interconnected supply chains. National lockdowns triggered months-long manufacturing shutdowns. Mass disruption across international trade routes sparked widespread supply shortages. Costs spiralled. And wild fluctuations in demand rendered tried-and-tested inventory planning and forecasting tools useless.

“It was the black swan event that nobody had accounted for, and it threw traditional measures for risk and resilience out the window,” says Matthias Winkenbach, director of research at the MIT Center for Transportation and Logistics. “Covid-19 showed that there were vulnerabilities in the way the supply chain industry had been running for years. Just-in-time inventory, a globally interconnected supply chain, a lean supply chain—all of this broke down.”

It is not the only catastrophic event to strike supply chains in the last five years either. For example, in 2021 a six-day blockage of the Suez Canal—a narrow waterway through which 30% of global container traffic passes—added further upheaval, impacting an estimated $9.6 billion in goods each day that it remained impassable.

These shocks have been a sobering wake-up call. Now, 86% of CEOs cite resilience as a priority issue in their own supply chains. Amid ongoing efforts to better prepare for future disruptions, generative AI has emerged as a powerful tool, capable of surfacing risk and solutions to circumnavigate threats.

Download the full article.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Fueling seamless AI at scale

From large language models (LLMs) to reasoning agents, today’s AI tools bring unprecedented computational demands. Trillion-parameter models, workloads running on-device, and swarms of agents collaborating to complete tasks all require a new paradigm of computing to become truly seamless and ubiquitous.

First, technical progress in hardware and silicon design is critical to pushing the boundaries of compute. Second, advances in machine learning (ML) allow AI systems to achieve increased efficiency with smaller computational demands. Finally, the integration, orchestration, and adoption of AI into applications, devices, and systems is crucial to delivering tangible impact and value.

Silicon’s mid-life crisis

AI has evolved from classical ML to deep learning to generative AI. The most recent chapter, which took AI mainstream, hinges on two phases—training and inference—that are data and energy-intensive in terms of computation, data movement, and cooling. At the same time, Moore’s Law, which determines that the number of transistors on a chip doubles every two years, is reaching a physical and economic plateau.

For the last 40 years, silicon chips and digital technology have nudged each other forward—every step ahead in processing capability frees the imagination of innovators to envision new products, which require yet more power to run. That is happening at light speed in the AI age.

As models become more readily available, deployment at scale puts the spotlight on inference and the application of trained models for everyday use cases. This transition requires the appropriate hardware to handle inference tasks efficiently. Central processing units (CPUs) have managed general computing tasks for decades, but the broad adoption of ML introduced computational demands that stretched the capabilities of traditional CPUs. This has led to the adoption of graphics processing units (GPUs) and other accelerator chips for training complex neural networks, due to their parallel execution capabilities and high memory bandwidth that allow large-scale mathematical operations to be processed efficiently.

But CPUs are already the most widely deployed and can be companions to processors like GPUs and tensor processing units (TPUs). AI developers are also hesitant to adapt software to fit specialized or bespoke hardware, and they favor the consistency and ubiquity of CPUs. Chip designers are unlocking performance gains through optimized software tooling, adding novel processing features and data types specifically to serve ML workloads, integrating specialized units and accelerators, and advancing silicon chip innovations, including custom silicon. AI itself is a helpful aid for chip design, creating a positive feedback loop in which AI helps optimize the chips that it needs to run. These enhancements and strong software support mean modern CPUs are a good choice to handle a range of inference tasks.

Beyond silicon-based processors, disruptive technologies are emerging to address growing AI compute and data demands. The unicorn start-up Lightmatter, for instance, introduced photonic computing solutions that use light for data transmission to generate significant improvements in speed and energy efficiency. Quantum computing represents another promising area in AI hardware. While still years or even decades away, the integration of quantum computing with AI could further transform fields like drug discovery and genomics.

Understanding models and paradigms

The developments in ML theories and network architectures have significantly enhanced the efficiency and capabilities of AI models. Today, the industry is moving from monolithic models to agent-based systems characterized by smaller, specialized models that work together to complete tasks more efficiently at the edge—on devices like smartphones or modern vehicles. This allows them to extract increased performance gains, like faster model response times, from the same or even less compute.

Researchers have developed techniques, including few-shot learning, to train AI models using smaller datasets and fewer training iterations. AI systems can learn new tasks from a limited number of examples to reduce dependency on large datasets and lower energy demands. Optimization techniques like quantization, which lower the memory requirements by selectively reducing precision, are helping reduce model sizes without sacrificing performance. 

New system architectures, like retrieval-augmented generation (RAG), have streamlined data access during both training and inference to reduce computational costs and overhead. The DeepSeek R1, an open source LLM, is a compelling example of how more output can be extracted using the same hardware. By applying reinforcement learning techniques in novel ways, R1 has achieved advanced reasoning capabilities while using far fewer computational resources in some contexts.

The integration of heterogeneous computing architectures, which combine various processing units like CPUs, GPUs, and specialized accelerators, has further optimized AI model performance. This approach allows for the efficient distribution of workloads across different hardware components to optimize computational throughput and energy efficiency based on the use case.

Orchestrating AI

As AI becomes an ambient capability humming in the background of many tasks and workflows, agents are taking charge and making decisions in real-world scenarios. These range from customer support to edge use cases, where multiple agents coordinate and handle localized tasks across devices.

With AI increasingly used in daily life, the role of user experiences becomes critical for mass adoption. Features like predictive text in touch keyboards, and adaptive gearboxes in vehicles, offer glimpses of AI as a vital enabler to improve technology interactions for users.

Edge processing is also accelerating the diffusion of AI into everyday applications, bringing computational capabilities closer to the source of data generation. Smart cameras, autonomous vehicles, and wearable technology now process information locally to reduce latency and improve efficiency. Advances in CPU design and energy-efficient chips have made it feasible to perform complex AI tasks on devices with limited power resources. This shift toward heterogeneous compute enhances the development of ambient intelligence, where interconnected devices create responsive environments that adapt to user needs.

Seamless AI naturally requires common standards, frameworks, and platforms to bring the industry together. Contemporary AI brings new risks. For instance, by adding more complex software and personalized experiences to consumer devices, it expands the attack surface for hackers, requiring stronger security at both the software and silicon levels, including cryptographic safeguards and transforming the trust model of compute environments.

More than 70% of respondents to a 2024 DarkTrace survey reported that AI-powered cyber threats significantly impact their organizations, while 60% say their organizations are not adequately prepared to defend against AI-powered attacks.

Collaboration is essential to forging common frameworks. Universities contribute foundational research, companies apply findings to develop practical solutions, and governments establish policies for ethical and responsible deployment. Organizations like Anthropic are setting industry standards by introducing frameworks, such as the Model Context Protocol, to unify the way developers connect AI systems with data. Arm is another leader in driving standards-based and open source initiatives, including ecosystem development to accelerate and harmonize the chiplet market, where chips are stacked together through common frameworks and standards. Arm also helps optimize open source AI frameworks and models for inference on the Arm compute platform, without needing customized tuning. 

How far AI goes to becoming a general-purpose technology, like electricity or semiconductors, is being shaped by technical decisions taken today. Hardware-agnostic platforms, standards-based approaches, and continued incremental improvements to critical workhorses like CPUs, all help deliver the promise of AI as a seamless and silent capability for individuals and businesses alike. Open source contributions are also helpful in allowing a broader range of stakeholders to participate in AI advances. By sharing tools and knowledge, the community can cultivate innovation and help ensure that the benefits of AI are accessible to everyone, everywhere.

Learn more about Arm’s approach to enabling AI everywhere.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Building customer-centric convenience

In the U.S., two-thirds of the country’s 150,000 convenience stores are run by independent operators. Mom-and-pop shops, powered by personal relationships and local knowledge, are the backbone of the convenience sector. These neighborhood operators have long lacked the resources needed to compete with larger chains when it comes to technology, operations, and customer loyalty programs. 

As consumer expectations evolve, many small business owners find themselves grappling with outdated systems, rising costs, and limited digital tools to keep up.

“What would happen if these small operations could combine their knowledge of their market, of their neighborhood, with the state-of-the-art technology?” asks GM of digital products, mobility, and convenience for the Americas at bp, Tarang Sethia. That question is shaping a years-long, multi-pronged initiative to bring modern retail tools, like cloud-connected point-of-sale systems and personalized AI, into the hands of local convenience store operators, without stripping their independence. 

Sethia’s mission is to close the digital gap. bp’s newly launched Earnify app centralizes loyalty rewards for convenience stores across the country, helping independent stores build repeat business with data-informed promotions. Behind the scenes, a cloud-based operating system can proactively monitor store operations and infrastructure to automate fixes to routine issues and reduce costly downtime. This is especially critical for businesses that double as their own IT departments. 

“We’ve aggregated all of that into one offering for our customers. We proactively monitor it. We fix it. We take ownership of making sure that these systems are up. We make sure that the systems are personalizing offers for the customers,” says Sethia. 

But the goal isn’t to corporatize corner stores. “We want them to stay local,” says Sethia. “We want them to stay the mom-and-pop store operator that their customers trust, but we are providing them the tools to run their stores more efficiently and to delight their guests.”

From personalizing promotions to proactively resolving technical issues to optimizing in-store inventory, the success of AI should be measured, says Sethia, by its ability to make frontline workers more effective and customers more loyal.

The future, Sethia believes, lies in thoughtful integration of technology that centers humans rather than replacing them. 

“AI and other technologies should help us create an ecosystem that does not replace humans, but actually augments their ability to serve consumers and to serve the consumers so well that the consumers don’t go back to their old ways.”

This episode of Business Lab is produced in association with Infosys Cobalt.

Full Transcript 

Megan Tatum: From MIT Technology Review, I’m Megan Tatum, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. 

This episode is produced in partnership with Infosys Cobalt. 

Our topic today is innovating with AI. As companies move along in their journey to digitalization and AI adoption, we’re starting to see real-world business models that demonstrate the innovation these emerging technologies enable. 

Two words for you: ecosystem innovation. 

My guest today is Tarang Sethia, the GM of digital products, mobility and convenience for the Americas at BP. 

Welcome, Tarang.

Tarang Sethia: Thank you.

Megan: Lovely to have you. Now, for a bit of context just to start with, could you give us some background about the current convenience store and gas station landscape in the United States and what the challenges are for owners and customers right now?

Tarang: Absolutely. What is important to understand is, what is the state of the market? If you look at the convenience and mobility market, it is a very fragmented market. The growth and profitability are driven by consumer loyalty, store experience, and also buying power of the products that they sell to the customers that come into their stores.

And from an operations perspective, there is a vast difference. If you put the bucket of these single-store smaller operators, these guys are very well run, they are in the community, they know their customers. Sometimes they even know the frequent buyers that are coming in, and they address them by name and keep the product ready. They know their communities and customers, and they have a personal affinity with them. They also know their likes and dislikes. But they also need to rapidly change to the changing needs of the customers. These mom-and-pop stores represent the core of the convenience market. And these constitute about 60% of the entire market.

Now, where the fragmentation lies is, there are also larger operations that are equally motivated to develop strong relationships with customers and they have the scale. They may not match the personal affinity of these mom-and-pop store operators, but they do have the capital to actually leverage data, technology, AI, to personalize and customize their stores for the consumers or the customers that come to their stores. 

And this is like the 25% or 30% of the market. Just to put that number in perspective, out of the 150,000 convenience stores in the US market, 60% constitute almost 100,000 stores, which are mom-and-pop operated. The rest are through organized retail. Okay.

Now let me talk about the problems that they face. In today’s day and age, these mom-and-pop stores don’t have the capital to create a loyalty program and to create those offers that make customers choose to come to the store instead of going to somebody else. They also don’t have a simpler operations technology and the operations ecosystem. What I mean is that they don’t have the systems that stay up, these are still legacy POS systems that run their stores. So they spend a lot of time making the transaction happen.

Finally, what they pay for, say, a bottle of soda, compared to the larger operation, because of the lack of buying power, also eats into their margin. So overall, the problems are that they’re not able to delight their guests with loyalty. Their operations are not simple, and so they do a lot of work to keep their operations up to date and pay a lot more for their operations, both technology and convenience operations. That’s kind of the summary.

Megan: Right, and I suppose there’s a way to help them address these challenges. I know bp has created this new way to reach convenience store owners to offer various new opportunities and products. Could you tell us a bit about what you’ve been working on? For example, I know there’s an app, point of sale and payment systems, and a snack brand, and also how these sort of benefit convenience store owners and their customers in this climate that we’re talking about.

Tarang: So bp is in pursuit of these digital first customer experiences that don’t replace the one-on-one human interactions of mom-and-pop store operators, but they amplify that by providing them with an ecosystem that helps them delight their guests, run their stores simply and more efficiently, and also reduce their cost while doing so. And what we have done as bp is, we’ve launched a suite of customer solutions and an innovative retail operating system experience. We’ve branded it Crosscode so that it works from the forecourt to the backcourt, it works for the consumers, it works for the stores to run their stores more efficiently, and we can leverage all kinds of technologies like AI to personalize and customize for the customers and the stores.

The reason why we did this is, we asked ourselves, what would happen if these small operations could combine their knowledge of their market, of their neighborhood, with the state-of-the-art technology? That’s how we came up with a consumer app called Earnify. It is kind of the Uber of loyalty programs. We did not name it BPme. We did not name it BP Rewards or ampm or Thorntons. We created one standardized loyalty program that would work in the entire country to get more loyal consumers and drive their frequency, and we’ve scaled it to about 8,000 stores in the last year, and the results are amazing. There are 68% more active, loyal consumers that are coming through Earnify nationally. 

And the second piece, which is even more important is, which a lot of companies haven’t taken care of, is a simple to operate, cloud-based retail operating system, which is kind of the POS, point of sale, and the ecosystem of the products that they sell to customers and payment systems. We have applied AI to make a lot of tasks automated in this retail operating system.

What that has led to is 20% reduction in the operating costs for these mom-and-pop store operators. That 20% reduction in operating costs, goes directly to the bottom line of these stores. So now, the mom-and-pop store operators are going to be able to delight their guests, keeping their customers loyal. Number two, they’re able to spend less money on running their store operations. And number three, very, very, very important, they are able to spend more time serving the guests instead of running the store.

Megan: Yeah, absolutely. Really fantastic results that you’ve achieved there already. And you touched on a couple of the sort of technologies you’ve made use of there, but I wondered if you could share a bit more detail on what additional technologies, like cloud and AI, did you adopt and implement, and perhaps what were some of the barriers to adoption as well?

Tarang: Absolutely. I will first start with how did we enable these mom-and-pop store operators to delight their guests? The number one thing that we did was we first started with a basic points-based loyalty program where their guests earn points and value for both fueling at the fuel pump and buying convenience store items inside the store. And when they have enough points to redeem, they can redeem them either way. So they have value for going from the forecourt to the backcourt and backcourt to the forecourt. Number one thing, right? Then we leveraged data, machine learning, and artificial intelligence to personalize the offer for customers.

If you’re on Earnify and I am in New York, and if I were a bagel enthusiast, then it would send me offers of a bagel plus coffee. And say my wife likes to go to a convenience store to quickly pick up a salad and a diet soda. She would get offers for that, right? So personalization. 

What we also applied is, now these mom-and-pop store operators, depending on the changing seasons or the changing landscape, could create their own offers and they could be instantly available to their customers. That’s how they are able to delight their guests. Number two is, these mom-and-pop store operators, their biggest problem with technology is that it goes down, and when it goes down, they lose sales. They are on calls, they become the IT support help desk, right? They’re trying to call five different numbers.

So we first provided a proactively monitored help desk. So when we leveraged AI technology to monitor what is working in their store, what is not working, and actually look at patterns to find out what may be going down, like a PIN pad. We would know hours before, looking at the patterns that the PIN pad may have issues. We proactively call the customer or the store to say, “Hey, you may have some problems with the PIN pad. You need to replace it, you need to restart it.”

What that does is, it takes away the six to eight hours of downtime and lost sales for these stores. That’s a proactively monitored solution. And also, if ever they have an issue, they need to call one number, and we take ownership of solving the problems of the store for them. Now, it’s almost like they have an outsourced help desk, which is leveraging AI technology to both proactively monitor, resolve, and also fix the issues faster because we now know that store X also had this issue and this is what it took to resolve, instead of constantly trying to resolve it and take hours.

The third thing that we’ve done is we have put in a cloud-based POS system so we can constantly monitor their POS. We’ve connected it to their back office pricing systems so they can change the prices of products faster, and [monitor] how they are performing. This actually helps the store to say, “Okay, what is working, what is not working? What do I need to change?” in almost near real-time, instead of waiting hours or days or weeks to react to the changing customer needs. And now they don’t need to make a decision. Do I have the capital to invest in this technology? The scale of bp allows them to get in, to leverage technology that is 20% cheaper and is working so much better for them.

Megan: Fantastic. Some really impactful examples of how you’ve used technology there. Thank you for that. And how has bp also been agile or quick to respond to the data it has received during this campaign?

Tarang: Agility is a mindset. What we’ve done is to bring in a customer-obsessed mindset. Like our leader Greg Franks talks about, we have put the customer at the heart of everything that we do. For us, customers are people who come to our stores and the people on the frontline who serve them. Their needs are of the utmost importance. What we did was, we changed how we went to business about them. Instead of going to vendors and putting vendors in charge of the store technology and consumer technology, we took ownership. We built out a technology team that was trained in the latest tools and technologies like AI, like POS, like APIs.

Then we changed the processes of how quickly we go to market. Instead of waiting two years on an enterprise project and then delivering it three years later, what we said was, “Let’s look at an MVP experience, most valuable experience delivered through a product for the customers.” And we started putting it in the stores so that the store owners could start delighting their guests and learning. Some things worked, some didn’t, but we learned much faster and were able to react almost on a weekly basis. Our store owners now get these updates on a biweekly basis instead of waiting two years or three years.

Third, we’ve applied an ecosystem mindset. Companies like Airbnb and Uber are known for their aggregator business models. They don’t do everything themselves, and we don’t do everything ourselves. But what we have done is, we’ve become an aggregator of all the capabilities, like consumer app, like POS, like back office or convenience value chain, like pricing, like customer support. We’ve aggregated all of that into one offering for our customers. We proactively monitor it. We fix it. We take ownership of making sure that these systems are up. We make sure that the systems are personalizing offers for the customers. So the store owner can just focus on delighting their guests.

We have branded this as Crosscode Retail Operating System, and we are providing it as a SaaS service. You can see in the name, there’s no bp in the name because, unlike the very big convenience players, we are not trying to make them into a particular brand that we want them. We want them to stay local. We want them to stay the mom-and-pop store operator that their customers trust, but we are providing them the tools to run their stores more efficiently and to delight their guests.

Megan: Really fantastic. And you mentioned that this was a very customer-centric approach that you took. So, how important was it to focus on that customer experience, in addition to the 

technology and all that it can provide?

Tarang: The customer experience was the most important thing. We could have started with a project and determined, “Hey, this is how it makes money for bp first.” But we said, “Okay, let’s look at solving the core problems of the customer.” Our customer told us, “Hey, I want to pay frictionlessly at the pump, when I come to the pump.” So what did we do? We launched pay for fuel feature, where they can come to the pump, they don’t need to take their wallet out. They just take their app out and choose what pump and what payment method. 

Then they said, “Hey, I don’t get any value from buying fuel every week and going inside. These are two different stores for me.” So what did we do? We launched a unified loyalty program. Then the store owner said, “Hey, my customers don’t like the same offers that you do nationally.” So what did we do? We created both personalized offers and build-your-own offers for the store owner. 

Finally, to be even more customer-obsessed, we said that being customer-obsessed doesn’t just happen. We have to measure it. We are constantly measuring how the consumers are rating the offers in our app and how the consumers are rating that experience. And we made a dramatic shift. The consumers, if you go to the Earnify app in the app store, they’re rating it as 4.9. 

We have 68% more loyal consumers. We are also measuring these loyal consumers, how often they are coming and what they are buying. Then we said, “Okay, from a store owner perspective, their satisfaction is important.” We are constantly measuring the satisfaction of these store operators and the frontline employees who are operating the systems. Customer satisfaction used to be three out of 10 when we first started, and now, it has reached an 8.7 out of 10, and we are constantly monitoring. Some stores go down because we haven’t paid enough attention. We learn from it and we apply.

Finally, what we’ve also done is with this Earnify app, instead of a local store operator having their own loyalty program with a few hundred customers, how many people are going to download that app? We’ve given them a network of millions of consumers nationwide that can be part of the ecosystem. The technologies that we are using are helping the stores delight the consumers, helping the stores providing the value to the consumers that they see, helping the stores provide the experience to the consumers that they see, and also helping bp to provide the seamless experience to the frontline employees.

Megan: Fantastic. There are some incredible results there in terms of customer satisfaction. Are there any other metrics of success that you’re tracking along the way? Any other kind of wins that you can share so far in the implementation of all of this?

Tarang: We are tracking a very important deeper metric so that we can hold ourselves accountable, the uptime of the store. The meantime to resolve the issues, the sales uplift of the stores, the transaction uplift of the stores. Are the consumers buying more? Are the consumers rating their consumer experience higher? Are they engaging in different offers? Because we may do hundreds of offers. If consumers don’t like it, then they are just offers.

On this journey, we are measuring every metric, and we are making it transparent. That entire team is on the same scorecard of metrics that the customers or the store owners have for the performance of their business. Their performance and the consumer delight are embedded into the metrics on how all of us digital employees are measured.

Megan: Yes, absolutely. It sounds like you’re measuring success through several different lenses, so it’s really interesting to hear about that approach. Given where you are in your journey, as many companies struggle to adopt and implement AI and other emerging technologies, is there any advice that you’d offer, given the lessons you’ve learned so far?

Tarang: On AI, we have to keep it very, very simple. Instead of saying that, “Hey, we are going to create, we are going to use AI technology for the sake of it,” we have to tie the usage of AI technology to the impact it has on the customers. I’ll use four examples on how we are doing that. 

When we say we are leveraging AI to personalize the offers, leveraging data for consumers, what are we measuring, and what are we applying? We are looking at the data of consumer behavior and applying AI models to see, based on the current transactions, how would they react, what would they buy? People living in Frisco, Texas, age, whatever, what do they buy, when do they come, and what are they buying other places?

So let’s personalize offers so that they make that left turn. And we are measuring, whether personalization is driving the delight enough that the consumers come back to the store and don’t go back to their old ways, number one. Number two, what we are also doing is, like I mentioned earlier, we are leveraging data and AI technologies to constantly monitor the trends right in the marketplace, and we’ve created some automation to leverage those trends and act quickly, which also leads to some level of personalization. It’s more regionalization. 

Now, as we do that, we also look at the patterns of what equipment or what transactions are slowing down and we proactively monitor and resolve them. So if the store has issues and if payment has issue, loyalty has issue, or POS has issue, back office has issue, we proactively work on it to resolve that.

Number three that we are doing is, we are looking at the convenience market and we are looking at what is selling and what is in stock, so we are optimizing our supply chain inventory, pricing, and inventory, so that we could enable the store owners to cater to their consumers who come to the stores. This is actually really helping us have the product in the store that the customer actually came for.

Megan: Absolutely. Looking ahead, when you think about the path to generative AI and other emerging technologies? Is there something that excites you the most, kind of looking ahead in the years to come as well?

Tarang: That’s a great question, Megan. I’m going to answer that question a little bit philosophically because as technologists, our tendency is, whenever there is a new technology like generative AI, to create a lot of toys with it, right? But I’ve learned through this experience that whatever technology we use, like generative AI, we need to tie it to the objectives and key results for the consumer and the store. 

As an example, if we are going to leverage generative AI to do personalized offers, to do personalized creative, then we need to be able to create frameworks to measure the impact on the store, to measure the impact on the consumer, and tie that directly to the use of the technology. Are we making the consumers more loyal? Are they coming more often? Are they buying more? Because only then, we will have adopters of that technology, both the store and stores driving the consumers to adopt.

Number two, AI and other technologies should help us create an ecosystem that does not replace humans, but actually augments their ability to serve consumers and to serve the consumers so well that the consumers don’t go back to their old ways. That’s where we have to stay very, very customer-obsessed instead of just business-obsessed.

When I say ecosystem, what excites me the most is, think about it. These small mom-and-pop store operators, these generational businesses, which are the core of the American dream or entrepreneurialism, we are going to enable them with an ecosystem like an Airbnb of mobility and convenience, where they get a loyalty program with personalization, where they can delight their guests. They get technology to run their stores very, very efficiently and reduce their cost by 20%.

Number three, and very important, their frontline employees look like heroes to the guests that are walking into the store. If we achieve these three things and create an ecosystem, then that will drive prosperity leveraging technology. And bp, as a company, we would love to be part of that.

Megan: I think that’s fantastic advice. Thank you so much, Tarang, for that.

Tarang: Thank you.

Megan: That was Tarang Sethia, the GM of digital products, mobility and convenience for the Americas at bp, whom I spoke with from Brighton, England. 

That’s it for this episode of Business Lab. I’m your host, Megan Tatum. I’m a contributing editor and host for Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print on the web and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts, and if you enjoy this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks ever so much for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.