Building supply chain resilience with AI

If the last five years have taught businesses with complex supply chains anything, it is that resilience is crucial. In the first three months of the covid-19 pandemic, for example, supply-chain leader Amazon grew its business 44%. Its investments in supply chain resilience allowed it to deliver when its competitors could not, says Sanjeev Maddila, worldwide head of supply chain solutions at Amazon Web Services (AWS), increasing its market share and driving profits up 220%. A resilient supply chain ensures that a company can meet its customers’ needs despite inevitable disruption.

Today, businesses of all sizes must deliver to their customers against a backdrop of supply chain disruptions, with technological changes, shifting labor pools, geopolitics, and climate change adding new complexity and risk at a global scale. To succeed, they need to build resilient supply chains: fully digital operations that prioritize customers and their needs while establishing a fast, reliable, and sustainable delivery network.

The Canadian fertilizer company Nutrien, for example, operates two dozen manufacturing and processing facilities spread across the globe and nearly 2,000 retail stores in the Americas and Australia. To collect underutilized data from its industrial operations, and gain greater visibility into its supply chain, the company relies on a combination of cloud technology and artificial intelligence/machine learning (AI/ML) capabilities.

“A digital supply chain connects us from grower to manufacturer, providing visibility throughout the value chain,” says Adam Lorenz, senior director for strategic fleet and indirect procurement at Nutrien. This visibility is critical when it comes to navigating the company’s supply chain challenges, which include seasonal demands, weather dependencies, manufacturing capabilities, and product availability. The company requires real-time visibility into its fleets, for example, to identify the location of assets, see where products are moving, and determine inventory requirements.

Currently, Nutrien can locate a fertilizer or nutrient tank in a grower’s field and determine what Nutrien products are in it. By achieving that “real-time visibility” into a tank’s location and a customer’s immediate needs, Lorenz says the company “can forecast where assets are from a fill-level perspective and plan accordingly.” In turn, Nutrien can respond immediately to emerging customer needs, increasing company revenue while enhancing customer satisfaction, improving inventory management, and optimizing supply chain operations.

“For us, it’s about starting with data creation and then adding a layer of AI on top to really drive recommendations,” says Lorenz. In addition to improving product visibility and asset utilization, Lorenz says that Nutrien plans to add AI capabilities to its collaboration platforms that will make it easier for less-tech-savvy customers to take advantage of self-service capabilities and automation that accelerates processes and improves compliance with complex policies.

To meet and exceed customer expectations with differentiated service, speed, and reliability, all companies need to similarly modernize their supply chain operations. The key to doing so—and to increasing organizational resilience and sustainability—will be applying AI/ML to their extensive operational data in the cloud.

Resilience as a business differentiator

Like Nutrien, a wide variety of organizations from across industries are discovering the competitive advantages of modernizing their supply chains. A pharmaceutical company that aggregates its supply chain data for greater end-to-end visibility, for example, can provide better product tracking for critically ill customers. A retail startup undergoing meteoric growth can host its workloads in the cloud to support sudden upticks in demand while minimizing operating costs. And a transportation company can achieve inbound supply chain savings by evaluating the total distance its fleet travels to reduce mileage costs and CO2 emissions.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Transforming the energy industry through disruptive innovation

In the rhythm of our fast-paced lives, most of us don’t stop to think about where electricity comes from or or how it powers homes, industries, and the technologies that connect people around the world. As populations and economies grow, energy demands are set to increase by 50% by 2050–challenging century-old energy systems to adapt with innovative and agile solutions. This comes at a time when climate change is making its presence felt more than ever; 2023 marked the warmest year since records began in 1850, crossing the 1.5 degrees global warming threshold. 

Nadège Petit of Schneider Electric confronts this challenge head-on, saying, “We have no choice but to change the way we produce, distribute, and consume energy, and do it sustainably to tackle both the energy and climate crises.” She explains further that digital technologies are key to navigating this path, and Schneider Electric’s AI-enabled IoT solutions can empower customers to take control of their energy use, enhancing efficiency and resiliency.

Petit acknowledges the complexity of crafting and implementing robust sustainability strategies. She highlights the importance of taking an incremental stepwise approach, and adopting open standards, to drive near-term impact while laying the foundation for long-term decarbonization goals. 

Because the energy landscape is evolving rapidly, it’s critical to not just keep pace but to anticipate and shape the future. Much like actively managing health through food and fitness regimes, energy habits need to be monitored as well. This can transform passive consumers to become energy prosumers–those that produce, consume, and manage energy. Petit’s vision is one where “buildings and homes generate their own energy from renewable sources, use what’s needed, and feed the excess back to the grid.”  

To catalyze this transformation, Petit underscores the power of collaboration and innovation. For example, Schneider Electric’s SE Ventures invests in startups to provide new perspectives and capabilities to accelerate sustainable energy solutions. 

“It’s all about striking a balance to ensure that our relationship with startups are mutually beneficial, knowing when to provide guidance and resources when they need it, but also when to step back and allow them to thrive independently,” says Petit. 

This episode of Business Lab is produced in partnership with Schneider Electric. 

Full transcript 

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. 

Our topic today is disruptive innovation in the energy industry and beyond. We use energy every day. It powers our homes, buildings, economies, and lifestyles, but where it came from or how our use affects the global energy ecosystem is changing, and our energy ecosystem needs to change with it.

 My guest is Nadège Petit, the chief innovation officer at Schneider Electric. 

This podcast is produced in partnership with Schneider Electric. 

Welcome, Nadège. 

Nadège Petit: Hi, everyone. Thank you for having me today. 

Laurel: Well, we’re glad you’re here. 

Let’s start off with a simple question to build that context around our conversation. What is Schneider Electric’s mission? And as the chief innovation officer leading its Innovation at the Edge team, what are some examples of what the team is working on right now? 

Nadège: Let me set up this scene a little bit here. In recent years, our world has been shaped by a series of significant disruptions. The pandemic has driven a sharp increase in the demand of digital tools and technologies, with a projected 6x growth in the number of IoT devices between 2020 and 2030, and a 140x growth in IP traffic between 2020 and 2040. 

Simultaneously, there has been a parallel acceleration in energy demands. Electrical consumption has been increasing by 5,000 terawatt hours every 10 years over the past two decades. This is set to double in the next 10 years and then quadruple by 2040 This is amplified by the most severe energy crisis that we are facing now since the 1970s. Over 80% of carbon emissions are coming from energy, so electrifying the world and decarbonizing [the] energy sector is a must. We cannot overlook the climate crisis while meeting these energy demands. In 2023, the global average temperature was the warmest on record since 1850, surpassing the 1.5 degrees global warming limit. So, we have no choice but to change the way we produce, distribute, and consume energy, and do it sustainably to tackle both the energy and climate crises. This gives us a rare opportunity to reimagine and create a clean energy future we want. 

Schneider Electric as an energy management and digital automation company, aims to be the digital partner for sustainability and efficiency for our customers. With end-to-end experience in the energy sector, we are uniquely positioned to help customers digitize, electrify, and deploy sustainable technologies to help them progress toward net-zero. 

As for my role, we know that innovation is pivotal to drive the energy transition. The Innovation at the Edge team leads the way in discovering, developing, and delivering disruptive technologies that will define a more digital, electric, and sustainable energy landscape. We function today as an innovation engine, bridging internal and external innovation, to introduce new solutions, services and businesses to the market. Ultimately, we are crafting the future businesses for Schneider Electric in this sector. And to do this, we nourish a culture that recognizes and celebrates innovation. We welcome new ideas, consider new perspectives inside and outside the organization, and seek out unusual combinations that can kindle revolutionary ideas. We like to think of ourselves as explorers and forces of change, looking for and solving new customer problems. So curiosity and daring to disrupt are in our DNA. And this is the true spirit of Innovation at the Edge at Schneider Electric. 

Laurel: And it’s clear that urgency certainly comes out, especially for enterprises. Because they’re trying to build strong sustainability strategies to not just reach those environmental, social, and governance, or ESG, goals and targets; but also to improve resiliency and efficiency. What’s the role of digital technologies when we think about this all together in enabling a more sustainable future? 

Nadège: We see a sustainable future, and our goal is to enable the shift to an all-electric and all-digital world. That kind of transition isn’t possible without digital technology. We see digital as a key enabler of sustainability and decarbonization. The technology is already available now, it’s a matter of acceleration and adoption of it. And all of us, we have a role to play here. 

At Schneider Electric, we have built a suite of solutions that enable customers to accelerate their sustainability journey. Our flagship suite of IoT-enabled solution infrastructure empowers customers to monitor energy, carbon, and resource usage; and enabling them to implement strategies for efficiency, optimization, and resiliency. We have seen remarkable success stories of clients leveraging our digital EcoStruxure solution in buildings, utilities, data centers, hospitality, healthcare, and more, all over the place. If I were to take one example, I can take the example of PG&E customer, a leading California utility that everybody knows; they are using our EcoStruxure distributed energy resources management system, we call it DERMS, to manage grid reliability more effectively, which is crucial in the face of extreme weather events impacting the grid and consumers.

Schneider has also built an extensive ecosystem of partners because we do need to do it at scale together to accelerate digital transformation for customers. We also invest in cutting-edge technologies that make need-based collaboration and co-innovation possible. It’s all about working together towards one common goal. Ultimately the companies that embrace digital transformation will be the ones that will thrive on disruption. 

Laurel: It’s clear that building a strong sustainability strategy and then following through on the implementation does take time, but addressing climate change requires immediate action. How does your team at Schneider Electric as a whole work to balance those long-term commitments and act with urgency in the short term? It sounds like that internal and external innovation opportunity really could play a role here. 

Nadège: Absolutely. You’re absolutely right. We already have many of the technologies that will take us to net-zero. For example, 70% of CO2 emissions can be removed with existing technologies. By deploying electrification and digital solutions, we can get to our net-zero goals much faster. We know it’s a gradual process and as you already discussed previously, we do need to accelerate the adoption of it. By taking an incremental stepwise approach, we can drive near-term impact while laying the foundation for long-term decarbonization goals. 

Building on the same example of PG&E, which I referenced earlier; through our collaboration, piece by piece progressively, we are building the backbone of a sustainable, digitized, and reliable energy future in California with the deployment of EcoStruxure DERMS. As grid reliability and flexibility become more important, DERMS enable us to keep pace with 21st-century grid demands as they evolve. 

Another critical component of moving fast is embracing open systems and platforms, creating an interoperable ecosystem. By adopting open standards, you empower a wide range of experts to collaborate together, including startups, large organizations, senior decision-makers, and those on the ground. This future-proof investment ensures flexible and scalable solutions, that avoids expensive upgrades in the future and obsolescence. That is why at Innovation at the Edge we’re creating a win-win partnership to push market adoption of the innovative technology available today, but laying the foundation of an even more innovative tomorrow. Innovation at the Edge today provides the space to nurture those ideas, collaborate together, iterate, learn, and grow at pace. 

Laurel: What’s your strategy for investing in, and then adopting those disruptive technologies and business models, especially when you’re trying to build that kind of innovation for tomorrow? 

Nadège: I strongly believe innovation is a key driver of the energy transition. It’s very hard to create the right conditions for consistent innovation, as we discuss short-term and long-term. I want to quote again the famous book from Clayton Christenson, The Innovator’s Dilemma, about how big organizations can get so good at what they are already doing that they struggle to adapt as the market changes. And we are in this dilemma. So we do need to stay ahead. Leaders need to grasp disruptive technology, put customers first, foster innovation, and tackle emerging challenges head on. The phrase “that’s no longer how we do it,” really resonates with me as I look at the role of innovation in the energy space. 

At Schneider, innovation is more than just a buzzword. It’s our strategy for navigating the energy transition. We are investing in truly new and disruptive ideas, tech, and business models, taking the risk and the challenge. We complement our current offering constantly, and we include the new prosumer business that we’re building, and this is pivotal to accelerate the energy transition. We foster open innovation through investment and incubation of cutting-edge technology in energy management, electrical mobility, industrial automation, cybersecurity, artificial intelligence, sustainability, and other topics that will help to go through this innovation. I also can quote some joint ventures that we are creating with partners like GreenStruxure or AlphaStruxure. Those are offering energy-as-a-service solutions, so a new business model enabling organizations to leverage existing technology to achieve decarbonization at scale. As an example, GreenStruxure is helping Bimbo Bakeries move closer to net-zero with micro-grid system at six of their locations. This will provide 20% of Bimbo Bakeries’ USA energy usage and save an estimate of 1,700 tons of CO2 emission per year. 

Laurel: Yeah, that’s certainly remarkable. Following up on that, how does Schneider Electric define prosumer and how does that audience actually fit into Schneider Electric’s strategy when you’re trying to develop these new models? 

Nadège: Prosumer is my favorite word. Let’s redefine it again. Everybody’s speaking of prosumer, but what is prosumer? Prosumer refers to consumers that are actively involved in energy management; producing and consuming their own energy using technologies like solar panels, EV chargers, EV batteries, and EV storage. This is all digitally enabled. So everybody now, the customers, industrial customers, want to understand their energy. So becoming a prosumer comes with perks like lower energy bills. Fantastic, right? Increase independence, clean energy use, and potential compensation from utility providers. It’s beneficial to all of us; it’s beneficial to our planet, it’s beneficial to the decarbonization of the world. Imagine a future where buildings and homes generate their own energy from renewable sources, use what’s needed, and feed the excess back to the grid. This is a fantastic opportunity, and the interest in this is massive. 

To give you some figures; in 2019 we saw 100 gigawatts of new solar PV capacities deployed globally, and by last year this number had nearly quadrupled. So transformation is happening now. Electric vehicles, as an example, their sales have been soaring too, with a projected 14 million sales by 2023, six times the 2019 number. These technologies are already making a dent in emissions and the energy crisis. 

However, the journey to become a prosumer is complex. It’s all about scale and adoption, and it involves challenges with asset integration, grid modernization, regulatory compliance. So we are all part of this ecosystem, and it takes a lot of leadership to make it happen. So at Innovation at the Edge, we’re creating an ecosystem of solutions to streamline the prosumer journey from education and management to purchasing, installation, management, and maintenance of these new distributed resources. What we are doing, we are bringing together internal innovations that we already have in-house at Schneider Electric, like micro-grid, EV charging solutions, battery storage, and more with external innovation from portfolio companies. I can quote companies like Qmerit, EnergySage, EV Connect, Uplight, and AutoGrid, and we deliver end-to-end solutions from grid to prosumer. 

I want to insist one more time, it’s very important to accelerate and to be part of this accelerated adoption. These efforts are not just about strengthening our business, they’re about simplifying the energy ecosystem and moving the industry toward greater sustainability. It’s a collaborative journey that’s shaping the future of energy, and I’m very excited about this. 

Laurel: Focusing on that kind of urgency, innovation in large companies can be hampered by bureaucracy and go slow. What are some best practices for innovation without all of those delays? 

Nadège: Schneider Electric, we are not strangers to innovation, specifically in the energy management and industrial automation space. But to really push the envelope, we look beyond our walls for fresh ideas and expertise. And this is where SE Ventures comes in. It’s our one-billion-euro venture capital fund, from which we make bold bets and bring disruptive ideas to life by supporting and investing in startups that complement our current offering and explore future business. So based in Silicon Valley, but with a global reach, SE Ventures leverages our market knowledge and customer proximity to drive near-term value and commercial relationships with our businesses, customers, and partners. 

We also focus on partnership and incubation. So through partnerships with startups, we accelerate time to market. We accelerate the R&D roadmap and explore new products, new markets with startups. When it comes to incubation, we seek out game-changing ideas and entrepreneurs. We are providing mentorship, resources, and market insight at every stage of their journey. As an example, we also invested in funds like E14, the fund that started out at MIT Media Lab, to gain early insight into disruptive trends and technology. It’s very important to be early-stage here. 

So SE Ventures has successfully today developed multiple unicorns in our portfolio. We’re working with several other high-growth companies, targeted to become future unicorns in key strategic areas. That is totally consistent with Schneider’s mission. 

It’s all about striking a balance to ensure that our relationship with startups are mutually beneficial, knowing when to provide guidance and resources when they need it, but also when to step back and allow them to thrive independently. 

Laurel: With that future lens on, what kind of trends or developments in the energy industry are you seeing, and how are you preparing for them? Are you getting a lot of that kind of excitement from those startups and venture fund ideas? 

Nadège: Yeah, absolutely. There are multiple strengths. You need to listen to startups, to innovators, to people coming up with bold ideas. I want to highlight a couple of those. The energy industry is set to see major shifts. We know it, and we want to be part of it. We discussed prosumers. Prosumer is something very important. A lot of people now understand their body, doing exercises, monitoring it; tomorrow, people will all monitor their energy. Those are prosumers. We believe that prosumers, that’s individuals and businesses, they’re central to the energy transition. And this is a key focal point for us. 

Another trend that we also discuss is digital and also AI. AI has the potential to be transformative as we build the new energy landscape. One example is AI-powered virtual power plants, or what we call VPP, that can optimize a large portfolio of distributed energy resources to ensure greater grid resiliency. Increasingly, AI can be at the heart of the modern electrical grid. So at Schneider Electric, we are watching those trends very carefully. We are listening to the external world, to our customers, and we are showing that we are positioning our solution and global hubs to best serve the needs of our customers. 

Laurel: Lastly, as a woman in a leadership position, could you tell us how you’ve navigated your career so far, and how others in the industry can create a more diverse and inclusive environment within their companies and teams? 

Nadège: An inclusive environment starts with us as leaders. Establishing a culture where we value differences, different opinions, believe in equal opportunity for everyone, and foster a sense of belonging, is something very important in this environment. It’s also important for organizations to create commitments around diversity, equity, and inclusion, and communicate them publicly so it drives accountability, and report on the progress and how we make it happen. 

I was truly fortunate to have started and grown my career at a company like Schneider Electric where I was surrounded by people who empowered me to be my best self. This is something that should drive all women to be the best of herself. It wasn’t always easy. I have learned how important it is to have a voice and to be bold, to speak up for what you are passionate about, and to use that passion to drive impact. These are values I also work to instill in my own teenage daughters, and I’m thrilled to see them finding their own passion within STEM. So the next generation is the driving force in shaping a more sustainable world, and it’s crucial that we focus on leaving the planet a better and more equal place where they can thrive. 

Laurel: Words to the wise. Thank you so much Nadege for joining us today on the Business Lab. 

Nadège: Thank you. 

Laurel: That was Nadège Petit, the chief innovation officer at Schneider Electric, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review. 

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the global director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com. 

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening. 

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Unlocking secure, private AI with confidential computing

All of a sudden, it seems that AI is everywhere, from executive assistant chatbots to AI code assistants.

But despite the proliferation of AI in the zeitgeist, many organizations are proceeding with caution. This is due to the perception of the security quagmires AI presents. For the emerging technology to reach its full potential, data must be secured through every stage of the AI lifecycle including model training, fine-tuning, and inferencing.

This is where confidential computing comes into play. Vikas Bhatia, head of product for Azure Confidential Computing at Microsoft, explains the significance of this architectural innovation: “AI is being used to provide solutions for a lot of highly sensitive data, whether that’s personal data, company data, or multiparty data,” he says. “Confidential computing is an emerging technology that protects that data when it is in memory and in use. We see a future where model creators who need to protect their IP will leverage confidential computing to safeguard their models and to protect their customer data.”

Understanding confidential computing

“The tech industry has done a great job in ensuring that data stays protected at rest and in transit using encryption,” Bhatia says. “Bad actors can steal a laptop and remove its hard drive but won’t be able to get anything out of it if the data is encrypted by security features like BitLocker. Similarly, nobody can run away with data in the cloud. And data in transit is secure thanks to HTTPS and TLS, which have long been industry standards.”

But data in use, when data is in memory and being operated upon, has typically been harder to secure. Confidential computing addresses this critical gap—what Bhatia calls the “missing third leg of the three-legged data protection stool”—via a hardware-based root of trust.

Essentially, confidential computing ensures the only thing customers need to trust is the data running inside of a trusted execution environment (TEE) and the underlying hardware. “The concept of a TEE is basically an enclave, or I like to use the word ‘box.’ Everything inside that box is trusted, anything outside it is not,” explains Bhatia.

Until recently, confidential computing only worked on central processing units (CPUs). However, NVIDIA has recently brought confidential computing capabilities to the H100 Tensor Core GPU and Microsoft has made this technology available in Azure. This has the potential to protect the entire confidential AI lifecycle—including model weights, training data, and inference workloads.

“Historically, devices such as GPUs were controlled by the host operating system, which, in turn, was controlled by the cloud service provider,” notes Krishnaprasad Hande, Technical Program Manager at Microsoft. “So, in order to meet confidential computing requirements, we needed technological improvements to reduce trust in the host operating system, i.e., its ability to observe or tamper with application workloads when the GPU is assigned to a confidential virtual machine, while retaining sufficient control to monitor and manage the device. NVIDIA and Microsoft have worked together to achieve this.”

Attestation mechanisms are another key component of confidential computing. Attestation allows users to verify the integrity and authenticity of the TEE, and the user code within it, ensuring the environment hasn’t been tampered with. “Customers can validate that trust by running an attestation report themselves against the CPU and the GPU to validate the state of their environment,” says Bhatia.

Additionally, secure key management systems play a critical role in confidential computing ecosystems. “We’ve extended our Azure Key Vault with Managed HSM service which runs inside a TEE,” says Bhatia. “The keys get securely released inside that TEE such that the data can be decrypted.”

Confidential computing use cases and benefits

GPU-accelerated confidential computing has far-reaching implications for AI in enterprise contexts. It also addresses privacy issues that apply to any analysis of sensitive data in the public cloud. This is of particular concern to organizations trying to gain insights from multiparty data while maintaining utmost privacy.

Another of the key advantages of Microsoft’s confidential computing offering is that it requires no code changes on the part of the customer, facilitating seamless adoption. “The confidential computing environment we’re building does not require customers to change a single line of code,” notes Bhatia. “They can redeploy from a non-confidential environment to a confidential environment. It’s as simple as choosing a particular VM size that supports confidential computing capabilities.”

Some industries and use cases that stand to benefit from confidential computing advancements include:

  • Governments and sovereign entities dealing with sensitive data and intellectual property.
  • Healthcare organizations using AI for drug discovery and doctor-patient confidentiality.
  • Banks and financial firms using AI to detect fraud and money laundering through shared analysis without revealing sensitive customer information.
  • Manufacturers optimizing supply chains by securely sharing data with partners.

Further, Bhatia says confidential computing helps facilitate data “clean rooms” for secure analysis in contexts like advertising. “We see a lot of sensitivity around use cases such as advertising and the way customers’ data is being handled and shared with third parties,” he says. “So, in these multiparty computation scenarios, or ‘data clean rooms,’ multiple parties can merge in their data sets, and no single party gets access to the combined data set. Only the code that is authorized will get access.”

The current state—and expected future—of confidential computing

Although large language models (LLMs) have captured attention in recent months, enterprises have found early success with a more scaled-down approach: small language models (SLMs), which are more efficient and less resource-intensive for many use cases. “We can see some targeted SLM models that can run in early confidential GPUs,” notes Bhatia.

This is just the start. Microsoft envisions a future that will support larger models and expanded AI scenarios—a progression that could see AI in the enterprise become less of a boardroom buzzword and more of an everyday reality driving business outcomes. “We’re starting with SLMs and adding in capabilities that allow larger models to run using multiple GPUs and multi-node communication. Over time, [the goal is eventually] for the largest models that the world might come up with could run in a confidential environment,” says Bhatia.

Bringing this to fruition will be a collaborative effort. Partnerships among major players like Microsoft and NVIDIA have already propelled significant advancements, and more are on the horizon. Organizations like the Confidential Computing Consortium will also be instrumental in advancing the underpinning technologies needed to make widespread and secure use of enterprise AI a reality.

“We’re seeing a lot of the critical pieces fall into place right now,” says Bhatia. “We don’t question today why something is HTTPS. That’s the world we’re moving toward [with confidential computing], but it’s not going to happen overnight. It’s certainly a journey, and one that NVIDIA and Microsoft are committed to.”

Microsoft Azure customers can start on this journey today with Azure confidential VMs with NVIDIA H100 GPUs. Learn more here.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Housetraining robot dogs: How generative AI might change consumer IoT

As technology goes, the internet of things (IoT) is old: internet-connected devices outnumbered people on Earth around 2008 or 2009, according to a contemporary Cisco report. Since then, IoT has grown rapidly. Researchers say that by the early 2020s, estimates of the number of devices ranged anywhere from the low tens of billions to over 50 billion.

Currently, though, IoT is seeing unusually intense new interest for a long-established technology, even one still experiencing market growth. A sure sign of this buzz is the appearance of acronyms, such as AIoT and GenAIoT, or “artificial intelligence of things” and “generative artificial intelligence of things.”

What is going on? Why now? Examining potential changes to consumer IoT could provide some answers. Specifically, the vast range of areas where the technology finds home and personal uses, from smart home controls through smart watches and other wearables to VR gaming—to name just a handful. The underlying technological changes sparking interest in this specific area mirror those in IoT as a whole.

Rapid advances converging at the edge

IoT is much more than a huge collection of “things,” such as automated sensing devices and attached actuators to take limited actions. These devices, of course, play a key role. A recent IDC report estimated that all edge devices—many of them IoT ones—account for 20% of the world’s current data generation.

IoT, however, is much more. It is a huge technological ecosystem that encompasses and empowers these devices. This ecosystem is multi-layered, although no single agreed taxonomy exists.

Most analyses will include among the strata the physical devices themselves (sensors, actuators, and other machines with which these immediately interact); the data generated by these devices; the networking and communication technology used to gather and send the generated data to, and to receive information from, other devices or central data stores; and the software applications that draw on such information and other possible inputs, often to suggest or make decisions.

The inherent value from IoT is not the data itself, but the capacity to use it in order to understand what is happening in and around the devices and, in turn, to use these insights, where necessary, to recommend that humans take action or to direct connected devices to do so.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Scaling green hydrogen technology for the future

Unlike conventional energy sources, green hydrogen offers a way to store and transfer energy without emitting harmful pollutants, positioning it as essential to a sustainable and net-zero future. By converting electrical power from renewable sources into green hydrogen, these low-carbon-intensity energy storage systems can release clean, efficient power on demand through combustion engines or fuel cells. When produced emission-free, hydrogen can decarbonize some of the most challenging industrial sectors, such as steel and cement production, industrial processes, and maritime transport.

“Green hydrogen is the key driver to advance decarbonization,” says Dr. Christoph Noeres, head of green hydrogen at global electrolysis specialist thyssenkrupp nucera. This promising low-carbon-intensity technology has the potential to transform entire industries by providing a clean, renewable fuel source, moving us toward a greener world aligned with industry climate goals.

Accelerating production of green hydrogen

Hydrogen is the most abundant element in the universe, and its availability is key to its appeal as a clean energy source. However, hydrogen does not occur naturally in its pure form; it is always bound to other elements in compounds like water (H2O). Pure hydrogen is extracted and isolated from water through an energy-intensive process called conventional electrolysis.

Hydrogen is typically produced today via steam-methane reforming, in which high-temperature steam is used to produce hydrogen from natural gas. Emissions produced by this process have implications for hydrogen’s overall carbon footprint: worldwide hydrogen production is currently responsible for as many CO2 emissions as the United Kingdom and Indonesia combined.

A solution lies in green hydrogen—hydrogen produced using electrolysis powered by renewable sources. This unlocks the benefits of hydrogen without the dirty fuels. Unfortunately, very little hydrogen is currently powered by renewables: less than 1% came from non-fossil fuel sources in 2022.

A massive scale-up is underway. According to McKinsey, an estimated 130 to 345 gigawatts (GW) of electrolyzer capacity will be necessary to meet the green hydrogen demand by 2030, with 246 GW of this capacity already announced. This stands in stark contrast to the current installed base of just 1.1 GW. Notably, to ensure that green hydrogen constitutes at least 14% of total energy consumption by 2050, a target that the International Renewable Energy Agency (IRENA) estimates is required to meet climate goals, 5,500 GW of cumulative installed electrolyzer capacity will be required.

However, scaling up green hydrogen production to these levels requires overcoming cost and infrastructure constraints. Becoming cost-competitive means improving and standardizing the technology, harnessing the scale efficiencies of larger projects, and encouraging government action to create market incentives. Moreover, the expansion of renewable energy in regions with significant solar, hydro, or wind energy potential is another crucial factor in lowering renewable power prices and, consequently, the costs of green hydrogen.

Electrolysis innovation

While electrolysis technologies have existed for decades, scaling them up to meet the demand for clean energy will be essential. Alkaline Water Electrolysis (AWE), the most dominant and developed electrolysis method, is poised for this transition. It has been utilized for decades, demonstrating efficiency and reliability in the chemical industry. Moreover, it is more cost effective than other electrolysis technologies and is well suited to be run directly with fluctuating renewable power input. Especially for large-scale applications, AWE demonstrates significant advantages in terms of investment and operating costs. “Transferring small-scale manufacturing and optimizing it towards mass manufacturing will need a high level of investment across the industry,” says Noeres.

Industries that already practice electrolysis, as well as those that already use hydrogen, such as fertilizer production, are well poised for conversion to green hydrogen. For example, thyssenkrupp nucera benefits from a decades-long heritage using electrolyzer technology in the chlor-alkali process, which produces chlorine and caustic soda for the chemical industry. The company “is able to use its existing supply chain to ramp up production quickly, a distinction that all providers don’t share,” says Noeres.

Alongside scaling up existing solutions, thyssenkrupp nucera is developing complementary techniques and technologies. Among these are solid oxide electrolysis cells (SOEC), which perform electrolysis at very high temperatures. While the need for high temperatures means this technique isn’t right for all customers, in industries where waste heat is readily available—such as chemicals—Noeres says SOEC offers up to 20% enhanced efficiency and reduces production costs.

Thyssenkrupp nucera has entered into a strategic partnership with the renowned German research institute Fraunhofer IKTS to move the technology toward applications in industrial manufacturing. The company envisages SOEC as a complement to AWE in the areas where it is cost effective to reduce overall energy consumption. “The combination of AWE and SOEC in thyssenkrupp nucera’s portfolio offers a unique product suite to the industry,” says Noeres.

While advancements in electrolysis technology and the diversification of its applications across various scales and industries are promising for green hydrogen production, a coordinated global ramp-up of renewable energy sources and clean power grids is also crucial. Although AWE electrolyzers are ready for deployment in large-scale, centralized green hydrogen production facilities, these must be integrated with renewable energy sources to truly harness their potential.

Making the green hydrogen market

Storage and transportation remain obstacles to a larger market for green hydrogen. While hydrogen can be compressed and stored, its low density presents a practical challenge. The volume of hydrogen is nearly four times greater than that of natural gas, and storage requires either ultra-high compression or costly refrigeration. Overcoming the economic and technical hurdles of high-volume hydrogen storage and transport will be critical to its potential as an exportable energy carrier.

In 2024, several high-profile green hydrogen projects launched in the U.S., advancing the growth of green hydrogen infrastructure and technology. The landmark Inflation Reduction Act (IRA) provides tax credits and government incentives for producing clean hydrogen and the renewable electricity used in its production. In October 2023, the Biden administration announced $7 billion for the country’s first clean hydrogen hubs, and the U.S. Department of Energy further allocated $750 million for 52 projects across 24 states to dramatically reduce the cost of clean hydrogen and establish American leadership in the industry. The potential economic impact from the IRA legislation is substantial: thyssenkrupp nucera expects the IRA to double or triple the U.S. green hydrogen market size.

“The IRA was a wake-up call for Europe, setting a benchmark for all the other countries on how to support the green hydrogen industry in this startup phase,” says Noeres. Germany’s H2Global scheme was one of the first European efforts to facilitate hydrogen imports with the help of subsidies, and it has since been followed up by the European Hydrogen Bank, which provided €720 million for green hydrogen projects in its pilot auction. “However, more investment is needed to push the green hydrogen industry forward,” says Noeres.

In the current green hydrogen market, China has installed more renewable power than any other country. With lower capital expenditure costs, China produces 40% of the world’s electrolyzers. Additionally, state-owned firms have pledged to build an extensive 6,000-kilometer network of pipelines for green hydrogen transportation by 2050.

Coordinated investment and supportive policies are crucial to ensure attractive incentives that can bring green hydrogen from a niche technology to a scalable solution globally. The Chinese green hydrogen market, along with that of other regions such as the Middle East and North Africa, has advanced significantly, garnering global attention for its competitive edge through large-scale projects. To compete effectively, the EU must create a global level playing field for European technologies through attractive investment incentives that can drive the transition of hydrogen from a niche to a global-scale solution. Supportive policies must be in place to also ensure that green products made with hydrogen, such as steel, are sufficiently incentivized and protected against carbon leakage.

A comprehensive strategy, combining investment incentives, open markets, and protection against market distortions and carbon leakage, is crucial for the EU and other countries to remain competitive in the rapidly evolving global green hydrogen market and achieve a decarbonized energy future. “To advance several gigawatt scale or multi-hundred megawatts projects forward,” says Noeres, “we need significantly more volume globally and comparable funding opportunities to make a real impact on global supply chains.”

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

The data practitioner for the AI era

The rise of generative AI, coupled with the rapid adoption and democratization of AI across industries this decade, has emphasized the singular importance of data. Managing data effectively has become critical to this era of business—making data practitioners, including data engineers, analytics engineers, and ML engineers, key figures in the data and AI revolution.

Organizations that fail to use their own data will fall behind competitors that do and miss out on opportunities to uncover new value for themselves and their customers. As the quantity and complexity of data grows, so do its challenges, forcing organizations to adopt new data tools and infrastructure which, in turn, change the roles and mandate of the technology workforce.

Data practitioners are among those whose roles are experiencing the most significant change, as organizations expand their responsibilities. Rather than working in a siloed data team, data engineers are now developing platforms and tools whose design improves data visibility and transparency for employees across the organization, including analytics engineers, data scientists, data analysts, machine learning engineers, and business stakeholders.

This report explores, through a series of interviews with expert data practitioners, key shifts in data engineering, the evolving skill set required of data practitioners, options for data infrastructure and tooling to support AI, and data challenges and opportunities emerging in parallel with generative AI. The report’s key findings include the following:

  • The foundational importance of data is creating new demands on data practitioners. As the rise of AI demonstrates the business importance of data more clearly than ever, data practitioners are encountering new data challenges, increasing data complexity, evolving team structures, and emerging tools and technologies—as well as establishing newfound organizational importance.
  • Data practitioners are getting closer to the business, and the business closer to the data. The pressure to create value from data has led executives to invest more substantially in data-related functions. Data practitioners are being asked to expand their knowledge of the business, engage more deeply with business units, and support the use of data in the organization, while functional teams are finding they require their own internal data expertise to leverage their data.
  • The data and AI strategy has become a key part of the business strategy. Business leaders need to invest in their data and AI strategy—including making important decisions about the data team’s organizational structure, data platform and architecture, and data governance—because every business’s key differentiator will increasingly be its data.
  • Data practitioners will shape how generative AI is deployed in the enterprise. The key considerations for generative AI deployment—producing high-quality results, preventing bias and hallucinations, establishing governance, designing data workflows, ensuring regulatory compliance—are the province of data practitioners, giving them outsize influence on how this powerful technology will be put to work.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

AI-readiness for C-suite leaders

Generative AI, like predictive AI before it, has rightly seized the attention of business executives. The technology has the potential to add trillions of dollars to annual global economic activity, and its adoption for business applications is expected to improve the top or bottom lines—or both—at many organizations.

While generative AI offers an impressive and powerful new set of capabilities, its business value is not a given. While some powerful foundational models are open to public use, these do not serve as a differentiator for those looking to get ahead of the competition and unlock AI’s full potential. To gain those advantages, organizations must look to enhance AI models with their own data to create unique business insights and opportunities.

Preparing an organization’s data for AI, however, unlocks a new set of challenges and opportunities. This MIT Technology Review Insights survey report investigates whether companies’ data foundations are ready to garner benefits from generative AI, as well as the challenges of building the necessary data infrastructure for this technology. In doing so, it draws on insights from a survey of 300 C-suite executives and senior technology leaders, as well on in-depth interviews with four leading experts.

Its key findings include the following:

Data integration is the leading priority for AI readiness. In our survey, 82% of C-suite and other senior executives agree that “scaling AI or generative AI use cases to create business value is a top priority for our organization.” The number-one challenge in achieving that AI readiness, survey respondents say, is data integration and pipelines (45%). Asked about challenging aspects of data integration, respondents named four: managing data volume, moving data from on-premises to the cloud, enabling real-time access, and managing changes to data.

Executives are laser-focused on data management challenges—and lasting solutions. Among survey respondents, 83% say that their “organization has identified numerous sources of data that we must bring together in order to enable our AI initiatives.” Though data-dependent technologies of recent decades drove data integration and aggregation programs, these were typically tailored to specific use cases. Now, however, companies are looking for something more scalable and use-case agnostic: 82% of respondents are prioritizing solutions “that will continue to work in the future, regardless of other changes to our data strategy and partners.”

Data governance and security is a top concern for regulated sectors. Data governance and security concerns are the second most common data readiness challenge (cited by 44% of respondents). Respondents from highly regulated sectors were two to three times more likely to cite data governance and security as a concern, and chief data officers (CDOs) say this is a challenge at twice the rate of their C-suite peers. And our experts agree: Data governance and security should be addressed from the beginning of any AI strategy to ensure data is used and accessed properly.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Download the full report.

Industry- and AI-focused cloud transformation

For years, cloud technology has demonstrated its ability to cut costs, improve efficiencies, and boost productivity. But today’s organizations are looking to cloud for more than simply operational gains. Faced with an ever-evolving regulatory landscape, a complex business environment, and rapid technological change, organizations are increasingly recognizing cloud’s potential to catalyze business transformation.

Cloud can transform business by making it ready for AI and other emerging technologies. The global consultancy McKinsey projects that a staggering $3 trillion in value could be created by cloud transformations by 2030. Key value drivers range from innovation-driven growth to accelerated product development.

“As applications move to the cloud, more and more opportunities are getting unlocked,” says Vinod Mamtani, vice president and general manager of generative AI services for Oracle Cloud Infrastructure. “For example, the application of AI and generative AI are transforming businesses in deep ways.”

No longer simply a software and infrastructure upgrade, cloud is now a powerful technology capable of accelerating innovation, improving agility, and supporting emerging tools. In order to capitalize on cloud’s competitive advantages, however, businesses must ask for more from their cloud transformations.

Every business operates in its own context, and so a strong cloud solution should have built-in support for industry-specific best practices. And because emerging technology increasingly drives all businesses, an effective cloud platform must be ready for AI and the immense impacts it will have on the way organizations operate and employees work.

An industry-specific approach

The imperative for cloud transformation is evident: In today’s fast-faced business environment, cloud can help organizations enhance innovation, scalability, agility, and speed while simultaneously alleviating the burden on time-strapped IT teams. Yet most organizations have not fully made the leap to cloud. McKinsey, for example, reports a broad mismatch between leading companies’ cloud aspirations and realities—though nearly all organizations say they aspire to run the majority of their applications in the cloud within the decade, the average organization has currently relocated only 15–20% of them.

Cloud solutions that take an industry-specific approach can help companies meet their business needs more easily, making cloud adoption faster, smoother, and more immediately useful. “Cloud requirements can vary significantly across vertical industries due to differences in compliance requirements, data sensitivity, scalability, and specific business objectives,” says Deviprasad Rambhatla, senior vice president and sector head of retail services and transportation at Wipro.

Health-care organizations, for instance, need to manage sensitive patient data while complying with strict regulations such as HIPAA. As a result, cloud solutions for that industry must ensure features such as high availability, disaster recovery capabilities, and continuous access to critical patient information.

Retailers, on the other hand, are more likely to experience seasonal business fluctuations, requiring cloud solutions that allow for greater flexibility. “Cloud solutions allow retailers to scale infrastructure on an up-and-down basis,” says Rambhatla. “Moreover, they’re able to do it on demand, ensuring optimal performance and cost efficiency.”

Cloud-based applications can also be tailored to meet the precise requirements of a particular industry. For retailers, these might include analytics tools that ingest vast volumes of data and generate insights that help the business better understand consumer behavior and anticipate market trends.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Unlocking the trillion-dollar potential of generative AI

Generative AI is poised to unlock trillions in annual economic value across industries. This rapidly evolving field is changing the way we approach everything from content creation to software development, promising never-before-seen efficiency and productivity gains.

In this session, experts from Amazon Web Services (AWS) and QuantumBlack, AI by McKinsey, discuss the drivers fueling the massive potential impact of generative AI. Plus, they look at key industries set to capture the largest share of this value and practical strategies for effectively upskilling their workforces to take advantage of these productivity gains. 

Watch this session to:

  • Explore generative AI’s economic impact
  • Understand workforce upskilling needs
  • Integrate generative AI responsibly
  • Establish an AI-ready business model

Learn how to seamlessly integrate generative AI into your organization’s workflows while fostering a skilled and adaptable workforce. Register now to learn how to unlock the trillion-dollar potential of generative AI.

Register here for free.

Optimizing the supply chain with a data lakehouse

When a commercial ship travels from the port of Ras Tanura in Saudi Arabia to Tokyo Bay, it’s not only carrying cargo; it’s also transporting millions of data points across a wide array of partners and complex technology systems.

Consider, for example, Maersk. The global shipping container and logistics company has more than 100,000 employees, offices in 120 countries, and operates about 800 container ships that can each hold 18,000 tractor-trailer containers. From manufacture to delivery, the items within these containers carry hundreds or thousands of data points, highlighting the amount of supply chain data organizations manage on a daily basis.

Until recently, access to the bulk of an organizations’ supply chain data has been limited to specialists, distributed across myriad data systems. Constrained by traditional data warehouse limitations, maintaining the data requires considerable engineering effort; heavy oversight, and substantial financial commitment. Today, a huge amount of data—generated by an increasingly digital supply chain—languishes in data lakes without ever being made available to the business.

A 2023 Boston Consulting Group survey notes that 56% of managers say although investment in modernizing data architectures continues, managing data operating costs remains a major pain point. The consultancy also expects data deluge issues are likely to worsen as the volume of data generated grows at a rate of 21% from 2021 to 2024, to 149 zettabytes globally.

“Data is everywhere,” says Mark Sear, director of AI, data, and integration at Maersk. “Just consider the life of a product and what goes into transporting a computer mouse from China to the United Kingdom. You have to work out how you get it from the factory to the port, the port to the next port, the port to the warehouse, and the warehouse to the consumer. There are vast amounts of data points throughout that journey.”

Sear says organizations that manage to integrate these rich sets of data are poised to reap valuable business benefits. “Every single data point is an opportunity for improvement—to improve profitability, knowledge, our ability to price correctly, our ability to staff correctly, and to satisfy the customer,” he says.

Organizations like Maersk are increasingly turning to a data lakehouse architecture. By combining the cost-effective scale of a data lake with the capability and performance of a data warehouse, a data lakehouse promises to help companies unify disparate supply chain data and provide a larger group of users with access to data, including structured, semi-structured, and unstructured data. Building analytics on top of the lakehouse not only allows this new architectural approach to advance supply chain efficiency with better performance and governance, but it can also support easy and immediate data analysis and help reduce operational costs.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.