Inside the quest for unbreakable encryption

When we check email, log in to our bank accounts, or exchange messages on Signal, our passwords and credentials are protected through encryption, a locking scheme that uses secrets to disguise our data. It works like a cyber padlock: with the right key someone can unlock the data. Without it, they’ll have to resort to laborious brute-force methods, the digital equivalent of hacksaws and blowtorches.

Our trust in online security is rooted in mathematics. Encryption schemes are built on families of math problems called one-way functions—calculations that are easy to carry out in one direction but almost impossible to solve efficiently from the other, even with a powerful computer. They’re sort of a computational equivalent of those road spikes found at the exits of airport car rental agencies. Drive in one direction and you barely notice. Hit reverse and you won’t get far (and will need new tires).

There’s a problem, however. Although mathematicians suspect true one-way functions exist, they have yet to prove it. They haven’t proved that the thorny problems we do use are impossible, or even extremely impractical, to solve. Instead, it could just be that we haven’t yet found the appropriate mathematical means to take the problems apart. This conundrum haunts all encryption. Our data is secured by the fact that no one knows how to crack the schemes that protect it—at least not yet

It’s not just today’s hackers we may need to worry about. Security experts have long warned of a threat that hasn’t yet materialized: quantum computers. In the future these machines could execute a program that quickly solves the math problems behind today’s state-of-the-art encryption. That threat puts personal financial, medical, and other information at risk. Hackers could steal today’s encrypted data and store it away, just waiting for the arrival of new technological lockpicks

Computer scientists, mathematicians, and cryptographers are on a quest to find new encryption algorithms that can withstand attacks not only from today’s conventional computers but also from tomorrow’s quantum machines. What they want is a big, sticky math problem—something that’s robust enough to withstand attacks from classical and quantum computers but can still be easily implemented in cyberspace. 

Unfortunately, no one has yet found a single type of problem that is provably hard for computers—classical or quantum—to solve. (In the world of cryptography, “hard” describes a problem whose solution requires an unreasonable number of steps or amount of computing power.) If one-way functions don’t exist, then cryptographers’ whack-a-mole process of finding flaws and developing ever stronger schemes to block clever hackers will persist indefinitely. 

“The question of whether one-way functions exist is really the most important problem,” says Rafael Pass, a theoretical computer scientist at Tel Aviv University in Israel. It’s a conundrum that dates to the 1970s and the dawn of a research area now known as computational complexity theory. Over five decades, theorists and cryptographers have been looking for ways to establish whether such functions do exist. Perhaps the problems we hope or suspect are one-way are just easier, breakable ones in disguise. 

Pass is exploring how one-way functions are connected to a raft of other open problems, a promising line of research that has drawn other theorists into the quest. At the same time, people focused on the practical side of cryptography are plowing ahead, hunting for new schemes that are—if not provably hard—seemingly strong enough to hold up against quantum computers. 

Computer scientists find themselves at a curious crossroads, unsure of whether post-quantum algorithms are truly unassailable—or just believed to be so.

For the last seven years, the job of finding the best candidates has been spearheaded by the National Institute of Standards and Technology (NIST), the US government body charged with collecting, testing, and standardizing cryptographic algorithms for public use. NIST has been running dozens of potential “post-­quantum” algorithms through a gauntlet of tests and making them available for outside testing. The process has winnowed the field to a few finalists, and in August NIST announced that one called CRYSTALS-Kyber, which takes an approach believed to be robust enough to counter quantum attacks, will be the first to be officially recommended for public use by 2024. After that, companies and governments will adopt the algorithm for encrypting data. 

Will it hold up? The answer will help determine the trajectory of cybersecurity in the near term. But it’s far from settled: history suggests that our faith in unbreakability has often been misplaced, and over the years, seemingly impenetrable encryption candidates have fallen to surprisingly simple attacks. Computer scientists find themselves at a curious crossroads, unsure of whether post-quantum algorithms are truly unassailable—or just believed to be so. It’s a distinction at the heart of modern encryption security. 

The myth and reality of unbreakability

Securing secret messages hasn’t always been tied to difficult math problems; until recently, cryptography was barely mathematical at all. In ancient Greece, military leaders encoded messages using a scytale, a cylindrical device that revealed a hidden message when a strip of seemingly jumbled text was wound around it. Centuries later, Roman historians described a code, often attributed to Julius Caesar, that involved shifting letters in a message three spots up in the alphabet; for example, a d would be written as an a

In history as in our modern world, secret codes were frequently broken. In the 16th century, during the decades she spent imprisoned by her cousin Queen Elizabeth I, Mary, Queen of Scots, used elaborate, symbol-based ciphers to encode hundreds of letters, most of which were aimed at securing her freedom and regaining the throne. She didn’t prevail: Elizabeth I’s team of spies and codebreakers intercepted, decoded, and copied the letters. In the one that sealed her fate, Mary approved of a plan to assassinate Elizabeth with six words: “sett the six gentlemen to woork.” In response, Elizabeth eventually ordered her cousin beheaded in 1587.

In 1932, codebreakers in Poland cracked the code for Germany’s early Enigma machine, invented at the end of World War I. They later shared their intel with British codebreakers, who cracked a more advanced version of Enigma during World War II.

Pass, the theoretical computer scientist in Tel Aviv, half-jokingly refers to all time before the 1970s as the “dark age of cryptography.”

“Cryptography wasn’t really a scientific field,” he says. “It was more like artist versus attackers. You needed to have [artistic] skills to invent an encryption scheme. And then it would get deployed until some clever person would figure out how to break it. And it was just going on and on like that.” 

That changed, Pass says, in November 1976, when cryptographers Whitfield Diffie and Martin Hellman, at Stanford, described a novel way for two people to devise a key that only they knew—one they could then use to pass secret messages. Crucially, they wouldn’t have to meet to do it. This was a groundbreaking notion. Previously, both sender and receiver had to physically possess a key for encoding and decoding. To decrypt a message encoded with the Enigma machine, for example, a recipient needed a key sheet that revealed the initial encryption settings.

The secret to the Diffie-Hellman strategy was for two people to build the key using a straightforward mathematical problem that’s easy to compute in one direction and laborious in the other. Here’s how it works: The two people who want to communicate secretly, usually designated Alice and Bob in these setups, each pick a secret number. Then, together, they agree on a pair of numbers that they share publicly (one is a big prime, and the other is called the base). Each of them next carries out a series of mathematical operations to combine those private numbers with the prime and the base. 

Then they exchange the results, and they each carry out another series of mathematical operations on the new numbers. In the end, both Alice and Bob will have done the same operations on the same numbers—just not in the same order—and arrived at the same answer. The digits of that answer become the encryption. And an eavesdropper who intercepts the transmission—often nicknamed Eve—won’t be able to easily unravel the mathematical jumble without knowing at least one of the private numbers. She could start testing numbers in a brute-force approach, but that would require an unreasonable amount of calculation. 

The complicated problem that Eve would have to solve is called finding a discrete logarithm. The Diffie-Hellman approach is still used today—to secure some VPNs, for example—and is integral to some post-quantum schemes.

In their paper, Diffie and Hellman noted that there was no existing algorithm capable of solving the discrete log problem in a reasonable amount of time. There still isn’t. They went on to introduce, for the first time, the notion of one-way functions as a basis for secure cryptography.

Today, secure online interactions that involve authentication or digital signatures, for example, are based on that general idea. But without mathematical proof that the problems they rely on are one-way functions, the possibility remains that someone might discover an efficient scheme for cracking them. 

The quantum menace

Today, online transactions begin with a kind of digital handshake, and the security of that handshake is often guaranteed by another math problem that’s presumed to be difficult. The most popular encryption scheme used today was introduced in 1977 by a trio of young computer scientists who were energized by Diffie and Hellman’s 1976 paper. They called their approach RSA, after the last names of the scientists (Ron Rivest, Adi Shamir, and Leonard Adleman). 

RSA, which is based on the difficulty of finding prime factors relative to the ease of multiplying them together, is a bit different from the Diffie-Hellman approach. Diffie-Hellman is a shared secret: it allows two users to devise a key over an insecure channel (like the internet), and that key is used to disguise messages. In RSA, Alice uses Bob’s key—based on big prime numbers—to encrypt a message that only he can unlock. RSA can secure data sent from one person to another.  

It quickly became one of the most popular public-key encryption methods. It’s easy to use and adapt. Over time, as new algorithms have emerged that can factor faster, and computers have become more powerful, NIST has recommended using larger and larger numbers for security. The numbers are represented in binary form with 1s and 0s, and these binary digits are better known as “bits.” The number 13, for example, is written in binary as 1101, which has four bits. NIST currently recommends using a key represented by at least 2,048 bits—which corresponds to a number with over 600 digits. (To date, the largest number that has been factored into two primes was made up of 250 digits, and the process took nearly 3,000 hours of computing time.) That’s a strength of RSA—even if it’s not uncrackable, it’s been easy to keep upping the ante, making it computationally impractical to break. 

In 1994, however, a threat of a different type emerged when the American mathematician Peter Shor, then at Bell Labs, devised an algorithm for quantum computers that could solve the factoring problem in a reasonable amount of time. (It was a double threat: his approach could also conquer the discrete log problem in the Diffie-Hellman approach.) 

Shor’s paper ignited excitement and anxiety among those who wanted to build quantum computers and those who recognized the threat it posed to cybersecurity. Fortunately for cryptographers, not just any quantum computer would do. 

The last three decades of cybersecurity have played out like an increasingly intricate game, with researchers perpetually building and breaking—or attempting to break—new candidates.

A few years back, researchers at Google and the KTH Royal Institute of Technology, in Sweden, estimated that it would take a quantum computer composed of 20 million quantum bits, or qubits, some eight hours to break today’s 2,048-bit RSA security. Current state-of-the-art machines are nowhere close to that size: the largest quantum computer to date, built by IBM, debuted last year with 433 qubits.

Whether or not RSA can be considered at immediate risk of a quantum attack depends largely on whom you ask, says computer scientist Ted Shorter, who cofounded the cybersecurity company Keyfactor. He sees a cultural divide between the theorists who study the mathematics of encryption and the cryptographers who work in implementation.

To some, the end seems nigh. “You talk to a theoretical computer scientist and they’re like, Yes, RSA is done, because they can imagine it,” Shorter says. For them, he adds, the existence of Shor’s algorithm points to the end of encryption as we know it. 

Many cryptographers who are implementing real-world security systems are less concerned about the quantum future than they are about today’s cleverest hackers. After all, people have been trying to factor efficiently for thousands of years, and now the only known method requires a computer that doesn’t exist. 

Thomas Decru, a cryptographer at KU Leuven in Belgium, says the quantum threat must be taken seriously, but it’s hard to know if RSA will fall to quantum computers in five years or longer—or never. “As long as quantum computers do not exist, everything you say about them is speculative, in a way,” he says. Pass is more certain about the threat: “It’s safe to say that the existence of this quantum algorithm means there are cracks in the problem, right?” 

The thorns of implementation

But we have to be ready for anything, says Lily Chen, a mathematician who manages NIST’s Cryptographic Technology Group and works on the ongoing effort to produce post-quantum encryption standards. Whether they arrive in three years or 30, quantum computers loom on the horizon, and RSA, Diffie-Hellman, and other encryption schemes may be left vulnerable. 

Finding a quantum-resistant cryptographic scheme isn’t easy. Without a mathematical problem that is computationally hard, the last three decades of cybersecurity have played out like an increasingly intricate game, with researchers perpetually building and breaking—or attempting to break—new candidates. 

This push and pull has already emerged in the NIST post-quantum program. In February 2022, cryptographers found a fatal flaw in Rainbow, an algorithm that had survived three rounds of NIST’s analysis. A few months later, after the NIST list had been winnowed again, Decru and his KU Leuven colleague Wouter Castryck announced that they’d broken another finalist, an algorithm called SIKE. 

SIKE, which was developed a few years ago, was the brainchild of a collaboration among researchers and engineers at Amazon, Microsoft, the University of Versailles, and elsewhere. It is based on a special mathematical map, called an isogeny, that is made up of connections between elliptic curves. These maps can be turned into an encryption for communication, and outsiders can’t eavesdrop without knowing the maps.

At Leuven, Decru and Castryck devise ways to use these so-called isogenies to build new, faster encryption approaches. They broke the most difficult version of SIKE in just a few hours of computing time using an ordinary desktop computer. (Since then, other groups have found ways to do it even faster.) What’s more, Decru and Castryck did it almost accidentally, and only a few weeks after SIKE had been declared an alternate NIST finalist. “We weren’t trying to break it at all,” insists Decru. “We just tried to generalize it.” 

Chen says the case of SIKE—and Rainbow before it—illustrates a real-world tension that drives efforts to find quantum-proof algorithms. On one hand, she says, “you have to find a problem which is hard for both quantum computers and classical computers.” On the other is implementation: transforming that hard problem into one that can be used in a real-world cryptographic system. Even with today’s well-defined problems, Shorter says, it’s very difficult to predict and prevent every loophole in every operating system and device on the market today. “And then there’s interoperability testing and certifications and other tests,” he says, “to make sure they are not only implemented correctly, but also securely.”  

The mathematical problem SIKE is based on seems computationally hard because there are so many different maps that could be constructed between curves. It may even be a one-way problem—and therefore quantum-proof. The flaw was in the design, which revealed too much of the transmitted information. Decru and Castryck cracked it because they inadvertently found a way to expose enough connecting points to give away the entire thing. 

Other schemes have fared better. The first post-quantum encryption algorithm to be standardized, CRYSTALS-Kyber, delivers security through an approach that involves problems on lattices, mathematical objectsthat can be modeled as arrays of points. (There are five main families of post-quantum cryptographic methods. Isogeny and lattice approaches are two of them.) 

CRYSTALS-Kyber is a general encryption scheme, like RSA, that can be used for tasks like securing online communication. Three other approved algorithms are designed to authenticate digital signatures, ensuring that digital documents haven’t been fraudulently signed. NIST plans to standardize these by spring 2024. Another three (it was four until SIKE was broken) could also be standardized in the next few years, as long as they survive further rounds of scrutiny.

But unless mathematicians can prove whether one-way functions exist, says Pass, the patterns that have always characterized cryptography will continue. “We’re back to this cat-and-mouse game, where it’s a game between algorithm designers proposing new candidate constructions and other designers trying to break them,” he says. Unless, of course, he—or someone in his field—can come up with an implementable, provably one-way function to settle the matter of encryption forever. 

Until that time, cryptographers will remain in a messy limbo in which convincingly robust encryption schemes can be trusted—but only until they can’t. 

The perfect math problem could take us out of this limbo, but it can’t be some sticky mess cooked up by an armchair algebraist over a long weekend. It must strike a balance between math and cryptography, with computational hardness on one side and easy implementation on the other. Stray too far from either of those properties, and it becomes vulnerable—if not now, then in the future. Hanging in the balance is the past, present, and future security of everyone’s data, everywhere. No pressure. 

Stephen Ornes is a science writer based in Nashville.

Inside the quest for unbreakable encryption

When we check email, log in to our bank accounts, or exchange messages on Signal, our passwords and credentials are protected through encryption, a locking scheme that uses secrets to disguise our data. It works like a cyber padlock: with the right key someone can unlock the data. Without it, they’ll have to resort to laborious brute-force methods, the digital equivalent of hacksaws and blowtorches.

Our trust in online security is rooted in mathematics. Encryption schemes are built on families of math problems called one-way functions—calculations that are easy to carry out in one direction but almost impossible to solve efficiently from the other, even with a powerful computer. They’re sort of a computational equivalent of those road spikes found at the exits of airport car rental agencies. Drive in one direction and you barely notice. Hit reverse and you won’t get far (and will need new tires).

There’s a problem, however. Although mathematicians suspect true one-way functions exist, they have yet to prove it. They haven’t proved that the thorny problems we do use are impossible, or even extremely impractical, to solve. Instead, it could just be that we haven’t yet found the appropriate mathematical means to take the problems apart. This conundrum haunts all encryption. Our data is secured by the fact that no one knows how to crack the schemes that protect it—at least not yet

It’s not just today’s hackers we may need to worry about. Security experts have long warned of a threat that hasn’t yet materialized: quantum computers. In the future these machines could execute a program that quickly solves the math problems behind today’s state-of-the-art encryption. That threat puts personal financial, medical, and other information at risk. Hackers could steal today’s encrypted data and store it away, just waiting for the arrival of new technological lockpicks

Computer scientists, mathematicians, and cryptographers are on a quest to find new encryption algorithms that can withstand attacks not only from today’s conventional computers but also from tomorrow’s quantum machines. What they want is a big, sticky math problem—something that’s robust enough to withstand attacks from classical and quantum computers but can still be easily implemented in cyberspace. 

Unfortunately, no one has yet found a single type of problem that is provably hard for computers—classical or quantum—to solve. (In the world of cryptography, “hard” describes a problem whose solution requires an unreasonable number of steps or amount of computing power.) If one-way functions don’t exist, then cryptographers’ whack-a-mole process of finding flaws and developing ever stronger schemes to block clever hackers will persist indefinitely. 

“The question of whether one-way functions exist is really the most important problem,” says Rafael Pass, a theoretical computer scientist at Tel Aviv University in Israel. It’s a conundrum that dates to the 1970s and the dawn of a research area now known as computational complexity theory. Over five decades, theorists and cryptographers have been looking for ways to establish whether such functions do exist. Perhaps the problems we hope or suspect are one-way are just easier, breakable ones in disguise. 

Pass is exploring how one-way functions are connected to a raft of other open problems, a promising line of research that has drawn other theorists into the quest. At the same time, people focused on the practical side of cryptography are plowing ahead, hunting for new schemes that are—if not provably hard—seemingly strong enough to hold up against quantum computers. 

Computer scientists find themselves at a curious crossroads, unsure of whether post-quantum algorithms are truly unassailable—or just believed to be so.

For the last seven years, the job of finding the best candidates has been spearheaded by the National Institute of Standards and Technology (NIST), the US government body charged with collecting, testing, and standardizing cryptographic algorithms for public use. NIST has been running dozens of potential “post-­quantum” algorithms through a gauntlet of tests and making them available for outside testing. The process has winnowed the field to a few finalists, and in August NIST announced that one called CRYSTALS-Kyber, which takes an approach believed to be robust enough to counter quantum attacks, will be the first to be officially recommended for public use by 2024. After that, companies and governments will adopt the algorithm for encrypting data. 

Will it hold up? The answer will help determine the trajectory of cybersecurity in the near term. But it’s far from settled: history suggests that our faith in unbreakability has often been misplaced, and over the years, seemingly impenetrable encryption candidates have fallen to surprisingly simple attacks. Computer scientists find themselves at a curious crossroads, unsure of whether post-quantum algorithms are truly unassailable—or just believed to be so. It’s a distinction at the heart of modern encryption security. 

The myth and reality of unbreakability

Securing secret messages hasn’t always been tied to difficult math problems; until recently, cryptography was barely mathematical at all. In ancient Greece, military leaders encoded messages using a scytale, a cylindrical device that revealed a hidden message when a strip of seemingly jumbled text was wound around it. Centuries later, Roman historians described a code, often attributed to Julius Caesar, that involved shifting letters in a message three spots up in the alphabet; for example, a d would be written as an a

In history as in our modern world, secret codes were frequently broken. In the 16th century, during the decades she spent imprisoned by her cousin Queen Elizabeth I, Mary, Queen of Scots, used elaborate, symbol-based ciphers to encode hundreds of letters, most of which were aimed at securing her freedom and regaining the throne. She didn’t prevail: Elizabeth I’s team of spies and codebreakers intercepted, decoded, and copied the letters. In the one that sealed her fate, Mary approved of a plan to assassinate Elizabeth with six words: “sett the six gentlemen to woork.” In response, Elizabeth eventually ordered her cousin beheaded in 1587.

In 1932, codebreakers in Poland cracked the code for Germany’s early Enigma machine, invented at the end of World War I. They later shared their intel with British codebreakers, who cracked a more advanced version of Enigma during World War II.

Pass, the theoretical computer scientist in Tel Aviv, half-jokingly refers to all time before the 1970s as the “dark age of cryptography.”

“Cryptography wasn’t really a scientific field,” he says. “It was more like artist versus attackers. You needed to have [artistic] skills to invent an encryption scheme. And then it would get deployed until some clever person would figure out how to break it. And it was just going on and on like that.” 

That changed, Pass says, in November 1976, when cryptographers Whitfield Diffie and Martin Hellman, at Stanford, described a novel way for two people to devise a key that only they knew—one they could then use to pass secret messages. Crucially, they wouldn’t have to meet to do it. This was a groundbreaking notion. Previously, both sender and receiver had to physically possess a key for encoding and decoding. To decrypt a message encoded with the Enigma machine, for example, a recipient needed a key sheet that revealed the initial encryption settings.

The secret to the Diffie-Hellman strategy was for two people to build the key using a straightforward mathematical problem that’s easy to compute in one direction and laborious in the other. Here’s how it works: The two people who want to communicate secretly, usually designated Alice and Bob in these setups, each pick a secret number. Then, together, they agree on a pair of numbers that they share publicly (one is a big prime, and the other is called the base). Each of them next carries out a series of mathematical operations to combine those private numbers with the prime and the base. 

Then they exchange the results, and they each carry out another series of mathematical operations on the new numbers. In the end, both Alice and Bob will have done the same operations on the same numbers—just not in the same order—and arrived at the same answer. The digits of that answer become the encryption. And an eavesdropper who intercepts the transmission—often nicknamed Eve—won’t be able to easily unravel the mathematical jumble without knowing at least one of the private numbers. She could start testing numbers in a brute-force approach, but that would require an unreasonable amount of calculation. 

The complicated problem that Eve would have to solve is called finding a discrete logarithm. The Diffie-Hellman approach is still used today—to secure some VPNs, for example—and is integral to some post-quantum schemes.

In their paper, Diffie and Hellman noted that there was no existing algorithm capable of solving the discrete log problem in a reasonable amount of time. There still isn’t. They went on to introduce, for the first time, the notion of one-way functions as a basis for secure cryptography.

Today, secure online interactions that involve authentication or digital signatures, for example, are based on that general idea. But without mathematical proof that the problems they rely on are one-way functions, the possibility remains that someone might discover an efficient scheme for cracking them. 

The quantum menace

Today, online transactions begin with a kind of digital handshake, and the security of that handshake is often guaranteed by another math problem that’s presumed to be difficult. The most popular encryption scheme used today was introduced in 1977 by a trio of young computer scientists who were energized by Diffie and Hellman’s 1976 paper. They called their approach RSA, after the last names of the scientists (Ron Rivest, Adi Shamir, and Leonard Adleman). 

RSA, which is based on the difficulty of finding prime factors relative to the ease of multiplying them together, is a bit different from the Diffie-Hellman approach. Diffie-Hellman is a shared secret: it allows two users to devise a key over an insecure channel (like the internet), and that key is used to disguise messages. In RSA, Alice uses Bob’s key—based on big prime numbers—to encrypt a message that only he can unlock. RSA can secure data sent from one person to another.  

It quickly became one of the most popular public-key encryption methods. It’s easy to use and adapt. Over time, as new algorithms have emerged that can factor faster, and computers have become more powerful, NIST has recommended using larger and larger numbers for security. The numbers are represented in binary form with 1s and 0s, and these binary digits are better known as “bits.” The number 13, for example, is written in binary as 1101, which has four bits. NIST currently recommends using a key represented by at least 2,048 bits—which corresponds to a number with over 600 digits. (To date, the largest number that has been factored into two primes was made up of 250 digits, and the process took nearly 3,000 hours of computing time.) That’s a strength of RSA—even if it’s not uncrackable, it’s been easy to keep upping the ante, making it computationally impractical to break. 

In 1994, however, a threat of a different type emerged when the American mathematician Peter Shor, then at Bell Labs, devised an algorithm for quantum computers that could solve the factoring problem in a reasonable amount of time. (It was a double threat: his approach could also conquer the discrete log problem in the Diffie-Hellman approach.) 

Shor’s paper ignited excitement and anxiety among those who wanted to build quantum computers and those who recognized the threat it posed to cybersecurity. Fortunately for cryptographers, not just any quantum computer would do. 

The last three decades of cybersecurity have played out like an increasingly intricate game, with researchers perpetually building and breaking—or attempting to break—new candidates.

A few years back, researchers at Google and the KTH Royal Institute of Technology, in Sweden, estimated that it would take a quantum computer composed of 20 million quantum bits, or qubits, some eight hours to break today’s 2,048-bit RSA security. Current state-of-the-art machines are nowhere close to that size: the largest quantum computer to date, built by IBM, debuted last year with 433 qubits.

Whether or not RSA can be considered at immediate risk of a quantum attack depends largely on whom you ask, says computer scientist Ted Shorter, who cofounded the cybersecurity company Keyfactor. He sees a cultural divide between the theorists who study the mathematics of encryption and the cryptographers who work in implementation.

To some, the end seems nigh. “You talk to a theoretical computer scientist and they’re like, Yes, RSA is done, because they can imagine it,” Shorter says. For them, he adds, the existence of Shor’s algorithm points to the end of encryption as we know it. 

Many cryptographers who are implementing real-world security systems are less concerned about the quantum future than they are about today’s cleverest hackers. After all, people have been trying to factor efficiently for thousands of years, and now the only known method requires a computer that doesn’t exist. 

Thomas Decru, a cryptographer at KU Leuven in Belgium, says the quantum threat must be taken seriously, but it’s hard to know if RSA will fall to quantum computers in five years or longer—or never. “As long as quantum computers do not exist, everything you say about them is speculative, in a way,” he says. Pass is more certain about the threat: “It’s safe to say that the existence of this quantum algorithm means there are cracks in the problem, right?” 

The thorns of implementation

But we have to be ready for anything, says Lily Chen, a mathematician who manages NIST’s Cryptographic Technology Group and works on the ongoing effort to produce post-quantum encryption standards. Whether they arrive in three years or 30, quantum computers loom on the horizon, and RSA, Diffie-Hellman, and other encryption schemes may be left vulnerable. 

Finding a quantum-resistant cryptographic scheme isn’t easy. Without a mathematical problem that is computationally hard, the last three decades of cybersecurity have played out like an increasingly intricate game, with researchers perpetually building and breaking—or attempting to break—new candidates. 

This push and pull has already emerged in the NIST post-quantum program. In February 2022, cryptographers found a fatal flaw in Rainbow, an algorithm that had survived three rounds of NIST’s analysis. A few months later, after the NIST list had been winnowed again, Decru and his KU Leuven colleague Wouter Castryck announced that they’d broken another finalist, an algorithm called SIKE. 

SIKE, which was developed a few years ago, was the brainchild of a collaboration among researchers and engineers at Amazon, Microsoft, the University of Versailles, and elsewhere. It is based on a special mathematical map, called an isogeny, that is made up of connections between elliptic curves. These maps can be turned into an encryption for communication, and outsiders can’t eavesdrop without knowing the maps.

At Leuven, Decru and Castryck devise ways to use these so-called isogenies to build new, faster encryption approaches. They broke the most difficult version of SIKE in just a few hours of computing time using an ordinary desktop computer. (Since then, other groups have found ways to do it even faster.) What’s more, Decru and Castryck did it almost accidentally, and only a few weeks after SIKE had been declared an alternate NIST finalist. “We weren’t trying to break it at all,” insists Decru. “We just tried to generalize it.” 

Chen says the case of SIKE—and Rainbow before it—illustrates a real-world tension that drives efforts to find quantum-proof algorithms. On one hand, she says, “you have to find a problem which is hard for both quantum computers and classical computers.” On the other is implementation: transforming that hard problem into one that can be used in a real-world cryptographic system. Even with today’s well-defined problems, Shorter says, it’s very difficult to predict and prevent every loophole in every operating system and device on the market today. “And then there’s interoperability testing and certifications and other tests,” he says, “to make sure they are not only implemented correctly, but also securely.”  

The mathematical problem SIKE is based on seems computationally hard because there are so many different maps that could be constructed between curves. It may even be a one-way problem—and therefore quantum-proof. The flaw was in the design, which revealed too much of the transmitted information. Decru and Castryck cracked it because they inadvertently found a way to expose enough connecting points to give away the entire thing. 

Other schemes have fared better. The first post-quantum encryption algorithm to be standardized, CRYSTALS-Kyber, delivers security through an approach that involves problems on lattices, mathematical objectsthat can be modeled as arrays of points. (There are five main families of post-quantum cryptographic methods. Isogeny and lattice approaches are two of them.) 

CRYSTALS-Kyber is a general encryption scheme, like RSA, that can be used for tasks like securing online communication. Three other approved algorithms are designed to authenticate digital signatures, ensuring that digital documents haven’t been fraudulently signed. NIST plans to standardize these by spring 2024. Another three (it was four until SIKE was broken) could also be standardized in the next few years, as long as they survive further rounds of scrutiny.

But unless mathematicians can prove whether one-way functions exist, says Pass, the patterns that have always characterized cryptography will continue. “We’re back to this cat-and-mouse game, where it’s a game between algorithm designers proposing new candidate constructions and other designers trying to break them,” he says. Unless, of course, he—or someone in his field—can come up with an implementable, provably one-way function to settle the matter of encryption forever. 

Until that time, cryptographers will remain in a messy limbo in which convincingly robust encryption schemes can be trusted—but only until they can’t. 

The perfect math problem could take us out of this limbo, but it can’t be some sticky mess cooked up by an armchair algebraist over a long weekend. It must strike a balance between math and cryptography, with computational hardness on one side and easy implementation on the other. Stray too far from either of those properties, and it becomes vulnerable—if not now, then in the future. Hanging in the balance is the past, present, and future security of everyone’s data, everywhere. No pressure. 

Stephen Ornes is a science writer based in Nashville.

Decarbonizing your data strategy

Posting just a six-second video on social media uses the same amount of power as boiling 22 gallons of water. This staggering statistic encapsulates just how intertwined data management is with sustainability. And as companies look to become data-driven and to gain insights from vast data streams, it’s also crucial to keep an eye on the environmental cost of those efforts.

The road toward decarbonization is daunting, especially while trying to keep pace with innovation. According to the director of data platform product marketing at Hitachi Vantara, Ian Clatworthy, companies need to take their own initiative when it comes to sustainability measures and just start somewhere.

“When we look at integrating the carbonization goals into budgets, the agenda is crucial. Start where you can actually have some impact,” says Clatworthy.

Making data hardware and infrastructure more sustainable begins with understanding the value of what you’re storing, Clatworthy says. From there, companies can invest in the most data and energy-efficient servers, storage devices, and networking equipment.

“Look at data flows, adopt energy-efficient technologies, and that’s going to really align your data processing capabilities with those goals,” says Clatworthy.

Although many companies have made strict commitments to become emissions-free internally and within their own operations, decarbonizing entirely throughout a supply chain is exceedingly challenging. Clatworthy says that it comes down to transparency. A company can be cognizant of the emissions released in its own operations but down the supply chain to an outside manufacturer or supplier, it may not be as forthcoming about the scope of its footprint.

Data storage technology is always evolving, but the technology needs to be right for your company, he says. The shiniest or newest tool may be faster or more efficient but it’s important to keep in mind its energy consumption and the impact on emissions.

“You need to adopt a multifaceted approach that combines energy-efficient infrastructure, renewable energy sourcing, optimize the data management practices, but commit to that transparency and sustainability reporting,” says Clatworthy. “The environmental concerns will continue to grow, and these trends will play a critical role in shaping the future of data management.”

This episode of Business Lab is produced in partnership with Hitachi Vantara.

Related resources

Making sustainability achievable with data

Hitachi Vantara’s CO2 estimator

Full transcript

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic today is embracing sustainability initiatives in data storage and management, as well as throughout the supply chain. Enterprises need to act now to realize decarbonization goals and meet environmental, social, and corporate governance deadlines.

Two words for you: carbon reduction.

My guest is Ian Clatworthy, who is the director of data platform product marketing at Hitachi Vantara.

This podcast is produced in partnership with Hitachi Vantara.

Welcome, Ian.

Ian Clatworthy: Hey, Laurel, thank you so much for having me.

Laurel: Well, great to have you, and I think we’ll just dive right in. So most enterprises are prioritizing sustainability and reducing their overall carbon footprint as climate changes persist globally. Data management and storage is a critical factor in an enterprise’s carbon emissions. So could you describe the state of data management right now? Is there a prevailing understanding of how to make data centers more energy efficient during this transitional period, or are the methods evolving?

Ian: Great question, Laurel. And many enterprises have indeed been prioritizing sustainability and looking to reduce their carbon footprint. We know now our impact on the environment is much higher than we ever thought it was. We know that data centers contribute more CO2 than the airline industry. That’s massive. So there is this effort to try and consolidate what we’re doing and reduce our carbon emissions. So, therefore, as data centers, we really need to understand the circular economics of the products we’re putting in there for data management. And that really starts at how I’m really falling into the scope one, two, and three emissions. Scope one, stuff I can control, stuff I’m producing. “Am I burning gas to heat my offices? Are my sales team using electric cars or petrol cars?” That’s scope one. Scope two being much more of the “where am I buying my energy from? Is that renewable? Or where’s my data center located?” Data center locality has a huge impact on carbon footprint. A great example of that is what’s the greenest state in the US to hold your data, do you know, Laurel?

Laurel: I don’t, what is it?

Ian: It’s Texas and Austin specifically. They have the highest renewables rate in the United States. So therefore my carbon footprint can be lowered massively by putting my data in Texas. So there are elements like that that really, from Scope 2, can make a difference. But the key one is Scope 3. This is your other emissions. And when we get to that circular economics, it’s well, actually, who’s the vendor you can trust to provide you true transparency on carbon footprint? So I’m not talking about just running it in your data center. I’m talking, well, where have I sourced my metals from? How have I produced it? Have I developed my software in a way that’s sufficient and carbon neutral? How have I shipped it to you? How am I going to recycle the product? So actually I think we very quickly, and a lot of customers get into, is the need to think about advanced cooling techniques or more monitoring and management.

That’s purely the efficiencies of running the box. Absolutely, it needs to happen. But the challenge coming in, and this is where regulations are going to force our hand in some ways, is that they’re going to say, well, you have to have a carbon reduction throughout the supply chain. And from suppliers, like us as Hitachi Vantara, we have to declare what’s the carbon footprint of my product? And a great example of that is we’ve shifted. We do manufacture absolutely in Japan, but again, how do we shift software development, are we doing it closer to the API? Are we doing it closer to the box? Is it efficient? All those things really matter. So these methods and technologies are evolving to meet these challenges. But I think what the enterprises need to do is really open their mind as to what data they’re storing, how they’re storing it, but also where their suppliers are providing it, where they’re storing their data. Data locality matters, but at the end of the day, you’ve got to understand that data, so important.

Laurel: That’s really interesting that the tech industry, well, through data centers are contributing more to greenhouse emissions than the airline industry, which that’s astonishing. So how should leaders integrate decarbonization goals into their budgets and agendas? I mean, you mentioned a bit about data locality, but are there proactive steps they can start thinking about?

Ian: Absolutely. This is also quite overwhelming, and I speak to customers a lot on this. They’re like, where do I start? This isn’t a competitive thing. And we see this a little bit in our industry as like, oh, well, who is more greener than who? It doesn’t matter. We’re doing this as a collective. You’ve got to start somewhere. So making a step, be that as small as it is, is a massive thing, and it shouldn’t be underestimated. When we look at integrating the carbonization goals into budgets, the agenda is crucial. Start where you can actually have some impact. Don’t try and bite off more than you can chew as the saying says. Set some targets. Say, “Well, look, I’m just going to understand what my CO2 footprint is.” First of all, “what am I consuming day-to-day? How much renewable energy sources am I using?” And set some targets around that based upon where you can get to.

Allocate some resources to it. Allow that from financials, human resources, sustainability projects and initiatives across your organizations. This isn’t just IT. Invest in energy efficient technologies. Go and ask your vendors to help you. As I say, it’s not a competitive thing, come and ask people like us and say, “Well, what would you recommend? Where would I start? What’s a good thing?” We’re not here to cram things in your face, we’re here to help you make those steps. And to your point around the airline industry, there are so many stats that are just eye watering that really kind of change thinking. And one that really brings to mind for me is a six second social media video uses the same amount of power as you would use to boil 22 gallons of water, a six second video.

And think about it, that’s because I’m powering the phone, I’m powering the cell mass, I’m powering the data centers, I’m replicating the data centers, presenting that back out. That’s exactly what we’re doing. So this idea of… just start somewhere. Before we get to regulation, and that will happen. That’s a given. But have regular reviews, look at what you’re doing, take a step, understand what you’re doing today, and make it a part of your agenda, of your business’s agenda. Because you know what? I say this isn’t competitive, but reality is, you need to be competitively sustainable to exist in your industry. Customers will choose someone different. That’s why this is really important.

Laurel: To sort of put this into a real stake, how does a company like Hitachi Vantara measure its own consumption of power and emissions? And what’s the company’s approach to reducing its own carbon footprint?

Ian: It’s constantly evolving. It is absolutely. The first thing is constant audits. We have a team dedicated to looking at sustainability and to be key here, Hitachi Vantara is a wholly-owned subsidiary of Hitachi Limited. And Hitachi Limited, our executives are entirely incentivized on our green capabilities. That’s absolutely key. This idea of executives making bonuses because they’ve sold stuff, no. Hitachi executives are bonused if we meet our green goals. So there’s a complete mindshift in us as a company. So that changes everything. The first thing is when we say we want to do energy audits and we’re looking at emissions inventories and data center monitoring, this is key to who we are as a company, and that’s really important. We’ve made a massive shift change as a company, but we’re not talking about that. For customers listening to this podcast, what’s their approach and how can they make that change?

Where we’ve seen value is, look at efficiency improvements, energy efficiency improvements. Now, we’re very biased. We have our own power generation company, which not many people do, but it means that I can go, hey, where’s the best renewable energy at the minute from the grid to be able to supply us in Hitachi Vantara? Equally, we can offset that with carbon offsetting that we’re doing with Hitachi power grids, for example, where they’re investing in green technology and projects that reduce or capture emissions. There’s a lot of thought process, carbon offsetting is a very sensitive subject because a lot of the time it’s, hey, we’re going to plant a tree for everything. It’s outsourced to a third world country and there’s corruption, there’s issues, and those things never happen. There’s issues around that. But again, it’s finding the actions that you are taking, engaging your employees, and getting them involved.

We know that this thought process and the value of sustainability is key to everybody. This is for us as a collective. So get people involved, get them helping, get their ideas together, because they may see something because doing it every day that you just never would. And I think that’s really important as part of your employee resource groups, get them involved. And then longer term, look for where I can get more environmental certifications. We know that pursuing environmental certifications, such as ISO 14001… can we demonstrate regularly that we have this commitment to having our solutions and products certified externally by a third party, it just validates the efforts that we’re doing. And continuous improvement, just make sure you have a cadence of improvement and adjust the strategies accordingly.

Laurel: So as you mentioned, many companies have their own ESG goals and commitments to reducing emissions, but some differentiate between becoming carbon free internally with operations by a certain deadline like 2050, and then they would become carbon free throughout their entire supply chain at some other later point in time. Could you describe why this gap exists and what changes enterprises face when trying to decarbonize their supply chain?

Ian: Well, this ties back into our first comment, and I discussed that. Difference of the circular economics and Scope 3 emissions, and this really is a wide range of indirect emissions. So this isn’t power that I’m buying, this is the power that my suppliers are buying. And it’s really difficult to get transparency, and that’s why someone like Hitachi Vantara has made it our mission, that we make ourselves exceedingly transparent and try and measure that in a way that then we can pass on our customers. So the gap here is, a great example would be, I’m a vendor. I talk about sustainability in regards to how much power I’m saving you daily, but, and here is a but, I manufacture that box in China where it’s coal-fired and I’m using typically high CO2 processes to manufacture the metals and materials I’m using in my storage platform.

I’m not going to tell you that because I don’t want to tell you that. And that’s where it starts to become difficult, because I need that vendor to be open and wanting to share that information with us. By contrast, if I say by comparison, I manufacture in somewhere with more renewables in Asia and I’m manufacturing somewhere where I’m conscious of where my supply chain is in regard to my materials, we’ve done some analysis on what we do versus in Japan versus some of the others in Asia. And we can see the boxes we’re producing have 38% less CO2. Now that’s before I’ve even turned it on. If I put it side-by-side to the one that’s manufactured in that coal-fired environment for power and manufacturing, that’s 38%. The efficiencies of running the box, whatever, when you recycle it, there is still 38% less CO2, because of that Scope 3 emissions of where it’s manufactured, how I am choosing my metals.

So this is why this gap exists, because actually I’m going to be carbon free from my operations. That’s my decision. But carbon free from the supply chain means I need to choose the vendors that are going to be able to help me achieve that. And the vendors themselves need to change to do that. It’s really fascinating. I mean, push your vendors, ask them and say, “Look, I’ve got to get to this. I’ve got to get to this point. I may not have a decision from my executives today, but I need to get there. I need to know where you’re manufacturing. Where are your materials coming from? How are you shipping that? Are you using last mile electric solutions to deliver? Recyclable packaging?” All these things matter in regards to the overall carbon footprint to a product, and also getting to that carbon neutrality from the supply chain to the customers.

Laurel: Well, that transparency certainly helps when choosing a vendor to work with. What are some other kind of tangible changes that companies can invest in to make their hardware and infrastructure more sustainable and environmentally friendly?

Ian: So those sort of changes can really help reduce energy consumption. So we’re getting into the efficiencies of data storage and data management. So to lower the carbon emissions and minimize that impact of IT ops, you really need to understand the data you’ve got, first of all. So understand you could put the latest and greatest storage solution in, but actually if you’re storing stuff that you just don’t use or has no value to your company, what’s the point? You could half what you’re putting in there and save even more. So there’s this element of understanding what you’ve got today, and understanding its value to your business. That’s really key. Once you know that, now you can say, that gives me efficient hardware. I’ve got my data efficient hardware. And also choose stuff that is energy efficient, upgrade to energy efficient servers, storage devices, networking equipment. Look for products with Energy Star ratings or carbon footprint for product ratings.

Continue that journey of virtualization and reducing overall hardware footprint in your data center. The second is cooling. A lot of the cooling we see, and certainly being from the United Kingdom, I don’t necessarily need to cool my data center as much as say someone would need to do in say, Arizona, because the ambient air is typically cooler, but there’s more we can do with liquid cooling. There’s a great article recently of an MSP [managed service provider] in the United Kingdom that took over a, I don’t know what you say in the United States, but a swimming pool, what would you call that? A leisure center?

Laurel: Yeah.

Ian: Yeah, cool. Okay. But they’re actually heating the pool with their data center so they get free cooling and they’re charging people to come in and enjoy the swimming pool. And I was like, this is genius. That’s real social engineering around carbon footprint, and I think it’s going to need more. I mean, that’s a very extreme example, but that clever energy management and temperature management is really exciting, and that essentially results in that kind of greener data center. Look, and I’ve mentioned exciting things there, but really this is all about monitoring, reporting, understanding what you’ve got, making sure that you’re getting your employees engaged. These are the key things that are going to make an impact quickly to your business.

Laurel: And those are sort of the basics that you kind of have to do first, right? Understanding what your data is and where it’s stored. But are there emerging opportunities for data storage and management technologies that can help improve efficiency?

Ian: Do you know what? There’s so much going on at the minute and these innovations are going to help reduce the carbon footprint, but we’ve got to be really careful. There’s technologies that are coming out like, if I, for example, compare NVMe [nonvolatile memory express] as a storage technology to SAS [serial attached storage], where I have SSDs as NVMe or SCM drives as we also have as flash drives, they actually consume a third more power than our traditional SAS SSDs. So when you’re putting in these, hey, I’m going to go all NVMe and it’s all exciting and it’s super fast, that actually could have a massive impact on your carbon footprint. So think about using the right technology that’s right for you. Don’t necessarily tick a blanket box and say, this is going to be everything. No, be more granular on what you actually need and the performance you need for your data center.

And that then moves into the technology piece. You want to look into the data compression deduplication, which is very much a table stakes technology these days. Everybody has something and algorithms to reduce data are fairly common, but you need to use that on more data sets than you do necessarily today. And equally, we need to be able to, from a technology perspective, actively switch from say, inline compression to post-process. So for example, when there’s tons of going on and there’s loads of data storage, I don’t want to have an impact on performance so that when I’m writing data, I’m compressing it in line and I’m dealing with it, amazing. But actually when the array is not busy, I want to be able to switch to post-process, save some power. And the same can be true for the CPUs themselves.

Rather than, say we talk about overclocking over the years, we need to underclock those CPUs. Make them slower, because if we’re making them slower, they’re going to consume less power. But equally, I want to be able to turn them on and make them fast as and when I need that without any impact to my company and my business. So this idea of taking technology that’s inside our solutions today and making them dynamic, making them have the ability to reduce their footprint all without any impact to customers is so, so important. Look, there’s much more than that going on. There’s tons around DNA storage and other things, which really is the next generation. And I think that’s going to again, fundamentally change this conversation entirely. But actually what we’re seeing today is how can we take the technology we’ve got, make it dynamic, make it accessible to align to your data management practices and sustainability goals.

Laurel: And Ian, just quickly, what was that acronym, MBNE?

Ian: Oh, NVMe.

Laurel: Got it, NVMe. And could you explain that, what that is to us?

Ian: Yeah, sure. Non-volatile memory express. This is a language that we use for talking about storage and apologies. We have SAS as well, serial attached storage. This purely is, they’re a language, they’re a way of talking to a type of media. And NVMe is the latest language that we have, but what that means is, that means we can have even faster flash drives. Fantastic. We’re talking with a flash language, but it uses more power. So great, I’ve got something faster. But that could mean that you’ve put the latest technology in and you’ve consolidated. Why is my power consumption higher? So the key things to take just because it’s the latest and greatest technology doesn’t necessarily mean that it’s going to lower your carbon footprint.

Laurel: Well, and I think that’s a good analogy for what we may have in our own homes with dishwashers or washing machines where it’s now this longer ecocycle. Yeah, that makes sense. So in addition to demands for reducing carbon emissions among enterprises, there’s also demands for more immediate and transparent data across to run business applications and power AI and machine learning tools. I mean, we haven’t even touched on that. We know that’s such a great demand. Every other conversation is about generative AI. So how do you meet those demands for greater data access, while then mitigating environmental impacts?

Ian: It presents a challenge. How do we do that while mitigating environmental impacts? I’ve spoken about start somewhere, make small changes, but then we say, well, generative AI comes in and I need to replace all my servers with the latest and greatest, and that’s a huge carbon issue. How do I make those changes sustainable? Well, it’s about employing different strategies and balancing them. What’s right for some might be different for others. An example would be, as I mentioned before, optimize your storage and retrieval, employ advanced data management techniques, understand where you’re storing it and how you’re retrieving it, how you’re tiering data along with compressing data, deduplicating it, caching strategies. Understand your data lifecycle and process of where you are storing it. And I don’t mean actually the data itself. Sorry, I mean, like I say, an application itself. I’m talking about the physical data and how you are moving that through its lifecycle in your data center.

Look at the edge. There’s lots in the edge. Data’s coming in at the edge, and how do we use that to really process data closer to its source? This is very much more efficient and reduces the need for transmitting large volumes of data over long distances. So minimize the network latency and use and energy consumption by doing much more at the edge of where you’re receiving data. Different for customers, depending on the apps they’re using. Different industries mean they need different things, so take that as it applies to you. Look at your cloud computing. A lot of the data centers that the hyperscalers use are, what’s their CO2 impact? What’s their footprint? I don’t think there’s enough clarity there right now. So how can you actually on-premise, let’s say, I’m in the Nordics and I have a data center there. Well, that’s 100% renewables.

So I have a huge savings by saying, let’s say I’m not going to run it in the public cloud, because I can run it on-prem. So again, that balance, understanding where what you have is really key. But finally, understand that AI element. What can the AI and machine learning algorithms do to dramatically allocate resources based upon workload demand? Make it so that you only utilize it when you need it rather than actually taking resources and then sitting on it. And that balance, that need for kind of immediate and transparent data access with that environmental sustainability in mind delivers a real holistic approach. So look at data flows, adopt energy-efficient technologies, and that’s going to really align your data processing capabilities with those goals.

Laurel: So thinking ahead, what are some of those trends in relation to data management that you’re thinking about and anticipate enterprises will approach to reduce their own carbon footprints while still being able to deploy all these advanced technologies and innovations?

Ian: Gosh, yeah. That’s such a large question. Let me try and summarize. I think there’s so many trends in data management and sustainability is almost like a cloud that’s hanging over them because people think, “Oh, how am I going to do this and meet those goals?” It shouldn’t be seen like that. The focus should be on sustainability and how do we align our strategies to that. So again, start somewhere. Understand the power that you’re using. Start with some level of reporting, understand what you’re using. And there’s going to be more coming from vendors to provide software to give you more metrics and reporting. That’s really key. Stakeholders will expect clear information on carbon emissions and energy usage. That’s absolutely key. The second has to be renewable energy procurement. So make sure that you are investing in renewable energy sources. IT typically doesn’t have control over where power is provided, but that includes onsite generation too.

They may have backup generators. Are they generators or are they using PDUs [power distribution unit]? So power purchase agreements, renewable energy credits, and how you’re actually using that for backup power as well really, really matter. And again, just that element of energy efficient hardware. As you’re looking to invest, ask the difficult question, well, where are you producing this? Where is it manufactured? How are you shipping it? These are really, really difficult questions. And oh, by the way, I want to see how you’re doing that. Actually, I want to see you certified externally to meet those goals. And all that sort of collaboration around supply chain sustainability will really help. Understanding how you’re sourcing and responsibly manufacturing becomes integral to that data management strategy. And finally, just really innovation and research, the ongoing innovation and research and how we’re using technologies within data solutions to actively and dynamically turn on features and turn off features with complete transparency to you as a customer.

I think that’s so important. So you need to adopt a multifaceted approach that combines energy-efficient infrastructure, renewable energy sourcing, optimize the data management practices, but commit to that transparency and sustainability reporting. The environmental concerns will continue to grow, and these trends will play a critical role in shaping the future of data management.

Laurel: Well, Ian, this has been a fantastic conversation on the Business Lab. Thank you so much for joining us.

Ian: Appreciate it. Thanks a lot, Laurel.

Laurel: That was Ian Clatworthy, who is the director of data platform product marketing at Hitachi Vantara who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the global director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology. And you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoy this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.