Developing climate solutions with green software

After years of committing to sustainable practices in his personal life from recycling to using cloth-based diapers, Asim Hussain, currently the director of green software and ecosystems at Intel, began to ask questions about the practices in his work: software development.

Developers often asked if their software was secure enough, fast enough, or cost-effective enough but, Hussain says, they rarely considered the environmental consequences of their applications. Hussain would go on to work at Intel and become the executive director and chairperson of the Green Software Foundation, a non-profit aiming to create an ecosystem of people, tooling, and best practices around sustainable software development.

“What we need to do as software developers and software engineers is we need to make sure that it is emitting the least amount of carbon for the same amount of value and user functionality that we’re getting out of it,” says Hussain.

The three pillars of green software are energy efficiency, hardware efficiency, and carbon awareness. Making more efficient use of hardware and energy consumption when developing applications can go a long way toward reducing emissions, Hussain says. And carbon-aware computing involves divestment from fossil fuels in favor of renewable energy sources to improve efficiency without compromising performance.

Often, when something is dubbed “green,” there is an assumption that the product, application, or practice functions worse than its less environmentally friendly version. With software, however, the opposite is true.

“Being green in the software space means being more efficient, which translates almost always to being faster,” says Hussain. “When you factor in the hardware efficiency component, oftentimes it translates to building software that is more resilient, more fault-tolerant. Oftentimes it also translates then into being cheaper.”

Instituting green software necessitates not just a shift in practices and tooling but also a culture change within an enterprise. While regulations and ESG targets help to create an imperative, says Hussain, a shift in mindset can enable some of the greatest strides forward.

“If there’s anything we really need to do is to drive that behavior change, we need to drive behavior change so people actually invest their time on making software more energy efficient, more hardware efficient, or more carbon aware.”

This episode of Business Lab is produced in partnership with Intel.

Full Transcript

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Our topic is green software, from apps to devices to the cloud. Computing runs the world around us. However, there is a better way to do it with a focus on sustainability.

Two words for you: sustainable code.

My guest is Asim Hussain, who is the director of the Office of Green Software and Ecosystems at Intel, as well as the chairperson of the Green Software Foundation.

This podcast is produced in partnership with Intel.

Welcome, Asim.

Asim Hussain: Hi Laurel. Thank you very much for having me.

Laurel: Well, glad you’re here. So for a bit of background, you’ve been working in software development and sustainability advocacy from startups to global enterprises for the last two decades. What drew you into sustainability as a focus and what are you working on now?

Asim: I’ve personally been involved and interested in the sustainability space for quite a while on a very personal level. Then around the birth of my first son, about five years ago now, I started asking myself this one question, which was how come I was willing to do all these things I was doing for sustainability to recycle, we were using cloth-based nappies, all sorts of these different things. Yet I could not remember in my entire career, my entire career, I could not remember one single moment where in any technical discussion, in any architectural meeting, in any discussion about how we’re going to build this piece of software. I mean, people oftentimes raise points around is this secure enough? Is this fast enough? Does this cost too much? But at no point I’d ever heard anybody ask the question, is this emitting too much carbon? Is this piece of software, is this solution that we’re talking about right now, how does that solution, what kind of environmental impacts does that have? I’ve never, ever, ever heard anybody raise that question.

So I really started to ask that question myself. I found other people who are like me. Five years ago, there weren’t many of us, but were all asking the same questions. I joined and then I started to become a co-organizer of a community called ClimateAction.Tech. Then the community just grew. A lot of people were starting to ask themselves these questions and some answers were coming along. At the time, I used to work at Microsoft and I pitched and formed something called the green cloud advocacy team, where we talked about how to actually build applications in a greener way on the cloud.

We formed something called the Green Software Foundation, which is a consortium of now 60 member organizations, which I am a chairperson of. Over a year ago I joined Intel because Intel has been heavily investing in sustainable software space. If you think about what Intel does, pretty much everything that Intel produces, developers use it and developers write software and write code on Intel’s products. So it makes sense for Intel to have a strong green software strategy. That’s kind of why I was brought in and I’ve since then been working on Intel’s green software strategy internally.

Laurel: So a little bit more about that. How can organizations make their software greener? Then maybe we should take a step back and define what green software actually is.

Asim: Well, I think we have to define what green software actually is first. The way the conversation’s landed in recent years and the Green Software Foundation has been a large part of this is we’ve coalesced around this idea of carbon efficiency, which is if you are building a piece of software … Everything we do emits carbon, everything we do emits carbon, this tool we’re using right now to record this session is emitting carbon right now. What we need to do as software developers and software engineers is we need to make sure that it is emitting the least amount of carbon for the same amount of value and user functionality that we’re getting out of it. That’s what we call carbon efficiency.

What we say is there’s three pillars underneath, there’s only really three ways to make your software green. The first is to make it more energy efficient, to use less energy. Most electricity is still created through the burning of fossil fuels. So just using less electricity is going to emit fewer carbon emissions into the atmosphere. So the first is energy efficiency. The second is hardware efficiency because all software runs on hardware and depends on the, if you’re talking about a mobile phone, typically people are forced to move on from mobile phones because the software just doesn’t run on their older models. In the cloud it tends to be more around a conversation around utilization by making more use of the servers that you already have in the cloud, making just more efficient use of the hardware. The third one is a very interesting space. It’s a very new space. It’s called carbon awareness or carbon-aware computing. That is you are going to be using electricity anyway. Can you make your software? Can you architect your software in such a way?

So it does more when the electricity is clean and does less when the electricity is dirty. So can you architect an application? So for instance, it does more when there’s more renewable energy on the grid right now, and it does less when more coal or gas is getting burnt. There’s some very interesting projects in this space that have been happening, very high-profile projects and carbon-aware computing is an area where there’s a lot of interest because it’s a stepping stone. It might not get you your 50, 60, 70% carbon reductions, but it will get you your 1, 2, 3, and 4% carbon reductions and it’ll get you that with very minimal investments. There’s a lot of interest in carbon-aware computing. But those are basically the three areas, what we call the three pillars of green software, energy efficiency, hardware efficiency, and carbon awareness.

Laurel: So another reason we’re talking about all of this is that technology can contribute to the environmental issues that it is trying to actually help. So for example, a lot of energy is needed to train AI models. Also, blockchain was key in the development of energy-efficient microgrids, but it’s also behind the development of cryptocurrency platforms, some of which consume more energy than that of a small country. So how can advanced technologies like AI, machine learning, and blockchain contribute positively to the development of green software?

Asim: That’s an interesting question because sometimes the focus oftentimes is how do we actually make that technology greener? But I don’t believe that is necessarily the whole story. The story is the broader story. How can we use that technology to make software greener? I think there’s many ways you can probably tackle that question. One thing that’s been interesting for me since my journey as a software developer joining Intel is me realizing how little I knew about hardware. There is so much, I describe it as the gap between software and silicon. The gap is quite large right now. If you’re building software these days, you have very little understanding of the silicon that’s running that software. Through a greater understanding of exactly how your software is exactly getting executed by the silicon to implement the functionality, that’s where we are seeing that there’s a lot of great opportunities to reduce emissions and to make that software more energy efficient, more hardware efficient.

I think that’s where places like AI can really help out. Developer productivity has been the buzzword in this space for a very long time. Developers are extremely expensive. Getting to market fast and beating your competition is the name of the game these days. So it’s always been about how do we implement the functionality we need as fast as possible, make sure it’s secure, get it out the door. But oftentimes the only way you can do that is to increase the gap between the software and silicon and just make it a little bit more inefficient. I think AI can really help there. You can build AI solutions that can, there’s copilot solutions which can help as you’re developing code could actually suggest to you. If you were to write your code in a slightly different way, it could be more efficient. So that’s one way AI can help out.

Another way that I’m seeing AI utilized in this space as well is when you deploy … Silicon and the products that we produce can actually, they come out of the box configured in a certain way, but they can actually be tuned to actually execute that particular piece of software much more efficiently. So if you have a data center running just one type of software, you can actually tune the hardware so that software is run more efficiently on that hardware. We’re seeing AI solutions come on the market these days, which can then automatically just figure out what type of application are you, how do you run, how do you work? We have a solution called Granulate, which does part of this as well. It can then figure out how do you tune the underlying hardware in such a way so it executes that software more efficiently. So I think that’s kind of a couple of ways that this technology could actually be used to make software itself greener.

Laurel: To bridge that gap between software and silicon, you must be able to measure the progress and meet targets. So what parameters do you use to measure the energy efficiency of software? Could you talk us through the tenets of actually measuring?

Asim: So measuring is an extremely challenging problem. When we first launched the Green Software Foundation three years ago, I remember asking all the members, what is your biggest pain point? They all came back, almost all came back with measuring. Measuring is very, very challenging. It’s so nuanced, there’s so many different levels to it. For instance, at Intel, we have technology in our chips to actually measure the energy of the whole chip. Those counters on the chip which measure it. Unfortunately, that only gives you the energy of the entire chip itself. So it does give you a measurement, but then if you are a developer, there’s maybe 10 processes running on that chip and only one of them is yours. You need to know how much energy is your process consuming because that’s what you can optimize for. That’s what you can see. Currently, the best way to measure at that level is using models, models which are either generated again through AI or through other processes where you can effectively just run lots large amounts of data and generate statistical models.

Oftentimes a model that’s used is one that uses CPU [central processing unit] utilization, so how busy a CPU is and translate that into energy. So you can see my process is consuming 10% of the CPU. There are models out there that can convert that into energy, but again, all models are wrong, some models are useful. So there’s always so much nuance to this whole space as well, because how have you tweaked your computer? What else is running on your computer? It can also affect how those numbers are measured. So, unfortunately, this is a very, very challenging area.

But this is really the really big area that a lot of people are trying to resolve right now. We are not at the perfect solution, but we are way, way better than we were three, four or five years ago. It’s actually a very exciting time for measurement in this space.

Laurel: Well, and I guess part of it is that green software seems to be developed with greater scrutiny and higher quality controls to ensure that the product actually meets these standards to reduce emissions. Measurement is part of that, right? So what are some of the rewards beyond emissions reduction or meeting green goals of developing software? You kind of touched on that earlier with the carbon efficiency as well as hardware efficiency.

Asim: Yeah, so this is something I used to think about a lot because the term green has a lot associated with it. I mean, oftentimes when people historically have used the word green, you can have the main product or the green version of the product. There’s an idea in your mind that the green version is somehow less than, it’s somehow not as good. But actually in the software space it’s so interesting because the exact opposite. Being green in the software space means being more efficient, which translates almost always to being faster. When you factor in the hardware efficiency component, oftentimes it translates to building software that is more resilient, more fault-tolerant. Oftentimes it also translates then into being cheaper. So actually green has a lot of positive associations with it already.

Laurel: So in that vein, how can external standards help provide guidance for building software and solutions? I mean, obviously, there’s a need to create something like the Green Software Foundation, and with the focus that most enterprises have now on environmental, social, and governance goals or ESG, companies are now looking more and more to build those ideas into their everyday workflow. So how do regulations help and not necessarily hinder this kind of progress?

Asim: So standards are very, very important in this space. Standards, I mean, one of the things, again, when we look to the ecosystem about three, four years ago, the biggest problem the enterprises had, I mean a lot of them were very interested in green software, but the biggest problem they had was what do they trust? What can I trust? Whose advice should I take? That’s where standards come in. That’s where standards are most important. Standards are, at least the way we develop standards inside the Green Software Foundation, they’re done via consensus. There are like 60 member organizations. So when you see a standard that’s been created by that many people and that many people have been involved with it, it really builds up that trust. So now you know what to do. Those standards give you that compass direction to tell you which direction to go in and that you can trust.

There’s several standards that we’ve been focusing on in the Green Software Foundation, one’s called the SEI, which is a software carbon intensity specification. Again, to prove it as an ISO standard, you have to reach consensus through 196 countries. So then you get even more trust into a standard so you can use it. So standards really help to build up that trust, which organizations can use to help guide them in the directions to take. There’s a couple of other standards that are really coming up in the foundation that I think are quite interesting. One is called Real-Time Cloud. One of the challenges right now is, and again always comes back to measurement, it always always comes back to measurement. Right now measurement is very discreet, it happens oftentimes just a few times a year. Oftentimes when you get measurement data, it is very delayed. So one of the specs that’s been worked on right now is called Real-Time Cloud.

It’s trying to ask the question, is it possible? Is it possible to get data that is real-time? Oftentimes when you want to react and change behaviors, you need real-time data. If you want data so that when somebody does something, they know instantly the impact of that action so they can make adjustments instantly. If they’re having to wait three months, that behavior change might not happen. Real-time [data] is oftentimes at log aheads with regulations because oftentimes you have to get your data audited and auditing data that’s real-time is very, very challenging. So one of the questions we’re trying to ask is, is it possible to have data which is real-time, which then over the course of a year, you can imagine it just aggregates up over the course of a year. Can that aggregation then provide enough trust so that an auditor can then say, actually, we now trust this information and we will allow that to be used in regulatory reporting.

That’s something that we’re very excited about because you really need real-time data to drive behavior change. If there’s anything we really need to do is to drive that behavior change, we need to drive behavior change so people actually invest their time on making software more energy efficient, more hardware efficient, or more carbon aware. So that’s some of the ways where standards are really helping in this space.

Laurel: I think it’s really helpful to talk about standards and how they are so ingrained with software development in general because there are so many misconceptions about sustainability. So what are some of the other misconceptions that people kind of get stuck on, maybe that even calling it green, right? Are there philosophies or strategies that you can caution against or you try to advocate for?

Asim: So as a couple of things I talk about, so one of the things I talk about is it does take everybody, I mean, I remember very early on when I was talking in this space, oftentimes a conversation went, oh, don’t bother talking to that person or don’t talk to this sector of developers, don’t talk to that type of developers. Only talk to these people, these people who have the most influence to make the kind of changes to make software greener. But it really takes a cultural change. This is what’s very important, really takes a cultural change inside an organization. It takes everybody. You can’t really talk to one slice of the developer ecosystem. You need to talk to everybody. Every single developer or engineer inside an organization really needs to take this on board. So that’s one of the things I say is that you have to speak to every single person. You cannot just speak to one set of people and exclude another set of people.

Another challenge that I often see is that people, when they talk about this space, one of the misconceptions they talk about is they rank where effort should be spent in terms of the carbon slice of the pie that it is responsible for and I’ll talk about this in general. But really how you should be focusing is you should be focusing not on the slice of the pie, but on the ability to decarbonize that slice of the pie. That’s why green software is so interesting and that’s why it’s such a great place to spend effort and time. It might not be, I mean it is, depending on which academic paper you look at, it can be between 2 to 4% of global emissions. So some people might say, well, that’s not really worth spending the time in.

But my argument is actually the ability for us to decarbonize that 2 to 4% is far easier than our ability to decarbonize other sectors like airlines or concrete or these other sectors. We know what we need to do oftentimes in the software space, we know the choices. There doesn’t need to be new technology made, there just needs to be decisions made to prioritize this work. That’s something I think is very, very important. We should rank everything in terms of our ability to decarbonize the ease of decarbonization and then work on the topmost item first down, rather than just looking at things in just terms of tons of carbon, which I think leads to wrong decision making.

Laurel: Well, I think you’re laying out a really good argument because green initiatives, they can be daunting, especially for large enterprises looking to meet those decarbonization thresholds within the next decade. For those companies that are making the investment into this, how should they begin? Where are the fundamental things just to be aware of when you’re starting this journey?

Asim: So the first step is, I would say training. What we’re describing here in terms of, especially in terms of the green software space, it’s a very new movement. It’s a very new field of computing. So a lot of the terms that I talk about are just not well understood and a lot of the reasons for those terms are not well understood as well. So the number one thing I always say is you need to focus on training. There’s loads of training out there. The Green Software Foundation’s got some training, learn.GreenSoftware.Foundation, it’s just two hours, it’s free. We send that over to anybody who’s starting in this space just to understand the language, the terminology, just to get everybody on the same page. That is usually a very good start. Now in terms of how do you motivate inside, I think about this a lot.

If you’re the lead of an organization and you want to make a change, how do you actually make that change? I’m a big, big believer in trusting your team, trusting your people. If you give engineers a problem, they will find a solution to that problem. But what they oftentimes need is permission, a thumbs up from leadership that this is a priority. So that’s why it’s very important for organizations to be very public about their commitments, make public commitments. Same way Intel has made public commitments. Be very vocal as a leader inside your organization and be very clear that this is a priority for you, that you will listen to people and to teams who bring you solutions in this space.

You will find that people within your organization are already thinking about this space, already have ideas, already probably have decks ready to present to you. Just create an environment where they feel capable of presenting it to you. I guarantee you, your solutions are already within your organization and already within the minds of your employees.

Laurel: Well, that is all very inspiring and interesting and so exciting. So when you think about the next three to five years in green software development and adoption, what are you looking forward to the most? What excites you?

Asim: I think I’m very excited right now, to be honest with you. I look back, I look back five years ago the very, very early days, first looked at this, and I still remember if there was one article, one mentioning green software, we would all lose our heads. We’d get so excited about it, we’d share it, we’d pour over it. Now I’m inundated with information. This movement has grown significantly. There are so many organizations that are deeply interested in this space. There’s so much research, so much academic research.

I have so many articles coming my way every single week. I do not have time to read them. So that gives me just a lot of hope for the future. That really excites me. I might just be because I’m at this kind of cutting edge of this space, so I see a lot of this stuff before anybody else, but I see a huge amount of interest and I see also a huge amount of activity as well. I see a lot of people working on solutions, not just talking about problems, but working on solutions to those problems. That honestly just excites me. I don’t know where we’re going to end up in five years time, but if this is our growth so far, I think we’re going to end up in a very good place.

Laurel: Oh, that’s excellent. Awesome. Thank you so much for joining us today on the Business Lab.

Asim: Thank you very much for having me.

Laurel: That was Asim Hussain, the director of the Office of Green Software and Ecosystems at Intel, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review.

That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can also find us in print on the web and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

The two words that pushed international climate talks into overtime

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The annual UN climate negotiations at COP28 in Dubai have officially come to a close. Delegates scrambled to get a deal together in the early morning hours, and the meetings ended a day past their scheduled conclusion (as these things tend to). 

If you’ve tuned out news from the summit, I don’t really blame you. The quibbles over wording—“urges” vs. “notes” vs. “emphasizes”—can all start to sound like noise. But these talks are the biggest climate event of the year, and there are some details that are worth paying attention to. 

We’ve seen agreements on methane and renewables, and big progress on an international finance deal. And, of course, there was the high-profile fight about fossil fuels. As negotiators wrap up and start their treks home, let’s take a beat to sort through what happened at COP28 and why all these political fights matter for climate action.

What’s the point of these meetings anyway? 

The UN Conference of the Parties (COP) meetings are an annual chance for negotiators from nearly 200 nations to set goals and make plans to address climate change. 

You might be familiar with the outcome of one of these meetings: eight years ago COP21 gave us the Paris Agreement, the international treaty that set a goal to limit global warming to 1.5 °C (2.7 °F) over preindustrial levels.

This year’s meeting comes at a crucial time for the Paris Agreement. Part of that treaty requires the world to put together a progress report on climate change, called the global stocktake. It’s supposed to happen every five years, and the first one was scheduled to finish up at this year’s COP. 

What were the big agreements from the meetings? 

1. On the first day of the talks, there was a big announcement about a loss and damage fund. This is money that richer nations put into a pool to help pay for damages caused by climate change in more vulnerable nations. 

You may remember that the creation of this fund was a major topic at last year’s COP27 in Egypt. The urgency was spurred by a collection of climate disasters, including particularly devastating floods in Pakistan in August 2022. 

Now there’s some money going into the account: at least $700 million pledged by wealthy nations.

There are some caveats, of course. The agreement is still short on details, missing anything like financial targets or rules about how nations will put money in. In fact, there’s currently no requirement for wealthy nations to contribute at all, and the pledged money is a fraction of what many scientists say is really needed to pay for the damage caused by climate change. (Some estimates put that number at $100 billion annually.)

2. Over 100 countries pledged to triple renewable energy capacity and double energy efficiency by 2030. In addition, the US and 20 other countries signed a pledge to triple global nuclear capacity by 2050. 

3. Finally, 50 oil and gas companies pledged to virtually eliminate methane leaks from their operations by 2030. Methane is a powerful greenhouse gas, and plugging up accidental leaks from oil and gas production is seen as an easy way to cut climate pollution. 

The companies that signed this pledge, which included ExxonMobil and Saudi Aramco, represent 40% of global production. 

Some analysts have pointed out that the pledge will have a pretty limited effect. Most human-caused methane emissions come from agriculture, after all. And accidental methane emissions aren’t the biggest problem fossil-fuel companies cause, by a long shot. The majority of emissions from fossil-fuel companies isn’t from their operations but from their products.

What was holding things up? 

In two words: fossil fuels. 

I wrote in the newsletter a couple of weeks ago about how fossil fuels were going to loom large over these talks, not least because they’re being hosted in the UAE, a nation whose wealth relies heavily on them. The leader of the talks (and head of the UAE’s national oil company) has lived up to that prediction, questioning the scientific reasoning behind the calls to eliminate fossil fuels

As delegates worked to put the final agreement together, a sticking point in the debate was how fossil fuels would be represented. Earlier versions of the draft text called for phasing them out. But many nations, including the UAE, objected to this sort of language. And these meetings run by consensus: everybody has to sign off on the final agreement. 

So in the final version, the language was watered down. The pivotal paragraph now calls on parties to take a series of actions, including “transitioning away from fossil fuels in energy systems, in a just, orderly and equitable manner, accelerating action in this critical decade, so as to achieve net zero by 2050 in keeping with the science.”  

In a way, this bit is a win, since it’s the first COP agreement that even mentions fossil fuels by name. (The bar is truly on the floor.) 

Ultimately, the exact wording of a COP agreement probably won’t be the thing to spur anybody into real action. Rather, the state of the world’s attitude toward climate change is reflected in this agreement: there’s a growing acknowledgement that something needs to change in our relationship with fossil fuels. But there’s not a wide enough consensus yet on the speed of that change, or what that relationship should look like as we pursue ambitious climate goals. 

Maybe next year. 

Another thing

The carbon removal industry is starting to take off, but some experts are warning that it’s headed in the wrong direction. 

There’s a growing signal that the world may have to remove billions of tons of carbon dioxide from the atmosphere to limit global warming. But in a new essay, two former US Department of Energy staffers argue that the emergence of a for-profit sector could actually spell danger for the technology’s ability to help meaningfully address climate change. 

Get all the details in the latest story from my colleague James Temple.

Keeping up with climate  

Silicon powder could be the key to longer EV range and faster charging. Battery giant Panasonic will use silicon material from US-based startup Sila to build new EV batteries. (Wired)

→ Sila’s material debuted in a much smaller product in 2021. (MIT Technology Review)

Not the potatoes! Heavy rains have been bad news for European potato harvesting, sending prices soaring. Thanks, climate change. (Bloomberg)

Repairing EV batteries can be dangerous and difficult. But some mechanics want to do it anyway to save customers money and keep older EVs on the roads. (Grist)

This startup wants to sprinkle rock dust over farmland for carbon removal. (Wired)

Public (non-Tesla) EV chargers in the US can be unreliable, to put it lightly. Here’s how $7.5 billion in federal funding aims to change that. (Canary Media

Two- and three-wheelers are going electric in nations across Asia and Africa. And these small vehicles are having a big impact, making up the majority of reduction in oil demand as transportation goes electric. (New York Times)

→ Gogoro is building a massive network of battery-swappable electric scooters. (MIT Technology Review)

Animal agriculture is a big contributor to climate change, but convincing meat eaters to cut back isn’t easy. If you want to get more people to eat plant-based foods, don’t call them “plant-based.” Much less “vegan.” (Washington Post)

There was one permitted offshore wind farm in progress in the US Great Lakes. Now, the project is on hold. (Inside Climate News)

Two former Department of Energy staffers warn we’re doing carbon removal all wrong

The carbon removal industry is just starting to take off, but some experts are warning that it’s already headed in the wrong direction. Two former staffers of the US agency responsible for advancing the technology argue that the profit-driven industry’s focus on cleaning up corporate emissions will come at the expense of helping to pull the planet back from dangerous levels of warming.

Numerous studies have found that the world may have to remove tens of billions of tons of carbon dioxide from the atmosphere per year by around mid-century to keep global warming in check. These findings have spawned significant investments into startups developing carbon-sucking direct air-capture factories, and companies striving to harness the greenhouse gas-trapping potential of plants, minerals, and the oceans. 

But a fundamental challenge is that carbon dioxide removal (CDR) isn’t a product that any person or company “needs,” in the traditional market sense. Rather, carrying it out provides a collective societal good, in the way that waste management does, only with larger global stakes. To date, it’s largely been funded by companies that are voluntarily paying for it as a form of corporate climate action, in the face of rising investor, customer, employee or regulatory pressures. That includes purchases of future removal through the $1 billion Frontier effort, started by Stripe and other companies.

There’s also some growing government support, including in the US, which is funding carbon removal projects, offering a comparatively small amount of money to companies that provide the service and subsidizing those that store away carbon dioxide. 

But in a lengthy and pointed essay published in the new Carbon Management journal on Tuesday, researchers Emily Grubert and Shuchi Talati argue there are rising dangers for the field. Both previously worked for the Department of Energy’s Office of Fossil Energy and Carbon Management, which drove several of the recent US efforts to develop the industry.

They write that the emergence of a for-profit, growth-focused carbon removal sector selling a carbon removal product, instead of a publicly funded and coordinated effort more akin to waste management, “presents grave risks for the ability of CDR to enable net zero and net negative targets in general,” including keeping or pulling the planet back to 1.5 C of warming. 

“If we missallocate our limited CDR resources and end up not having access to the capacity that can help meet the needs we really have, climatically, that’s a problem,” says Grubert, now an associate professor of sustainable energy policy at the University of Notre Dame. “It means we’re never going to get there.”

One of their main concerns is that corporations have come to see carbon removal as a relatively simple and reliable way of canceling out ongoing climate pollution that they have other ways of cleaning up directly, which the authors refer to as “luxury” removals. The emergence of this market effectively grants a larger share of the world’s carbon removal capacity to profitable companies in rich nations, rather than reserving it for higher priority public goods, including: allowing developing nations more time to reduce emissions; balancing out emissions from sectors we still don’t have ways of cleaning up, like agriculture; and drawing down historic emissions enough to bring global temperatures to safer levels.

“You really need to save it for the stuff you can’t eliminate, not just the stuff that’s expensive to eliminate,” Grubert says. 

That means using carbon removal to address things like the emissions from the fertilizer used to feed populations in poor parts of the world, not for avoiding the hassle and expense of retrofitting a cement plant, she adds.

“CDR cannot succeed at restorative and reparative goals if it is controlled by the same forces that created the problems it is trying to solve,” write Grubert and Talati, executive director of the Alliance for Just Deliberation on Solar Geoengineering.

There is evidence that some companies have come to perceive carbon removal in the way that the authors describe. 

Earlier this year, Vicki Hollub, the chief executive of the oil and gas company Occidental, which recently acquired a direct-air capture company, told the audience at an energy conference: “We believe that our direct capture technology is going to be the technology that helps to preserve our industry over time. This gives our industry a license to continue to operate for the 60, 70, 80 years that I think it’s going to be very much needed.”

Part of the problem, the authors note, is that carbon removal is seen as  “unconstrained,” easily scaled to meet industry goals and climate needs. But in fact, it’s hard and expensive to do it reliably. Direct air-capture machines, for instance, require a lot of land and resources to build and a lot of energy to run, Talati says. That limits how big the sector can become and complicates the question of how much good the facilities do.

Last week, the Global Carbon Project reported that the world’s technology-based carbon removal only sucked down about 10,000 tons this year, “significantly less than one-millionth of our fossil-fuel emissions,” as MIT Technology Review reported.

Other means of carbon removal may be cheaper and more scalable, particularly methods that harness nature to do the job. But some of these approaches, including adding minerals to or sinking biomass in the oceans, also raise concerns about environmental side-effects or create added difficulties in certifying the climate benefits.

Grubert and Talatai fear that growing market pressures, including the demand for high volumes of low cost carbon removal, could undermine the quality of the measuring, reporting and verification of such efforts over time. They add that the carbon removal market may simply replicate many of the problems in the traditional carbon offsets space, where researchers have found that efforts to plant trees or prevent deforestation often substantially exaggerate the amount of additional carbon trapped.

Ultimately, the authors argue that the global task of drawing down billions of tons of carbon dioxide should largely be a publicly funded, owned and managed enterprise, if we hope to achieve the global, common good of stabilizing and repairing the climate.

“There’s a role for the private sector, but our argument is that a purely profit-driven industry that’s currently operating with very little governance is going to go badly,” Talati says. “If we want to see this succeed, we can’t count on the self-governance of corporations, which we’ve seen fail over and over again, across every industry. The role of the public sector needs to be broadened and deepened.”

Stripe didn’t respond to an inquiry before press time. But executives there have argued that Frontier is marshaling corporate funds and expertise to help build up an essential industry that will be needed to combat the dangers of climate change, enabling startups to move ahead with early demonstration projects and to test a variety of approaches to carbon removal. Major investors in the space have also said that rising demand among corporations is helping to drive forward innovation and growth in the field. 

A spokesman for Heirloom, which is part of a team that recently secured Department of Energy funds to move ahead with a major direct-air capture project in Louisiana, said it recognizes some of the risks that the authors raise and has taken steps to address them by committing to follow a clear set of corporate principles: “We believe decarbonization should be the #1 goal of climate mitigation, and CDR should be used for residual and legacy emissions. We feel strongly that CDR is not used as a fig leaf for emitting industries.”

How carbon removal technology is like a time machine

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

If you could go back in time, what would you change about your life, or the world?

The idea of giving myself some much-needed advice is appealing (don’t cut your own bangs in high school, seriously). But we can think bigger. What about winding the clock back on the emissions that cause climate change? 

By burning fossil fuels, we’ve released greenhouse gases by the gigaton. There’s a lot we can (and need to) do to slow and eventually stop these planet-warming emissions. But carbon removal technology has a different promise: turning the clock back. 

Well, sort of. Carbon removal can’t literally take us back in time. But this time-machine analogy for thinking about carbon removal—specifically when it comes to the scale that will be needed to make a significant dent in our emissions—is a favorite of climate scientist David Ho, who I spoke to for my latest story. So for the newsletter this week, let’s consider what it might take for carbon removal to take us back far enough in time to reverse our mistakes (emissions-related ones, anyway). 

The world is on track to hit a new record for carbon dioxide emissions due to fossil fuels, with the global total expected to reach 36.8 billion metric tons this year, according to the newest edition of the Global Carbon Budget Report.

For the first time this year, the report included another total: how much carbon dioxide was sucked out of the atmosphere by carbon removal technologies. In 2023, carbon removal is expected to total around 10,000 metric tons. 

That’s obviously a lot less, but exactly how much less can be hard to grasp, as Ho points out. “I think humans (myself included) have a hard time with orders of magnitude, like the difference between thousands, millions, and billions,” he told me in an email. 

One solution Ho has come up with is putting things in terms of time. It’s something we intuitively have a handle on, which can make big numbers easier to understand. A thousand seconds is around 17 minutes. A million seconds is about 11 days. A billion seconds is nearly 32 years. 

Since time is a bit easier to grasp, when Ho talks about carbon removal, he often invokes the idea of a time machine. “My goal is to help people appreciate the scale of the problem, and put ‘solutions’ into context,” he says. 

Imagine all carbon removal technology as one big time machine, winding the clock back on emissions. If the world is emitting just under 40 billion metric tons of carbon dioxide in a year, how far back in time could this year’s total carbon removal take us? Right now, the answer is somewhere around 10 seconds. 

We eventually need to reach net-zero emissions if we’re going to avoid the worst effects of climate change. And it’s pretty clear that 10 seconds is a pretty far cry from being enough to zero out a year’s worth of emissions. There are two things we’d need to do for this time machine to be more effective: scale up carbon removal technology, and drastically scale back emissions. 

It’ll take time, and likely a lot of it, to get carbon removal technology to a point where it’s a more effective time machine. There are technical, logistical, and economic challenges to figure out. And early projects, like the Climeworks direct-air-capture plant in Iceland, are still getting their footing.

“It’s going to take many years to make significant progress, so we should start now,” Ho says. And while we figure all that out, it’s a good time to focus on decarbonization, he adds. Slashing our emissions is possible with tools we already have on the table. Doing so will make it a bit more feasible for carbon removal technologies to eventually play a significant role in cleaning up our emissions. 

If you’re curious to learn more, including how big a dent larger projects might make, check out David Ho’s article from earlier this year in Nature. You can also take a look back at some of our recent coverage of carbon removal below. 

Related reading

Carbon removal tech is vacuuming up significantly less than one-millionth of our fossil-fuel emissions. Get all the details in my latest story.

Startup Climeworks has been one of the major actors in putting direct air capture on the map. We put the company on our list of 15 businesses to watch in climate tech this year.

The US Department of Energy is committing big money to carbon removal. Earlier this year, the agency announced over $1 billion in funding for the technology, as my colleague James Temple covered.

Another thing

Around a decade ago, a huge wave of startups working on energy and climate-related technologies failed. This surge and crash in what’s often called cleantech 1.0 holds many lessons for innovators today. 

Now, as interest and funding in climate and energy technology companies again is surging, what should we take away from the previous generation of startups? My colleague David Rotman took a careful look for his latest story. Give it a read!

Keeping up with climate  

The University of California system is basically done with carbon offsets. While paying to balance out your own emissions sounds like a good deal, there are a host of problems with the practice. (MIT Technology Review)

Generating an image using AI can require as much energy as fully charging a smartphone. Smaller models doing other tasks (like generating text) can be significantly less energy intensive. (MIT Technology Review)

COP28 is in full swing. Here’s a quick roundup of a few of the headlines that have caught my eye so far. (If you need a catch-up on what’s happening at the UN climate talks and why fossil fuels are center stage, check out my story from last week here.

  • The head of the conference has been criticized for his comments about fossil fuels. (Vox)
  • Over 20 countries pledged to triple the world’s nuclear energy by 2050. (Canary Media)
  • Nations committed over $400 million in funding to help vulnerable nations pay for climate damages. These are the first pledges to the loss and damage fund, created at last year’s talks. (NPR)

A rule change in California slashed the value of rooftop solar panels six months ago. New sales are (predictably) down since the change. (Canary Media)

The Salton Sea is a salt lake in California. It contains a fascinating ecosystem, and apparently a whole lot of lithium. There might be 18 million metric tons of the metal under the main lake, the equivalent of nearly 400 million EV batteries. (LA Times

Congress set aside $7.5 billion for EV chargers. But there hasn’t been a single one installed with the money yet. (Politico

Fossil-fuel emissions are over a million times greater than carbon removal efforts

Carbon dioxide emissions from fossil fuels are on track to reach a record high by the end of 2023. And a new report shows just how insignificant technologies that pull greenhouse gases out of the atmosphere are by comparison. 

Worldwide, those emissions are projected to reach 36.8 billion metric tons in 2023, a 1.1% increase from 2022 levels, according to this year’s Global Carbon Budget Report, released today. As delegates gather in Dubai for this year’s UN climate summit, a record-setting year for emissions underscores the need to make dramatic changes, and quickly. 

“There has been great progress in reducing emissions in some countries—however, it just isn’t good enough. We’re drastically off course,” Mike O’Sullivan, a lecturer at the University of Exeter and one of the authors of the report, said via email. 

Europe’s emissions dropped around 7% from last year, while the US saw a 3% reduction. But overall, coal, oil, and natural-gas emissions are all still on the rise, and nations including India and China are still seeing emissions growth. Together, those two nations currently account for nearly 40% of global fossil-fuel emissions, though Western nations including the US are still the greatest historical emitters.

“What we want to see is fossil-fuel emissions decreasing, fast,” said David Ho, a climate scientist at the University of Hawaii at Manoa and a science advisor at Carbon Direct, a carbon management company, via email. 

However, one technology sometimes touted as a cure-all for the emissions problems has severe limitations, according to the new report: carbon dioxide removal. Carbon removal technologies suck greenhouse gases out of the atmosphere to prevent them from further warming the planet. The UN panel on climate change has called carbon removal an essential component of plans to reach international climate targets of keeping warming at less than 1.5 °C (2.7 °F) above preindustrial levels. 

The problem is, there’s very little carbon dioxide removal taking place today. Direct air capture and other technological approaches collected and stored only around 10,000 metric tons of carbon dioxide in 2023. 

That means that, in total, emissions from fossil fuels were millions of times higher than carbon removal levels this year. That ratio shows that it’s “infeasible” for carbon removal technologies to balance out emissions, O’Sullivan says: “We cannot offset our way out of this problem.”

The report also had bad news about nature-based approaches. Efforts to pull carbon out of the atmosphere with methods like reforestation and afforestation (in other words, planting trees) accounted for more emissions removed from the atmosphere than their technological counterparts. However, even those efforts are still being canceled out by current rates of deforestation and other land-use changes.

“The only way to solve this crisis is with major changes to the fossil-fuel industry,” O’Sullivan says. Technologies like carbon removal “only become important if emissions are drastically cut as well.”

There are many tools available to start making more progress on emissions in the near term, as a UN climate report released earlier this year laid out: deploying renewables like wind and solar, preventing deforestation and cutting methane leaks, and increasing energy efficiency are all among the low-cost solutions that could cut emissions in half by 2030.  

Ultimately, carbon removal could also be part of the answer, but there’s a lot of work left to do, Ho says. Now is a good time to study and develop carbon removal technologies, figure out the risks and benefits of different approaches, and determine which ones can be scaled up while avoiding ecological and environmental-justice issues, he adds. 

None of that is likely to happen fast enough to achieve the progress needed on emissions cuts this decade. In the Global Carbon Budget report, researchers estimate how close we are to sailing past climate limits. The researchers estimate that there’s about 275 billion metric tons of carbon dioxide left to emit before we exceed 1.5 °C (2.7 °F) of warming. At this rate, the world is on track to blow that budget within about seven years, around the end of the decade. 

“We have agency, and nothing is inevitable,” O’Sullivan says. “The world will change and is changing—we just need to speed up.”

The University of California has all but dropped carbon offsets—and thinks you should, too

In the fall of 2018, the University of California (UC) tasked a team of researchers with identifying tree planting or similar projects from which it could confidently purchase carbon offsets that would reliably cancel out greenhouse gas emissions across its campuses. 

The researchers found next to nothing.

“We took a look across the whole market and did deeper dives into project types we thought were more promising,” says Barbara Haya, director of the Berkeley Carbon Trading Project, housed within UC Berkeley’s Center for Environmental Public Policy, who led the effort. “And we came up almost empty.”

The findings helped prompt the entire university system to radically rethink its sustainability plans. In July, UC announced it would nearly eliminate the use of third-party offsets, charge each of its universities a carbon fee for ongoing pollution, and focus on directly cutting emissions across its campuses and health facilities. 

Now the researchers are sharing the lessons they learned over the course of the project, in the hopes of helping other universities and organizations consider what role, if any, offsets should play in sustainability strategies, MIT Technology Review can report. On November 30, they will launch a website highlighting the array of problems they found, the strict standards they helped set for UC’s offset purchases, and the methods they developed for scrutinizing projects in voluntary carbon markets. 

The University of California is a huge and influential public research system encompassing three national labs and 10 campuses, including UC Berkeley, UC San Francisco, and UCLA. Its commitment to replacing natural gas plants and other polluting infrastructure across the state highlights a model that other universities, organizations, and even cities could and should follow, says Holly Buck, an environmental social scientist at the University at Buffalo. And the fact that it has taken such a strong stance on offsets marks another blow to battered carbon markets.

The basic promise of offsets is that individuals or organizations can balance out their own greenhouse gas pollution by paying others to grow trees, halt logging, or take other steps that may reduce emissions or pull carbon dioxide out of the atmosphere. But a mounting body of studies and investigative reports has found that these projects can dramatically exaggerate the climate benefits in a variety of ways, often amounting to little more than greenwashing. 

The growing criticism is taking a toll. Recent data shows that demand for offsets is falling, as are prices for future contracts, a commitment to buy offsets at a set price at a later date, as some companies rethink their reliance on them. But many corporations and nations alike continue to bank heavily on the promise of offsets. Indeed, the subject will be a hot point of debate at the COP28 climate conference that kicks off November 30 in Dubai, where national negotiators will haggle over the standards for a UN-run global carbon trading market.

Haya, who has highlighted issues with offsets for two decades, says she sees three main takeaways from the research project, which she lists in order of priority: Don’t buy carbon offsets; focus on cutting emissions instead. If you must use offsets, create your own. If you can’t create your own, scrutinize the options in the marketplace very carefully and commit to only buy trustworthy ones.

But that third option “is a hard path to take,” she says, “just because of the poor quality on the market today.”

Direct cuts

In 2013, the University of California pledged to achieve carbon neutrality across its campuses and health centers within 12 years by shifting to emissions-free vehicles, building renewables projects, and undertaking similar efforts. But reaching that goal would have also required significant purchases of offsets through carbon markets. 

Students, faculty, and campus budget officers raised concerns about the institution’s plan to rely on and invest so heavily in such an unreliable climate tool. In response, the UC’s Carbon Neutrality Initiative set up the UC Carbon Abatement Committee, which worked with staff, students, and faculty from each campus to establish the institution’s purchasing standards and to identify the types of projects that could meet them. The initiative also provided funding for a dedicated research effort, led by Haya, exploring these questions.

But finding projects that met even the basic standards of reliability proved so difficult that the researchers ultimately drew a larger lesson from the work, says Camille Kirk, who was previously the director of sustainability at UC Davis and co-directed the research effort along with staff at the UC Office of the President.

“You can’t buy your way out of this,” says Kirk, now head of sustainability at the 

J. Paul Getty Trust, one of the world’s richest arts institutions. “Ultimately, it’s just better if you invest in yourself, invest in your infrastructure, and do the direct work on decarbonization.” 

That philosophy is, more or less, what’s now playing out across the UC system.

Based on the Carbon Abatement Committee’s findings, increasingly pointed criticisms of offsets, and tightening California climate targets for state agencies, UC ultimately opted to rewrite its sustainability plan. 

This summer, the university system dropped its 2025 target, after concluding it would have needed to use offsets to address more than 50% of its emissions reductions. Those purchases would have cost the system $20 million to $30 million annually.

“We were not able to get to a point where we had enough confidence that we could procure the volume of offsets we would require to meet our goal, with offsets that would meet our minimum quality requirements,” says Matt St.Clair, the UC Office of the President’s chief sustainability officer.

UC’s goal now is to clean up its carbon footprint by 2045, almost entirely by directly cutting emissions. The system’s updated policy on sustainable practices notes that every campus will now need to charge itself a $25 carbon fee for every ton of ongoing carbon pollution.

That money must be used to cut greenhouse gas pollution, or to support climate justice or community benefits programs. The carbon price will tick up by 5% each year starting in 2026.

The University of California says it has already cut carbon pollution 30% below 2009 levels, through energy efficiency improvements, the construction of more than 100 on-campus solar projects, and similar steps. It has also set up its own utility to purchase clean electricity from solar, wind, and hydroelectric projects.

Funds from the carbon fee will be used to accelerate these efforts, with a particular focus on replacing on-campus natural gas turbines, which produce 80% of the system’s emissions.

Under UC’s revised plan, offsets can only account for up to 10% of the total reductions by 2045. In addition, any projects must adhere to the strict criteria the committee developed, and they must remove carbon from the atmosphere rather than simply prevent emissions.

One way the university system has opted to control quality is to develop its own offset projects, enabling it to direct university funds to faculty and students while ensuring greater confidence that the projects would meet the institution’s standards and values. Indeed, another goal of the Carbon Abatement Committee was to help kick-start UC-initiated projects, in part to explore and test new approaches. 

In March 2019, UC issued a request for ideas to students and researchers across its campuses. It received 80 proposals and has since provided pilot funding for 12 projects, including: a UC Santa Barbara effort to provide households in rural Rwanda with cookstoves that are cleaner and more efficient than their standard means of cooking, potentially cutting greenhouse gas and indoor air pollution; a UC Davis project designed to reduce methane emissions from rice farming in California’s Central Valley by draining the fields at certain points; and a UCLA effort to convert carbon dioxide captured from power plants or industrial facilities in concrete.

As an outside observer, the University at Buffalo’s Buck says she’s eager to see the results from these pilot projects, noting that rigorous, peer-reviewed studies of such efforts could help improve the field’s understanding of what works.

“It’s pretty well demonstrated that the open market approach isn’t generating that knowledge,” she adds.

Carefully vet

But not every organization has the reach and resources to build its own projects, and even UC may not be able to clean up all its lingering emissions through such efforts.

In its effort to identify more reliable project types, the UC research group formalized an approach that it calls “over/under crediting analyses.”

Here’s how it works: Methods for estimating climate benefits of projects will sometimes exaggerate and sometimes discount them. In practice, though, it’s far more often the former—and there are a common set of ways in which the problems frequently arise.

Take forestry offsets. Researchers have shown that the methods for awarding carbon credits often overestimate the levels of logging that would have occurred without the programs, as when conservation groups earn and sell carbon credits for preventing logging in forests they’ve already pledged to preserve.

Programs can also discount the degree to which a timber company may increase harvesting to make up for the supply-demand gaps when another landowner commits to halting logging for carbon credits. Or added carbon gains may simply not last long enough to matter much from a climate perspective, as when wildfires blaze through project areas.

The UC Berkeley group analyzes offset project types with those known problems and others in mind, then strives to calculate whether the offset program’s methods will undercount actual carbon benefits enough to more than make up for any likely overcounting. If the projects pass that test, the results must then be reviewed by at least two independent researchers or go through a formal peer review process at a scientific journal.

In September the largest certifier of carbon offsets, Verra, provided a point-by-point response to a related study by Haya and colleagues that found four of its most widely used methods for forestry offsets dramatically overestimated the carbon benefits.

Verra stressed that it has spent the last two years updating its methods in ways that address most of the concerns and recommendations. The organization added it is committed to transparency and welcomes academic scrutiny. 

Passing the test

If offsets are so often so bad, why bother with them at all? Why would UC even use them to address up to 10% of its climate pollution in the end?

One argument is that some sources of emissions will continue to be difficult to eliminate directly for a long time to come, like those from air travel and cattle digestion. Offsets may create a mechanism for funding projects that do counteract such pollution, and even provide other important societal or ecological benefits, when they are done right.

Take the example of cookstoves. Using the under/over analysis, UC Berkeley researchers found the existing methods for evaluating the impact of giving households cleaner cooking devices exaggerate the climate benefits between six and nine times, on average. But they also noted that if the programs are carried out carefully, with conservative assumptions and stoves that run on very low polluting fuels, they could both cut greenhouse gas emissions and help save some of the millions of people who die annually from household air pollution.

Haya hopes their work will encourage organizations that manage offsets programs and the regulators who oversee them to embrace the sorts of assessment methods they have developed. After all, amid the widening criticisms it’s imperative that these groups transform their approach if they hope to restore faith in the market, she says.

But given the long track record of problems, she argues, it’s better at this point for universities and other potential buyers to spend their money on cutting emissions instead—and to think of purchasing offsets as an act of charity that might do some added good in the world.

“See it as a donation, as a contribution, but not as a quantified, certified ton of emissions reductions,” she says. “We need to move away from the whole idea of offsetting. You can’t fly and drive and burn fossil fuels, and then pay someone else to do something and say you didn’t have an impact.”

Four ways AI is making the power grid faster and more resilient

The power grid is growing increasingly complex as more renewable energy sources come online. Where once a small number of large power plants supplied most homes at a consistent flow, now millions of solar panels generate variable electricity. Increasingly unpredictable weather adds to the challenge of balancing demand with supply. To manage the chaos, grid operators are increasingly turning to artificial intelligence. 

AI’s ability to learn from large amounts of data and respond to complex scenarios makes it particularly well suited to the task of keeping the grid stable, and a growing number of software companies are bringing AI products to the notoriously slow-moving energy industry. 

The US Department of Energy has recognized this trend, recently awarding $3 billion in grants to various “smart grid” projects that include AI-related initiatives.

The excitement about AI in the energy sector is palpable. Some are already speculating about the possibility of a fully automated grid where, in theory, no humans would be needed to make everyday decisions. 

But that prospect remains far off; for now, the promise lies in the potential for AI to help humans, providing real-time insights for better grid management. Here are four of the ways that AI is already changing how grid operators do their work.

1. Faster and better decision-making

The power grid system is often described as the most complex machine ever built. Because the grid is so vast, it is impossible for any one person to fully grasp everything happening within it at a given moment, let alone predict what will happen later.

Feng Qiu, a scientist at Argonne National Laboratory, a federally funded research institute, explains that AI aids the grid in three key ways: by helping operators to understand current conditions, make better decisions, and predict potential problems. 

Qiu has spent years researching how machine learning can improve grid operations. In 2019, his team partnered with Midcontinent Independent System Operator (MISO), a grid operator serving 15 US states and parts of Canada, to test a machine-learning model meant to optimize the daily planning for a grid comparable in scale to MISO’s expansive network.

Every day, grid system operators like MISO run complex mathematical calculations that predict how much electricity will be needed the next day and try to come up with the most cost-effective way to dispatch that energy. 

The machine-learning model from Qiu’s team showed that this calculation can be done 12 times faster than is possible without AI, reducing the time required  from nearly 10 minutes to 60 seconds. Considering that system operators do these calculations multiple times a day, the time savings could be significant.

Currently, Qiu’s team is developing a model to forecast power outages by incorporating factors like weather, geography, and even income levels of different neighborhoods. With this data, the model can highlight patterns such as the likelihood of longer and more frequent power outages in low-income areas with poor infrastructure. Better predictions can help prevent outages, expedite disaster response, and minimize suffering when such problems do happen.

2. Tailored approach for every home

AI integration efforts are not limited to research labs. Lunar Energy, a battery and grid-technology startup, uses AI software to help its customers optimize their energy usage and save money. 

“You have this web of millions of devices, and you have to create a system that can take in all the data and make the right decision not only for each individual customer but also for the grid,” says Sam Wevers, Lunar Energy’s head of software. “That’s where the power of AI and machine learning comes in.”

Lunar Energy’s Gridshare software gathers data from tens of thousands of homes, collecting information on energy used to charge electric vehicles, run dishwashers and air conditioners, and more. Combined with weather data, this information feeds a model that creates personalized predictions of individual homes’ energy needs. 

As an example, Wevers describes a scenario where two homes on a street have identically sized solar panels but one home has a tall backyard tree that creates afternoon shade, so its panels generate slightly less energy. This kind of detail would be impossible for any utility company to manually keep track of on a household level, but AI enables these kinds of calculations to be made automatically on a vast scale. 

Services like Gridshare are mainly designed to help individual customers save money and energy. But in the aggregate, it also provides utility companies with clearer behavioral patterns that help them improve energy planning. Capturing such nuances is vital for grid responsiveness.

3. Making EVs work with the grid

While critical for the clean-energy transition, electric vehicles pose a real challenge for the grid. 

John Taggart, cofounder and CTO of WeaveGrid, says EV adoption adds significant energy demand. “The last time they [utility companies] had to handle this kind of growth was when air conditioners first took off,” he says.

EV adoption also tends to cluster around certain cities and neighborhoods, which can overwhelm the local grid. To relieve this burden, San Francisco–based WeaveGrid collaborates with utility companies, automakers, and charging companies to collect and analyze EV charging data. 

By studying charging patterns and duration, WeaveGrid identifies optimal charging times and makes recommendations to customers via text message or app notification about when to charge their vehicles. In some cases, customers grant companies full control to automatically charge or discharge batteries based on grid needs, in exchange for financial incentives like vouchers. This turns the cars themselves into a valuable source of energy storage for the grid. Major utility companies like PG&E, DTE, and Xcel Energy have partnered on the program.

DTE Energy, a Detroit-based utility company that serves southern Michigan, has worked with WeaveGrid to help improve grid planning. The company says it was able to identify 20,000 homes with EVs in its service region and is using this data to calculate long-term load forecasts.

4. Spotting disasters before they hit

Several utility companies have already begun integrating AI into critical operations, particularly inspecting and managing physical infrastructure such as transmission lines and transformers.

For example, overgrown trees are a leading cause of blackouts, because branches can fall on electric wires or spark fires. Traditionally, manual inspection has been the norm, but given the extensive span of transmission lines, this can take several months.

PG&E, covering Northern and Central California, has been using machine learning to accelerate those inspections. By analyzing photographs captured by drones and helicopters, machine-learning models identify areas requiring tree trimming or pinpoint faulty equipment that needs repairs.

Some companies are going even further, and using AI to assess general climate risks. 

Last month Rhizome, a startup based in Washington, DC, launched an AI system that takes utility companies’ historical data on the performance of energy equipment and combines it with global climate models to predict the probability of grid failures resulting from extreme weather events, such as snowstorms or wildfires.

There are dozens of improvements a utility company can make to improve resiliency, but it doesn’t have the time or funding to tackle all of them at once, says Rhizome’s cofounder and CEO, Mish Thadani. With software like this, utility companies can now make smarter decisions on which projects to prioritize.

What’s next for grid operators?

If AI is capable of swiftly making all these decisions, is it possible to simply let it run the grid and send human operators home? Experts say no. 

Several major hurdles remain before we can fully automate the grid. Security poses the greatest concern. 

Qiu explains that right now, there are strict protocols and checks in place to prevent mistakes in critical decisions about issues like how to respond to potential outages or equipment failures. 

“The power grid has to follow a very rigorous physical law,” says Qiu. While great at enhancing controlled mathematical calculations, AI is not yet foolproof at incorporating the operating constraints and edge cases that come up in the real world. That poses too big a risk for grid operators, whose primary focus is reliability. One wrong decision at the wrong time could cascade into massive blackouts.

Data privacy is another issue. Jeremy Renshaw, a senior technical executive at the Electric Power Research Institute, says it’s crucial to anonymize customer data so that sensitive information, like what times of day people are staying home, is protected. 

AI models also risk perpetuating biases that could disadvantage vulnerable communities. Historically, poor neighborhoods were often the last to get their power restored after blackouts, says Renshaw. Models trained on this data might continue assigning them a lower priority when utilities work to turn the power back on.

To address these potential biases, Renshaw emphasizes the importance of workforce training as companies adopt AI, so staff understand which tasks are and aren’t appropriate for the technology to handle.

 “You could probably pound in a screw with a hammer, but if you use the screwdriver, it would probably work a lot better,” he says.

Your guide to talking about climate tech over the holidays

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Ah, the holidays. Time for good food, quality moments with family, and hard questions about climate change … or is that last one just something that happens to me?

I’m a climate reporter, so at parties I’m often peppered with questions about my job, and more broadly about climate change and climate technology. Sometimes these questions can spark a heated conversation, and I have to admit, I often change the subject or sneak away for a cookie. But all these conversations have shown me that a lot of people have heard confusing things about climate change on cable news or the internet or from their friend in book club, and they want to know more. 

With Thanksgiving and other big holidays coming up, you might find yourself in a similar position. So grab some green bean casserole (made with canned green beans, of course) and let’s dig into a few questions about climate technology that might come up. 

Touchy Climate Topic #1: I’ve heard EVs are worse for the environment than regular cars—the power has to come from somewhere, after all. 

In almost every case today, battery-powered vehicles produce fewer emissions than those with internal-combustion engines. The exact size of those differences does depend on where you are in the world, what is powering the electrical grid, and what sort of vehicle you’re driving in the first place. 

Regional differences can be significant, as catalogued in a 2021 study from the International Council on Clean Transportation. In the US and Europe, an electric car will cut emissions by between 60% and 70% relative to a gas-powered one. In places like China and India, where the grid is powered by a higher fraction of fossil fuels like coal, the savings are lower—20% to 35% in India and 35% to 45% in China. 

Vehicle size matters here too. If you really stack the deck, it’s true that some vehicles with batteries in them can wind up being worse for the planet than some vehicles with combustion engines. Take, for instance, the Hummer EV, a monstrosity that is responsible for 341 grams of carbon dioxide per mile driven. That’s more than a Toyota Corolla running on gasoline (269 grams), according to a 2022 analysis by Quartz research.

One crucial point to remember is that there’s a clear path for EVs to keep getting even better in the future. Batteries are getting more efficient. Recycling efforts are underway (more on this later). And grids around the world are seeing more power coming from low-carbon sources like wind, solar, hydro, and nuclear. That all adds up to EVs that will continue to get cleaner over time. 

Touchy Climate Topic #2: What about all that mining for the materials that make clean tech? Isn’t that going to destroy the planet? 

This one is tough, and there’s a lot of complexity when it comes to all the stuff (yes, that’s the technical term) that we need to address climate change. There are very real environmental and human rights issues around mining of all sorts. 

We’ll need to mine a lot in order to build all the technology required to address climate change: about 43 million metric tons of minerals by 2040 in order to be on track for net-zero goals, according to the International Energy Agency.

The volume of mining is even higher if you take into account that some minerals are present in pretty low concentrations. Take copper, for example—a common material used for everything from transmission lines to EV batteries. Getting one ton of copper can require moving over 500 tons of rock, since sites mined today tend to have concentrations of copper below 1%. 

However, even if you take into account all that waste rock, the energy transition is likely going to involve less mining than the fossil-fuel economy does today. The details will depend on how much recycling we can do, as well as how technologies evolve. If you want more details, I’d highly recommend this stellar analysis from Hannah Ritchie for a comparison.

Any mining can be harmful for the environment and for people living near mines. So it’s still worth paying careful attention to how these projects are progressing, and how we can lighten the burden of new technologies. But climate technology isn’t going to create a brand-new level of mining. 

Touchy Climate Topic # 3: I heard they’re stacking wind turbine blades, solar panels, and EV batteries in landfills. Isn’t the waste from all this “clean” tech going to be a big problem? 

Manufacturers are racing to get more clean energy technologies built and running, which means that in a few decades many will be reaching the end of their useful lives, and we’ll need to figure out what to do with them.

Take solar panels, for example. In 2050, we could see as much as 160 million metric tons of cumulative waste from solar panels. Sounds like a lot—and it is—but there’s a bigger problem. By then we’ll have generated a total of about 1.8 billion metric tons of e-waste, and plastic waste will top 12 billion metric tons. (For other comparisons, check out this Inside Climate News story, and the original article those numbers come from in Nature Physics.)

Overall, waste from climate tech is likely to be a facet of a much more substantial problem. Even so, there are still plenty of good reasons not to just throw old technology into the landfill. Many of the materials needed to make these items are expensive and could be reused to alleviate the need for more mining. 

The good news is, widespread efforts are underway to recycle solar panels, lithium-ion batteries, and even wind turbine blades. So yes, there’s a waste problem looming, but there’s plenty of opportunity to address that now and in the future. 

In the end, if you’re going to talk about climate tech at your holiday meal, remember that some people are more interested in fighting than in having a conversation, so it’s okay to just change the subject sometimes! If you’re looking for something else to talk about, I’d suggest you bring up the fact that crabs have evolved independently so many times there’s a word for the process. (It’s called carcinization.)

Enjoy your conversations about crabs and/or climate tech, and have some mashed potatoes for me!

Related reading

For more on EVs, and specifically the topic of hybrids, check out this story from last year. And for my somewhat conflicted defense of huge EVs, give this one a read.

On mining, I’d recommend this interview my colleague James Temple did with a Department of Energy official on the importance of critical minerals for clean energy. I’ll also point you to this newsletter I wrote earlier this year busting three myths about mining for clean energy. 

And if you’re curious to read about recycling, here are recent stories I’ve written about recycling wind turbine blades, solar panels, and batteries

Another thing

The power grid is getting more complicated, but AI might be able to help. AI could make the grid faster and more resilient in a range of ways, from allowing operators to make faster decisions to making EVs part of the solution. Check out the latest from my colleague June Kim for more!

Keeping up with climate  

New York has purchased 30,000 heat pumps for public housing units. The appliances could help save energy, cut costs, and address climate change, and these and other trials will be key in finding units that work for renters, a common barrier for the technology. (The Verge)

In related news, the US Department of Energy just announced $169 million in federal funding for domestic heat pump manufacturing. (Wired)

→ This is how heat pumps work. (MIT Technology Review)

A $100 billion promise from nearly 15 years ago is still having effects on climate negotiations, including the upcoming UN climate talks. (Grist)

How to get more people into EVs? Pay them to turn in their old gas-guzzlers. New programs in Colorado, Vermont, and California are testing out the approach. (Bloomberg)

Pumping water up and down hills can be a cheap and effective way to store energy. But there’s a growing question about where the water for new storage projects will come from. (Inside Climate News)

Electricity supplies are changing around the world, and these charts reveal how. I loved the world map showing where fossil fuels are declining (the US, most of Europe, Japan) and where they’re still growing. (New York Times)

→ Here’s which countries are most responsible for climate change. (MIT Technology Review)

Eat Just, a maker of vegan eggs and lab-grown meat, is in a tricky financial spot. The company has faced lawsuits and difficulties paying its vendors on time, according to a new investigation. (Wired

The country of Portugal produced more than enough renewable electricity to serve all its customers for six straight days earlier this fall. (Canary Media)

What’s coming next for fusion research

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

We’ve covered the dream of fusion before in this newsletter: the power source could provide consistent energy from widely available fuel without producing radioactive waste. 

But making a fusion power plant a reality will require a huge amount of science and technology progress. Though some milestones have been reached, many are yet to come. At our EmTech MIT event this week, I sat down with Kimberly Budil, director of the Lawrence Livermore National Laboratory (LLNL). 

She was at the center of the science news world last year, when researchers from the national lab achieved what’s called net energy gain, finally demonstrating that fusion reactions can generate more energy than is used to start them up. 

During our conversation on stage, I asked her about this moment for fusion research, where the national labs fit in, and where we go from here. 

The moment

In December 2022, a group of scientists sat in a control room that looked like something out of a space mission. They focused 2 million joules of laser energy onto a target about the size of a pea. Hydrogen fuel inside that target began to compress, releasing energy as the atoms inside fused together. 

And this time, more energy came out than what went in—something that had never happened before.

“This was really just a moment of great joy and vindication for all of the thousands of people who have poured their heart and soul into this pursuit over many decades,” Budil told me on stage at the event. 

Many people thought it would never work, she explained—that the lab would never get to the level of precision needed with the lasers, or get the targets perfect enough to house the reaction. “The laser is a miracle, a modern engineering miracle,” she said during her talk. And “the targets are incredible, precision works of art.” 

It’s “very, very hard to make fusion work,” Budil told me. And the moment researchers achieved net energy didn’t represent the finish line, but one milestone in a series of many still to come. 

The aftermath

After the first successful demonstration of net energy gain, “the first priority was to repeat it,” Budil said. “But the next five shots were duds. They really did not work.” 

It seemed to be mostly a problem with the targets, those tiny fuel pellets that the lasers shoot at. The targets need to be virtually perfect, with no defects. Making one takes around seven months from start to finish. 

It wound up taking around six months to repeat the initial success, but over the summer, the lab achieved the highest energy gain to date. The team achieved net energy gain twice more in October. 

There’s still a lot to learn about fusion, and researchers are trying to do just that with these repeated attempts. On stage, Budil ticked through some of the questions they still had: Could the scientists make changes to the targets? Alter the laser pulse shape? Turn the energy up? 

There’s been steady progress on the science and engineering behind fusion energy for decades, Budil said, but new questions always come up as progress gets made. 

I asked her when she thought this energy source might be ready for prime time. “My best guess is that you could have a demo power plant in 20 years,” she told me. Some startups are making bolder claims than that, predicting a decade or less, “but I think the challenges are much more significant than people realized at the beginning. Plasmas are really complicated,” she said. 

Ultimately, researchers at the national lab won’t be the ones to build a power plant: that’s the role of the private sector, Budil says. But the researchers plan to keep working as part of the growing ecosystem of fusion. 

Budil counsels a bit of patience as researchers around the world work to reach the next big fusion milestone: “The fusion community is definitely known for its irrational exuberance. My job for the last year has been half to get people excited about big science and public science, and the other half is to manage expectations for fusion energy, because it’s going to be very hard.” 

Related reading

The road to this moment in fusion has been a long one. Check out some of our old magazine covers on the topic, from as early as 1972.  

The dream of fusion power isn’t going away, as I wrote in a newsletter earlier this fall.

The first net energy gain in a fusion reactor was a huge moment, but the ultimate application for energy is still many breakthroughs away.

Helion says its first fusion plant will be online as soon as 2028. Experts are skeptical of this and other ambitious timeline announcements, as my colleague James Temple covered earlier this year.

Keeping up with climate  

The US and China have agreed to work together to ramp up renewables and cut emissions. The agreement comes as President Biden and President Xi Jinping meet in person this week. (New York Times)

The first planned small-scale nuclear plant in the US is officially no more. Startup NuScale canceled plans for the project after it failed to line up enough customers willing to pay the rising cost of electricity. (Wired)

→ We were promised smaller nuclear reactors. Where are they? (MIT Technology Review)

A German flow-battery company, CMBlu, just pulled in $100 million in funding. The money is a big boon for a technology that has long struggled to bring the cost savings it’s promised. (Canary Media)

Car dealerships aren’t ready, or in some cases very willing, to sell electric vehicles. That could undermine progress on cleaning up transportation. (Washington Post)

Electrifying heating systems and other appliances in homes could be a major part of cleaning up emissions attributed to buildings. The problem is, renters might have trouble taking advantage of existing incentives for home electrification. (The Verge)

Exxon Mobil is setting up a facility to produce lithium, a key material for the batteries that power EVs. It’s a new foray for the fossil-fuel giant. (New York Times)

A new wave of startups is working to address the threat of wildfires. The field, increasingly termed “firetech,” can help prevent fires, or detect them once they start. (Canary Media)

Companies are racing to set up massive insect farms. The bugs can provide protein for animal feed, in a method that could help cut emissions from agriculture. (Washington Post)

Floods, heat waves, storms, and fires fueled by climate change are getting worse across the US. The hazards will increase unless greenhouse-gas emissions are cut quickly, according to a new report from the US government. (Bloomberg)

I tried lab-grown chicken at a Michelin-starred restaurant

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The waiter lifted the lid with a flourish. Inside the gold-detailed ceramic container, on a bed of flower petals, rested a small black plate cradling two bits of chicken. Each was coated with a dark crust (a recado negro tempura, I later learned) and topped with edible flowers and leaves.  

A swanky restaurant in San Francisco isn’t my usual haunt for reporting on climate and energy. But I recently paid a visit to Bar Crenn, a Michelin-starred spot and one of two restaurants in the US currently serving up lab-grown meat. The two morsels on the plate in front of me were what I’d come for: a one-ounce sampling of cultivated chicken, made in the lab by startup Upside Foods. 

Small wisps of what looked like smoke rose from the dish mysteriously. I wondered if this was my imagination playing tricks on me, adding to the theatrics of the moment. I later discovered a small reservoir for dry ice inside the cylinder the meat was brought out in. As I pondered my plate, I wondered if this could be a future staple in my diet, or if the whole thing might turn out to all be smoke and mirrors. 

Lab to table

Cultivated meat, also called cultured or lab-grown meat, is meat made using animal cells—but not animals themselves. Upside Foods, along with another US-based company called Good Meat, got the green light from regulators earlier this year to begin selling cultivated chicken products to consumers.

Both companies chose to roll out their products first in high-end restaurants. Good Meat, a subsidiary of Eat Just, is serving up its chicken in China Chilcano, a DC spot headed up by chef José Andrés. Upside Foods landed its products in Bar Crenn. 

Neither restaurant could be accused of being cheap, but the placement of these products on a commercial menu is still something of a milestone in affordability for cultivated meat. The world’s first lab-grown burger, served in 2013, cost hundreds of thousands of dollars to make. Upside hasn’t shared how much the chicken on my plate cost to grow and serve, but Bar Crenn sells the dish for $45 on an a la carte menu. 

I ordered a few other items, including a pumpkin tart topped with what appeared to be gilded pumpkin seeds and a grilled oyster dish comprising two oyster bellies with smoked cream and pickled tapioca. (Yes, apparently it’s possible to butcher an oyster.)

Bar Crenn removed most meat from its menu in 2018, a decision attributed to “the impact of factory farming on animals and the planet,” according to the restaurant’s website. It does still serve seafood, though (hence, the oyster bellies).

So Upside’s chicken is the only land-based meat available on the menu. It’s only served on a limited basis, though. Reservations are available once each month for a special Upside Foods night, and they sell out fast.

a hand holding the cultivated chicken piece up for the camera to see the texture

CASEY CROWNHART

Tucking in

After I snapped a few photos, it was time to dig in. While we were given silverware, the servers encouraged us to pick up the chicken pieces with our fingers. The flavor was savory, a bit smoky from the burnt chili aioli. It was tough to discern, with all the seasonings, sauces, and greens on top, but I thought I detected a somewhat chicken-ish flavor too. 

More than the taste, I was intrigued by the texture. This is often what I find isn’t quite right in alternative meats—if you’ve ever tried a plant-based burger like the one from Impossible Foods, you might have had the experience of a slightly softer product than one made with conventional meat. I noticed the same thing when I tried a burger made with part plant-based and part cultivated ingredients earlier this year. 

And Upside Foods has taken on a difficult task where texture is concerned, aiming to make not a chicken nugget, burger, or other blended product, but a whole-cut chicken filet. 

Whole-cut meat like chicken breast or steak is made of complicated structures of protein and fat that form as muscles grow and work. That’s hard to replicate, which is why we see so many alternative-meat companies going after things like burgers or chicken nuggets. 

But Upside wanted its first offering to be a lab-grown chicken filet. And the result is at least partway there, at least in my opinion. Cutting into the Bar Crenn tasting portion showed some fibrous-looking structure. And while the bites I slowly chewed and considered were still softer than a chicken breast, they were definitely more chicken-like than other alternatives I’ve tried. 

Washing up

The thing is, just because lab-grown meat has reached a few plates doesn’t mean it’ll make it to everyone anytime soon. 

One of the biggest challenges facing the industry is scaling up production: growing huge amounts of products in massive reactors. Upside has started work to get to these large scales. It has a pilot facility built in California, which it says has the capacity to produce 50,000 pounds of meat per year.

But for the products I tasted, things are much more small-scale right now. The Upside Foods products served at Bar Crenn are grown in small two-liter vessels, according to the company. A recent deep dive about the process from Wired described it as producing meat “almost by hand,” in a labor-intensive set of steps. 

Part of the difficulty is the decision to make a whole-cut product. In a blog post from September, Upside CEO Uma Valeti said, “We know that the whole-cut filet won’t be our first mass-market product.” The company will be working to scale easier-to-produce options over the next several years. So it’s not clear when, if ever, the chicken I tried will be widely available. 

I’ll be talking with Valeti about the road ahead for the company and the rest of the industry in a panel discussion next week at EmTech MIT, our signature event covering technology in business. We’ll also be joined by Shannon Cosentino-Roush, chief strategy officer for Finless Foods, another startup working to bring new versions of meat—in this case tuna—to our plates. 

There’s still time to register to join us on MIT’s campus or online, and we’ve got a special discount for newsletter readers at this link. Hope to see you there! 

Related reading

A green light from regulators is just the beginning. Read more about the milestone and what’s coming next for Upside Foods and Good Meat in this news story from earlier this year.

For more details on my first lab-grown meat tasting, check out this newsletter.

Finally, I took a close look at the data on just how much lab-grown meat could help climate change. It basically all comes down to scale.

Another thing

If you missed the last few editions of this newsletter, you should go back and give them a read! While I was away for a couple weeks, my colleagues on the climate desk took on some fascinating topics. 

June Kim, our editorial fellow, dug into the potential for heat batteries and shared some news from startup Antora Energy in her first appearance in The Spark. And James Temple, our senior editor for energy, took the opportunity to dive into one of his favorite topics, carbon offsets. What are you waiting for? Go read them! 

Keeping up with climate  

This startup took its electric plane from Vermont to Florida. Here’s what it might mean for the future of flight. (New York Times)

→ The runway for battery-powered planes might still be a long one. (MIT Technology Review)

There’s been lots of talk over the last few weeks about a slowdown in EV sales from legacy automakers like Ford and GM. Battery makers are grateful for the reprieve. (E&E News)

Meanwhile, the industry is still waiting for more details on EV tax credits, specifically related to China’s involvement in the supply chain. It’s a niche bit of rule-making that could have massive implications for the affordability of electric vehicles in the US. (Politico)

The US offshore wind industry is facing a moment of reckoning as rising costs and stalled supply chains put projects in jeopardy. (Canary Media)

Climate-change-fueled droughts and rising temperatures are messing with the fish, too. Smallmouth bass could soon wreak havoc on native fish in the Grand Canyon. (High Country News)

I loved this column on 10 controversial food truths from Tamar Haspel. (Washington Post)

→ Number five reminded me of this story that my colleague James Temple wrote a few years ago, which points out that unfortunately, organic farming is actually worse for climate change than the conventional route. (MIT Technology Review)

Hoboken, New Jersey, is something of a success story when it comes to managing flooding. But it’s nearly impossible to prepare for every storm. (New York Times)