Connect with us

Hi, what are you looking for?

CleanTechnica

Clean Power

No, Machine Learning Does Not Have A Huge Carbon Debt

The increase in CPU cycles to advance machine learning has been accompanied by advances in efficiencies of computer technology and lower carbon electricity, but it’s worth paying attention to. It’s only going to increase. 

As part of the CleanTechnica series on the use of machine learning in advancing our low-carbon future, it would be remiss to not point out the carbon debt. However, it’s not as bad as was reported earlier this year, in my estimation.

Let’s talk about the study itself, and the assumptions it made. The paper that made some headlines was Energy and Policy Considerations for Deep Learning in NLP by Strubell, Ganesh, and McCallum of the University of Massachusetts Amherst, and it was published in June of 2019. Strubell and McCallum are part of the team that built a state-of-the-art natural language processing model, LISA. That stands for linguistically informed self-attention, and as followers of the series will remember, attention is core to machine learning.

Some of the numbers provided for the CO2e emissions were quite large, with one model, an advanced translation model referred to as the Evolved Transformer for neural architecture search, having a calculated carbon debt of 626,155 lbs of CO2e to train and optimize. 300 tons of CO2e is quite a bit, but some context is required, and then a recalculation with likely better assumptions.

As a reminder, neural nets are trained occasionally and often used many times. Taking the Tesla machine learning model, it has over 500,000 cars on the road with its neural net chips, and Tesla’s Autopilot and Autosteer features are used by vastly more people than any competitor. As a result, when thinking about the carbon debt of training neural nets, we have to compare that to the number of times that they are actually executed and for what purpose. Given that each Tesla displaces an internal combustion vehicle and that when using autonomous features the cars are actually more efficient, this is a highly virtuous use of machine learning.

As a different example, an earlier article in the series looked at the CoastalDem machine learning model. That use of machine learning took North American satellite radar coastal elevation data, trained it with ground truth from Lidar, validated it against Australian Lidar, and then ran it for the entire world. The model was executed a few times, but the end result is a static dataset of adjusted coastal elevations which is being referenced around the world for policy and climate action planning. In this instance, the understanding of actual threat from climate change and the multiple reuse of the outputs outweighs the carbon debt.

Not all examples are so beneficial, of course. Recently, an article in the series assessed the Heliogen improvements of focusing concentrated solar power (CSP), and found that while the machine learning portion was interesting and potentially reusable in other domains, the end results were very unlikely to be of any value. Certainly, the purported use cases for its higher heat CSP didn’t stand up to scrutiny.

Let’s look at the assumptions made by the research next. The key one I tested was the paper’s assumption of 0.954 lbs of CO2e per kWh for model training. That’s the US average, and as I looked at that I had a hypothesis that it was likely overstated given where most deep machine learning efforts were being performed.

To that end, I first pulled together the data on current state-by-state CO2e per kWh.

Bar chart of Lbs / kWh CO2e emissions

Chart by author from IEA data

As can be seen, the US average conceals a wide variance of potential CO2e debts for compute power. A model which is trained in Washington State on compute resources that are powered off of straight grid electricity would have a tenth of the carbon debt of one trained in Wyoming.

My hypothesis was that many of the models in the report would be California-based. The 0.47 lbs CO2e per kWh that is from California’s grid is only 50% of the carbon debt of the US average.

However, after determining this I then went deeper. I looked at each of the major models with a calculated carbon debt in the paper to see where they were actually trained, assuming that at least one or two would be trained in Google data centers, with Google’s 100% renewable commitment and offsets. The results were substantially at odds with my expectations.

Table of neural net model training from paper with extension of actual CO2e

Table by author

These are the models and associated training CO2e burden per the paper. When I dug into the compute resources used, I found that in all but one case they were Google or Azure compute resources used for learning. The 3rd through 6th columns are the variance calculation between what the paper suggested and what was likely accurate. To be clear, the NAS Evolved Transformer model still sees 10 tons of CO2e, which is considerable, but also a tiny fraction of the study’s assertion.

I had performed a rough assessment based on publicly available data earlier this year, What Is The Carbon Debt Of Cloud Computing? My assessment found that of the biggest Cloud providers, Google and Microsoft Azure had the lowest carbon debt by far, having not only a commitment to 100% carbon-neutral electricity that they were working to achieve, but also purchasing high-quality carbon offsets for their operations. That puts the CO2e per kWh down in the 0.033 lb range given the full lifecycle emissions of wind, solar, and hydro. Amazon’s AWS wasn’t as good, but had still achieved 50% renewables for its data centers in 2018, meaning its operations are far below the US average.

The authors of the paper used a different approach to assessing data center loads. They started with a 2017 Greenpeace report on the subject, so it was relatively solid, however it doesn’t cite CO2e per kWh at all, but stays silent. Instead, it reports different mixes of electrical generation actually purchased and provides percentages of those. Unsurprisingly, all the major Cloud providers are buying a lot more low carbon electricity than the average for the grid, but also unsurprisingly, they still have to purchase MWh that have been generated from coal and gas. I won’t quibble with Greenpeace’s methodology, but I do find a substantial variance between the bulk purchasing of renewable electricity by Google and Microsoft and the claims that their data centers run in large part on gas- and coal-generated electricity. I suspect that Google and Microsoft are buying sufficient electricity from renewables for their operations, but Greenpeace isn’t choosing to credit them with it.

But that’s not the largest issue with the assumption made by the paper. That assumption is that since Amazon’s AWS is the most popular Cloud compute platform and its breakdown per Greenpeace was roughly the same as the US breakdown, that the US average was appropriate to use. As can be seen from the resulting assessment in the table above, not one of the models assessed used Amazon, so that’s a bit of a problem with the reliability of their results.

To be clear, I’ve taken an average CO2e for renewables assuming Google and Microsoft have purchased offsets to get them there where they are not directly purchasing renewables, but they also might be purchasing offsets for the low full lifecycle CO2e.

This isn’t to say we should disregard the study.

Chart showing exponential growth in machine learning advances

Chart courtesy openai.com

Open AI — back to Elon Musk again — published an assessment of compute cycles required to train machine learning over the years. What they found is that major advances in machine learning capabilities showed an exponential growth in CPU cycles required, shown as a straight line on this logarithmic chart.

The increase in CPU cycles to advance machine learning has been accompanied by advances in efficiencies of computer technology and lower carbon electricity, but it’s worth paying attention to. It’s only going to increase.

Note: I’ve reached out to the study lead author for comment. Should they get back to me, the article will be updated.

 

Advertisement
 
Appreciate CleanTechnica’s originality? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.
 
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Written By

is Board Observer and Strategist for Agora Energy Technologies a CO2-based redox flow startup, a member of the Advisory Board of ELECTRON Aviation an electric aviation startup, Chief Strategist at TFIE Strategy and co-founder of distnc technologies. He spends his time projecting scenarios for decarbonization 40-80 years into the future, and assisting executives, Boards and investors to pick wisely today. Whether it's refueling aviation, grid storage, vehicle-to-grid, or hydrogen demand, his work is based on fundamentals of physics, economics and human nature, and informed by the decarbonization requirements and innovations of multiple domains. His leadership positions in North America, Asia and Latin America enhanced his global point of view. He publishes regularly in multiple outlets on innovation, business, technology and policy. He is available for Board, strategy advisor and speaking engagements.

Comments

You May Also Like

Cars

Monday's ruling means the case will play out in open court.

Cars

Are the social forces at play a sign of falling confidence in the all-electric car manufacturer? Will Tesla's innovation and robotic ingenuity offset a...

Cars

Fewer than 1 in 4 e-curious car buyers are considering a new car from an "EV only" brand.

Cars

How profitable is each car that Tesla sells? This is a question that was recently asked in a thread of misinformation and FUD (fear,...

Copyright © 2021 CleanTechnica. The content produced by this site is for entertainment purposes only. Opinions and comments published on this site may not be sanctioned by and do not necessarily represent the views of CleanTechnica, its owners, sponsors, affiliates, or subsidiaries.