We started becoming aware of the AI boom in 2022. Sure, before then we had appreciated autocorrecting on our smartphones. We liked (for the most part) Amazon’s list of suggestions as we shopped online. Some of us even started using new platforms to create copyright-free works of art.
What we’re less aware of, however, is that AI technology uses a lot of electricity because it requires thousands of specialized computer chips. It’s becoming apparent that the AI boom in the next few years will increase electricity consumption exponentially. As a result, the electricity needed to run AI is going to increase the world’s carbon emissions — if the companies acquire their power from fossil fuels. If the power is derived from renewable resources, the impact is drastically reduced.
This accelerated development should be raising concerns about the electricity consumption and potential environmental impact of AI and data centers.
AI refers to a range of technologies and methods that enable machines to exhibit intelligent behavior, as defined by Alex de Vries, author of an October, 2023 study out of the VU Amsterdam School of Business and Economics. Generative AI text-generating tools like ChatGPT and OpenAI’s DALL-E use natural language processing to creating text, images, or videos. Different phases of AI — such as the inference phase versus the training phase — vary in terms of electricity consumption.
Examining literature from the field, de Vries concludes that in early years, data center electricity consumption had accounted for a relatively stable 1% of global electricity use, excluding cryptocurrency mining. Between 2010 and 2018, global data center electricity consumption increases came in around 6%. In 2022, data centers that power all computers, including Amazon’s cloud and Google’s search engine, used about 1% to 1.3% of the world’s electricity.
“There is increasing apprehension,” de Vries cautions, “that the computational resources necessary to develop and maintain AI models and applications could cause a surge in data centers’ contribution to global electricity consumption.”
Digital innovations like AI are heavily reliant on data centers to process and store huge amounts of information. To mitigate the energy consumption Goliath, the industry needs to build its data centers to be energy efficient at scale from both a resource and cost perspective. Yet, as companies hurry to create competitive AI models, it becomes increasingly more difficult to get them to consider electricity consumption as they design the next generation of AI hardware and software.
By 2027, AI servers could use between 85 to 134 terawatt hours (Twh) annually as calculated by a modestly conservative approach. More exact estimates are difficult to deduce because companies like OpenAI are closed-mouthed about details, like how many specialized chips they need to run their software.
Study author de Vries devised a method to estimate electricity consumption using projected sales of Nvidia A100 servers — the hardware estimated to be used by 95% of the AI market.
“Each of these Nvidia servers, they are power-hungry beasts,” de Vries told the New York Times. It could help if customers used the servers at less than 100% capacity, which would lower electricity consumption. Then again, server cooling and other infrastructure would push the total higher.
California has begun the process of AI accountability. Two major climate disclosure laws signed by Governor Gavin Newsom will require all large companies to become more transparent about their climate risks and impacts; that means AI-intensive companies like OpenAI and Google will have to divulge their carbon emissions to do business in California. More than 10,000 companies may be affected.
The Securities and Exchange Commission was expected to announce climate disclosure rules for public companies this year, but the initiative stalled after strong opposition from Republicans.
Sustainability Measures to Rein in the AI Boom
As Pankaj Sharma, Executive Vice President of Secure Power Division and Data Center Business at Schneider Electric, explains, “Achieving resiliency and sustainability in data center operations are now both business imperatives.” Sharma describes that just 10-15 years ago, a standard data center had a highly inefficient PUE of ~1.8. “Strong industry innovations like improvements in data center design, cooling, data center management, and uninterruptible power supplies led to an 80% reduction in energy loss, resulting in data centers today which can achieve a PUE of 1.17.” More innovative opportunities such as liquid cooling and data center grid interaction can continue to enhance sustainability measures.
Schneider Electric is walking the talk with its June, 2023 publication, “A Guide to Environmental Sustainability Metrics for Data Centers.” The document outlines how many companies are now reporting on sustainability as a supplement to financial reporting in order to demonstrate their commitment to Environmental, Social, and Governance (ESG) programs. For companies in the data center industry or wishing to report on their data center operations, Schneider Electric has proposed 5 categories of environmental sustainability reporting metrics.
- Energy: The projected future growth of total data center energy consumption, combined with growing distributed renewable energy supply, requires that data center operators have a better understanding of their energy sources. Reporting energy consumption, energy efficiency, and renewable energy use is important for data center operators to show their progress on efforts to minimize their carbon footprint.
- Greenhouse gas emissions: CO2 and other gases such as CH4, PFCs, HFCs are classified as GHGs. Also referred to as “carbon emissions,” there are 3 categories of GHG emissions — Scope 1, Scope 2, and Scope 3. Organizations can reduce their Scope 2 carbon emissions by increasing their share of renewable energy. Replacing fossil fuel-based energy with renewable energy should be a key component of carbon-neutral strategies for data centers’ energy consumption.
- Water: Water shortages are becoming a serious problem in many regions, so it’s important for data centers to understand and decrease water usage. There are different types of technologies (e.g., dry cooler with adiabatic evaporation, liquid cooling) that are being implemented to reduce direct water usage.
- Waste: Data centers are challenged with a unique waste profile compared to other industrial operations. In order to meet circularity targets, data center operators need to understand their waste profile (especially E-waste and batteries) through tailored data center metrics. Circular economy design methodologies and processes support improvements in this area. Reporting waste generation and diversion is emerging in importance for data center operators and is likely to become commonplace in the near future.
- Local ecosystem: Data centers have direct and indirect impact on the local ecosystem including land, sound level, and species. Reporting the impact on the local ecosystem is also emerging in importance for data center operators and likely to become commonplace in the near future.
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
CleanTechnica Holiday Wish Book
Our Latest EVObsession Video
CleanTechnica uses affiliate links. See our policy here.