The potential of artificial intelligence (AI) and its various applications to profoundly change our lives is not entirely understood yet.

However, one major impact of the technology is already clear – AI applications are escalating power consumption in data centers at a time when they need to become more sustainable.

Power-hungry AI applications

Storing and processing data to train machine learning and large language models drives up energy consumption. AI applications use large amounts of processing power provided by GPUs or specialized AI accelerators.

I am sure many of you are familiar with generative AI applications like ChatGPT. Researchers estimated that creating GPT-3 consumed 1,287 megawatt hours of electricity and generated 552 tons of CO2 – the equivalent of 123 gasoline-powered passenger vehicles driven for one year.

We estimate that AI power demand will grow at a CAGR of 26 percent to 36 percent driving total demand to a potential range of 13.5 GW to 20 GW by 2028.

By 2030, according to another estimate, the power consumption of data centers could reach 35GW, up from 17GW in 2022 in the US market, which represents about 40 percent of the total globally.

So, as operators design and manage data centers, they need to focus on energy-efficient hardware, such as high-efficiency power and cooling systems, and renewable power sources to reduce energy costs and carbon emissions.

White paper revisited: Getting the data center sustainability metrics right

The good news is that data center operators have options to reduce their carbon emissions in line with the UN’s net-zero goals. To help operators implement sustainability strategies, in 2021 Schneider Electric published White Paper 67: A guide to environmental sustainability metrics for data centers, which provides a framework to measure and improve sustainability.

I wrote a blog in 2022 about the white paper, and since then, I’ve had some follow-up questions from customers, partners, and the media about best practices. In preparing those best practices, I concluded our original framework needed some tweaking based on customer feedback. So, we decided to revise the framework to help data center operators accelerate their path to sustainability.

For instance, customers said the framework was too general, so we made it more data center-specific. We learned that some framework categories had too many overlapping metrics while others had too few, so we revised those metrics.

Additionally, we added server utilization to the Energy category and added land use, land use intensity, and noise metrics to the Local Ecosystem category. We also added battery recycling and e-waste to the Waste category and water replenishment metrics to the Water category.

We sought this feedback from customers and partners to enhance our original framework because it’s important to get the sustainability equation right. As an industry and global community, we must reduce carbon emissions for the planet’s future. Extreme weather events have become more frequent; they were a near-constant this summer.

Unfortunately, these events will become even more frequent unless we take decisive steps to reduce carbon emissions. For data center operators, this means driving efficiencies to reduce power consumption and carbon emissions.

Now is the time for data center decision-makers to implement more aggressive sustainability strategies through high-efficiency hardware and renewable energy sources. Otherwise, the data center industry risks rapidly falling behind in mitigating the adverse consequences of increased power consumption.

So be sure to check out our updated White Paper 67, tell us about your progress toward sustainability, and let us know how we can help.