It is no secret that we are becoming increasingly data-dependent as a society. In the space of just ten years, the amount of data being processed each year has increased by more than tenfold, from just nine zettabytes in 2013, all the way up to 120 zettabytes in 2023. This is forecast to reach 147 zettabytes in 2024, before again jumping to 181 zettabytes in 2025, indicating a singular direction of travel.

Naturally, increased digital demand has a very real physical impact, with the world’s data centers now becoming more resource-hungry than ever before. The industry, to its credit, is in full recognition of this fact, and has long been at the forefront of driving sustainable practice in order to ensure that the environmental impact of this growth remains minimal.

This has manifested in the form of numerous innovations, including high-efficiency servers, liquid cooling systems, waste heat recovery systems, and more. Among all of these developments, there remains a common theme – energy efficiency. Like many industries, the data center sector has pursued this above all, with the crucial Power Usage Effectiveness (PUE) value regarded as the chief metric for measuring this.

However, it is important to recognize that data usage is not only increasing but the way it is used. This brings us back to a topic that is seemingly inescapable as of the present – artificial intelligence.

AI acceleration

As any data center professional knows well, increasingly intense computational workloads have greater cooling requirements, which offers up immediate challenges where AI is concerned.

An article from the Telegraph earlier this year details the full extent of this problem, using the world’s most popular chatbot, ChatGPT, as a case study. For each command given to the chatbot, ChatGPT ‘drinks’ the equivalent of a sip of water. While this doesn’t sound like a lot at first, 20 tasks consume the equivalent of half a liter of water, demonstrating the rate at which this can escalate from a minor consideration into an environmental challenge.

This impact, while already significant, comes at a point when AI remains in its relative infancy. Projections from the International Monetary Fund state that as much as 40 percent of workers have jobs that will be affected by AI, with this figure rising to 60 percent for the world’s more advanced economies. Here, we can safely assume that both the uptake and complexity of this technology are both set to increase, which will only feed into this challenge further.

Exacerbating this challenge, the rate at which the world is experiencing water shortages also seems to be increasing, with WWF projecting that two-thirds of the world’s population will face water shortages by 2025. While it is important to recognize that AI and data centers are not to blame for this, the bottom line is that their appetite for this resource is skyrocketing at a time when it is scarcer than ever.

Closing the loop

There is no one-size-fits-all solution to this challenge, but a starting point for data centers should be water usage effectiveness (WUE), a metric that should be balanced against the traditionally dominant PUE.

Putting this into practice, data center operators may wish to re-evaluate their choice of cooling system in order to find the optimum solution for the needs of their site. A traditional data center cooling tower, for instance, experiences significant water losses through both evaporation and the need for blowdown, compromising on WUE for the benefit of PUE.

Instead, data centers in particularly water-scarce areas, such as Italy and Spain, may wish to consider a closed-loop cooling system. Here, a chiller forms the primary cooling system, with water-chilled air distributed through a cooling circuit consisting of a compressor, evaporator, expansion valve, condenser, and multiple Computer Room Air Handlers (CRAHs), each supplying cooled air to the racks.

From here, the ejected warm air is directed to a heat exchanger and cooled again, before being recirculated. Crucially, this method results in little to no water loss, as the system uses a fixed volume of liquid that recirculates. A decentralized energy and cooling specialist can help bridge the gap while the changeover between the two systems takes place, providing vital power and temperature control to keep the site online.

While this may not be the best solution for every site, the key takeaway is that the water consumption of data centers will only increase from this point onwards. As such, it is time to open up the conversation on WUE, and consider this alongside more traditional concerns such as PUE, in order to ensure that the sector continues to grow sustainably.