We live on a blue planet, yet only one percent of the world’s water is useable. Growing populations, intensive urbanisation and industrialisation are putting strains on this one percent like never before.

Earth and moon_Arek Socha_Pixabay_earth-1365995_small.png
– Arek Socha, Pixabay

In 2015, the World Economic Forum in Davos listed 'water crises' for the first time as the world’s leading threat, and this is not a crisis limited to developing countries. Water is essential to achieving climate targets because climate change will affect the availability, quality and quantity of water for basic needs, as well as for industry and agriculture.

So as an industry we cannot ignore this.

Take the US, for example. Data centers throughout the United States consumed a combined total of 660 billion litres of water in 2020, according to the US Department of Energy. In 2018, Google used 15.79 billion litres of water1. Some went to offices, but most was consumed by its global fleet of data centers. In 2018, meanwhile, Microsoft used 3.5 billion litres, with most of that also going to its data centers2.

A 1MW data center using traditional cooling can use around 25 million litres of water per year (Heslin, 2016) and less than one-third of data center operators track any water metrics, with water conservation ranked as a low priority (Heslin, 2016). The data center industry must take its environmental responsibility seriously, including water conservation, with water usage efficiency (WUE) considered just as important as power usage efficiency (PUE) in the future.

The evolution of data center cooling

Air-side optimisation

About ten years ago, the optimisation of air temperatures was introduced as the latest way for data centers to improve efficiency. At the time, many data centers ran at set points of between 20°C and 22°C.

The American Society of Heating, Refrigerating, and Air-conditioning Engineers (ASHRAE) published guidelines for temperature and humidity operating ranges of IT equipment, updating them in line with technology advancements with a focus on sustainability.

Its updated ‘recommended’ supply temperatures have not only been increased, but include a much wider operating band, called the ‘allowable’ range, which represent the actual limits at which IT equipment should be subjected to. This gave data center operators more flexibility in terms of temperature setting, with the caveat that outside the recommended range, there is an expected impact on reliability, defined as the ‘X factor’.

Running data centres at higher temperatures reduced the cooling requirement and provided more opportunities for free-air cooling. Airedale International pioneered this with the introduction of chillers specifically designed to take advantage of enhanced free-cooling opportunities, resulting in a lower cost of ownership for operators.

Adiabatic cooling systems

The rapid growth of the industry in the last ten years resulted in a supply chain struggling to keep up. Chillers were facing serious issues as operators required a lot more from their cooling systems than was offered or available on the market. For some time, chiller operating limits and efficiencies were preventing the industry from achieving the PUEs that it strove for and, simply put, past efficiencies were no longer sufficient.

As problems with chillers continued, alternatives like adiabatic dry air coolers were taking advantage of elevated supply air temperatures to provide solutions without the need for compressors. Chillers, at the time, were extremely inflexible, and had relatively limited operating envelopes.

In recent years, air-side optimisation has been built on with the introduction of adiabatic cooling systems. This technique incorporates both evaporation and air cooling into a single system. The evaporation of water, usually in the form of a mist or spray, is used to pre-cool the ambient air to within a few degrees of the wet bulb, allowing cooler and more efficient operation.

The use of spray or mist means water use is significantly lower than with more traditional evaporative systems, but a conservative water usage estimate for a modern data centre employing an adiabatic cooling system would still be 500,000 litres/1MW/per annum. As data centres grow larger, this becomes a real concern, particularly in regions where water shortages have been identified as a threat.

The water usage itself is not even the whole story. This water still has to be stored and treated which increases capital costs and as with any mechanical equipment exposed to continuous water contact, cooling plant has been seen to suffer from increased degradation, putting strain on OPEX costs too.

Water side optimisation

Having recognised the need for cooling systems that provide something close to the efficiencies that can be achieved with adiabatic cooling, but with a more sensitive approach to water conservation, Airedale developed an innovative approach to data centre cooling, that takes the philosophy behind air side optimisation and evolves it further. Airedale call this water side optimisation, and it is proving its worth in many of the world’s leading data centres already.

The philosophy of water side optimisation is based on taking an optimised air environment and looking at what other variables can be adjusted in order to deliver more free-cooling. Assuming that the air within the data centre white space stays at the same temperature, the next step was to reduce the approach temperature whilst opening the difference between water supply and water return.

Implementing innovations within the plant equipment means the supply and return air remain as before, but supply and return water temperatures are higher, thus the approach temperature is reduced. We see a fixed temperature difference of 12°C TD on the air side with the fluid side being opened out to 12°C and the approach temperature closing from 6°C to 4°C.

We achieve this via physical changes in the equipment and environment in a number of ways:

  • Higher water temperatures, meaning less mechanical cooling;
  • Simplified air paths;
  • Ducted hot air return;
  • CRAC/CRAH with maximised coil surface area;
  • Free cooling coils with maximised surface area;
  • Free cooling chiller/DAC;
  • Smart controls with dynamic operation.

Free-cooling chillers are matched to large surface area chilled water coils in either indoor CRAC units or fan walls. The air path is simplified using hot aisle containment, creating a pressure differential that draws cool air through the servers and out of the white space via ducts and back to the air conditioning plant via a common plenum. The air is introduced directly to the space via side wall diffusion, minimising air side pressure drops. This is all managed with an intelligent controls platform that monitors fluctuating demand within the white space and dynamically operates the system at its most efficient operating point.

The benefits of this are:

  • Lower waterside pressure drop;
  • Smaller pipework;
  • Wider free-cooling band;
  • No need for adiabatic spray & water storage;
  • Less mechanical cooling meaning more efficient chiller operation;
  • Lower fan speeds meaning more efficient indoor unit operation;
  • Lower pump power meaning more efficient water transfer;
  • Large coil surface leads to increased cooling for less footprint (more cooling capacity per metre).

Based on average temperatures for London, an extra 2°C creates many more hours of free cooling. Fourteen percent more free cooling (59 percent in total) can be achieved with water side optimisation, with all but one percent of the rest of the year being covered by concurrent cooling (a combination of free cooling and mechanical), giving huge benefits in terms of chiller efficiency.

This system could provide free cooling for more than half of the year in all of Europe’s major data center hubs (London, Frankfurt, Amsterdam, Paris, Dublin).

References:

  1. Google Environmental Report 2019 [PDF]
  2. Microsoft 2019 Data Factsheet: Environmental Indicators [PDF]