Energy efficiency is a priority for continuous process improvement in every data center, so it is natural for designers and operators focus on those functions that consume the most electricity when considering how to improve reliability and reduce costs.

A major “power hog” in any data center is the chilled water system, which typically consumes between 60 and 85 percent–depending on equipment type and cooling architecture—of the energy required by the overall cooling process.

If one can improve the efficiency of water chillers, and/or devise architectures that maximise the amount of time chillers spend in economy mode—in which they are bypassed in favour of less power-hungry elements like dry coolers or cooling towers—one can greatly reduce the overall power consumption and therefore the operating costs. Another benefit is the positive effect on a data center’s PUE (power usage effectiveness) rating that ensues from reducing the power consumed by the cooling effort.

A tactic, detailed inside a recently announced Schneider Electric research study, is to allow higher ambient temperatures inside the data center, in order that the chillers won’t be required to work as hard as intially planned, thereby spending more time in economy mode when free cooling is sufficient to maintain satisfactory temperatures.

Hotter is better

Aquaflair AC air-cooled chiller
Aquaflair AC air-cooled chiller – Schneider Electric

Indeed, there is a concerted effort by many parties throughout the data center industry to permit higher operating temperatures, with server vendors designing their products to withstand higher temperatures than before, and with vendors of cooling equipment continuously striving to improve the efficiency of their own devices. For example, improving the control and responsiveness of fans and chillers to match both the cooling effort and load.

However, with any complex system, attention must be paid to all elements in the process as changing the parameters at one stage will inevitably have implications elsewhere.

There are also trade-offs to be made between capital and operating expenses: increasing the number of fans, for example, requires a capital outlay but allows them to operate at lower speeds, thereby reducing energy costs.

The research detailed in a new Schneider Electric white paper considered the effects on increased temperature for a data center in a temperate climate—in this case Frankfurt, Germany — and with a given cooling architecture based on a very common model of a packaged air chiller with an economiser mode, a dry cooler for use during economiser mode, and an indoor Computer Room Air Handling (CRAH) system.

By allowing the water temperature leaving the chiller (CHW) to rise from its baseline temperature of 7C through various increments to a CHW of 17C, the study found that a 39 percent reduction in energy for the total cooling effort was possible.

Further reductions in energy are also possible but these require additional capital investment. For CHW temperatures above 20C, one must either increase the number of CRAH units inside the data center to increase the cooling capability or one must redesign the cooling coils in the CRAH to compensate for the higher temperature. The latter option resulted in a total energy savings of 50 percent at a CHW temperature of 21C, but the cost of redesigning the coil must be offset against the lower energy costs.

Another approach is to increase the CHW deltaT, or the difference in temperature between the input chilled water set point and the returned chilled water from the IT space. As the delta temperature (deltaT) increases, the power consumption of chilled water pumps and CRAH unit fans decreases, producing a total energy savings of 26 percent for a CHW deltaT of 10C.

In addition, using adiabatic cooling to reduce air temperature to the chiller’s condenser reduces the chiller energy consumption and increases economiser hours. It is an approach that works best in warm dry climates which nonetheless have ample water supplies because adiabatic coolers work using an evaporative process. The study found that use of adiabatic cooling produced total energy savings of 64 percent.

For the Frankfurt data center, the study found that using a cooling system optimised to allow higher CHW temperatures and with additional adiabatic cooling required a 13 percent increase in capital expenditure but resulted in a 64 percent reduction in the energy used on cooling. The payoff was a 16 percent improvement on PUE and a reduced TCO over three years of 16 percent.

The figures may vary from one data center to another depending on architecture and prevailing climate conditions. A comparison with another data center in Miami, Florida—where the prevailing climate is tropical monsoon, unlike Frankfurt’s temperate zone—also showed significant savings.

Deploying additional adiabatic cooling along with optimised higher CHW temperatures required an additional capital investment of 13 percent, similar to that of Frankfurt, which resulted in a 41 percent saving in total cooling energy. This resulted in a reduction in TCO over three years of 12 percent.

Clearly, an investment in infrastructure that permits operations at higher temperatures than before pays off very quickly, with some variations depending on the climactic environment in which the data center is located.

Paul Lin is senior research analyst at the Schneider Electric Data Center Science Center