Liquid cooling is inevitably coming, and data center operators should learn how to capitalize on its possibilities, a DCD online event has heard.

Chip densities will make liquid cooling the only option for powerful data centers, said Operational Intelligence's Robert Tozer in a keynote speech for DCD's Keeping IT Cool event, which addressed the major challenges in data center cooling. To make use of liquid cooling and other technologies, operators need to understand the science and engineering behind removing heat from IT equipment, and make realistic predictions of their needs.

Chip densities drive liquid cooling

Liquid cooling will inevitably come to the fore said Tozer: "The driving force is chip densities. Businesses require the chip to do things - and if businesses require it, it will happen."

Powerful processors already operate at a high heat density which demands a large heatsink, and densities will increase, he said: "Look at the size of the chip compared to the size of the heatsink. As the density goes up, you literally cannot fit the heatsink into your 1U server."

But it won't be a one-size-fits-all movement, and Tozer said operators must find out how the new capabilities match their IT demands: "We hear that liquid cooling releases capex and opex, which is true, but the focus should be on learning and discovering what is coming along. There are loads of questions to be asked," he said, listing issues such as additives and other properties of the coolant. "I suggest you start to do something small scale to look at the options and the things that are required.

Operators need to apply this critical thinking to every other aspect of their systems, said Tozer, including the human element, and the way in which capacity demands will change over time as systems are built in phases.

They also need to understand current systems better: "There are still loads of data centers which are set up wrong," he said. "You need to do a study to see where you are first. Before you start doing things, then you can monitor improvement as you go through the process."

Data center builders should not be over-reliant on single-vendor systems, which may not cover all the options, and be aware of technological issues which create trade-offs.

For instance, a heat exchanger with a large surface area will be more efficient: "If you double the heat exchange area, you can do free cooling at higher temperatures," he said, but pointed out that this will cost more.

Fundamentally, there's still a big need to understand the basic science and engineering, he said, knowing the difference in the impact of air recirculation and bypass, and when refrigeration is necessary in a given climate. Psychrometric charts and equations relating to heat and power are important, as are basic elements such as blanking plates.

As the IT becomes more specialized, these elements will become ever more important, as cooling must become more localized to areas of high-density cooling in the data center, he said.