This summer, across much of the northern hemisphere, brought with it unusually warm weather, ranging anywhere from clement to scorchio. Even in Northern Canada and Scotland, people were able to remove their overcoats, while across much Europe, North Africa and parts of Asia there were heatwaves – even in the UK.
Sales of electric fans have gone through the roof, of course, but a far more effective way to chill in hot weather is by simply dipping into a pool of cool water – immersion cooling, in other words.
It’s much the same in the data center, where the traditional method of cooling heat-generating CPUs utilizes heat sinks and fans to extract and blow the hot air away, supplemented with aggressive server room air conditioning. But immersion cooling can cool-down equipment more directly, efficaciously and cost-effectively. As a result, it also captures the excess heat more effectively, too, enabling it to be more efficiently reused.
While liquid cooling in various forms is not new the first immersion cooling systems, in which entire servers are immersed in a non-conductive liquid, were only introduced in 2008.
Yet, since then, a series of developments have brought immersion cooling into the mainstream, with more and more operators switching in order to wring greater compute power out of both their server estates and their real estate, while keeping the tightest possible lid on power and cooling costs.
More than one giant leap
Those developments have primarily focused on improvements to cooling capacity, enabling more compute power to be concentrated in the same physical space; and, refining the cooling fluids for better performance and safety.
GRC, the company that developed and introduced the first immersion cooling systems 13 years ago, now has 22 patents, with 15 more pending. “We’ve focused on improving the cooling capacity per rack, with our most recent system able to achieve well in excess of 100KW per rack with a warm water loop,” says Ben Smith, chief product officer at GRC.
Traditional air-cooling solutions are realistically limited to perhaps 20KW per rack but today’s servers are challenging that regularly. Mainstream applications are starting to utilize much higher power CPUs and GPUs that are simultaneously more difficult to cool and we see customers pushing rack densities far above what air cooled data centers can support.
“At the same time, we have also used and tested many fluids for IT device compatibility, optimal thermal performance, safety, and cost,” he adds. The fluids are central not just to the efficiency of immersion cooling systems, but also their safety and ease of maintenance. GRC’s ElectroSafe® family are single-phase dielectric fluids meaning they retain their liquid state without changing into a vapor state. They are carefully screened to ensure safety, immersed device component reliability and performance, and environmental friendliness.
On top of all that, GRC has also been the first to introduce a number of innovations that have since been adopted industry wide. These include rack balancing technology enabling customers to attach multiple racks to a single coolant distribution unit (CDU), reducing both cost and space compared to designs that require piping for systems with a one-to-one ratio between rack and CDU.
In addition, GRC was also the first to design and deliver CDUs with redundant pumps, monitoring and control systems, and the first to design and deploy immersion cooling-based modular data centers, starting with an Edge-optimized two-rack system that has been deployed by the US Air Force. A further high resiliency design has also been deployed by the US Air Force, while a design optimized for disaster recovery has been deployed by a major financial services organization.
People in ice houses
GRC’s research and development center is based at its ICEhouse Lab, where engineers perform testing on new products, fluids, IT systems and even new cables.
“Our staff there are busy validating new fluids to determine where they fit on the scale of performance, safety and cost. We are currently testing a number of new systems being prepared for release to market soon. These include enhancements to our current systems, as well as newly developed systems, also coming soon,” says Smith.
“In addition, we are currently testing a number of servers from different IT OEMs to validate how they will perform in GRC’s cooling systems: the modifications that will ensure optimal performance and the kind of accessories they will need to ensure customers have a great experience in terms of installation, maintenance and so on.”
GRC also engages in longer-term research and development, such as testing entirely new concepts for single-phase immersion systems to see how they can be used to cool the high power CPUs, GPUs and other components coming to market in the coming years.
The company’s R&D effort has run alongside collaboration with industry partners, including Dell Technologies and Intel, focused on optimizing data center servers for immersion cooling. Design innovations have resulted in heat-sink optimizations, improvements to firmware, and better cables, cable management and even OEM server warranty and service delivery.
“GRC has also developed software and firmware solutions, as well as appliances, enabling customers to know what is happening in real time with their systems.
"These solutions help them to better understand fluid temperatures, flow rates and other importance measurements and allow them to view that either directly, connecting to the systems, to manage them via their BMS or other aggregate management platforms,” says Smith.
Other partnerships include collaborations with data center equipment giant Vertiv and oil and chemicals companies such as CPChem, Lubrizol, Shell, SK Lubricants and Eneos.
“These are deep technical collaborations that include testing of systems and fluids, sharing of data for ongoing developments, collaboration on solutions and assuring best performance of immersed devices. All of this results in customers getting total solutions that work really well,” says Smith.
GRC’s “total solutions” (remove quotes) involve engagement with all partners during a sales process to ensure that exactly the right solution, down to the most effective fluid formulation, is supplied.
The ICEraQ Series 10, GRC’s new flagship cooler, significantly increased rack density support, expanded redundancy options for 2N data center requirements, and introduced a protected ‘dry space’ for communications equipment.
Not only is it more compact, the containment area is integrated, removing the need for external containment decks. Units can be placed end-to-end so that units can be packed-in more closely, and cables and plumbing are neatly hidden behind cosmetic panels.
DCD reported on the new ICEraQ Series 10 last year: “The tank is filled with GRC's ElectroSafe coolant, and heat is removed by secondary water cooling. If warm (32 degrees Celsius) water is used in the secondary circuit, the racks can contain up to 200kW of IT power, handled by up to four power distribution units (PDUs) mounted at the rear of each rack.
“Networking and power connections are accessible simply by opening the lid [and] the capacity can be increased to 368kW per system, at an efficiency cost, if chilled water at 13C is used in the secondary circuit.”
On top of all that, the system continually monitors for leaks and provides real-time data on water and coolant temperatures, as well as pressure, coolant pump power consumption, and coolant pump speed, and the system can be monitored via standard data center infrastructure management (DCIM) software.
Not surprisingly, perhaps, the first customer for the new ICEraQ Series 10 was an organization that needs to max-out its compute capacity, the Texas Advanced Computing Center (TACC). TACC deployed the ICEraQ Series 10 Quad system as part of its brand new Lonestar 6 project, immersing 2/3 of the IT load in four GRC 42U racks running 70KW per rack
We’ve only just begun
Air cooling has been the de facto standard for the computer industry since its inception and is arguably running out of time.
In contrast, immersion cooling is relatively new, with plenty of scope for further developments. “The engineering to optimize IT and immersion cooling systems is only just beginning compared to all the work that has been done to optimize air cooling for IT for decades,” says Smith.
After all, every animal on Earth knows that immersion cooling is far more effective than air cooling, and soon that will be true for data centers as well.
To learn more about immersion cooling technology and what it can do drive lower PUEs while enhancing performance, check out The Definitive Guide to Immersion Cooling.
The two have been working together for nearly ten years
Data centers won't use liquid cooling till they have to. Could that time be near?
New GRC Iceraq immersion tank quadruples earlier models to cool 200kW