Archived Content

The following content is from an older version of this website, and may not display correctly.

Data center computer room cooling, using large computer room air conditioners (CRACs) and a raised floor for the air plenum, is an effective approach to data center equipment cooling. However, as rack densities grow and load diversity across the room increases, supplemental cooling should be evaluated for its impact on cooling system performance and efficiency.

As data center management explores newer cooling technologies, an increased number of liquid cooling solutions are being installed. This is because liquid cooling can remove more than 3,000 times more heat than air cooling alone.

One of the newest forms of liquid cooling is the rear-door heat exchanger (RDHx). The RDHx can be added to existing equipment racks, or you can provide new rack enclosures with an RDHx incorporated as part of the new rack enclosure.

RDHx systems can provide supplemental cooling support for a few high-density racks, or provide the total cooling solution with or without a raised floor. When building data centers in the future, IT leaders should examine the need for raised floors. At a minimum, they should plan on multitiered cooling solutions that will eliminate the need for raised floors in at least part of the machine room.

The basic design of the RDHx is a door that is attached to the rear of an equipment rack and contains heat exchange coils filled with water or refrigerant. Cold air flows into the front of the rack, is heated during the running of the equipment, and immediately flows through the RDHx and is cooled to a temperature that is the same as, or cooler than, it was when it entered the front of the rack. Passive (no fans other than the existing server fans) or active (high-efficiency fans as part of the RDHx) can be used.

The RDHx provides a highly focused cooling solution by placing the cooling unit directly on an individual equipment rack. It doesn’t depend on room cooling to recool hot air coming out of the equipment racks, because there is a heat exchanger incorporated into the rear door of the rack.

These systems will require chilled water or a liquid refrigerant to be installed in the data center and distributed to the racks via piping and a coolant distribution unit (CDU). A typical CDU is approximately the size of a standard rack, depending on capacity.

Vendors offering RDHx units include:

OptiCool — Cool Door

Black Box Network Services — Cold Front

Emerson-Liebert — DCD and XDR

Vette — CoolCentric LiquiCool

Impacts and recommendations
Data center designers’ use of an RDHx can simplify air flow management by removing the need for hot-aisle/cold-aisle configurations and hot-aisle/cold-aisle containment.

Air flow management is the most important part of the cooling design for a data center. The most common best practice for a data center that uses CRAC units and a raised floor for the cold air plenum involves arranging the equipment racks in a hot-aisle/cold-aisle configuration. This requires the equipment racks to be arranged so the cold-air inlet faces the same way for all racks in a row. Then each row alternates which way the cold-air inlets face (Figure 1). This helps air flow so the hot air is limited in its impact on the cold air before it can be used to cool the equipment.

Another best practice improves on this design by adding hot-aisle/cold-aisle containment solutions. It involves enclosing the cold aisle so that no contamination of the cold air by air from the hot aisle is possible, or by closing off the hot aisle or the back of each rack and venting the hot air outside or back to the CRAC units.

With the use of RDHx units, the above solutions are not necessary because the air exiting the equipment rack is as cold as or colder than the air entering the rack. Also, an RDHx design can be installed on either a raised floor or a slab floor.

The RDHx can be used as the only cooling solution in a data center, or it can be used as part of a multitier cooling solution, where it is only used for high-density equipment.

Recommendations
- When equipment in your data center greatly varies in cooling needs, implement a multitiered cooling strategy, with the RDHx as part of that strategy for higher-heat-density racks.

- Do not use a raised floor space for equipment using an RDHx (unless it’s pre-existing), because the RDHx does not require the added expense of a raised floor.

Data center designers can use liquid cooling to lower data center costs and support the scalability of computing infrastructures through higher densities.

Although new chip designs attempt to lower the heat footprint of processors, increases in the need for computing power are leading to increased equipment densities. These densities, in turn, increase cooling requirements.

With the growing number of high-density servers on the market, infrastructure and operations leaders must provide adequate cooling levels for computer rooms.

The power for cooling a data center can take between 60% and 65% of the total power used. Higher-density racks of 15kW to 20kW can require more than 1.5kW of cooling load for every 1kW of IT load, just to create the cool airflow needed to support those racks. The side benefit of the RDHx is that, not only do you have more efficient racks but much of the power you were using for cooling is available for reuse by facilities to support other building systems or be re-routed as additional IT load.

The use of liquid cooling can solve the high-density, server-cooling problem because water (conductive cooling) conducts more than 3,000 times as much heat as air and requires less energy to do so. Liquid cooling enables the ongoing scalability of computing infrastructure to meet business needs.

It may not be obvious that the RDHx solution can save money, so customers must be willing to build the business case. According to one vendor, the solution for new or facilities expansion projects can show ROI in 12 months or less, compared with conventional cooling. Retrofits of existing data centers will require a longer ROI.

 


Recommendations
- Use RDHx cooling for dense to extremely dense, high-energy (10kW to 30kW per rack) server racks, or for data centers in which computer room air conditioning and underfloor cooling is at its limit.

New RDHx designs provide data center designers with new options and opportunities to address cooling needs.

Early on, the RDHx used water as the liquid and a passive design. A passive system has no moving parts in the rear door and consists of just the heat exchange grid in the rear door of each rack. Pumps in a CDU circulate the liquid to each rack. It uses the equipment cooling fans to push the hot air coming from the equipment through the grid.

Although this does a good job of cooling the equipment in the rack, these first solutions face several challenges.

First, many data center managers are hesitant to bring water into the data center for fear of leaks damaging equipment (even with the many advances made to piping products). Second, using the equipment cooling fans requires that they be highly efficient, and the unit must be well sealed so that air blown toward the cooling grid actually passes through it, not around it.

More recently, some vendors have started providing a refrigerant solution instead of water. A low pressure refrigerant solution can alleviate water leakage concerns because, if leaks occur, refrigerants can boil off as nontoxic, noncorrosive gases. Although this adds extra cost, it may allow an RDHx solution to be used, because it removes the worry of water leaks damaging equipment.

An active, rather than a passive, system provides high-efficiency fans on the outside of the heat exchange grid that pulls the hot air through the grid. This ensures the fans used are high efficiency and limit the hot-air bypass of the heat exchange grid.

Some products provide extra redundancy through the use of multiple coolant lines to the units and by using three separate cooling units within a single rack. This solution offers redundancy up to 20kW, because each 10kW unit is independent; if one should fail, two more units in the door are available to remove heat.

 

 

This article first appeared in FOCUS on Racks and Cabinets, in FOCUS magazine Issue 32. You can read the full digital edition here, or download a copy for the iPad from DCDfocus.