Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

US lab cools servers with warm water

  • Print
  • Share
  • Comment
  • Save

The US Department of Energy’s National Renewable Energy Laboratory (NREL) is taking liquid cooling to the next level, carrying out a retrofit that uses warm water for its Skynet high performance computing (HPC) cluster.

The retrofit of its air-cooled racks will be carried out as part of a relocation of the Skynet cluster into its new Energy Systems Integration Facility (ESIF) in Golden, Colorado, which it has been designed to achieve a Power Usage Effectiveness (PUE) of 1.06.

NREL said this low PUE makes the Colorado facility the most energy efficient data centers in the world.

The 182,500 sq ft facility has been designed to carry out research into distributed energy systems and the integration of renewable energy into the electricity grid.

It is used for large scale modelling and simulation of material properties and processes that the laboratory said would be too dangerous to carry out using direct experimentation.

Its liquid cooled retrofit will see the lab place liquid cooling vendor Asetek’s RackCDU direct-to-chip hot water cooled system, which uses warm water at 75F to operate servers, on to existing air-cooled servers and racks.

“Because of RackCDU’s design, these performance improvements will be achieved without the need for a customized server design,” Asetek said.

Waste heat recovered from the system will be used as a primary source of heat for the rest of the facility – buildings, offices and laboratories.

NREL’s director of its Computational Science Center Steve Hammond said the system will help reduce the facility’s water use and allow it to increase server density within the cluster.

“Starting with warmer water on the inlet side can create an opportunity for enhanced waste-heat recovery and reduced water consumption, and in many locations can be accomplished without the need for active chilling or evaporative cooling, which could lead to dramatically reduced cooling costs,” Hammond said.

Asetek claims its RackCDU can bring about cooling energy savings of up to 80% and density increases of 2.5 times that of modern air-cooled data centers.

“RackCDU removes heat from CPUs, GPUs, memory modules and other hot spots within servers and takes it out of the data center using liquid where it can be cooled for free using outside air, or recycled to generate building heat and hot-water,” Asetek said.

Related images

  • Asetek's RackCDU: Applying warm water to cool servers

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required
required
  • Print
  • Share
  • Comment
  • Save

Webinars

  • Is Hyperconvergence a Viable Alternative to the Public Cloud?

    Thu, 31 Mar 2016 15:00:00

    Enterprise IT leaders are right to be skeptical of such bold claims. After all, is it really possible to deliver an on-premises infrastructure that delivers the same agility, elasticity, and cost-effectiveness of public cloud providers like Amazon Web Services? If you’re following the traditional IT model, with its many siloes and best-of-breed point solutions, the answer is, most likely, no. To truly deliver a viable alternative to public cloud, you need to look beyond traditional IT. Join Evaluator Group and SimpliVity to learn more about how hyperconverged infrastructure can deliver the efficiency, elasticity, and agility of public cloud."

  • "Single Pane of Glass” comes to your Datacenter facility & IT operations

    Wed, 24 Feb 2016 18:00:00

    Join Hewlett Packard Enterprise and RoviSys,as well as OSIsoft, for a webinar hosted by DatacenterDynamics’ CTO Stephen Worn, as they discuss the implementation of the “Single Pane of Glass” solution and some of its resulting benefits: •An estimated 10 Million kWh saved in the first full year of operation •Ability to easily meet “Best-Practices” throughout data center operations

  • Overhead Power Distribution – Best Practice in Modular Design

    Wed, 3 Feb 2016 16:00:00

    Overhead power distribution in your data center offers many attractive possibilities, but is not without its challenges. Join Starline's Director of Marketing, Mark Swift; CPI’s Senior Data Center Consultant, Steve Bornfield; and University of Florida's Joe Keena for an exploration of the options and some of the pitfalls, supported by real-life examples from the field.

  • White Space 29: See the diagram

    Fri, 29 Jan 2016 10:40:00

    Peter is investigating DevOps, Bill looks at security while Max is stuck on the infrastructure level

  • White Space 28: The good, the bad and the ugly

    Thu, 21 Jan 2016 12:10:00

    Join the DCD team again this week, as they discuss AMD's ARM based efforts, waste heat, BT's EU deal and cloud for Canucks and much more! Enjoy.

More link