Archived Content

The following content is from an older version of this website, and may not display correctly.

DatacenterDynamics held its inaugural awards for the North American data center industry in July. This is a first in a series of articles that will cover in detail projects and people who won in each of the seven award categories.

Facebook’s data center in Prineville, Oregon, the first data center the company designed and built for its own use, won in two categories: Future Thinking and Design Concepts and Innovation in the Mega Data Center. The ground-up approach Facebook took to designing the facility and the company’s openness about the concepts used have made it a project that sent ripples across the global data center industry that are sure to have a lasting effect on how people think about data center design.

When setting out to build its first wholly owned data center, the Facebook infrastructure team reconsidered the design of each of the functional components of a data center from software to servers, equipment cabinets, electrical systems, mechanical systems and building form. Their goal was to build the most energy efficient data center in the world.

Facebook Prineville aisle
Facebook Prineville aisle

Nearly 40 percent more efficient than leased facilities

The data hall suites and mechanical penthouse function as an “occupied air handler.” The mechanical penthouse has air in-takes along the entire west face and exhausts along the east face, allowing for low-energy air distribution without the need for extensive duct work.

The building expends no energy for mechanical cooling via refrigeration. Cooling, when required, is supplied by system capable of 100% outside-air economization first and a high-pressure misting system second.

The physical infrastructure, “from grid to gates,” was redesigned with a focus on an energy efficient ecosystem. Further to the mechanical systems, a 277V AC distribution to custom designed servers yields a data center that uses 38% less energy than previously leased facilities.

The data center has a design Water Usage Effectiveness ratio of 0.31 L/kWh, much lower than a typical chilled water plant WUE of 1.0 L/kWh. A 10,000 gallon rainwater collection tank which supplies water for the toilets and courtyard irrigation was installed in the courtyard.

Design ideas were validated via thorough engineering of the electrical, mechanical and hardware systems, including the use of computational fluid dynamics models and lab tests.

One aspect of the Facebook design that may gain popularity is running the data center at temperatures as high as 81F instead of the 68F commonly used today. Another innovation that may prove popular is the lack of a centralized uninterruptible power supply.

The realized a power usage effectiveness (PUE) ratio of 1.07. Total operating costs of this building are 24% lower than Facebook’s leased facilities.

While typical buildings require energy-intensive cooling towers or chillers to keep their interiors at the right temperatures for servers, the Prineville data center has no cooling tower and no chiller.

Contrary to the typical secrecy surrounding data center technology, Facebook tries to develop servers and data centers following the model traditionally associated with open source software projects. Building drawings and specifications for mechanical and electrical systems and server hardware are available for download and critique via The Open Compute Project.

This non-profit foundation was founded by Facebook to create a collaborative dialogue on how to create the most efficient computing infrastructure possible.