Archived Content

The following content is from an older version of this website, and may not display correctly.

Facebook has detailed how it has incorporated energy efficiency technologies into its “first” Prineville data center in the US, with a video on its Facebook page posted yesterday.

The video outlines its efforts with free-air cooling, how its Open Compute efforts contribute to efficiency, how it recovers water and uses a solar farm on the site to power its office activities.

Facebook Prineville facility manager Ken Patchett leads the tour.

He says Facebook’s energy efficiency efforts mean Facebook deals in what it calls ‘Negawatts’, as opposed to Megawatts, “because this is the power that we never had to generate to run this building – the carbon footprint in creating energy doesn’t exist because we are not using it”.

Facebook reclaims all water at the site and uses this for its bathrooms and gardens – this is both from rainwater runoff and condensation.

But most of its efficiency is gained through the use of free-air cooling.

“A standard data center with a chilled water system uses 100% of the water. We use only 30% of the water a normal data center uses,” Patchett said.

Patchett walks viewers through the entire cooling process at Facebook, starting with the vestibule where air comes down from the “penthouse”, the place Patchett says creates the real energy efficiency magic.

Facebook’s Prineville airflow vestibule

The vestibule contains a number of humidity and temperature sensors which takes readings as air passes into the data center. It can then respond by increasing or decreasing the amount of humidity and temperature required.

“I am standing in an air temperature of about 70 degrees Fahrenheit now, and about 30% relative humidity,” Patchett says from the data center floor.

The penthouse features hurricane-force louvers which bring the cold air into the building. It then passes the air through filter banks and a wall of pipes that can produce pressurized water to humidify the air and cool it down to the required temperature.

Hurricane-force louvers which pass air from outside into Facebook’s facility in Prineville 

Air from the data center’s hot aisle can be added to the mix when temperatures need to be raised. This all happens as part of Facebook’s MeeFog system.

Facebook’s MeeFog system - water from here is used to humidify and change the temperature of the air

A downdraft room contains fans that draw air into the data center, where it hits a concrete floor and is then distributed through the aisles and through the servers.

Facebook uses cold and hot-aisle containment, with a plenum 14 ft high separating the aisles. Patchett said the different in temperature between these two aisles can be as much as 30 to 35 degrees.

“On a cold day the air is recirculated and brought back into the data center space to create a better temperature, in winter we are not cooling anything, we are heating,” Patchett said.

Facebook Prineville and Open Compute
Patchett takes the audience through the benefits of Facebook’s Open Compute racks and servers at Prineville. “It helps to create a demand for the industry which overall drives energy efficiency in the data center space,” Patchett says.

Patchett and an Open Compute rack and battery bank (in blue)

He said Facebook uses a DC uninterruptible power supply (UPS ) – “in other words a battery bank”-  which is separated into microfailure domains. “This battery bank will support this rack, and this rack, with Open Compute servers, has a 277V power supply fed from a 480V bus.” This bus can choose either AC or DC power.

He also removes an Open Compute server to highlight its simplicity in removal and in design, and says the server is 38% more efficient than any other machine you will find on the market and it costs 24% less.

An open Open Compute server at Facebook

You can view the full video on Facebook’s Prineville Data Center page here.