Archived Content

The following content is from an older version of this website, and may not display correctly.

Facebook has placed the first of about 250 modules for its second data center in Sweden’s Luleå  – Luleå  2 - marking a milestone for the social media company’ new data center design which focusses on pre-manufactured and modular components.

Facebook announced it will put use its Rapid Deployment Data Center (RDDC) design method in the 125,000 sq ft data center in Sweden at the Open Compute summit in January this year.

Today, at DatacenterDynamics Converged Santa Clara, Facebook design engineer Marco Magarelli said after many reiterations, its modules – which come flat-packed - are making their way into Facebook’s architecture. 

“We are pretty excited,” Magarelli said.

“Instead of defining a space with a chassis we can now define space with the panels we have.”

Facebook has worked with Emerson Network Power to design and manufacture its new data center modules, drawing on experience the vendor has working with other sectors, such as healthcare.

“We found that in any industry the crew on site gets better as start doing more of each job. The more repetitive it is the more streamlined,” Magarelli said.

The modules include power skids, evaporative handlers, water treatment plants and overall data center building structures.

All are essentially read-to-install and where possible use simple off-the-shelf materials and components that can easily be altered to meet evolving data center practices and designs.

Facebook has used a storm door, for example, for its hot aisle which was custom ordered at a quarter of the cost it would have previously paid.

The clear panel can be easily sliced or have more added to make way for variances in cabinet height when required in future.

The new building method reduces the amount of haulage required to get building materials to the site and the amount of waste generated by the build.

Facebook also expects to drastically reduce the time it takes to construct its data centers by up to ten months.

Magarelli said Facebook has already reduced the cost per data hall by about £21,000 in materials alone.

Kits simply arrive at the data center site and are installed into the floor (Facebook does not use a raised floor).

“We eliminated any unnecessary details or joins during our mock up. Each panel can be handled by two people in the field.”

The idea was born out of Facebook’s stick-built Cold Storage facilities, now in Prineville and Forrest City, and being put into use for construction of its Altoona data center.

“We used a lot of principals of pre-engineering for the assembly of panels and systems, including hot aisle support and using off-the-shelf air handling units,” Magarelli said.

Facebook also took guidance from local building practices in Sweden, using insulated metal panes and precast panels which removed the need for interior finishes and minimized the amount of labour required for the build and resulting waste.

“This design process also meant we could expand long bays and close in the facility very quickly,” Magarelli said.

“One challenge we gave ourselves was to deliver twice the amount of data hall space in the amount of time it takes to deliver one.”

Facebook did originally consider moving to containers but “we looked at various modular data center options and determined our pitch is just too large”.

“A container would not meet our scale so we really picked up a framework you see with hospitals where you have a series of chassis for different systems just like a car manufacturing line. We simply take a chassis 12 ft wide, 40 ft long and start deploying that,” Magarelli said.

You can read more about Facebook’s Lulea 2 data center in the FOCUS special on Lulea – Part 1 and Part 2.