Because of the design work that went into its data center in Phoenix, Arizona, codenamed Project Mercury, eBay gets to claim energy efficiency way below industry average.
The Green Grid examined the facility and released a case study that described the design process and its stellar efficiency results. In a worst-case scenario, the site showed a PUE of 1.43 and the best efficiency result so far was 1.26. A lot of this comes down to eBay’s use of free cooling year-round, but how is eBay getting free cooling in the desert? Here’s how:
First, the background
The data center was built to accommodate future cooling technologies if and when they develop. Meanwhile, it is ready for the prospect of direct-to-chip water cooling if necessary. eBay required from its design team that the facility could handle three generations of spot-cooling technologies, according to the whitepaper.
The systems were designed to handle four hardware refresh cycles over the facility’s lifespan, doubling power and cooling capacity with each refresh without requiring any major construction. Rack density can scale from 4kW to 40kW per rack and the cooling capacity can be easily upgraded to accommodate that.
Container efficiency
There is a raised floor inside and infrastructure on the roof to support up to 12 40ft containers. While IT racks both on the raised floor and inside the containers are highly dense, density actually increases cooling efficiency, according to The Green Grid’s whitepaper.
There are currently two different types of data center modules on the roof: one Dell module and two HP PODs. Cooling systems on the two types of modules are completely different from each other. The roof’s total capacity is 6MW of containerized IT load.
IT equipment in containerized data centers is generally closer to cooling systems than on traditional data center floor. This means they require less air and can tolerate higher temperatures and humidity. Containers deployed at Project Mercury were designed
to tolerate 50C temperatures in the summer continuously using adiabatic cooling or water from the building’s hot-water cooling loop.
eBay’s senior director of global foundation services Dean Nelson told us the Dell module showed some impressive efficiency results on one hot day in Phoenix. Using adiabatic cooling and running 26kW per rack, the unit showed a PUE of 1.046 at one point, when the temperature outside was 48C.
Cooling with hot water
Summer temperatures in Phoenix peak at 49C, but The Green Grid’s paper estimates that about 6,000 hours of free air cooling is available in the region per year. Water at 31C can be delivered year-long without any mechanical chilling.
There are two parallel water cooling systems in the data center, and both racks on the raised floor and containers can use one or both
cooling loops.
One loop is traditional and the other is a hot-water cooling loop. The latter uses cooling-tower water exchanged through a waterside economizer, delivering water at 30C to the IT load. Since the local climate allows for water free-cooled down to 31C year-long, the hot-water loop is a year-long supply of free
cooling capacity.
Servers and the cooling system were designed to be able to use water at a maximum temperature of 30C and a maximum IT inlet-air temperature of 36C. This was possible because eBay pressed its server manufacturers to design equipment that ran under hot conditions.
From 4kW to 40kW per rack
The initial installation useds in-row cooling. As a rack reaches 28kW, a passive rear door can be added. As rear doors are added, in-row units can be removed, which will leave room for more IT racks. If the racks reach 40kW per rack, direct-to-chip cooling can be added to continue cooling with the hot-water loop. If densities rise further and the weather becomes hotter than expected, valves can be switched to add chilled water to the mix.
The roof modules use a plate-type heat exchanger. Chilled water runs through the heat exchanger, which trims the temperature of water in the hot-water loop in case outside temperature gets hotter than expected.
All of this amounts to a data center that will stay efficient as its density goes up. Ultimately, it owes its success to eBay’s pushing vendors and engineers away from business-as-usual.
This article originally appeared in the DatacenterDynamics FOCUS magazine special supplement FOCUS on Cooling