Country missing? Please select your nearest region...
Published on 26th December 2012 by Mark Monroe
There seems to still be a lot of Fear, Uncertainty, and Doubt around the ASHRAE Thermal Guidelines and data center operating temperatures. An audience poll in December 2012 at the Gartner Data Center conference in Las Vegas revealed that 59% of attendees run at 72F (22C) or less (including 7% who said "it feels nice and cold"), and only 1 in 10 run as warm as 77F (25C) or above. Even though it’s been 4 years since ASHRAE published expanded guidelines in 2008, bringing the term “Allowable” into the vocabulary of data center equipment vendors, very few data center managers venture out of the old, cold habits of the past.
At one time, people around the IT industry said, “No one ever got fired for buying IBM.” That is, until about 1998 when Y2K and the dotcom boom were in full swing, and business needs changed much faster than the industry giant could handle. Then people did get fired for buying IT kit that cost more, performed less, and wasn’t flexible enough adapt to the speed of business.
Similarly, one might say, “No one ever got fired for running a data center at 22C (72F).” If it was good enough for the previous generation of ITC equipment, why should we change? The lowest risk thing to do is keep the status quo, don’t question the old adages, and don’t get blamed for the next failure. Except…
Except, I believe that we should fire people for wasting millions in Capital Expense (CAPEX) and Operating Expense (OPEX) by running their data centers at temperatures and humilities less than the full Recommended ranges. Data centers that don’t take advantage of the full range of operating conditions are forced to implement hugely expensive mechanical and electrical solutions to fight the environments by which they are surrounded. The internet hyperscale operators all run their data center set point temperatures at the high end of Recommended, because they can save money by doing this, either in existing facilities or new construction.
Running the data center below the Recommended ranges is like owning a fleet of delivery vans and running them below the posted speed limit. There are great benefits from running below the speed limit: better fuel economy, lower accident rates, lower insurance costs. But I’ve yet to see the FedEx or DHL truck that is running at 40MPH on a U.S. interstate highway.
There are other parallels between speed limits and Recommended ranges. A friend of mine who grew up in France once said (in your own outrageous French accent), “We view those speed limit signs as, how you say, a good ‘suggestion’.” Everyone knows that the signs define the Recommended limit, but that police have jurisdiction to “allow” higher speeds. Not that I am encouraging anyone to break the law, but it’s clear that Americas roads operate much closer to the “allowable” speed limits than the “recommended.”
There are other examples where we, as a society, follow Allowable instead of Recommended guidelines without impact to reliability or operation. Some examples:
- Automobile motor oil changes: Recommended every 3,000 miles by oil change companies, average Allowable every 7,800 according to Edmunds.com (Reed 2010)
o Cost impact: $1,847 and 125 quarts of wasted oil in 5 years/60,000 miles of driving, 260% increase
- Furnace air filter changes: Recommended monthly by service technicians, Allowable longer periods in the National Air Filtration Association manuals
o Cost impact: $310 per 5 years, a 300% increase
- Alcohol consumption: Alcohol is not included in the USDA recommended MyPlateTM nutritional recommendations, but the National Institute on Alcohol Abuse and Alcoholism says that two drinks per day are generally allowable. (NIAAA 1992)
The impact of running at wider temperature and humidity ranges will vary for different data centers. The Green Grid published a whitepaper in 2010 that investigated the savings for one member company when the set point temperature was raised, 1F at a time, from 68F to 71F. About 3% of the cooling system energy was reduced for each degree F the setting was raised, resulting in more than US$100,000 per year savings with no cost. (The Green Grid 2011)
In another paper by Dell Product Technologist David Moss, a rack of ITC kit was tested in a Schneider Electric test chamber to determine the optimal temperature set point for that equipment/infrastructure combination. The results showed that for ITC equipment from 2007, the “sweet spot” of lowest total energy consumption would occur between 75F and 79F (26C). Moss estimates that updating the test with current generation equipment would move the optimal point up to about 80F.
Of course, the biggest gains can come from allowing the IT kit to run at high temperatures for a few hours a day, a few days per year, and build a facility without mechanical cooling. By some estimates, the 25-year Net Present Value (NPV) of installing a packaged chiller system can be US$1.2M-$1.8M per MW of capacity. That NPV includes capital, energy, personnel, maintenance, water, and other expenses that contribute to the operation of a chiller plant.
Pretty much anywhere in the world north of the 40th parallel, or anywhere the climate has low relative humidity, a system can be designed to operate without a chiller and keep the data center in the Recommended range all but a small percentage of the hours in a year, while keeping those “excursion” hours within the Allowable ranges.
So, just like driving below the Recommended speed limit may extend the life of your car but impact other Key Performance Indicators (like timeliness, schedule flexibilty, and incidents of road rage), I will continue encouraging CxO’s to examine what they are overspending when they operate well below the Allowable limits.
Reed, Phillip. "Stop Changing Your Oil: Breaking the 3,000-Mile Habit." Edmunds.com. Edmunds.com, Inc., 24 Aug 2010. Web. 22 Dec 2012. http://www.edmunds.com/car-care/stop-changing-your-oil.html
MyPlate, USDA recommendations, http://www.choosemyplate.gov/images/MyPlateImages/JPG/myplate_green.jpg
"National Institute on Alcohol Abuse and Alcoholism No. 16PH 315 April 1992." . NIAAA, n.d. Web. 22 Dec 2012. http://pubs.niaaa.nih.gov/publications/aa16.htm .
Brey, T. et al. "Case Study 2 - The ROI of Cooling System Energy Efficiency Upgrades." TheGreenGrid.org The Green Grid, June 2011, Web. 22 Dec 2012. http://www.thegreengrid.org/en/Global/Content/case-studies/CaseStudyROIofCoolingSystemEnergyEfficiencyUpgrades
Moss, David. "Data Center Operating Temperature: The Sweet Spot." Dell.com. Dell, Inc., 21 June 2011. Web. 22 Dec 2012. http://goo.gl/RQ7ll .
Images copyright Gartner Inc, USDA, Wordpress, DLB Associates