Why are cold aisles so darn cold?

Why might you find yourself warming up in a hot aisle because you forgot to wear a sweater into the data hall?

More to the point, why haven’t data center operators embraced expanded temperature and humidity ranges that were shared nearly a decade ago?

temperature chart

When ASHRAE issued the revised environmental standards in 2008, and then bumped them again in 2011, those of us in the Mission Critical HVAC business expected IT spaces to start warming up. Heck, I can recall trying to talk down electrical engineers who were kvetching because they were convinced power components would start to fry due to the higher temps.

But while the Web 2.0 crowd has moved closer to the metaphorical equator, in general, the enterprise data center of 2017 still feels pretty much like it did in 2007.

In part, this disparity can be attributed to the difference in their business priorities. The business of the ISPs and the Keepers of the Cloud, is data center-centric. So their bottom line gets boosted when they aggressively standardize, virtualize and optimize the IT environment and what is within. Simply put, if job one is to run the most efficient data center, then you exercise every option in your arsenal, and that includes pushing the environmental envelope.

But if your enterprise is something else… Like manufacturing widgets, trading securities or monitoring ISIS… Then your data center plays a support role within the enterprise, and it therefore receives only the requisite attention it deserves. It’s just the price you pay when you are not sitting at the head of the table.

However, I would posit that the primary obstacle to raising temperatures is more corporal than corporate. Specifically, the reason data centers aren’t uncomfortable is because people are stationed within data centers and they expect to be comfortable. Put another way, too many people are taking up residence within spaces that should be exclusively set aside for equipment.

The first time I stepped into a data center I was struck by the cold and the brightness. I wasn’t surprised however, because it was a data center for goodness sake, and I had some expectation of what a high tech space should be like. Geez, you wouldn’t expect to find HAL-9000 in a parking garage would you? And at the time it made sense. Computers were as finicky as Morris the Cat, and operators were close coupled to the equipment, with monitors to monitor and dot matrix printers at the ready.

The Vehicle Assembly Building, Kennedy Space Center
The Vehicle Assembly Building, Kennedy Space Center – Wikimedia Commons

Later, I was fortunate enough to visit the data center located deep within the bowels of the Vehicle Assembly Building at the Kennedy Space Center (that’s the big one with the US flag and NASA logo painted on its’ side). The machines and people in that room had guided the Space Shuttle Program successfully for decades. But there was no common cabinetry, no air management of any kind, and the number of rolling chairs in the space was rivaled only by the number of black rotary dial phones.

And this was in 2010.

While this may have been understandable for an aging program weeks away from its last mission, it shouldn’t be the case in today’s data center. But too often it is: whether it’s habit, or laziness or poor space management, too many people are still working in the wrong place at the wrong time…all the time.

There is no good reason to still be residing within the confines of an IT space. Most of the time, a data center should be dark and warm…feeling more like a warehouse than a surgical suite. Air management should be driven by the needs of the equipment, not maintaining a tech’s oasis somewhere out on the raised floor.

But what makes this problem an opportunity is that it can be solved almost unilaterally. Most supervisors can manage where their folks are stationed without having to go to Corporate. And of all industries, you would think technology practitioners would be the most adept at implementing work-from-home strategies, and reimagined office paradigms utilizing hoteling, hot desks and the like.

Once a permanent human presence is eliminated, the energy saving opportunities stack one on top of another.

Lighting policies can be implemented immediately. Since egress and life safety requirements are accounted for in the original lighting design, lights that have to stay on will stay on. This means you can simply flip switches and still maintain a safe environment. Motion sensors can be installed, or simply post a sign that says TURN OFF THE LIGHTS. Heck, you can even laminate it.

A bit trickier, but no less doable, is to start a temperature and humidity reset program where ambient conditions are adjusted and monitored. Adjustments continue until hot spots or issues arise, at which time you back down a step or two, corrections are made and the process restarts until it plateaus. It’s your call on how you implement and what is acceptable, but the key is all stake holders need to be advised and invested in the effort. Nothing poops a party faster than an application engineer with a blinking red light who wasn’t invited to the thermostat reset fiesta.

In the end it’s just common sense to separate the people from the processors. Even if your enterprise is not the primary enterprise of your business’s business, wasting energy and money is never a good idea. So to you white space squatters we simply say: Get out already.

Kevin Dickens is a design principal with Jacobs Engineering Group - Mission Critical