I was recently asked why our cooling optimization consultants find it necessary to use an IoT platform that monitors the thermally sensitive equipment in a data center. It’s true that as subject matter experts we are able to build up a mental picture of the dynamic behavior of any cooling system over time but, from experience, we’re careful not to rely just on mental pictures when it comes to cooling.
Our cooling optimization team are trained to utilize a sensor and software platform to support decision making that leads to risk free energy savings. This gives us the real-time data insight that is essential for effective thermal optimization. However, this doesn’t mean we don’t still take manual readings to support our overall understanding. In fact, we take a great pride in our experimental skills and are constantly confirming and calibrating our IoT data with other manually-derived data sources.
Granularity is key
If organizations are serious about achieving new levels of thermal efficiency in their data centers, but are unable to remain focused on this singular aspect of their operation, they absolutely need access to meaningful real-time cooling data that they can use to apply to their space, power, cooling and other airflow dynamics.
With cooling representing around 30 percent of a data center’s operating cost, it’s clearly critical for organizations to be focused on thermal optimization - particularly as cooling issues still account for almost a third of unplanned data outages.
For true cooling optimization, however, we believe it’s necessary for data centers to start getting much more granular. And that means monitoring and reporting temperature and cooling loads much more actively. In an ideal world, ASHRAE recommends a minimum of three temperature sensors per rack – however, achieving this would probably require around 10x more sensors than are currently deployed in today’s typical data center.
Realistically, data centers aren’t going to switch to this level of sensing immediately and – until now – traditional sensor costs would have made this financially unviable. What is clear though is that - from a risk management perspective – temperature sensing is now a critical requirement, particularly if you want to be certain that all your essential servers are actually ASHRAE compliant? Are you prepared to take the risk that they’re not?
Measurements alone aren’t enough but having access to real-time, rack-level data provides exactly the data platform needed for the kind of software-enabled real-time decision-making and scenario planning capabilities. This is what organizations increasingly require to optimize their critical facilities.
It’s only when you combine this level of granular cooling and thermal data with smart software that you can start to track cooling loads in real-time. True thermal optimization requires a proven, safe process that’s based on thousands of real-time sensors and expert spatial models that combine to remove the uncertainty from data center cooling.
Dr Stu Redshaw is the chief technology officer of EkkoSense