Ian Bitterlin’s opinion article raises very interesting issues around computational fluid dynamics (CFD), topics that the industry has been debating for some time. While we don’t agree with the conclusion, it is great that is again under scrutiny and being discussed in this way. It’s important that everyone understands the huge variety of CFD technology available for the data center, and is aware of the benefits and shortcomings of the science.
One of the main tenets of the piece is absolutely correct – CFD models are only as good as the information that is fed into them, and a user should be fully trained and understand what is necessary to get a useful model.
Lack of understanding
Let’s consider the suggestion that CFD is good for electronics, but of no use for a data center, or more specifically, that the models are too simple because a data center (which is in essence just a box of electronics) is too complicated to model accurately.
I would say there is a deeper problem: the fact that CFD for data centers has been successfully applied, but is not well understood by the larger market has resulted in people relying upon simple and inadequate models.
You could argue this is even worse than doing nothing at all. People may believe what the pretty picture is telling them when in reality it is wrong, leading them to make poor decisions and adversely impacting their organization as a result.
However, I can’t agree with the suggestion that simplified CFD models for data centers have no value.
Neither could I agree with the suggestion that modern CFD models are not able to capture sufficient details of a live data center so that the model produces little more than a “pretty picture”. The best CFD technology, used in the right way, is an extremely powerful tool that can help improve efficiency and reduce risk in the data center in a way that real time monitoring can’t predict.
There is a lot of empirical data to support this and organizations like The Green Grid recommend using it, alongside measurement, to achieve the highest level of its Performance Indicator evaluation.
Similarly, research organizations such as the National Science Foundation ES2 project use it as the only practical way to cover a plethora of possible configurations that could not viably be investigated using experiment.
In fact, in some of the research where simulation and experiment have been running side by side, simulation has shown up significant flaws in the measurements that otherwise would have gone undetected.
Good simulations give results
Let’s take a look at simple models first. CFD has been best used in many industries as a tool for sensitivity studies. Whenever anything has variability, designers look at parametric studies and try to select good design behavior that is insensitive to changes in the parameters that are uncertain. So yes, a tick-box simulation of the data center as a conceptual model at 100 percent design load probably isn’t very useful – not least, as Ian says, because the data center in reality will never look like that.
Research organizations such as the National Science Foundation use CFD as the only practical way to cover a plethora of possible configurations that could not viably be investigated using experiment.
However, some parametric studies, looking at response to changes in heat load and different airflow requirements can allow the designer to test the data center’s response to a wide range of likely configurations. These simulations, on modern computers, can be very quick without having to adopt a simplified physics model. They can also accommodate things like control responses typical of cooling systems or IT. In fact, those are just the sorts of things that can be optimized – Where should my pressure and temperature sensors be for controls? What pressure should I control to? How critical is the way in which I seal containment? What happens if some servers go to a high fan flow rate? These are practical questions the designer has to ask and a CFD model can answer. The “pretty pictures” are simply used to communicate the findings to stakeholders who may be less technical.
In terms of modeling real, live data centers; of course creating a good, tolerant design can’t defend you against decisions to deploy equipment of a type, or in a way, that wasn’t considered. That’s where CFD in an operational planning mode is of value and is being used. Whilst I would agree that data center CFD and computer hardware was not ready 10 years or so ago for the detail required to include today’s equipment specifics and control behavior, continuous improvement of the best tools to follow data center technology changes means that it is now effectively used for real data center management, both with and without DCIM.
CFD allows the business to test proposed configurations - accounting for IT and infrastructure control systems - and thereby avoid configurations that will break the intended operation. The business can choose a configuration that will work effectively while also being efficient. In fact, just recently I was asked to look at a model where the results were substantially different to the measured data. Look as I might, I couldn’t see anything that accounted for the significant discrepancy. In this case it turned out the measurement system itself was faulty – it’s good to have simulation and measurement side by side because differences can be used to identify when either has a problem
Finally, you will notice that I specified that “The best” CFD technology is very useful. The truth is, as with many industries, there is a wide range of technologies and applications available that offer varying levels of insight. Again, understanding the capabilities and limitations of the tool you are using is essential.
Perhaps the conversation should not be “Is CFD for data centers just pretty pictures?” but rather “Given there are CFD tools with the capability of modelling current data centers, what does a practitioner, in data center design or operation, need to do to make CFD deliver the ugly truth?
Mark Seymour is CTO at Future Facilities