Data centers are designed to support business requirements at a particular moment in time, but these requirements rarely remain static. Business goals change, technology evolves, and new regulatory and compliance frameworks are introduced; all at a seemingly ever-increasing rate. Of course, CIOs and CTOs must evolve and adapt to shifting requirements in a myriad of ways, and this means that although the mechanical and engineering architecture of the data center may not change much during its lifecycle, the IT configuration remains in a state of constant flux.
Consequently, owner/operators are faced with making alterations to a live environment, often without the ability to accurately predict how the facility will react. This challenge poses a serious risk; the wrong decision could inhibit business processes, or in a worst-case scenario, even lead to failure. Fortunately, there is a way to test any proposed changes in a safe environment before implementation.
Enter the digital twin
The idea of a ‘digital twin’ is well established in the manufacturing and aerospace industry. A survey from Gartner demonstrates that 48 percent of organisations introducing IoT said they are already using, or planning to use, digital twins by the end of 2018. But what is a digital twin, and what potential benefits can they offer the data center industry?
A digital twin is a dynamic, digital representation of a real object, linked by measured data such as temperature, pressure, vibration, etc. This digital copy updates and ages as the real object ages, giving the operator or engineers information about its performance.
A good example, is in the case of an aircraft engine. On a flight between Frankfurt and London, a General Electric engine will transmit around two terabytes of information to a digital twin in a General Electric data center. If the digital twin detects any patterns in software data that indicate a potential fault, a maintenance team is dispatched to the plane’s destination airport with the correct parts to make a repair. Data on real-world operating conditions are also fed back to the design team to drive simulations and subsequent design improvements for future versions.
According to Boeing CEO Dennis Muilenburg, data-driven digital twins such as this will be “The biggest driver of production efficiency improvements for the world’s largest airplane maker over the next decade.” For data centers, digital twins can be equally impactful – although the type of model used is slightly different.
Digital Twin for Data Centers
The constant flux of IT configurations means that a data center’s design is never fixed. With inconsistent data, data-driven models will suffer. Consequently, past patterns do not necessarily map onto future ones and future behavior cannot be predicted based on past configurations.
For example, if a colocation provider moves a row of cabinets two cells to the left, this will completely change the data center environment. As a result, previous data will not help engineers to predict the future behavior of the facility once new cabinets are installed and the IT is switched on. This example, demonstrates a clear gap in the effectiveness of data-driven models – whilst they have been used successfully in the data center industry, they are still subject to many of the same shortcomings as human operators.
To fill these gaps, rather than using an exclusively data-driven model, data center digital twins are also physics-based, with the ability to simulate the performance of a new configuration. A physics-based digital twin consists of a full 3D representation of the data center space, architecture, mechanical and engineering systems, cooling, power connectivity, and the weight bearing capability of the raised floor. The data allows you to predict, visualize and quantify the impact of any change in your data center prior to implementation, empowering you to make decisions with confidence.
Forward-looking facility mapping
Physics-based digital twins remove the risk associated with making alterations to an operational data center, by providing a safe space to accurately test the impact of changes before implementation.
To take the first example of installing higher density hardware, CIOs can use a singular platform to simulate this change and understand how it would affect the entire data center environment. A digital twin model enables accurate capacity planning, forecasting and the ability to easily quantify the associated costs. Potential issues can therefore be identified early in the capacity planning stage, allowing organisations to foster collaboration, make better decisions in a shorter span of time and prevent relationships between IT and engineering from becoming fraught under pressure. The result is a more resilient, efficient data center infrastructure supported by effective, responsive and informed teams.
With a physics-based digital twin of the data center in place, IT deployment and facilities management are free to collaborate and experiment with alterations safe in the knowledge that critical business functions will be entirely unaffected – providing more opportunities to innovate and improve performance than ever before.