The digital economy is expanding at an unprecedented pace, driven by advancements in artificial intelligence (AI).

This rapid growth has heightened the demand for robust infrastructure and spurred a surge in data centers, leading to a significant increase in US energy consumption—something not seen in decades. In fact, the International Energy Agency (IEA) projects that global energy demand for AI-driven data centers will more than double by 2026.

This presents a critical impasse in the AI era: balancing the environmental impact of data centers with the need to expand infrastructure to support burgeoning AI technologies.

To support AI advances, companies are rapidly building new data centers in diverse locations, from metropolitan areas to rural areas across the US. However, increasing societal pressure related to the enormous energy and water consumption of data centers is pushing DC owners to step up efforts to make these facilities more sustainable.

Industry projections indicate an annual development rate of 120-130 new hyperscale data centers, driven primarily by AI, suggesting that this expansion trend will continue well into the future.

The AI Boom and its surge in energy demand

The surge in energy demand is vividly illustrated by the substantial energy consumption of AI applications, with tools like ChatGPT using up to 10X more energy than a standard Google search.

AI models often require significant computing power to train and infer, which is where new GPU-based, power-hungry servers come into play. The increased power consumption of these servers results in additional heat generation, requiring increased energy consumption from cooling systems within data centers. These cooling systems are essential to maintaining optimal operating temperatures, as they prevent overheating and hotspots due to the substantial concentrated heat generated. This is why the power supply and heat dissipation required for AI computing are different from those of traditional data centers.

The growing energy intensity of AI applications linked to new GPU-based servers adds another layer of complexity to the sustainability challenges facing the data center industry and underscores the urgent need for innovative but environmentally friendly approaches to cooling and energy management. As such, developing more efficient cooling solutions strategies is becoming increasingly critical.

Virtual Twins: Transforming sustainability in AI-driven data centers

To tackle the challenge of meeting the growing global demand for sustainable data centers, it is essential to adopt a holistic approach by modeling and simulating the behavior of the data center as a whole, taking into account:

  • The different servers and their associated application load
  • The cooling system and the cold air patterns it produces
  • The direct liquid cooling (DLC) systems, if any

This is where virtual twins enter the game. Virtual twins allow data center stakeholders and operators to virtually model and analyze real-world phenomena throughout the data center's lifecycle and enable scenario-based simulations to predict and address potential issues before they arise.

The strength of the virtual twin is that it will be able to analyze the energy behavior of the data center holistically, by aggregating the behaviors of each of the subsystems, starting from the servers and their application load, the cooling system or DLC systems or the combination of the two, the energy management systems up to the grid that supplies energy to the data center.

By controlling the parameters of the models of these different subsystems, the stakeholders can orchestrate the distribution of applications on the different servers, and refine the parameters of the cold air production system, the DLC system, with the aim of significantly reducing energy consumption. This control can also be carried out automatically through an AI-based application. This approach aims to coordinate and align the fast-paced changes in compute allocation, with the slowly reacting climate system, to ensure optimal cooling at all times.

Additionally, virtual twins can simulate the performance of electrical, cooling, and solar panel systems under various conditions, including equipment failure, offering valuable insights into the data center's performance at every level—from individual assets to entire systems. This capability allows companies to ensure their facilities meet design specifications while reducing power usage and operating costs. By facilitating more precise planning and operational adjustments, virtual twins can enhance sustainability and minimize the environmental impact of data centers.

Finally, the virtual twin allows the capture of real-world data that can be compared with simulation data, in order to plan future upgrades to improve energy efficiency, while reducing operating costs.

Given that cooling systems and servers account for about 80 percent of data center energy consumption, virtual simulation and combined analysis of these systems represent a transformative approach, enabling data center owners to reduce energy consumption and achieve savings of 20 percent to 30 percent on cooling energy costs and up to 10 percent on server energy costs.

Charting the path forward: The future of sustainable innovation

Overall, the environmental and operational impacts of a data center’s material, space, energy, and water consumption are significant. Whether developing a hyperscale data center in a remote location or an Edge data center closer to end users, owner-operators must implement innovative strategies to minimize their carbon footprint while addressing increasing market demands.

Striking the right balance among these factors is crucial for achieving sustainability goals and ensuring operational efficiency. Embracing advanced technologies and sustainable practices will be key to achieving a more eco-friendly and efficient data center future.