Traditional data center designs have remained largely consistent over the last few decades, typically supporting around eight to 10 kilowatts (kW) per rack, with design features like raised floors and perimeter cooling. However, with the growing demands of AI and accelerated high-performance computing, there's a noticeable shift toward creating data centers capable of handling much higher power densities – up to 100kW per rack.
This allows for significantly more processing power and the potential for higher revenues, whether from colocation tenants or AI model users. While the core structure of legacy data centers may persist, key components will need to evolve to meet the changing needs of today and future demands.
In a conversation with DCD, Steven Carlini, VP of innovation and data center for Schneider Electric, shares insights on how emerging technology trends are set to challenge traditional infrastructure.
Challenges of legacy data centers
As the demands for higher power and cooling efficiency evolve, existing legacy designs struggle to help data centers keep pace. For example, the use of raised floors in older facilities, originally introduced to house wiring and cooling systems, was practical at the time. However, Carlini explains why this design is likely to be phased out in future data centers:
“I don’t believe the raised floor will be incorporated in new accelerated computing AI data centers because the sheer weight of thousands of kilograms of high-power GPU servers – and the fact that they’re likely to be liquid-cooled – means raised floors just can’t support it.” He adds “much of the power distribution and cooling piping will actually be above the IT racks, and data centers will go more vertical.”
As data centers increasingly require more power density, the limitations of raised floors become more apparent, especially since they already bear so much weight. This makes retrofitting legacy data centers particularly challenging, as power distribution must become denser to meet modern needs. The future of data center design will look quite different; having rows of thousands of racks will no longer be impressive or desirable.
Trends in future-proofing data centers
As the demand for high-density computing continues to rise, data center designs are evolving to accommodate greater power requirements. Carlini highlights this shift by noting:
“As racks are designed to support increasing power demands, we’re shifting from deploying 100 racks at 10kW to more like 10 racks of 100kW per rack for AI clusters. This significantly reduces the IT floor space required and the associated power distribution.”
Carlini continues: “For instance, consider next-generation Nvidia Ruben architecture, which suggests a capacity of 240kW per rack. It’s impractical to build for 240kW per rack and operate at only 40kW; this approach is pushing the limits of densification and cooling capabilities.”
This transformation reflects a broader trend in data center architecture, where efficiency and space optimization are becoming paramount. Carlini further explains:
“The data centers of the future will feature much less white space or IT room space, with a greater emphasis on external equipment.”
This shift means that there will be an increase in chillers placed outside the facility, while the need for traditional power distribution systems and medium- and low-voltage switchgear will be reduced as the load is more concentrated. Such changes look to not only streamline operations but also enhance the overall efficiency of data center management in an increasingly power-hungry landscape.
Application to key verticals: Healthcare and finance
AI applications are increasingly influencing a range of critical industries, including healthcare. With vast amounts of medical data stored online, the potential for AI to improve diagnostics is significant. Carlini notes:
“It’s really interesting to look at diagnosis – if you’re a doctor, you know there’s no way to read everything that’s coming out or to review every successful and unsuccessful case. Savvy doctors may not use AI directly for diagnosing, but they leverage it to gather historical data and identify treatment trends with higher success rates.”
This concept extends to other sectors as well, highlighting the promising future of AI in personalized services, such as tailored healthcare and individualized financial advice. To effectively manage these advanced workloads, Carlini outlines three potential options for legacy data centers to implement AI models:
- Off-the-shelf models: Leverage pre-built AI models from cloud providers and pay based on usage.
- Private cloud: Deploy AI models in a private cloud environment, enabling customization to enhance efficiency and optimize power usage.
- In-house development: Build custom AI models from scratch. This is the most time-consuming option, requiring specialized teams to develop, train, and optimize models for specific applications.
Practical solutions for legacy data centers
As companies with older facilities face the challenge of modernizing their data processing capabilities, a critical question arises: how do they determine the optimal approach for managing their data to meet both current and future needs?
To address these challenges, companies typically begin by evaluating their specific application requirements. Carlini explains:
“For instance, a manufacturing facility aiming to transform into a smart factory would need to integrate various IT systems, such as point-of-sale, CRM, and supply chain management. Once they identify their necessary IT infrastructure, they would approach Schneider for advice on their desired setup, focusing on power, cooling, and, particularly, backup time.”
He adds that while larger cloud providers offer extensive backup options, organizations deploying locally must carefully consider their power availability and the costs associated with potential outages.
In the current landscape, many enterprises are not in the data center business and instead seek off-the-shelf solutions to streamline their operations.
“We provide modular, prefabricated systems that are built and tested in the factory, then deployed on-site with the required IT equipment,” Carlini notes.
These systems come equipped with remote monitoring and management software, which is crucial for companies lacking in-house data center managers. As a result, organizations can increasingly rely on Schneider to address their power needs, negotiate utility contracts, and deploy power and cooling systems, particularly for high-density AI applications.
However, alongside operational considerations, companies must also confront the complexities of deploying an AI application. It's important for cybersecurity to use network segmentation, isolation, or air gapping, which can hinder the ability of AI to pull and report sensitive information and protect their sensitive data effectively.
It's also essential that all of the managed IoT devices on enterprise networks don’t provide access to sensitive data. Carlini emphasizes Schneider’s proactive role in this area, stating:
“We use our DCIM software to not only monitor all critical equipment, but make sure they have the latest and most secure encryption – including UPS units, power distribution, and cooling systems – to ensure that everything, including network access cards, is equipped with up-to-date encryption and is safeguarded against potential security breaches.”
Moreover, he highlights the often-overlooked vulnerabilities that can arise from seemingly unrelated systems. “For instance, air conditioning units used for maintenance can inadvertently provide access points for attackers, thereby expanding the company’s ‘attack surface',” Carlini explains. Schneider assists organizations in assessing these vulnerabilities, ensuring that various parts of the infrastructure, including edge AI data centers, are securely connected without exposing other systems to risk.
In conclusion, as data centers confront growing demands, organizations should embrace innovative solutions to evolve from legacy designs into high-functioning facilities that support AI while prioritizing cybersecurity. Adopting these changes will improve operational efficiency and unlock AI's potential in critical sectors like healthcare and finance.
More from Schneider Electric
-
From growth to gridlock: Why AI and data center capacity demands hinge on talent acquisition
Why people are integral to the infrastructure of AI
-
DCD>Talks: AI and the changing energy landscape with Steve Carlini | Schneider Electric
Steve Carlini joins us at DCD>Connect | London 2024 to examine the impact of AI on the energy & sustainability landscape
-
DCD>Talks: Data center software needs driven by AI with Todd LeBlanc | Schneider Electric
Global Solution Architect at Schneider Electric Todd Leblanc speaks with DCD's Kat Sullivan about what is required of data center software to handle the AI boom