Artificial intelligence (AI) has transformed how we live, work, play, and do business. However, it has also created a myriad of headaches for data center operators as they cope with exponential demand for water, power, and compute. For this DCD>Talk, Stephen Worn interviewed Tuan Hoang, modular data center manager at Schneider Electric about what benefits modular solutions hold for the deployment of AI in data centers.

Green Blue Generic
– Getty Images

Hoang cites the immediate problem as one of densification – finding the space for all the new infrastructure and IT required to handle AI workloads. He warns that this will require drastic changes to the white space in the data hall to accommodate new appliances, whilst simultaneously ensuring that the infrastructure can be upgraded to handle the higher power demands and increases in network traffic.

Turning this challenge into an opportunity, Schneider recently announced a modular AI solution, which offers data center operators a simplified entry into the AI marketplace boasting scalability, end-to-end support, and a consistent experience across deployments.

At the heart of the deployment strategy is ‘right-sizing’; ensuring that the system is resourced correctly, with enough power, and a little redundancy, but without ‘stranded capacity’. Hoang explains:

“Just like there is a shortage of components for the IT space, there will continue to be strains on the supply chain for infrastructure. What we want to do is maximize that, and with Schneider, we have the end-to-end solution from the infrastructure side, because we want to put together a solution that maximizes all the components and works well together, not only stranding the capacity but also looking at where our portfolio is maximized. We want to make sure we have the supply chain that supports it, and we maximize it consistently –then we can look at warehousing and manufacturing, to make sure that all these components line up and that we're not wasting anything.”

The Schneider modular AI solution is based on a series of reference designs, to ensure that when it is deployed across multiple locations, anywhere in the world, over 80 percent of the components are fixed, leaving just localization to be added bespoke. This allows as near uniformity in design as possible while, allowing for the different regulations across regions to be accounted for. This also means that Schneider can fulfill demand faster and speed up on-site deployment.

The result is a product that can provide a common starting point, but which can be scaled up as demand dictates, and as burgeoning AI technologies mature. It will also allow for any common standards that may come along in the future. Or as Hoang summarizes: “Making sure we don't have multiple sets of equipment, and deliver the simplest solution, in an environment that allows for the fact that the load and the future can be ambiguous, so we must ensure that the platform is robust enough to accept the changes today but also tomorrow.”

Because the modular solution is based on a scalable approach, Schneider is keen to create “common building blocks” in the form of agreed standards for things such as site feeders. By creating common standards, a knowledge base is created that allows for simpler optimization. “We want to make sure we optimize that block so when we scale it to 10, 100 or even 200-gigawatt sites, the building blocks stay the same, but still have the economy of scale to support a gigawatt versus 100-megawatt deployment,” adds Hoang.

There are two distinct challenges surrounding the employment of AI in the data center: training and inference. In the case of the former, there is a distinct start and end point to the process, with the main challenges coming from preparing the floorspace, power provision, and connectivity to the module. Inference is a more challenging issue because it has the potential to run into issues such as data sovereignty and intellectual property, which vary in scope and rigidity between territories. Hoang points to the importance, therefore, of choosing the right location for your AI module.

And with the sheer amount of power high-density workloads require, Schneider is keen to roll out AI in a sustainable way. Revisiting the topic of ‘right-sizing’ Hoang highlights how the avoidance of stranded capacity can also pay dividends not only in terms of a data center’s energy bill, but a facility's sustainability credentials as well:

Green Blue Hoops
– Getty Images

“Stranded capacity has cascading effects on raw material, operational efficiency, and also capacity which is not being used in other places for the actual workload. This is where we want to make sure that we are responsible for resources. When we do all those things, the results will be speed to market, optimized costs, and high efficiency that will lower your operating costs.”

Luckily, Schneider can help, as he goes on to explain:

“It is happening everywhere. Over the last year, AI has put a lot of strain on the community, with resources constrained as sites are looking at redesigning to accept AI workloads. That means we want them to have all the tools possible to ensure that it is effective right from the first draft of the design.”

This process is simplified by offering a consistent solution such as prefabricated modules: “How do we make sure we align and optimize along the way? This is more integrated delivery, which includes the design and manufacturing up front, not after, to ensure that we achieve just that.”

Key drivers for the AI modular solution include being able to scale up and down as necessary, whilst working to a set of reference designs to bring that much-needed consistency, but as he pointed out, don’t constrict customization:

“Standard designs don't mean rigid designs. Late differentiation on gear can still be available, but we want to make sure the building blocks are standard and we allow localization or late differentiation in the designs, to a certain extent, while still maintaining all the benefits of scalability, predictability, and cost optimization,” says Hoang.

The overarching message is that modular solutions are more than just a deployment method, but rather an end-to-end strategy, requiring forward planning, and partnering with people who hold the skills and experience to make your AI deployment a success, for today, but also for tomorrow.

For more information on Schneider's modular solutions click here and watch the full DCD>Talk here.