Data center operators experienced tremendous growth in 2023. As the world re-emerged from the pandemic, construction and other projects that had been on hold since 2019 began once again. At the same time, technological innovation never stopped, with new capabilities like generative AI demanding more power than ever.

But even if 2023 has felt fever-pitched, it’s nothing compared to the whirlwind coming for data centers in 2024. Here’s what we can expect in 2024, and how organizations can best prepare themselves.

The AI scramble

Enterprise organizations have recognized the need for more resources to bring AI capabilities to fruition. Further data center expansion has been driven by AI-focused business requirements, ranging from software companies looking to rapidly introduce or expand AI functionality in their product lines, or hyperscale organizations looking to bring more AI offerings to market in their cloud portfolios. However, technology is ahead of construction – and data center managers are finding themselves retrofitting data centers before they have even been commissioned. 2024 will be a sprint to keep up with technological demand, including rapid and rapidly shifting data center construction as well as rethinking and reorganizing current infrastructure.

Turner Construction
– Turner Construction

Powering up in real-time

We’ll continue to see conversations across the industry and media about energy consumption and power generation in 2024. The processing power required by generative AI is phenomenal, especially for hyperscale customers – and it’s only growing. From 2022 to 2023, data center power consumption doubled. This requires complete redesigns of power and cooling in data centers currently being commissioned and built, creating huge rifts in these projects. As technology lags behind demand, we shouldn’t expect to see much relief in 2024.

Modularity and agility

Hyperscalers are temporarily holding it together until they can get more power to the site by having half as many cabinets per row – a huge loss in real estate, but still resulting in the same amount of processing power. We’ll continue to see data centers shrink to a degree in 2024. The silver lining is more agility, with local jurisdictions more likely to approve data centers with smaller footprints.

In the pursuit of fast, inexpensive production, we’ll also see a growing focus on modularity in 2024. Rather than building power modules, cooling modules or even data centers themselves onsite, we’ll see projects breaking up these builds into bite-size pieces, constructed by partners and shipped to the site. While these pre-configured, plug-and-play models are putting pressure on manufacturers, they result in data centers being built in a year, compared to two to three years – speeding up their marketability and how quickly they can be commissioned.

Getting data closer

While AI growth will continue in 2024, companies are still learning how best to leverage their data for training purposes, and how to intersect their future business models with AI to improve their customers’ experiences. Enterprises and data center operators will need to get data closer to their AI compute farms while ensuring their training and validation data remains usable and up-to-date. This focus on data results in consideration of the entire data center stack, including network performance and tail latency, compute cluster expansion and parallelization, and solid-state storage to serve up data as quickly as possible.

Keeping cool

Power and cooling are increasingly important components of any data center build, and that will continue into 2024. In-rack power requirements have grown exponentially, with these power-hungry compute and AI platforms driving the need for cooling alternatives. Air-to-liquid or liquid-to-liquid heat exchangers, component-level cooling, and immersive systems all are considered reasonable options in highly dense computing environments. Understanding power budgets and calculating current and future state compute requirements will provide a baseline framework to build on when looking at cooling solutions.

Sustainability and efficiency

Sustainability is as much about waste as it is about efficiency. While more power is needed for newer servers, storage, and network infrastructure in the data center, using that power budget in optimal ways is just as important. Multi-tenant data center providers are already introducing renewable energy options, higher-density cooling platforms, and removing excess infrastructure such as generators. While this doesn’t reduce demand, it does provide for more efficient distribution and implementation models to make it easier for hyperscale and enterprise customers to meet their compute demands in more locations worldwide.

Big winners

Given AI capabilities, power consumption, cooling, real estate, and more, enterprise or customer-owned data centers are becoming more and more impossible to build. Hyperscalers are positioned to win out, simply because many smaller data centers and multi-tenant data centers are dealing with decade-old technology and cooling and don’t have the deep pockets to adjust. We will likely see many of these smaller and multi-tenant operations being absorbed by hyperscale. Similarly, demand for the cloud will also continue to grow in 2024.

With data centers, the future was yesterday

From liquid cooling to high-processing servers, the futuristic innovations we used to dream about are already happening today. As data center managers grapple with the heat loads and power requirements of new tech, they should also look five or more years down the line to prepare for what’s coming next.

Everything from cryptocurrency to electric cars are bringing the grid to the brink, and it might not be long before power regulations and restrictions force data centers to devise even more efficient methods of processing and new breakthroughs in power creation. Hold on tight: as 2024 approaches, the data center future is already roaring past.