The proliferation of AI and other high-density workloads is significantly reshaping the European data center market. This shift, driven largely by GPU-driven activities, has prompted companies to reevaluate their designs, models, and growth projections to become more AI-ready.
Adam Levine, CCO at Data4 – a European designer, builder, and operator of data centers – discussed with DCD’s Alex Dickins how they are staying ahead in the AI race.
Closing the latency gap
A persistent challenge in the industry has been the time lag between technological advancements and their physical implementation. Levine elaborates on this with an example from real estate:
“With real estate, you need to buy land, build power networks, create power generation, transmit power to sites, install network infrastructure, and get permits to eventually build the site. So, there’s always a lag between demand and supply in this area.”
Today, however, the industry is taking a more proactive approach, aiming to predict and intercept future growth:
“What we’re seeing now is a desire from our industry to intercept and predict future growth, and go big with gigawatt campuses and huge infrastructure, reminiscent of the Google campuses of the US,” Levine notes.
Speed bumps for colocation operators
Colocation (colo) operators face significant challenges in upgrading their infrastructure to support the diverse and demanding workloads of AI and GPU-driven applications, especially in a multi-tenant environment. The strategic dilemma lies in choosing between serving traditional enterprise customers, or scaling up for hyper-scale operations. Levine highlights that as the industry evolves, maintaining both models becomes increasingly challenging, often requiring operators to focus on one:
“We’re focusing on larger-scale, gigawatt-level campuses that are truly the Nirvana of our industry. The really important thing to focus on is maintaining the quality of delivery, but scaling that up.”
The question remains whether the colo market is best positioned to benefit from the AI boom, especially given the increasing demand for GPUs and power, often secured through large power purchase agreements (PPAs.) Levine hints that the success of AI in this market will likely depend on factors beyond just resource availability:
“The real challenge lies in making informed, strategic decisions about where and how to invest in AI infrastructure, as these investments are now substantial and critical to their future success.”
Levine suggests a need for a better understanding and categorization of workloads to ensure infrastructure is tailored to specific needs:
“One thing that we can do better as an industry is define and create a taxonomy of workloads to align infrastructure with specific requirements. This will inform the design, infrastructure, network capabilities, and required skill sets to ensure new developments are optimally designed and equipped for the tasks they will handle.
AI-ready data centers: Truth or lore?
There is a lot of speculation about the nature of future workloads in data centers being built today. Levine points out that while these data centers are often touted as “AI-ready,” there is little certainty about what workloads they will actually support in three to five years:
“Will it be cloud? AI? Machine learning? Or some application that hasn’t even been invented yet?” he asks.
Levine also advocates for increasing density and full utilization of capacity in data centers as a path to sustainability:
“We’re seeing a shift towards full life cycle analysis of data centers. We as a company do full life cycle analyses, including customer equipment and networking. The focus is on maximizing the efficiency and sustainability of everything we build.”
Where we go, the fiber will follow
Traditionally, data centers followed fiber infrastructure for connectivity. However, Levine notes a significant shift in priorities:
“It feels like right now we’re at the cusp of a big shift to follow the power. Power is the critical path. So wherever the power is, that’s where we will go, and the fiber will follow.”
Additionally, there is an emphasis on bringing production closer to the data center to shorten the supply chain. Levine acknowledges the industry’s impact on local communities, emphasizing the need for a balanced approach that contributes positively to the areas in which they operate:
“One of the things that we’ve always focused on since day one is our ‘Data4Good’ program. One of the pillars of that is ‘Data4Communities’, and we give back to the communities, whether it’s building roads, or engaging with the local universities to create courses and training programs to ensure that we facilitate the skills of our potential future employees.”
Walk before you run with AI
Levine’s key message is a call for thoughtful and discerning use of AI, warning against frivolous applications that consume significant resources:
“Before running an AI model for trivial purposes, consider the huge energy, land, and resource consumption. AI can be a positive force if leveraged responsibly,” he concludes, citing Jim Lovelock’s work as a strong analysis of AI’s potential benefits for society and the world.
To find out more, you can watch the full DCD>broadcast with Alex Dickins and Adam Levine here.
More from Data4
-
Sponsored Revealing full data center environmental faces thanks to life cycle analysis
Analyzing the lifecycle of data centers “from cradle to grave” is crucial for pinpointing how the industry can minimize its environmental impacts
-
Sponsored How innovative power sourcing can propel data centers toward sustainability
A proactive approach to securing sustainable power sources can position the industry as a leader in energy efficiency without compromising affordability
-
Sponsored Striking a balance for sustainable growth in the AI-driven data center
As AI causes demand for space and power to ramp up, new data center builds need careful planning to prepare for the surge