It’s hard to escape the endless talk of artificial intelligence (AI) and machine learning (ML) right now. While many buzz phrases come and go, there’s little doubt that AI puts us on the cusp of one of the most significant changes that society has ever faced – it will change the way we work and play alike, and most importantly, it’s here to stay.
What that means for the data center is a one-two punch – it vastly increases the capacity the industry will need to offer and introduces the challenge of supporting new AI-capable infrastructure in older facilities.
Align has been working in the industry for over 36 years, offering complete lifecycle services from the data center to the desktop. Their data center services include strategy, design/build, migration, decommission, and refresh, while meeting the ever-present need for expansion and consolidation of existing facilities.
Who better, then, to discuss the oncoming advances in AI in the data center industry? We sat down with some of Align’s key stakeholders – Tom Weber, managing director for Data Center Solutions; Rodney Willis, managing director for business development, Data Center Solutions; Simon Eventov, assistant director for Data Center Design & Build, and Tyler Miller, regional director of Data Center Sales, Texas.
The first stop in our exploration is an analysis by Tom Weber of why the AI revolution, alongside other high-compute applications, is not just leading to exponential growth in data center construction, but also the consolidation of existing facilities to increase capacity.
“In three to five years, we've gone from cabinets running at five, eight, or 10kW and now we commonly see 40kW and beyond, and it is driven by the gear and the more powerful application that is running on it. This may be elementary, but it's denser loads and smaller spaces, which create issues, because in the old days, we would have a megawatt covering 10,000 square feet, in the past few years, it's six megawatts and maybe 15,000 square feet.”
“Now we're seeing companies running a 10,000 square foot space capable of running a megawatt and not even a tenth of the space is being used. So, we have an open hockey rink with 20 cabinets in a checkerboard configuration that are all running 50kW.”
“One of the big challenges is fitting all that power into a cabinet and how you get cooling to that cabinet. You're not just blowing cold air in the front anymore. The biggest thing we see from a physical layer is fitting more into a smaller space and there are pros and cons to that.”
“Getting direct cooling at the cabinet level is a huge challenge. Tapping into existing cooling infrastructure often requires shutting down a portion of that infrastructure, which typically supports other critical customers that cannot afford any downtime. This is a real problem when trying to convert a portion of an existing facility to support these higher densities. Today, new builds and retrofits have started including cooling designs to better incorporate standard cooling technologies with the ability to provide direct cooling to areas that require it.”
Simon Eventov added, "There will be specific segments, particularly in high-compute tasks, that will remain essential. A prime illustration of this is autonomous vehicles, where proximity is crucial. There will also be a vast volume of data, continuous learning, and outcomes that can be distributed across diverse locations.”
Edge of Reason?
So, does that mean that the future lies in Edge? Weber thinks not.
“It will always go back to a central data center. Take Ashburn, Virginia. When I first went there, 12 years ago, I was on a cow farm. Now you drive down the road, and there are data centers stacked up next to each other like apartment buildings in Brooklyn.
"Because of this dynamic, in many of our data center regional clusters, we have seen many areas simply running out of electrical capacity to support further growth. I have this vision of open land, in the middle of the country where there are solar panels and acres of wind farms, and that drives the data center. Otherwise, how do you get self-sustaining within the data center, outside of being in Iceland, and using geothermal?”
Rodney Willis adds: “Or do we adopt micro-nuclear facilities faster? It’s already being talked about. If it can power an aircraft carrier or submarine, it can power a small data center. But that’s a long-term look, there’s the regulatory issues around that, and then societal issues. Will society accept small nuclear reactors close to their environments?”
Powering the AI revolution is an important consideration – but is AI destined to go the way of 3D TVs and Google Glass? It certainly doesn’t look that way for what is shaping up to be a revolution, rather than a fad.
“Why would you not try to utilize machine learning and artificial intelligence to help make decisions?” asks Willis. “The real key here is interpreting that data, letting the technology learn about data interpretations with some manual help, but then having that backup, for manual processes, AI just makes it more efficient.”
Keeping the lights on is one thing – but anyone who has seen the 1983 film War Games might question the wisdom of putting all our technology in the hands of AI, which is why, as Willis explains, we can’t afford to press “immature” neural networks into service.
“There is a reason we don't send a 10-year-old out to the workforce. He’s still got learning and training to do. We're basically parents of AI and machine learning. We're educating and teaching and monitoring and correcting where we need to go through that process before we feel comfortable. We have a responsibility for AI machine learning to make sure it goes down the path and delivers the outcome we want. Otherwise, what happens when that 10-year-old becomes a 30-year-old and locks you out of the house?”
Dark side of the data center
Miscreant robot security aside, it certainly seems unlikely that we’re approaching an age of ‘dark’ data centers, operated purely by machines – as Weber puts it.
“Will AI go in and teach robots? A lot of the things that need to be done in a data center can be done by a machine. It takes a lot of knowledge to do it, but it's something that could be done by a robot – but robots need human supervisors, mechanics, and programmers.”
People will forever remain integral to data centers, although this scenario presents a unique set of difficulties as data centers exclusively depend on GPUs and necessitate constant cooling, leading to significant side effects that must be mitigated.
“Have you been into a data center running pure AI?” asks Willis. “We completed one, a few weeks ago and the background noise was over 100db after the equipment was up and running. The sound pollution that's coming into this environment is incredible.”
“Absolutely!” adds Weber. “There is a big market for headphones with microphones that have channels, just so you can talk to project teams. It is mind-boggling how loud these GPUs, all crammed into small spaces, can be. Working for long periods within these environments is extremely stressful. I think the headcount will go down, due to operational predictability provided by AI, and will allow fewer and shorter periods of being inside these spaces.”
Overall, then, while AI is likely to make humans more efficient, it’s not likely to replace them anytime soon. For a start, there’s never been a better time to get into the construction industry, where data centers are concerned – and that’s something that will remain very much human-led.
“From a building standpoint, AI will require more data center infrastructure, but AI is not going to put up walls, and floors and install the critical infrastructure,” says Weber. “They may tell you the best way to do it, but they're not going to do it for you. So, from a construction standpoint, AI doesn't affect that, it could even make more jobs available. Where I see the labor force changing is in the clerical aspect, like a legal assistant sorting through previous court papers and historical documents – that type of role is going to be history, it’s already starting.”
Is it time to embrace AI?
Before your company rushes out to embrace AI, consider that there are many challenges to retrofitting your existing facility with AI-capable technologies, Miller explains:
“If you look at industry verticals and the AI marketplace, there's a limited number of companies that are going to be deploying this type of technology – utilizing the power that it takes or putting in the infrastructure and the capital investment.
“How do you future-proof? That's a challenge. Unless you're building a greenfield and you're trying to consider that this is going to be a 20-plus year facility. Right now, most companies are assigning three to five-year leases, maybe 10 at most, and then they're getting out and moving their data center to somebody who has the capability to deploy these higher-density solutions.”
In other words – AI is likely to become a sub-class of data centers, with more conventional data loads based in more conventional facilities, meaning that the AI switch is certainly not right for everyone – at least for right now.
“For the smaller facility, going all-in on AI is likely to lead to a high turnover of clients. A way to future-proof your building is to leave vast open spaces for newer mechanical and electrical gear and kit as things change and get more efficient, but then you lose that price per kW war with the guy next door who is lean and mean and doesn't leave a square inch unbuilt,” says Weber.
All of this means that for most small and medium-sized operations, it’s likely to be business as usual for now – the challenge isn’t how to embrace AI, it’s about how to make the most of the space you have to cope with ever-growing workloads. Weber tells us:
“I don't think every enterprise and every data center is going to go to 50Kw cabinets, it's going to be the select, major players that are adopting AI. If one of the smaller players in the AI world gets really good at it, somebody's going to buy them and they're going to become part of the major players just like every other technology we've seen, where it gets swallowed up by the companies with the deep pockets, so it becomes consolidated that way.
“We see that being a big problem when it comes to densifying existing rooms or data centers. We have to shut down to move, to deliver more power and cooling to this condensed area – whereas if it was built on day one, it wouldn't be a problem – but even that has its pros and cons, as making such considerations will increase the cost of your initial build.”
The message is that these are pioneer days for AI – and while businesses should feel empowered to embrace AI infrastructure, you want to be sure that you are doing so with partners that can aid in making informed decisions. Technology is moving fast, and you don’t want to get left behind, as Eventov puts it:
“The car that you're going to be driving five years from now is probably going to look completely different from the car that you're driving today. You’ve still got to make a regular car with a nice GPS for the next five years before you get to that self-driving electric vehicle. I think it's a similar thing for IT. Businesses are going to have to keep running and changing and making their existing infrastructure work while we figure out the next frontier.
Revisiting your data center strategy or have questions regarding your critical infrastructure options? Learn more about Align’s Data Center Solutions at www.align.com/solutions/data-center-solutions.
For over three decades, the world's leading enterprises have relied on the unparalleled expertise, proven methodology, and innovative tools from Align. Our teams have helped small regional businesses, leading financial firms, and global SaaS providers successfully inventory, design and build, and migrate their data centers to improve performance, reliability, productivity, reduce risk, and save money.
You can also find full profiles of the interviewees on Align’s website: