For years, artificial intelligence was the future, and it looked like that future would never arrive. But when ChatGPTv4 was released earlier this year it seemed as if, finally, AI had at last reached the mainstream.

Gerd_Altmann_Pixabay_technology-5254039_1920.jpg
– Gerd Altmann, Pixabay

On the one hand, ChatGPT’s ability to provide intelligent-looking answers to a wide range of questions, at the very least, elevated it well above the intelligence of the average Facebook or Twitter user.

On the other, more cynical commentators have suggested that it is not nearly as intelligent as it appears. Take, for example, the lawyer who used ChatGPT to generate a defence, which turned out to feature completely made-up case law.

Regardless of the shortcomings of current generation AI, there has nevertheless been an explosion of AI-based projects, while chip designers like Nvidia have been scaling up their plans and data center operators are already enjoying the first signs of an AI-led boom.

DCD’s Dan Loosemore spoke to Schneider’s Elliott Turek about the AI paradigm shift about to hit the data center – and society – and how data center operators ought to prepare.

“I’ve been talking to a lot of companies and they’ve already been building up capacity, from small deployments to large data centers. So they’ve now got the capacity to start doing these projects. And it’s going to change how they approach capital expenditure; maybe changing rack configurations or changing their cooling set-up,” Turek told DCD.

A bigger issue right now, perhaps, is power: the simple question of whether the grid capacity is available to provide the necessary power to the CPUs and GPUs already being installed to support AI applications.

Indeed, in many popular data center locations around the world expansion has been hampered by a lack of available power. Now, grid operators will have to decide between existing operators that want to expand their AI-friendly services, and new operators who may complain that they are being denied the opportunity to set-up new facilities and compete.

This may affect the enterprise data center, most of all, suggests Turek.

“Enterprise customers may have to leverage other third parties and colocation providers to help them achieve their AI goals initially, until you reach a point where it becomes more balanced. Yet I would say that over the next couple of years, AI deployments are probably going to be rolled out in centralized data centers, or with companies that have a data center practice that can deploy at the Edge, really fast, rather than enterprises doing it themselves.

“But I do think that, maybe over a five, six or seven year period of time, you're going to see the power quality change the amount that's coming from the grid and going to these enterprise customers, and they're going to be able to take advantage of better setups and better planning to then deploy these types of instances on their distributed network,” says Turek.

To listen to the discussion in full, check out the full DCD>Talks interview on DCD’s Edge Computing Channel