Artificial intelligence is a tempting proposition, offering a vision of algorithms trawling through petabytes of data, making connections and assisting their human masters in their decision-making, perhaps even making some decisions on their own, without any oversight. It sounds like a perfect tool to help manage complex data center systems, since modern facilities are chock-full of probes and sensors, and already produce too much information to be processed by people. To put it simply, we could really use some help.

One of the companies working in this direction is American software vendor Nlyte, which is collaborating with an unnamed partner to integrate AI capabilities into its data center infrastructure management (DCIM) products in either 2018 or 2019. Doug Sabella, who has been CEO of Nlyte since the middle of 2012, thinks this will considerably expand the functionality of DCIM software into areas like workload management and cost analysis. Data centers of the future will require both, but there are serious technical hurdles to overcome.

This feature appeared in the April/May issue of DCD Magazine. Subscribe for free today.

Beyond prevention

Doug Sabella

“The simple things are around preventive maintenance,” he told DCD. “But moving beyond predictive things, you’re really getting into workloads, and managing workloads. Think about it in terms of application performance management: today, you select where you’re going to place a workload based on a finite set of data. Do I put it in the public cloud, or in my private cloud? What are the attributes that help determine the location and infrastructure?

“There’s a whole set of critical information that’s not included in that determination, but from an AI standpoint, you can contribute into it to actually reduce your workloads and optimize your workloads and lower the risk of workload failure, because of where you’re putting things. There’s a whole set of AI play here that we see and our partner sees, that we’re working with on this, that is going to have a big impact.”

Other potential applications of AI in DCIM include identification of critical alerts, continuous optimization of various types of cooling equipment, and automated management of simple edge computing environments. To make this a reality, DCIM will require additional features that were historically associated with big data analytics, rather than data center software.

"We produce more and more data, but we have to become more intelligent about how we manage it,” Sabella said. “Think in terms of high velocity, real-time data streaming, think [Apache] Kafka – those are the kind of things we will be introducing, so people can stream data into a Hadoop environment, for example, and create analytics they want.”

In many ways, AI is a natural extension of what the DCIM vendors have been trying to achieve since the very beginning - to connect physical properties of the data center environment to the performance of the workload, and help eliminate the divide between facilities teams, responsible for things like space, power and cooling, and IT teams, responsible for servers and applications.

Sabella suggested this divide is to blame for the fact that it is currently easier to establish the cost of a workload hosted in a public cloud environment than a workload in a large in-house enterprise data center: “Because of the wall between facilities and IT, they have not invested the money or the time to actually understand [their infrastructure].”

“With DCIM deployment, there are a lot of companies that have CFOs and CIOs saying ‘we don’t need that, because everything is going to the cloud.’ Guys, you spent billions of dollars in your data centers, and you don’t even know what you have.

“The awakening that we’ve seen and that we’ve really been very successful on, is all these large enterprises realizing that they need to start understanding what’s going on in their data centers if they’re truly going to build a hybrid cloud.

"Without the information we provide, you can’t actually do the analytics to understand what the cost of the workload is."

From a broader perspective, DCIM is changing the power dynamics within data centers, and that’s not to everyone’s liking. “It’s not a peaceful process,” Sabella told DCD, adding that embracing new tools like DCIM and automation through AI might require a generational change. “At least in the US, a lot of the traditional facilities folk, they are baby boomers, so they are actually moving into retirement. You have young and smart facilities people who don’t think of themselves as facilities people, they are engineers. This will be driven by millennials,” he said of the adoption.

In terms of business goals, the next challenge for Nlyte is to bring its software to the masses. Historically, the company had focused on very large customers – but Sabella promised a renewed effort to target SMBs, especially through DCIM as-a-service, which Nlyte launched way back in 2014. He said he hoped to grow this part of Nlyte to eventually be responsible for half of all revenue.

“At some level, it’s as challenging to close a sale of 1,000+ racks as 100+ racks - it takes as much time,” he said. “But, on the same token, we have a great play for small business - a lot of our SaaS customers today are small businesses.”