The deployment of AI in data centers will be an evolution rather than a revolution. It is coming, but we can’t be sure at this point how fast, and how far, it will penetrate the sector.

The business case is clear, but will itself evolve. Very large and hyperscale facilities are already making major savings on power and operational efficiency. As AI matures the technology options and pricing levels will develop and the benefits will start to be seen in smaller multi-tenant data centers (MTDCs) and enterprise facilities.

A more detailed picture

There is more detail to this picture. Right now, AI fits well with the hyperscale operating model, but recent DCD research suggests that not all such facilities have yet introduced it. Penetration is lower among MTDCs and lower still among enterprise facilities.

Some analysts make a case that AI can be deployed at a reasonable cost in older facilities, but there’s a question here: will the improved operation of these dated, legacy environments simply throw into sharper relief their limited performance. Ultimately, the industry may reach a stage where the older and smaller facilities that can’t viably deploy AI may not be viable to operate at all.

Some have predicted that AI will replace human skills or supplant them, and even drive the industry towards ‘lights out’ facilities without human operators. In fact, in the past few years there has already been a trend for staff to move from working inside the data center to working across data infrastructure and interpreting business requirements. There are still major local shortages of skills in both these areas of activity.

Unless the data center industry changes the way it sources and upskills its staff, AI may simply keep the industry moving rather than representing a threat to employment

Soft issues

Beyond the “hard“ issues of finance, equipment and staff, the deployment of AI will have large effects on critical ‘soft’ issues such as risk management, performance, data and privacy practices, quality standards, compliance, and accreditation.

Some of these factors have become far less clear-cut, partly because the sheer volume of data and analytics makes the tasks of overseeing the processes more complex. It seems possible that the creation of large amounts of data for AI systems will result in a need for AI systems to oversee other AI systems, and check them for compliance.

New cloud-based and AI-driven services such as data center management as a service (DMaaS) take huge amounts of data from data centers and apply the shared learnings to individual situations and customers. But this creates a new legal conundrum as to who owns the data, particularly as the latest data legislation, such as GDPR in the European Union, shifts the ownership principle from the company or individual harvesting or collecting the information to the company or individual providing it. The major case that prevented Facebook from transferring data on an EU system out of the EU indicates the extent to which the balance has changed.

AI to manage AI

The degree of self-learning and automation associated with AI may make visibility more difficult. The deployment of AI and deep learning systems are most successful when the output is defined as part of the deployment process rather than after.

The most significant long-term trend in AI adoption in data centers will be away from single task/purpose applications towards deeper learning and overall management systems. In some ways, this is not actually a new situation: already data center operators are frustrated by the need to deal with solutions and systems from multiple suppliers, and this issue will continue as those suppliers embed AI into their products or systems.

To overcome the issue of integrating multiple AI systems will require an over-arching AI system to which all others are accountable. This will enable ‘task-driven’ AI in the data center to evolve towards more ‘general’ AI applications.

One step towards this could be the latest version of Huawei’s PostgreSQL-based GaussDB which introduces the idea of an AI-native database. It adds a host of AI-based capabilities such as the ability to self-tune, automatic fault diagnosis, and some self-healing capabilities.

AI improves the performance of GaussDB, with tests indicating that an AI-tuned GaussDB configuration can perform up to 50 percent better than a manually tuned configuration, or a configuration produced by popular machine learning-based tools such as OtterTune for both OLAP and OLTP databases. GaussDB supports multiple deployment scenarios, including deployment on a standalone server, as part of a private cloud, or on Huawei’s public cloud. On Huawei Cloud, GaussDB is used to power various data warehousing services for its cloud customers.

Future developments in AI will include better natural language processing to support ‘conversational’ AI, and advances in ‘self-healing’ data centers. Data throughput within the data center will need to deal with the added requirements of the AI system and this will require increased bandwidth and speed specifications.

AI at the Edge

Any future deployment of AI and machine data will need to accommodate the major emerging trend towards more distributed and ‘Edge’ computing. This model requires processing capacity close to where the data is collected and a scalable source of shared information and learnings at the ‘core’.

Edge systems will need to decide what needs to be transferred to the core and what can be discarded. This curation process will involve such a vast and changing set of data , that only a self-learning and auto-programming system will be able to do the job. Both these requirements can only really be fulfilled by AI.

The roll-out of Edge computing will add further urgency to the deployment of AI, since Edge deals with the complexity of data collection, analytics and intelligent decision-making - jobs which AI is taking on within data centers and across networks.

The volume and complexity of data that Edge will generate means that legacy statistical systems will not be capable of the analytics required. A system that can develop an adaptable learning capacity will be the only practical solution.

Maintenance

Finally, AI networks can also help improve equipment maintenance.

Many experienced data center engineers will tell stories of how they could tell that a piece of equipment was faulty just by the way it sounded or smelled What if a trained AI network could predict those failures long before they were detectable to engineers?

Scheduling maintenance according to manufacturer guidelines is effective but costly. Major data center operators are beginning to apply deep learning networks today to monitor different types of machines.

How fast is it growing?

Although AI is clearly growing, the rate of growth of any new market is difficult to chart. The OECD in December 2018 indicated considerable growth in private investment from 2017 onwards and Gartner’s 2019 CIO Survey indicates a growth rate of 270 percent in implementation since 2015 and 37 percent from 2018 to 2019. Gartner estimates that the enterprise AI market will be worth US$ 6.14 billion by 2022.

Almost 20 percent of 600 data center owners and operators in a 2019 DCD survey indicate that they are currently deploying AI in their datacentres and a similar proportion have this as a future priority.

Click here to download a free White Paper: Towards The Better Data Center