At some point in the not-too-distant future, artificial intelligence (AI) will drive our cars, write our programming code, and optimize how we do business. Data centers, too, will be unable to escape this trend. Thanks to machine learning technology, companies and data center operators will be able to coordinate and manage increasingly complex machines, infrastructures, and data more effectively than ever before, even as their numbers and data volumes continue to rise. Are completely autonomous, self-repairing data centers on the horizon?

The data center is the backbone of the digital revolution. Until now, it has largely remained out of the public eye, yet without server virtualization, cost-effective storage, and increasingly powerful hardware, digitalization and the new business models associated with it would be nothing more than a pipe dream.

The time has come to pair automation concepts – such as those used in Industry 4.0 and the Internet of Things (IoT) – with artificial intelligence in data centers. The task of analyzing the enormous volumes of data generated by our ever-more complex systems and structures plays to the unique strengths of artificial intelligence.

BIM and process automation

The mac
Read our report on the state of AI in the data center – Mar Perez

The challenge for data centers now is to catch up with the trend towards robotic process automation, which restructures and optimizes traditional processes using AI algorithms.

These algorithms assume responsibility for entire processes, requiring humans to intervene only to make decisions in exceptional circumstances. Robotic process automation relies on machine learning and deep learning technology. We must now extend the thinking and methodologies that we currently apply to factory automation and digital twin technology to data centers.

In Industry 4.0, production revolves around the principle of each element being visible in digital form, enabling it to be controlled from a central platform as part of the overall process. The “digital factory” – the digital planning of machines and systems within the plant – is also a key part of this approach.

This is one of the parallels between the two fields: The concept of BIM is currently a subject of much discussion in relation to data centers. Building information modeling can be used to create virtual walk-through 3D models of entire buildings and data centers. The concept can be taken one step further to drive the creation of a link between the physical and digital world.

The self-repairing data center

In the IoT, every server, hard drive, storage unit, and rack becomes an “element”; the equipment is fitted with low-cost sensors to ensure that all temperature changes and vibrations are recorded. If this data is merged with log data, values recorded by data sensors in the system, and empirical values, and fed into self-learning, deep neural networks, AI analyses will be able to provide the information needed to maintain the system at 100 percent availability at a significantly reduced cost.

Following the principles of predictive maintenance, this concept allows the operator to manage maintenance requirements with greater flexibility and to predict system faults before they occur. The precisely structured environment of a data center is the perfect setting to deploy robotics. In the future, autonomous robots will be able to replace hardware before it fails, at any time of the day or night, and flexibly adjust the configuration of the system as needs change. With the help of intelligent cyber security analyses on network files, usage files, and countless other system and log files – and particularly on combinations of these files – it will become possible to identify threats and malfunctions at an earlier stage and trigger an automatic response.

An example from Google serves as further evidence that AI can take systems to a whole new level of resource efficiency: With DeepMind, Google was able to reduce cooling energy consumption by 40 percent in some parts of its data center, generating overall energy savings of 15 percent.

AI harbors immense potential

Up to now, there has been some skepticism about what can be achieved with AI, and the risks connected to it. But the automotive industry is already proving the potential of this technology. Kitted out with sensors, radar and lidar technology, image recognition systems, graphics chips (GPUs) and computing power, self-driving cars – which were the stuff of fiction not long ago – are now taking to the roads.

In car production, too, tasks that were previously completed exclusively by human hands are now being transferred to robots. Some car manufacturers no longer have employees conducting visual checks to ensure that the paint has been applied flawlessly, opting instead for a technological solution that relies on a neural network which has been “taught,” using images, to identify how a perfectly applied coat of paint should or should not look. Robots are being trained to work alongside people; they observe humans at work to determine what action to perform next, and learn in conjunction with other robots.

Experiments by Google have proven that robots are capable of developing their own knowledge independently: In a sample of fourteen robots that independently learned to remove differently shaped objects from a box, one of the robotic arms moved another object out of the way to enable better access to the object it required. This action – which would be completely natural and logical from a human perspective – had not been programmed into the robot. Experts are convinced that it is just a matter of time before tried-and-tested methods such as just-in-time and just-in-sequence production are abandoned in favor of evolutionary algorithms with the ability to completely reinvent the production sequence. Similar results are expected for the arrangement of assets in autonomous data centers.

Data centers move closer to the edge

One of the most significant consequences of Industry 4.0 and the Internet of Things is decentralization – a phenomenon which is increasingly set to take hold in data centers too. Edge computing, which records and evaluates huge volumes of sensor data in locations where real-time conditions mean that data cannot be transferred to the cloud, is fueling the creation of smaller computing units, which may be sited in the immediate vicinity of machines.

In self-driving cars, too, the critical functions are always accommodated inside the vehicle itself, without relying on any form of cloud connection. As a result, we may see an increased trend towards mini data centers. The industry is already looking at solutions such as data center containers, which would be much easier to use in conjunction with autonomous concepts.

Although AI is an opportunity, it also brings real challenges for data centers. “After ten years of IT infrastructure consolidation, the landscape is starting to look a little more varied for company data centers and their service and cloud providers. Diversity, complexity, and hardware expertise will be key concepts over the next five to ten years,” concludes market research company Crisp Research. In the field of machine learning in particular, the tried-and-tested x86 standard infrastructures will be incapable of producing a high-performance solution on a large scale; the digitalization sector will need to rethink its approach.

In parallel to these developments, the hardware deployed in data centers is also changing. The use of flash storage and GPUs is on the rise. Over the course of the next ten years, quantum computing will start to dominate the landscape, giving a further push to many applications, including artificial intelligence. Companies like IBM, Intel, and Microsoft are working feverishly on quantum solutions; D-Wave has already launched the first commercial quantum computer. Like mainframe machines, the technology is still subject to a unique set of complex location requirements: A quantum processor must be cooled to temperatures similar to those in space. In light of these developments, engineers will need to internalize the DevOps mindset and focus on the automation and elasticity of these systems.

How companies are preparing for new technologies

In recent years, one major change stands out as particularly significant for companies: New disruptive technologies are being created faster than ever before. The rise of cloud and big data technology serve to highlight this accelerated pace of development. While many companies spent years doubting the security of cloud solutions, competitors plowed ahead, backing up their approach with hard facts. Both of these trends are predecessors to the AI technology emerging today. Speculation on artificial intelligence has been rife since the 1950s, but the technologies needed to realize the idea have only recently become available.

“For me, there are no limits to what can be achieved with AI technology,” says Researcher Prof. Christian from the Fraunhofer Institute for Intelligent Analysis and Information Systems (IAIS) and an expert in deep learning. “Even I am astounded by what has been achieved in the last five years. In the twenty years I’ve been working in this field, I never thought I would see this kind of progress in my lifetime.”

AI is not going anywhere – in fact, it will affect all medium-sized and large businesses. For many companies, the challenge will be how to exploit this new technology in the most profitable way and use it to digitalize their business. The IT systems deployed to implement the technology, with the data center as the nerve center of all activity, are inextricably linked to this process. CEOs and CIOs must assume the role of visionary and anticipate the impact of new technologies – and they will only be able to do so by planning and consulting with people who truly understand the new technologies.

Ravin Mehta is the founder and managing director of The unbelievable Machine Company (*um), which was recently acquired by Basefarm, a service provider based in Norway