If we were to trace the evolution of the computing industry, there are two major leaps that have taken place so far, with the third happening right now. Some of our readers might remember the first one – the jump from vacuum tubes to transistors and microprocessors. During this period, computers, previously only useful in a business environment, invaded our homes and that was the time when Moore’s law started taking shape.

The second leap can be seen in the shrinking of personal computers to a size that would comfortably fit in a pocket – i.e. a smartphone. This liberated us from having to do certain things at our desks and gave us a persistent connection to the Internet.

Q System One
IBM Q System One - an early quantum computer available as a cloud-based service – IBM

Qubits differ from bits by having a theoretically inifinite number of states - one, zero, and any quantum superposition of these states. And, as with many other quantum principles, its actual state will be decided when you measure or observe it. This basically opens new computational possibilities, allowing exponentially more simultaneous calculations.Today, information travels unhindered across the world and access to it is easier than ever, to a point where digital infrastructure can be seen as the circulatory system of the planet. However, we are starting to hit certain processing limits when it comes to complex problems. Problems in protein folding, advanced encryption and quantitative finance are some of the reasons computer scientists started looking beyond existing computing architectures.

The third leap is a shift towards quantum computing. Applying quantum mechanics to data processing seems to be the way forward, when it comes to tackling these problems in a time-efficient manner. While a very complex process at its core, the main difference between a traditional and a quantum computer is that the latter makes calculations using qubits.

Qubits differ from bits by having a theoretically infinite number of states - one, zero, and any quantum superposition of these states. And, as with many other quantum principles, its actual state will be decided when you measure or observe it. This basically opens new computational possibilities, allowing exponentially more simultaneous calculations.

We are on the brink of a new era in computing, with large enterprises working on R&D and several quantum computers already in operation. They might be decades away, but businesses are already scoping out the potential changes resulting from the advent of general purpose quantum machines. And yet, this is a data center publication, and what we care about the most is this: how will the facilities hosting these computers look like? How will a quantum computing data center work?

The quantum data center

While quantum computers will provide more processing power for certain tasks, they are not expected to replace conventional servers, but rather complement their existence and purpose. Hence, there is a high possibility that colocation facilities will become some of the first existing data centers to house this new generation of machines. At the same time, a quantum computer and a traditional server have very different ways of operating, and deploying quantum will not be as easy as fitting a blade in a rack.

To begin with, quantum processors are thought to be using 1.5kW of power at most, compared to an average server rack that needs 5kW to 10kW of power. What is even more interesting is that 1.5kW is mostly used for cooling, as the processor itself requires almost no power for computational purposes.

Qubits are still in their early days: they are not very stable and they do not exist for long periods of time. Due to these reasons, most quantum processors need to be supercooled to a value that is very close to absolute zero, and need to be maintained at that level - otherwise, the superposition of the qubit, where it can appear in multiple states at once, will be destroyed.

Data center cooling will need to be completely re-imagined in order to keep up with quantum computing demands and the instability of qubits. To give a bit of a perspective, the cores of D-Wave, one of the first commercially available quantum computers, operate at -460ºF, or -273ºC, which is 0.02 degrees above absolute zero. As a result, cooling will need to be based on technologies that can deliver that temperature, with liquid nitrogen being the obvious choice.

So, power requirements will drop massively compared to today’s data centers, and the facility’s cooling will need to be redesigned in order to accommodate the extremely low temperatures required to keep qubits stable.

The instability of the qubits brings another major modification to the layout of the traditional data center. Because of their instability, they tend to be affected by any disturbance happening around the quantum processor. These systems will need to be kept in an electromagnetically isolated space, and the data center rack itself will need to be changed completely – to the point where it effectively becomes a Faraday cage.

There is one realm of computing that is wary of the appearance of true quantum computers - cybersecurity. Today, if I was to try and brute-force a password encrypted with AES 256-bit standard, successfully cracking it would require more time than the current age of the universe, even when using all the available processing power in the world. Considering that this brute-force attack happens in binary (0,1), imagine how quick a quantum computer will be able to achieve the same result, if a qubit has exponentially more states than a classical bit.

To effectively counter this, all current encryption and decryption algorithms need to be repurposed to function within a qubit environment. Like any other new technology, this will mean time and investment, and consumer-level access to a quantum computer. Colocation data centers are again the ones which will see minimal impact on expenditure, since they will already have relevant equipment in place and will be able to absorb the cost of implementing such algorithms.

Changes to the data center space are coming along with mass availability of quantum computers. Racks will be changed; power consumption will increase, but not as much as one would expect with the arrival of a new generation of computing and cooling will play an even more important role than it currently does. We are quite a long way away from having quantum computers in our homes, or remotely accessible within a data center, but planning for this future starts now.