In part one of this blog I explored the last ten years of the data center evolution and that, while there have been many developments in data center technology and design, really those changes were not fundamental enough to warrant the word ‘revolution’ being used.

Liberty Leading the People
Liberty Leading the People – Pixabay / WikiImages

I went on to postulate that there was still an actual revolution this industry would need to go through in order to enable and support the whole new gamut of technology and data enabled applications that we’re just beginning to experience now.

Scale is the real challenge and a game-changer for the future of this field. We’re still largely a cottage industry, heavily reliant on human labor and exactly like the first great industrial revolution, it is automation that will enable and allow us to scale up - not by double-digit percentages, but by factors of many thousands, which is the scale we need to achieve.

What technology is driving this need for scale?

Investors take note – AI and blockchain are neither the big drivers nor are they the solution to everything technology-wise, now or in the future.

Both have their place and their part to play, but both (blockchain more so than AI) are technologies looking for problems to solve right now. AI has already advanced technology in so many areas and it is good to see people testing its application broadly, but right now the hype is bigger than the tangible benefit in at least half the areas it’s being applied.

This will change with time and situations and scenarios will shake out where both of these technologies are best applied to deliver true incremental value.

Here at Romonet we use AI/ML in our analytics platform but only to perform very specific functions within a bigger system. Those functions lend themselves well to the high value that AI can provide when applied in the right context. These are situations where its application could be described as being more surgically precise rather than a broad context where the value of the output is of much lower accuracy and hence value.

My caution to anyone investing or deploying these technologies today is to make sure you know someone who actually understands what AI/ML is and is not, as well as how and where it’s best applied for maximum value.

Okay so some of the drivers of increased need and speed to scale are:

General-purpose compute and deep data storage – absolutely needs to, and will, continue down the path of becoming a utility with almost completely ubiquitous availability. Today’s cloud and hyperscale guys are doing a great job of turning general-purpose compute into a utility, with one of the benefits for you and me being a marginal cost trend that’s on a trajectory to near-zero.

Application-specific compute – already a rapidly expanding market, with everyone from Google to Microsoft developing and using their own ASICs - be they GPU, TPU or some RISC variant - to increase speed and productivity of the more ‘exotic’ workloads. The tuning of hardware to much more closely match the demands of specific workload types is an obvious evolution of a general-purpose capability. If the Ford Model T was a general-purpose CPU then the motorbike, pick-up truck, tank, van and large articulated lorry are all examples of application-specific processing units.

The real challenge is that progress cannot happen in just one layer of the technology stack without impacting the layers above and below it. From application software to the delivery of power and cooling, everything is impacted and needs to work and develop more or less in sync.

Today, it’s still largely humans that provide the ‘management’ between the technology layers, mostly because we see them as different industries, disciplines, even different skillsets. With humans as ‘glue’ between the layers, overall system-level (macro) management grows in complexity and productivity becomes an inverse function of scale.

Scaling with humans is possible up to a point - then you have to break the management of the humans down into more manageable teams, and this inevitably creates unintended silos of communication which is just a function of human nature that can’t be easily overcome.

Thus cross-layer automation and controls are critical to achieving real scale and economies of scale; without this, scale is your enemy.

Technology to watch

I’m often asked by investors and vendors alike, “What companies and technologies should we keep an eye on?” Well, I’ve already mentioned some of the technology that will fuel the revolution in general terms, but it’s not just compute or software companies and technologies, it’s also well worth watching those that are putting together system-level solutions by integrating existing sub-system technologies and components and ‘gluing’ them together with automation and technologically scalable solutions (not humans, as I mentioned earlier!)

There are (and I have) many examples of such integration and automation innovation in the market, but below are two that I’ve known well for a while now, from their early R&D days to now fully production ready deployed solutions:

Vapor – I met the infectiously passionate Cole Crawford while Vapor was still in stealth mode, and he showed me a picture (an ‘artists impression’ at the time of what the Vapor solution would look like when scaled) of the new world he was about to build. It certainly caught my imagination and he was the first person I’d met that was intent on breaking the evolutionary process with a revolutionary product.

The Vapor Chamber
Early prototype of the Vapor Chamber – Vapor IO

What Cole foresaw that I hadn’t at the time, was that the Edge was going to play a leading role in the show that will drive the technology revolution that I’d (and many others) been predicting for some time. While the rest of the industry are busy debating how to define ‘Edge’ and what it means, Cole and the team at Vapor are out there building, deploying and enabling it. Too many people I introduced Cole to saw through their blinkered eyes just a new rack configuration that had a novel cooling solution. So many missed the point and the opportunity to get on-board or ahead of the curve.

All credit to Cole and the team as if you know anything about where they are now and where they are headed, you’ll know this is one solution you’d wish you’d paid more attention to… Watch this (vapor.io) space, as they say!

Iceotope – I’ve known the guys in Sheffield, UK, for many years having first modelled (via Romonet) their technology in order to help them validate the applicability of their technology within the data center market.

There are many liquid cooling solutions out there today, both old and new, and Romonet has modelled many of them either directly or indirectly. Honestly (and I’ll get hate-mail for saying this) hybrid solutions – meaning part air, part liquid – don’t have a strong financial story compared to traditional air-based cooling systems today. They do have specific use cases where there is a strong benefit but those use cases represent 10% of the total data center market.

That doesn’t mean they are bad, but their commercial viability is somewhat more limited than the fully immersed solutions on the market that can all but eliminate the need to deliver air as a cooling medium.

I’ve watched Iceotope’s technology and products evolve from science experiments, over twelve years ago, to what now looks and behaves like a scalable production-ready solution that I’d deploy if I was personally responsible for standing up data centers and compute capacity.

Liquid cooling is not just for HPC anymore and I wouldn’t be surprised… In fact, I fully expect its adoption to become mainstream as soon as the next generation of CTO’s, CIO’s, engineers and decision-makers take over from the current. That might sound harsh, but as I pointed out in Part 1 of this blog, adoption of new technology is a function of the perceived risk, however the term ‘risk’ is defined in your own job function.

There’s no logical or rational reason to delay adoption of these more mature technologies and solutions. Automation is required to get us past the scaling crux point many businesses are now experiencing, and integration across the layers of the technology stack is very much the underpinning of automation enablement.

If you’re reading this and have a new technology or approach you feel is part of the revolution, do feel free to share and comment here, I’d love to see and learn more as innovation is what keeps this industry an interesting and exciting place to work.