IT strategies have been transformed over the past decade, with an oscillation between guarding resources strictly in-house and fulfilling some or all computing needs in the cloud.

In response to the message that money is to be saved and efficiencies created, organizations have shifted workloads onto public platforms provided by the likes of AWS, Google and Microsoft. In so doing, they have been putting their faith in the data centers where those cloud services reside. But many IT decision-makers, it seems, are growing suspicious that this faith may have been misplaced.

Recent report by Sapio Research, on behalf of London-based Volta Data Centres, raised a wealth of concerns regarding the quality of data center experiences, and identified what appears to be a disconnect between infrastructure performance and the business objectives of those relying on it.

Cloud computing (also, Basel, Switzerland)
– Pexels

Numbers don't lie

Some 56 percent of IT decision-makers questioned by Sapio last year said they had suffered downtime in the previous six months, with 46 percent saying they had experienced data loss in the preceding 12 months. A particularly unfortunate 4 percent said they experienced data loss five times over six months. Little wonder then that many of those questioned are keeping a foot in both outsourced and insourced camps.

A popular model is to divide the overall IT load between in-house server infrastructure and the public cloud, or alternatively, to hedge it between multiple public cloud providers.

There are even organizations sufficiently disenchanted with their public cloud experience to be actively considering the building of their own, brand new enterprise-class data center, so that they can take back full control while enjoying cloud-like access to resources.

“An enterprise probably wants to have at least part of what they do in their own data center,” said Kevin Deierling, vice president of marketing at Mellanox, a supplier of interconnect solutions. “We’ve seen companies who move 100 percent into the cloud. Then they realize this can be very, very expensive, and that they can be more cost-effective with a hybrid cloud solution.”

Deierling said that heavily web-focused businesses in particular often find there to be value associated with having their own data center space, both in terms of operational cost and control. But he doubted whether the step of building, owning and managing everything from the ground up is necessarily the right response. Better, he said, to call on one of the various options on the market for a hybrid cloud strategy that doesn't break the bank.

“Digital Realty, and other such companies, can offer you an environment that you can just go put your infrastructure in, and you're responsible for your servers and your dashboard and your storage, but they provide heating, cooling, the security guard,” Deierling pointed out.

“It’s not like 10 or 15 years ago, where building your own data center meant investing in real estate. Today, it's about using somebody else's colo facility, but you're going to own the architecture of that data center. And you're going to be able to control everything you're doing and control your costs.”

He added that there are software-led solutions for reducing the operational expense of having a private cloud set-up: “A good example is Nutanix, which builds hyper-converged infrastructure,” he said. “They basically deliver a set of integrated solutions that provide compute, storage and all of the virtualization. So instead of having to go off and start configuring virtual machines, they provide a turnkey solution.

"We partner with them to do the networking bit. It’s about the enterprise focusing on the business application that they are running and not needing to be an expert in storage and virtualization networking.”

Deierling said that these types of solutions have traction across a range of verticals: “We have many customers in the government sector throughout the world that want to keep data in their own data center.

“We also have a number of customers in the financial community, as well as the health care and health provider market where for regulation reasons, they need to be able to ensure that they are offering security and that they're not sharing your information. These organizations may be able to host some services in the public cloud, like HR functions or payroll. But their core business needs to be protected.”

Don't do it yourself

Outage
– Thinkstock / AKodisinghe

Michael Wood, CMO and VP of product at Apstra, the provider of intent-based data center automation solutions, is enthused about the growing number of automation tools designed to make a hybrid cloud strategy work.

“New designs are now available that enable you to create a level of availability, reliability and redundancy that is unprecedented,” he said.

“One of our customers is a financial services firm that is implementing around 200 micro data centers within a campus location.

"This gives them greater efficiencies with building maintenance, but also the ability for employees to seamlessly move around these campus buildings and be more productive and efficient. But it’s a model that demands that the compute and the processing and the storage be as close as possible to where the data is originating, at scale, there in that location.”

What organizations crave, Wood said, is a solution based on cloud principles that offers a way for them to gain greater control – over policy, over how lines of business consume services, over the way DevOps teams utilize services, and over the way in which the applications teams implement services. Automation, he argued, is the only answer: “You want cloud efficiencies, to be able to push up a workload, have new services and operate them within a few hours, rather than waiting weeks. That's where the automation model comes in. It creates that cloud-like experience.”

The other advantage of the automated model that excites Wood is the freedom it offers from getting locked into a single way of doing things: “Those organizations that build data center infrastructure without automation may well find that they’ve committed themselves, without necessarily having intended to,” he said. “The guy who built everything from scratch five years ago has now left the company. So how do they change anything without spending fresh capex? Without the freedom that automation gives, they are little better off than those locked into a public cloud contract.”

The risks of either trusting everything to the public cloud or maintaining everything on a server in the old way is about more than just the cost, he argued: “We've seen times where a line of business decides that they're going to go set up a service out in the cloud.

“Then they have no visibility, no control, no knowledge of what's going on, and a year later, they end up discovering that there's a compliance gap that's been created, a regulatory hole that's been established, or even worse, a security vulnerability. By having an automated cloud-like experience in their own private data center, they can provide uniform policy and control and mitigate security vulnerabilities.”

Wood envisions, and welcomes, a future where every aspect of a private and public cloud strategy is completely automated: “People will look back on this era and say ‘I can't believe that that people ran their data centers and their campuses the way they did back in 2019.’”