Has the data center reached the pinnacle of automation? There is no doubt that facilities are more “robotic” and self-serving than ever before, but there could be more to come. Software is continuing its quest to become “the great definer” at the heart of the software-defined data center (SDDC), and the data center itself is taking on a more chameleon-like shape.
To Matt Altieri, marketing director at Device42, the SDDC means software-defined everything, where the idea of virtualization is now extended to all parts of the IT stack, resulting in delivery of infrastructure as a service (IaaS).
“Software-defined data centers are not only simpler and more manageable,” Altieri told us, “but they also more easily align with companies’ business needs. The speed of deploying systems is dramatically increased.”
He says SDDCs have the flexibility to take on many configurations, capabilities, and forms, and can support companies ranging from the size and complexity of the FANG companies (Facebook, Amazon, Netflix, and Google) to companies with very much simpler operations.
Well defined
The increase in simplicity realized from “converged infrastructure,” as Altieri put it, paved the way for delivery of IaaS: Amazon Web Services changed the world of IT by pioneering a new way to deliver IT, and the other public cloud providers such as Microsoft, Google, and IBM soon followed.
Altieri said these providers allow you to use application program interfaces (API) to create the infrastructure you need to run your business on demand. A SDDC differs from a private cloud, which only offers virtual-machine self-service, beneath which it could use traditional provisioning and management. Instead, SDDC concepts describe a data center that can span private, public or hybrid clouds.
Torsten Volk, managing research director with Enterprise Management Associates, and no stranger to the SDDC arena, believes that only in a SDDC can customers achieve policy-driven and fully-automated application provisioning and management.
Volk said: “Today, when the request comes in to host a specific enterprise application, IT teams (for storage, networking equipment and servers) have to crack open multiple hardware vendor specific command-line interfaces (CLI) and control panels to serve up the storage, network and compute resource pools needed by the virtual machine administrator to create the virtual application environment.”
But Volk said this is error prone, requires vendor specific skills, and can take days or weeks. He said these issues are the reason for the rapid growth of Microsoft Azure and Amazon Web Services, where line-of-business (LoB) developers already have fully-programmatic access to everything they need.
The “business logic layer” has always been Volk’s vision of artificial intelligence/machine learning-driven (AI/ML) intelligence that can manage hybrid IT infrastructure in a manner that accounts for the business importance of each application, as well as for the organization’s permanently changing strategic priorities. In short, the business logic layer turns the SDDC into the business-defined data center (BDDC).
Volk elaborated: “The crux is that we can only make the step to the business-defined data center if we make IT operations management sensitive to the business context, instead of rigidly enforcing best practices for application management. For example, in a business defined data center, the business logic layer makes the painful decisions in terms of which tradeoffs have to be made for IT to be optimally aligned with the business.”
This could mean accepting the slow performance of a specific application as a price for optimizing the user experience of another, much more business-critical application. Volk said these decisions cannot be effectively made in today’s siloed IT, as individual operators do not have access to the business context.
A changing landscape
For Michael Letschin, field CTO at Nexenta, SDDCs are currently changing the entire data center landscape. In the past, facilities were made up using big block systems from legacy players, purpose-built to support a narrow, pre-defined set of workloads.
“This limited the end user’s choice for technology and usage,” Letschin said. “The newer software-defined storage model is shifting this paradigm and is giving organizations the ability to run an agile, scalable and cost effective SDDC.”
Additionally, the selection of vendors for the data center has expanded and challenged the concept of “never getting fired for buying IBM,” Letschin said. In today’s SDDC, management and administrative staff are encouraged to look for innovative software-defined solutions on the market to address their data center pain points.
He continued: “The benefits of transitioning to an SDDC also extend beyond the core technology to deliver significant space, power and cooling reductions. As a result of the clear business benefits that software-defined storage (SDS) provides, an increasing number of enterprises are shifting to the SDDC model.
According to Letschin, if the last five to ten years have been any indication, it will be almost impossible to predict the growth in the software-defined data center, but it will include integration with public cloud solutions to give even more “just in time” solution capabilities. He said: “We will see the integration of multi-data center capabilities.
On the compute side, the rise of containers will bring the application back to the forefront and the idea of software-defined compute running on all full virtual machines will become obsolete. This increase in application based compute will lead to more automation, self-service and self-supporting infrastructure with AI taking a bigger role in actively managing the SDDC.”
Letschin said storage will most likely undergo some of the biggest changes and, at the same time, remain the most consistent. Also, data integrity will continue to be key for compliance and security purposes. However, the use of commodity hardware will become more prevalent in the future with the ability to cram more capacity in less space while providing higher performance. “One thing that will start to become obsolete in the data center,” Letschin said, “will be ‘performance-only’ solutions, and general-purpose storage will likely make a comeback because of the flexibility inherent to SDS.”
Overall, Letschin says the data center will continue to operate as the nervous system of the business, but will be much more agile than the data center we recognize today.
A version of this article appeared in the April/May 2017 issue of DCD magazine