The advent of data center service management (DCSM) solutions is giving enterprises the information they need, and driving greater transparency into the costs associated with the time and activities of personnel, which is one of the most expensive operational costs in the data center.

Recently, I was talking to the data center director of a large financial services organization. When I asked him what motivated his organization to implement DCSM, his answer was surprising: “The public cloud.”

business going to the cloud thinkstock boytsov
– Thinkstock / boytsov

Knowing the costs

Because his company knows exactly how much it costs to run applications on an Amazon AWS instance, his upper management is now asking what it costs to run applications in the company’s own data center.

This particular data center director said he was confident that after evaluating internal operations, he would find he was able deliver compute capacity to the business more cost effectively than AWS – even if he ignored the potential data privacy/compliance issues that might be raised by running his company’s apps in the public cloud.

But assessing all the costs of the internal option was a challenge. His team had rolled out a data center infrastructure management (DCIM) solution 18 months prior to our conversation. But, since then, he realized there was a huge hole in determining the true cost of operating the data center because they were not measuring the time and activities of personnel.

Comparing internal costs with AWS prompted the company to use DCSM 

The DCIM solution gave them a very good handle on capacity planning, energy costs and how much space was being used by assets in the data center. But it did not give a clear understanding of the cost of people – one of the most expensive data center operational costs.

When the CIO asked how internal costs compared with Amazon AWS, it prompted the company to expand its DCIM implementation to perform what is being described as data center service management – DCSM.

The company started using workflow to track changes that data center personnel were making. This not only ensured there was consistency in how people were performing adds, moves and changes, but it gave them granular insight into what work was being done, and on what assets. From there they could easily determine the costs to be attributed to specific applications.

We need transparency

The moral of this story is that whether or not organizations choose to use public clouds from any of myriad vendors on the market, these public clouds have changed the way in which data centers operate, and how we now evaluate and understand the granular costs. We are seeing a shift from a time where managing infrastructure efficiently was good enough, to a new world where the data center is delivering a service that is being compared with those offered by third-party vendors.

Traditional DCIM solutions simply don’t go far enough to help businesses gain a clear understanding of all the costs in the data center, which is why we now see many progressive organizations moving to DCSM to ensure they have full transparency of their operations.

Robert Neave is co-founder, CTO and vice president of product management at Nlyte Software.

This article appeared in the December/January issue of Datacenter Dynamics Magazine