Archived Content

The following content is from an older version of this website, and may not display correctly.

 

For all the talk of increasing data center efficiency and bringing down costs, for most people there is a gap between these two concepts. Most data centers are like a child who gets pocket money, but has no idea of what the family’s actual costs are.

The organization’s financials are carefully maintained in purpose-built applications, while the IT resources are parcelled out, managed and orchestrated through some level of automation and control software, that usually has no concept of cash.

The physical data center infrastructure may be designed to deliver the most efficient service possible, and the business may have machinery to bill resources to departments or customers.

child money
– Thinkstock

Financially illiterate servers

But how do you decide what to bill your clients? Can your finance software see where any expense is actually incurred? These answers are not just good to have, they are essential.

“The emergence of increasingly low cost cloud services, coupled with the ever more competitive market for commercial data centers, means that those investing in data centers need to have a clear understanding of their costs and be able see and model the implications of their business and technical decisions,” says Andy Lawrence, VP of research for data center technologies at 451 Research.

The fact that they don’t could simply be because the IT industry is not yet mature. Data center infrastructure management (DCIM) vendors promise cost savings, like all too many other IT movements, those promises often have to be taken on trust, and then (hopefully) demonstrated somewhere down the line.

The cloud makes this more urgent. It’s the ultimate maturation of the IT market and changes everything by commoditiztion, according to Zahl Limbuwala of Romonet: “In a commoditized world, it’s critical to see costs, to control them and use them as a tool within your business.”

If data centers weren’t a commodity, providers could differentiate themselves, perhaps on the basis of service levels, or on their understanding of a particular geography or market. But data center services are now less differentiated than mid-sized Fords, Citroens and Renaults, he says. Customers will choose whichever is cheapest, and that is just a sign of maturity: “People will say IT and computing and software have been around for 50 years, but in economic terms the market is still extremely immature.”

Supermarkets know how much profit they make on every tin of beans. Data centers need the same level of visibility.

Large IT firms can’t differentiate around hardware; the margins moved to service and consultancy. But now IT is bought as a service, and those margins are disappearing. With ease-of-use, even the consulting gravy train is hitting the buffers.

“Many people simply don’t understand cost as well as they must, to survive in a truly commoditized data center market,” says Limbuwala. “The data center is an asset, but they don’t understand the costs - either as a total or at a granular level -  of delivering services from that huge energy-intensive building.”

Supermarkets know how much profit they make on every tin of beans. Data centers need the same level of visibility.

With big companies plugging DCIM, and its offshoots data center service optimization (DCSO) and data center predictive modelling( DCPM), it’s a surprise to find a tiny software firm based in South London, creating as much noise as Romonet has - while resolutely refusing any attempt to lump it in with the DCIM crowd.

451 Research’s Lawrence ses prediction and cost analysis as crucial, and seems taken by Romonet’s approach, saying “its introduction is timely and addresses a clear need.”

Energy is the main cost

Romonet uses an energy model - and has a patent granted in Japan and pending elsewhere. Energy is the main operating cost of a data center, so Limbuwala’s idea is to follow the electrical and mechanical supply chain all the way up from the utility feed to the application software.

It’s basically a data center version of “activity based costing” the approach factories and supermarkets have used for many years to understand exactly the cost of producing and delivering each of those tins of beans.

By understanding every component of the the cost of a cloud service, a service provider can compute exactly how much each user should be charged, says Limbuwala: “It’s important because cost is not one dimensional,” including fixed capital costs written off over time, and operating expenditure which affects cash flow and is treated differently.

He compares colocation services to rented office space:  “If I have two equal size rooms, I can rent them for the same price to two different companies. But if one fills the room with ten people 24-by-seven, while the other has one person in it for one hour per month, the revenue is identical, but the cost and margin is different.”

The office rental firm needs to account for consumption of electricity, coffee as well as a cleaning and toilets. In the data center, the provider will need to address rack density, usage and other factors.

If 5kW racks are rented for a fixed cost, some customers may use all of the power allocation, while others might just put in a switch and use 100W for three years.

Consumption based billing is almost unknown in the US, and still rare in Europe, but without it, the provider has no idea what margin he or she is getting for each customer.

“Whether you are an internal data center or Equinix, you need that knowledge, because you are faced with competition from everyone who can sell the same product,” says Limbuwala. “There is nothing special about your data center.”

What is needed is to treat capital cost and variable costs separately. The system has to know what capacity is provisioned, and what is actually used - and where it is used. That way, the amortized capital, and the operation costs, can be allocated by customer in proportion to who is using the resources - and those costs related to the charges levied.

To work properly, the energy costs have to be measured and apportioned on an hourly basis. This could be done by hand using reports and spreadsheets, but it is far better to have it in real time and automatic, says Limbuwala: “The difficult bit is not doing it once, but doing it continuously.”

Data center interdependencies

Put that way, it sounds simple, but data centers contain interdependencies: “In the data center flow chart, there are loops, things cross over, and multiple paths. Complexity is inevitable when electrical and mechanical systems are put together.”

It’s worth getting right, because other things “fall out” of the energy model, including operational figures, Limbuwala continues, allowing efficiency measurement, and a lot of the things that DCM has been promising.

For instance, a data center may have a PUE of 1.2, but what if one of the chillers is using more power than it should? The data center manager may see that a chiller is using (say) 1.2MW; that might be more than its neighbor; but how much power should it be using?

To understand that, needs a knowledge of what is happening in the part of the data center served by that chiller. “Unless you have an energy model, you don’t know,” says Limbuwala.

It’s difficult for humans to spot anomalies, but a predictive approach to an energy model will highlight them “Like a highlighter pen”, and prompt activities such as cleaning the filters in a laboring chiller:  “If you can’t see this, it can go unnoticed for six months.”

As well as the DCIM, this kind of software has to integrate with the building management system - and Romonet’s strategy is to manage data centers by partnering with leaders in those fields, says the CEO.

Beyond this, Limbuwala wants to step beyond the data center field to other energy intensive sectors, and help organisations like hospitals take a more efficient approach. If he can pull that off, it might just be a sign that the IT industry is finally maturing.