As a data center operator, you may have recently received marching orders to optimize your government-owned or large enterprise facility. For some, this may be a daunting proposition, particularly for those whose data centers are a bit older, are not operating efficiently, and offer little visibility into real-time performance.

Are you feeling overwhelmed about the task at hand? That’s understandable, as any data center is a complex environment. The last thing you’ll want to do is introduce an element of risk, which could easily lead to more harm than good.

Why is optimization necessary?

Electricity meter
– Thinkstock / PaulMcArdleUK

Data centers are absorbing more and more of the power grid’s capacity every year. Letting legacy data centers run wild is a thing of the past, since it is now quite clear that energy costs represent an increasing percentage of an IT facility’s operating expenses. According to the National Resource Defense Council (NRDC), data center electricity consumption will increase to about 140 billion kilowatt-hours annually by 2020, which is the equivalent output of 50 power plants, costing American businesses a projected $13 billion.

Fortunately, times have changed, with both government and enterprise data centers becoming more reliant on cloud applications, virtualization and converged infrastructure. Data center managers are being asked to slash costs and maximize efficiencies wherever possible. Energy performance optimization is considered low hanging fruit because substantial improvements can be realized with relatively small investments.

Nowhere is this happening more aggressively than in the public sector. Back in 2010, the U.S. Office of Management and Budget (U.S. OMB) launched the Federal Data Center Consolidation Initiative (FDCCI) to reduce overall reliance on expensive and inefficient legacy data center technologies. The goal of the FDCCI is to promote green IT, reduce the cost of hardware, software and maintenance, and invest in more efficient computing platforms and technologies.

The FDCCI was followed by the Federal Information Technology Acquisition Reform Act (FITARA) and the Data Center Optimization Initiative (DCOI), both of which aim to move the FDCCI forward faster.

Unfortunately, the transformation has been slower than expected. Federal agencies, for instance, were asked to close 40 percent of their data centers by the end of 2015. But many have fallen behind this initiative. At the end of last year, the Pentagon had only closed 18 percent of its data centers.

In the next few years, you can expect to see a much greater optimization and consolidation effort in the public sector as agencies look to stay on track and maintain regulatory compliance. And you can bet this effort will spill over into the private sector, too.

How to get the ball rolling in your data center

There are several actions you can take to reduce the overall energy usage in your data center facility. Virtualization and consolidation are two techniques which tend to “densify” your infrastructure, and can sometimes lead to energy savings by eliminating zombie and underutilized servers. Coupling this with better thermal management enables operators to shrink their footprints and boost the efficiency of precision cooling systems.

Perhaps the most important thing you can do is benchmark your progress as you go along. This is particularly important for government-run data centers. U.S. Government Executive Order 19693, “Planning for Federal Sustainability in the Next Decade,” requires all federal agencies to have advanced energy meters in government data centers in place by September 30, 2018.

This is important for enterprise data centers, too. After all, you can’t optimize your data center if you don’t know how it’s performing. By tracking your daily power usage, specifically at the rack level, you will have a much easier time tweaking your setup. Just about every major equipment manufacturer has developed their version of a DCIM (Data Center Infrastructure Management) platform to do just that. The secret is to understand what you need, and not over-purchase a system with more bells and whistles than you will ever need.

Perhaps the most critical thing to remember is that data center optimization is not a strategy you can complete and just check off your list. It needs to be an ongoing process and it’s one that requires constant vigilance.

Want to learn more about how you can enhance the performance of your data center? Just recently, Compu Dyamics released a white paper titled, “Essential Elements for Data Center Optimization.” This is a must-read for every government and enterprise data center operator.

To access this white paper, click here.

Stephen B. Altizer is president of Compu Dynamics, an American company that designs, builds and operates data centers for enterprise customers.