The Office of Management and Budget has updated the US government's Data Center Optimization Initiative (DCOI), the latest move in a long-running effort to reduce the federal government's sprawling web of data centers.
But the change, released this week, was critiqued by members of the Government Operations Subcommittee for allowing too much "wiggle room" for agencies to evade closure targets.
The new DCOI rules, available here, reduces the number of data centers tracked by the OMB, removes some recommendations, and changes the targets for agencies.
"After eight years of work in consolidating and closing Federal data centers, OMB has seen diminishing returns from agency data center closures," the report, written by Suzette Kent, OMB's Federal Chief Information Officer, said.
"The Government has picked off much of the low-hanging fruit of easily-consolidated infrastructure. OMB now will focus on targeted improvements in key areas where agencies can make meaningful improvements and achieve further cost savings through optimization and closures, as well as driving further maturity in IT modernization."
The update levels a development freeze on both new data centers and on significant expansions to existing facilities. Agencies that wish to build despite this moratorium "must submit a written justification that includes an analysis of alternatives, including opportunities for cloud services, shared services, and third party colocation."
The OMB also made changes to what information it will request from agencies, and most notably will "no longer require agencies to consolidate server closets, meet optimization targets, or include them in their inventory submissions." The removal of smaller facilities, defined by the body as non-tiered data centers, marks a huge change for the DCOI, and one that was criticized by the Government Operations Subcommittee (see below).
Various performance metrics were also changed, with the OMB updating how it prioritized DCIM use, virtualization, advanced energy metering and energy efficiency.
No more PUE targets
In its most aggressive shift to metric tracking, the OMB said that it "will no longer set an overall target for PUE." Previously, the target power usage effectiveness - the ratio of total amount of energy used by a facility to the energy delivered to computing equipment - was set at 1.5 for existing facilities, and 1.4 for new builds.
But, the OMB states: "Differences in factors such as level of redundancy, geographic location, weather, time of year, and even building construction can have an impact on the measured PUE of a facility. For instance, an extremely efficient data center in a warmer part of the country can easily register a higher PUE than a less efficient data center in a colder climate, but agency mission may determine the location of that facility."
The OMB will still collect PUE data, but will not use it "in isolation as an indicator of good management practices."
In its stead, the OMB is introducing a new metric: Availability. "In the commercial space, the most critical element for an infrastructure provider is facility availability. Most service level agreements contain explicit discussion of service availability guarantees. At a minimum, the Federal Government should be prepared to deliver the comparable level of service as that provided by private sector data centers and cloud services."
OMB will require agencies to report the planned hours of availability for each data center, along with any unplanned outages for that data center.
With some of the targets for agency data center operators clashing - redundant equipment for increased availability may impact efficiency, for example - the OMB have set the priorities in the following order:
1) Consolidation and Closure
- Energy Metering
- Server Utilization
The OMB also noted that the amount of data gathered by the federal government will continue to grow, increasing agency compute demand. With this in mind, and many data centers already closed, the government "should not expect to see continued dramatic savings or large-scale closures from ongoing data center consolidation and optimization efforts as agency needs grow."
These changes to the DCOI were not welcomed by all, with a Government Operations Subcommittee hearing on the latest scorecard on compliance with FITARA raising several concerns.
Democratic Congressman Gerry Connolly, a co-sponsor of the initial 2014 legislation, chaired the hearing and was particularly effusive in his critique of the changes.
"There is nothing more important to him than getting rid of data centers," Republican Congressman Mark Meadows warned at the hearing. "And if you're messing up on data centers, you're going to have a problem."
Speaking to Kent, a Trump administration appointee, Connolly opened the proceedings with a reference to the wider political whirlwind surrounding the current administration: "When you go back to OMB, you're going to be able to say 'I'm the one person in this White House who went to a hearing, and impeachment and subpoenas - nothing like that was discussed at all.'"
Of particular concern to Connolly was the removal of tracking non-tiered data centers from OMB's requirements, which he said "make up about 80 percent of the government's facilities."
This was echoed by Carol Harris, director of IT management issues at the Government Accountability Office, who said: "If these changes are implemented as is, the committee will lose the ability to track and measure progress in this area since the initial scorecard, because the baseline for comparison will have changed. Moreover the changes will likely slow down or halt important progress agencies should be making to consolidate, optimize, and secure their data centers."
Connolly's other worry was that the update no longer appeared to prioritize consolidation. "Our concern is that when OMB gives guidance on optimization and exempts 80 percent of the data centers from specific inventory plans, you are skirting the intent of the law," he said. "The intent of the law was always to identify how many data centers we had - which was a struggle - and then cut them in half, and then cut them in half again.
"That was the goal, it was set by your predecessor in the early years of the Obama administration. In those days, we thought we had 1,600 [data centers]. So the goal by the administration was, initially, cut it to 800, and my bill said no, cut it to 400 - that is what we incorporated into FITARA. Of course, what we got really good at was identifying more - and so we didn't have 1,600, we had 12,000, then 14,000 [laughs]."
Kent countered that previous efforts to count non-tiered data centers were flawed: "There were things that had been included [such as] printers, weather stations and MRI machines - things that weren't actually classified as a data center. So... we are trying to address what actually operates as a data center and we intend to close them."
She added that, after talking to agencies, the OMB also came to the belief that there are sometimes reasons not to close data centers, particularly when they are supercomputing sites.
This mindset, Connolly said, allowed for circumvention and a dilution of the goal, adding "we're nervous 'optimization' gives a lot of wiggle room."
He said: "it's easy for somebody to say I have 3,420 of them and I need every one of them. Every one is precious. And we're not going to change a thing. And because you have used this weaker word optimization which doesn't really require me to do something specific. And so I know that's not your intent. But you hear my concern. And my experience is, sometimes you've got to give very clear direction and set very explicit metrics in order to accomplish something."
Kent retorted: "I think we were being extremely explicit."
The task ahead
While Kent pointed to the closure of 150 'enterprise' data centers in the past two years, and the increasing adoption of commercial cloud email from 45 percent to 72 percent - "that's 1.8m mailboxes," she said - the US government's IT infrastructure remains immense.
"We spend of $90 billion each year on IT," GOA's Harris said. "80 percent of that spend is on legacy IT - we need to focus on decreasing that number and, and reinvesting that money into modernizing our aging systems."
Congressman Meadows added: "It's probably up to $110/$120 billion when you count in some of the agencies we can't talk about. I'm amazed at how archaic our IT system is. I mean, we're spending more than any Fortune 500 company would spend on it."
Talking to assembled agency CIOs, he said: "We continue to spend operational money for Cobol and Fortran programmers and legacy systems, it is just mind boggling that we would do it and we continue to do it... so in terms of action items, if you would get back to this committee on what your plan is to get rid of legacy systems, and what is the cost to do it... I need a plan, and I guess the only frustration you will find is that the next FITARA hearing, if there is not a plan on how we're going to get rid of that, there's going to be a problem.
"I'm tired of hearing about it."