Data centers are an energy intensive industry, but fears over the the energy use of the cloud have been overstated, according to a meeting in Brussels last week.
Data usage is exploding, with more people consuming more video and online services than ever before, but IT energy use is not growing to match that, industry bodies said last week, at a meeting held during the EU’s Sustainable Energy Week, hosted by DigitalEurope, the umbrella body for European It industry bodies. After years of guesswork, experts said, Information is now being gathered which contradicts fears that energy use by data centers and the cloud is out of control.
Colocation cuts costs
The UK’s colocation sector consumed 2.15TWh of electrical energy during 2015, according to data gathered by the government’s Climate Change Agreement (CCA), a scheme that gives tax breaks to colocation providers who sign up to improve their efficiency.
”We have robust data on the UK colocation market because everyone who is anyone in that market participates in an energy efficiency scheme (the Climate Change Agreement) that requires detailed data on energy use,” said Emma Fryer. She added that the smaller colocation providers in the country probably use less than 0.25 TWh per year between them.
There are enterprise data centers, but they are smaller, said Fryer. Even over-estimating the total use at 6GWh, that is still less than 2 percent of the UK’s total electricity use. Given that the UK has Europe’s largest colocation sector by far, that figure will be smaller in other European countries,
Figures from Sweden show smaller energy usage, which is again coupled with a massive expansion in the data stored and carried, according to Jens Malmodin of Sweden. While data traffic has expanded exponentially, the amount of power used by data centers has leveled off, while the demand from end user equipment has actually shrunk, as people have moved from larg screens to smaller devices such as tablets and phones.
Energy usage grew rapidly in the years before 2008, and then fell because of the recession brought on by the financial crisis. The current fall in energy use is down to improved technology, where servers can handle far more work for the same amount of electrical energy, and greater efficiency, as workloads move from standalone data centers into colocation spaces and the cloud, where resources can be used more effectively.
Server virtualization and the cloud have promised efficiency savings for a long time, but fresh data is showing that these savings are real, said Steve Strutt, cloud advisor for IBM.