Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.


Balancing the data vs. power equation

  • Print
  • Share
  • Comment
  • Save

Data is growing faster than ever before.

According to EMC, by 2020 about 1.7 megabytes of new information will be created every second for every human being on the planet. That equates to a digital universe of around 44 zettabytes (or 44 trillion gigabytes) of data – a high-end computer today uses just 8GB.

And it’s not just the volume of data that’s growing at an alarming rate. The need to analyse all this information is also following suit. In today’s data-driven economy business innovation is dependent on capturing and analysing data insights in real-time (when it’s most valuable), making quick access to this information more important than ever.

Yet, businesses are quickly finding that the power required to support their high-performance computing (HPC) clouds, clusters, and supercomputers isn’t growing at the same pace as the need to process, store, access, and gain insight from the vast amounts of data they house.

The power problem

Power grids have become a problematic stumbling block for companies that want to expand their HPC, Big Data, and other compute-intensive programmes. Under increasing strain, the world’s ageing power infrastructure is unable to keep up with the electricity demand in many developed countries.

The UK power grid, for example, is extremely vulnerable. In May, the National Grid issued its first summer-time ‘Notification of Inadequate System Margin’ (NISM) alert in eight years – indicating that even in warm weather, when power demand is typically lower, the UK grid is suffering.

The reason: reserve power margins in the UK are expected to fall below four per cent for the next 12 months, making it a realistic possibility that data centre operators or businesses looking to locate resources within the region simply will not have the power available to do so. In short, there won’t be enough power to go around – meaning outages will be more common.

The situation across the pond is no brighter. The American Society of Civil Engineers gave the US power grids a D+ rating (labelling infrastructure “mostly below standard” with “a strong risk of failure”) in its latest evaluation, noting that it would take an investment of $17 billion to bring them up to international standards. Perhaps even more shocking is the fact many of the US grids in use today were designed by Thomas Edison and his contemporaries nearly 150 years ago.


Affording the risk

While power supplies will remain limited – delivered by ageing and increasingly fragile grids – accelerating amounts of data will continue to increase electricity demands on data centres. If the grid goes down, so too can data centres – and this can have an immediate impact for many companies.

Last year the UK suffered 640 data centre outages, up 23.5 % from 2014 and 84% on 2010. Combined, these outages lasted an average of 50 minutes each – a total of 32,032 minutes’ (that’s more than 22 days’) downtime – and affected over 2.5 million people.

But network outages like these are far more than just an hour’s inconvenience for those involved – the cost to business can be crippling. Taking into account process-related expenditures – such as detection, containment and recovery – as well as lost opportunity costs associated with these outages, a recent report by Emerson calculated the average cost of an outage at £6,000 per minute.

With figures like these, it’s clear that no company can afford to risk the revenue and economic loss of brand reputation, customer churn, and lost business opportunities.

Location, location, location

Now, thanks to the cloud, companies no longer need to sit on their data (that is, on premise). Information can now be housed in global locations with very little (if any) impact to latency and security. This means businesses can release their data from poor-performing (and expensive) grids to take advantage of some of the world’s most reliable energy infrastructure; fuelled by abundant, clean, and affordable power.

For compute-intensive applications, regions with hydroelectric and geothermal energy are optimal so it’s no coincidence that the industry has seen a steady data centre migration to countries like Iceland, Norway, Sweden, and the Province of Quebec over the past few years.

To put the difference in context, as the UK grid works at 96 per cent capacity, in Iceland – due to its sub-350,000 population and focus on long-term sustainable power resources – that figure drops to just 10 per cent. That’s more supercomputing power than any company could possibly need!

For this reason, it should be expected that these near-Arctic locations may soon become the pre-eminent destinations for the next generation of high-performance data centres, which will increasingly connect our world and add enormous value to the global economy.

Heading north may just be the solution to balancing the data versus power equation.

Jorge Balcells is director of Technical Services at Verne Global.

Have your say

Please view our terms and conditions before submitting your comment.

  • Print
  • Share
  • Comment
  • Save


More link