It was good to hear more about Microsoft’s underwater data center recently. It provoked a lot of interest earlier in the year when the results of the first test were announced, and the project’s leader Ben Cutler gave plenty more details at the DCD Enterprise event in New York.
Cutler made it clear that Project Natick was not (just) a science experiment. Last year, Microsoft put servers in a thick steel vessel, and operated them from the bottom of the sea. Now its planning to do it again with more servers - and a commercial roll-out is a real possibility.
It’s not really about the water
So far, the headlines have majored on the use of sea water to cool the IT systems, and the possibility of using tidal energy to power them. These are important attacks on the problem of data center power and waste energy.
But the real breakthrough here is not what the system uses, but what it excludes. The whole idea would be completely impossible, if it didn’t take an extreme approach that simply excludes the most troublesome environmental issue in a data center: human beings.
A large part of the cost of building data centers goes on walls and floors to halls that can accommodate engineers. And data center technology has reached a stage where the needs of servers and people are starting to diverge.
The ony thing stopping data centers getting hotter is the need to let people in to work on them
Humans need oxygen and humidity, which can cause corrosion and electrical problems in servers. Humans like to operate at around 18C (64F), while servers can operate safely well above that. Webscale companies have successfully gone up to 30C (85F), and every degree warmer can save four percent on energy bills.
The ony thing stopping data centers getting hotter is the need to let people in to work on them.
Lights out, people out
There is a move towards lights-out facilities, and Cutler says Microsoft has sites that are only visited once a year. Take that a little further, and you can envisage a submerged unit, which gets hauled up from the sea bed every two years for an upgrade.
Unattended operation is the only way to go if your data center is 600ft down, where scuba divers cannot reach it, and if no-one is going in, who needs oxygen? Natick breathes pure nitrogen, with no dust.
The needs of hardware will also fit this change. As Moore’s Law winds down, servers will be refreshed less often, and we’ll be moving to a system where more reliable hardware is kept in use for a longer period. There will be less need to nurture the tech.
Consider telephone exchanges, the facilities which spread across the world a century earlier. These started out with human intervention on every call, but gradually became facilities where people had little to do, and then were excluded altogether.
Today’s facilities are early versions of what will ultimately handle the world’s electronic nervous system, a vast undertaking which can only be achieved by more minimalist structures.
The number of people available to do the work, and the expense of engineers also argues for taking people out of the data center.
A version of this article appeared on Green Data Center News