Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

Microsoft serves Azure from the sea bed

Project Natick puts servers in a rack in a can at the bottom of the deep blue sea

Microsoft Research has deployed a small data center in a watertight container at the bottom of the ocean - and even delivered its Azure cloud form the sea bed. 

Underwater data centers can be cooled by the surrounding water, and could also be powered by wave or tidal energy, says the Project Natick group at Microsoft Research, which deployed an experimental underwater data center one kilometer off the Pacific coast of the US between August and November of 2015.  

microsoft underwater data center 5

Leona Philpot after three months underwater 

Source: Microsoft

Azure sea  

The project aims to show it is possible to quickly deploy “edge” data centers in the sea, close to people who live near the coast: ”Project Natick is focused on a cloud future that can help better serve customers in areas which are near large bodies of water,” says the site, pointing out that half the world’s population are near coastlines.

“The vision of operating containerized datacenters offshore near major population centers anticipates a highly interactive future requiring data resources located close to users,” the project explains. “Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment.”

The group put one server rack inside a cylindrical vessel, named Leona Philpot after a character in the Halo Xbox game. The eight-foot (2.4m) diameter cylinder was filled with unreactive nitrogen gas, and sat on the sea bed 30 ft (9m) down, connected to land by a fiber optic cable.

According to a report in the New York Times, the vessel was loaded with sensors to prove that servers would continue working the vessel wouldn’t leak, and would not affect local marine life. These aims were achieved, and the group went ahead and deployed commercial Azure web loads on the system. 

The sea provided passive cooling, but further tests this year might place a vessel neare hydroelectric power sources off the coast of Florida or Europe.  

microsoft underwater data center natick 2

Source: Microsoft

The Natick Team: Eric Peterson, Spencer Fowers, Norm Whitaker, Ben Cutler, Jeff Kramer. (left to right)

From concept to seabed

The Project had its origins in a 2013 white paper, which got backing in 2014 and, a year later, deployed Leona Philpot. This version of the concept uses current computer hardware, but if it makes it into wider deployment, the equipment would most likely be redesigned to fit the requirements of underwater deployment. Most importantly, the equipment would have to run for years without any physical attention, but the team thinks this could be the norm in future.

“With the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly,” says the project site. ”We see this as an opportunity to field long-lived, resilient datacenters that operate ‘lights out’ – nobody on site – with very high reliability for the entire life of the deployment, possibly as long as ten years.”

The first  commercial version - if a commercial version is made - would have a 20 year lifespan, and would be restocked with equipment every five years, as that’s the current useful lifespacn for data center hardware. 

microsoft underwater data center 6

Putting the vessel togeher

Source: Microsoft

Mass produced underwater data centers could be deployed in 90 days, and would most likely have a redesigned rack system to make better use of an unattended cylindrical space. Widely-deployed Natick units would be made from recycled material, says the group, also making the point that using the water for energy and cooling would effectively make this a “zero emission” site as the energy taken from the environment would be returned there. 

But will it be commercialized? ”Project Natick is currently at the research stage,” the group comments. “It’s still early days in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.” 

Update: Today. Microsoft has published a long blog post about Project Natick, and video, embedded below. 

Readers' comments (4)

  • George Rockett

    How very Jules Verne. Interesting to see Microsoft getting the more outlandish data center innovation upper hand over Google. Maybe worth a feature in the next magazine comparing some of these crazy ideas and how they have helped shift thinking on infrastructure forward.

    Unsuitable or offensive? Report this comment

  • I can't see how this makes sense. The costs of deploying and retrieving these systems at the end of their lives will alone outweigh the benefits from passive cooling. Surely they are not planning to just litter the seabed with these things?

    Unsuitable or offensive? Report this comment

  • Given that we are already seeing rising sea levels owing to the rapid melting of the poles which in large is attributed to rising ocean temperatures, how is the deployment of such Data centers going to help the cause of reducing the footprint or doing good to the environment...?

    Really wonder who comes up with these, with no thought of what comes next...

    Unsuitable or offensive? Report this comment

  • An undersea cloud? I don't get it - why not just suck the cooling water out of the ocean in a pipe?

    Unsuitable or offensive? Report this comment

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required

Webinars

  • How to Secure Your Critical Assets today and tomorrow

    Tue, 18 Oct 2016 18:00:00

    Whether you're a company looking to evaluate a data center, or a data / tech company looking to upgrade and improve their physical security, we'll discuss some of the requirements from major organizations along with important considerations for each facility layer, from ‘curb to core’. In this webinar we’ll discuss: • The rise of the data centre and new security challenges • Risks that businesses face from data loss • Three layers of physical security and 12 data centre security tips • Cost effective access control and 'curb to core' security • Best practices in securing data against attack and business continuity • How to manage and achieve compliance with regulatory requirements.

  • Next Generation Data Centers – Are you ready for scale?

    Wed, 24 Aug 2016 16:00:00

    This presentation will provide a general overview of the data center trends and the ecosystem that comprises of “hyperscale DC”, “MTDC”, and “enterprise DC”.

  • Ian Bitterlin -DC Power Professional

    Fri, 12 Aug 2016 08:55:00

    Professor Ian Bitterlin talks about DC Pro's Power Professional course.

  • White Space 49: Good news and Bad News this week

    Fri, 5 Aug 2016 14:10:00

    Editorial team at DCD talks about data center fires, cyber wars and network sabotage

  • White Space 47: There's a Pokéstop outside our office

    Fri, 22 Jul 2016 10:35:00

    This week on White Space, we talk everyhting: > Pokémon > Microsoft's Azure Stack launch > DatacenterDynamics Awards 2016 program > Digital Realty's move into Wind Power

More link