Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.


Microsoft serves Azure from the sea bed

Project Natick puts servers in a rack in a can at the bottom of the deep blue sea

Microsoft Research has deployed a small data center in a watertight container at the bottom of the ocean - and even delivered its Azure cloud form the sea bed. 

Underwater data centers can be cooled by the surrounding water, and could also be powered by wave or tidal energy, says the Project Natick group at Microsoft Research, which deployed an experimental underwater data center one kilometer off the Pacific coast of the US between August and November of 2015.  

microsoft underwater data center 5

Leona Philpot after three months underwater 

Source: Microsoft

Azure sea  

The project aims to show it is possible to quickly deploy “edge” data centers in the sea, close to people who live near the coast: ”Project Natick is focused on a cloud future that can help better serve customers in areas which are near large bodies of water,” says the site, pointing out that half the world’s population are near coastlines.

“The vision of operating containerized datacenters offshore near major population centers anticipates a highly interactive future requiring data resources located close to users,” the project explains. “Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment.”

The group put one server rack inside a cylindrical vessel, named Leona Philpot after a character in the Halo Xbox game. The eight-foot (2.4m) diameter cylinder was filled with unreactive nitrogen gas, and sat on the sea bed 30 ft (9m) down, connected to land by a fiber optic cable.

According to a report in the New York Times, the vessel was loaded with sensors to prove that servers would continue working the vessel wouldn’t leak, and would not affect local marine life. These aims were achieved, and the group went ahead and deployed commercial Azure web loads on the system. 

The sea provided passive cooling, but further tests this year might place a vessel neare hydroelectric power sources off the coast of Florida or Europe.  

microsoft underwater data center natick 2

Source: Microsoft

The Natick Team: Eric Peterson, Spencer Fowers, Norm Whitaker, Ben Cutler, Jeff Kramer. (left to right)

From concept to seabed

The Project had its origins in a 2013 white paper, which got backing in 2014 and, a year later, deployed Leona Philpot. This version of the concept uses current computer hardware, but if it makes it into wider deployment, the equipment would most likely be redesigned to fit the requirements of underwater deployment. Most importantly, the equipment would have to run for years without any physical attention, but the team thinks this could be the norm in future.

“With the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly,” says the project site. ”We see this as an opportunity to field long-lived, resilient datacenters that operate ‘lights out’ – nobody on site – with very high reliability for the entire life of the deployment, possibly as long as ten years.”

The first  commercial version - if a commercial version is made - would have a 20 year lifespan, and would be restocked with equipment every five years, as that’s the current useful lifespacn for data center hardware. 

microsoft underwater data center 6

Putting the vessel togeher

Source: Microsoft

Mass produced underwater data centers could be deployed in 90 days, and would most likely have a redesigned rack system to make better use of an unattended cylindrical space. Widely-deployed Natick units would be made from recycled material, says the group, also making the point that using the water for energy and cooling would effectively make this a “zero emission” site as the energy taken from the environment would be returned there. 

But will it be commercialized? ”Project Natick is currently at the research stage,” the group comments. “It’s still early days in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.” 

Update: Today. Microsoft has published a long blog post about Project Natick, and video, embedded below. 

Readers' comments (4)

  • George Rockett

    How very Jules Verne. Interesting to see Microsoft getting the more outlandish data center innovation upper hand over Google. Maybe worth a feature in the next magazine comparing some of these crazy ideas and how they have helped shift thinking on infrastructure forward.

    Unsuitable or offensive? Report this comment

  • I can't see how this makes sense. The costs of deploying and retrieving these systems at the end of their lives will alone outweigh the benefits from passive cooling. Surely they are not planning to just litter the seabed with these things?

    Unsuitable or offensive? Report this comment

  • Given that we are already seeing rising sea levels owing to the rapid melting of the poles which in large is attributed to rising ocean temperatures, how is the deployment of such Data centers going to help the cause of reducing the footprint or doing good to the environment...?

    Really wonder who comes up with these, with no thought of what comes next...

    Unsuitable or offensive? Report this comment

  • An undersea cloud? I don't get it - why not just suck the cooling water out of the ocean in a pipe?

    Unsuitable or offensive? Report this comment

Have your say

Please view our terms and conditions before submitting your comment.



  • Next Generation Data Centers – Are you ready for scale?

    Wed, 24 Aug 2016 16:00:00

    This presentation will provide a general overview of the data center trends and the ecosystem that comprises of “hyperscale DC”, “MTDC”, and “enterprise DC”.

  • White Space 46: We'll always have Paris

    Fri, 15 Jul 2016 10:35:00

    This week on White Space, we look at the safest data center locations in the world, as rated by real estate management firm Cushman & Wakefield. It will come as no surprise that Iceland comes out on top, while the US and the UK have barely made the top 10. French data center specialist Data4 is promoting Paris as a global technology hub, where it is planning to invest at least €100 million. Another French data center owned by Webaxys is repurposing old Nissan Leaf car batteries in partnership with Eaton. Brexit update: We’ve also heard industry body TechUK outline an optimistic vision of Britain outside the EU – as long as the country remains within the single market and subscribes to the principles of the General Data Protection Regulation.

  • Powering Big Data with Big Solar

    Tue, 12 Jul 2016 18:00:00

    The data center industry is experiencing explosive growth. The expansion of online users and increased transactions will result in the online population to reach 50% of the world’s projected population, moving from 2.3 billion in 2012 to an expected 3.6 billion people by 2017. This growth is requiring data centers to address the carbon impact of their business and to integrate more renewable resources into their projects. Join First Solar to learn: -Why major C&I companies are looking to utility-scale solar as a viable addition to their energy sourcing portfolios. -How cost-effective utility-scale solar options can support datacenters in securing renewable supply. -Case study of how a major data center player implemented solar into their portfolio

  • DC Professional - Meet John Laban

    Tue, 12 Jul 2016 15:25:00

    John has worked in the Telecommunications and Information Transport Systems (ITS) industry for over 35 years, beginning his career at the London Stock Exchange as a BT telecommunication technician. Believing there was a general lack of quality in the ITS industry, John was driven to "professionalize" the ITS industry – starting with a professional diploma programme for the Telecommunications Managers Association – which led to him becoming the first BICSI RCDD in the UK and soon after, a BICSI Master Instructor teaching RCDD and Technician programmes. Find out more about John and upcoming sessions here

  • White Space 45: Waste Not

    Sun, 10 Jul 2016 15:50:00

    In this episode of White Space, we look back at the news of the week with a special guest Adrian Barker, general manager for EMEA at RF Code and specialist in sensors and data.

More link