Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

Microsoft serves Azure from the sea bed

Project Natick puts servers in a rack in a can at the bottom of the deep blue sea

Microsoft Research has deployed a small data center in a watertight container at the bottom of the ocean - and even delivered its Azure cloud form the sea bed. 

Underwater data centers can be cooled by the surrounding water, and could also be powered by wave or tidal energy, says the Project Natick group at Microsoft Research, which deployed an experimental underwater data center one kilometer off the Pacific coast of the US between August and November of 2015.  

microsoft underwater data center 5

Leona Philpot after three months underwater 

Source: Microsoft

Azure sea  

The project aims to show it is possible to quickly deploy “edge” data centers in the sea, close to people who live near the coast: ”Project Natick is focused on a cloud future that can help better serve customers in areas which are near large bodies of water,” says the site, pointing out that half the world’s population are near coastlines.

“The vision of operating containerized datacenters offshore near major population centers anticipates a highly interactive future requiring data resources located close to users,” the project explains. “Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment.”

The group put one server rack inside a cylindrical vessel, named Leona Philpot after a character in the Halo Xbox game. The eight-foot (2.4m) diameter cylinder was filled with unreactive nitrogen gas, and sat on the sea bed 30 ft (9m) down, connected to land by a fiber optic cable.

According to a report in the New York Times, the vessel was loaded with sensors to prove that servers would continue working the vessel wouldn’t leak, and would not affect local marine life. These aims were achieved, and the group went ahead and deployed commercial Azure web loads on the system. 

The sea provided passive cooling, but further tests this year might place a vessel neare hydroelectric power sources off the coast of Florida or Europe.  

microsoft underwater data center natick 2

Source: Microsoft

The Natick Team: Eric Peterson, Spencer Fowers, Norm Whitaker, Ben Cutler, Jeff Kramer. (left to right)

From concept to seabed

The Project had its origins in a 2013 white paper, which got backing in 2014 and, a year later, deployed Leona Philpot. This version of the concept uses current computer hardware, but if it makes it into wider deployment, the equipment would most likely be redesigned to fit the requirements of underwater deployment. Most importantly, the equipment would have to run for years without any physical attention, but the team thinks this could be the norm in future.

“With the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly,” says the project site. ”We see this as an opportunity to field long-lived, resilient datacenters that operate ‘lights out’ – nobody on site – with very high reliability for the entire life of the deployment, possibly as long as ten years.”

The first  commercial version - if a commercial version is made - would have a 20 year lifespan, and would be restocked with equipment every five years, as that’s the current useful lifespacn for data center hardware. 

microsoft underwater data center 6

Putting the vessel togeher

Source: Microsoft

Mass produced underwater data centers could be deployed in 90 days, and would most likely have a redesigned rack system to make better use of an unattended cylindrical space. Widely-deployed Natick units would be made from recycled material, says the group, also making the point that using the water for energy and cooling would effectively make this a “zero emission” site as the energy taken from the environment would be returned there. 

But will it be commercialized? ”Project Natick is currently at the research stage,” the group comments. “It’s still early days in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.” 

Update: Today. Microsoft has published a long blog post about Project Natick, and video, embedded below. 

Readers' comments (4)

  • George Rockett

    How very Jules Verne. Interesting to see Microsoft getting the more outlandish data center innovation upper hand over Google. Maybe worth a feature in the next magazine comparing some of these crazy ideas and how they have helped shift thinking on infrastructure forward.

    Unsuitable or offensive? Report this comment

  • I can't see how this makes sense. The costs of deploying and retrieving these systems at the end of their lives will alone outweigh the benefits from passive cooling. Surely they are not planning to just litter the seabed with these things?

    Unsuitable or offensive? Report this comment

  • Given that we are already seeing rising sea levels owing to the rapid melting of the poles which in large is attributed to rising ocean temperatures, how is the deployment of such Data centers going to help the cause of reducing the footprint or doing good to the environment...?

    Really wonder who comes up with these, with no thought of what comes next...

    Unsuitable or offensive? Report this comment

  • An undersea cloud? I don't get it - why not just suck the cooling water out of the ocean in a pipe?

    Unsuitable or offensive? Report this comment

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required

Webinars

  • Powering Big Data with Big Solar

    Tue, 12 Jul 2016 18:00:00

    The data center industry is experiencing explosive growth. The expansion of online users and increased transactions will result in the online population to reach 50% of the world’s projected population, moving from 2.3 billion in 2012 to an expected 3.6 billion people by 2017. This growth is requiring data centers to address the carbon impact of their business and the increasing need for data centers to integrate more renewable resources into their projects. Join First Solar to learn: -Why major C&I companies are looking to utility-scale solar as a viable addition to their energy sourcing portfolios. -How cost-effective utility-scale solar options can support datacenters in securing renewable supply. -Case study of how a major data center player implemented solar into their portfolio

  • Smart Choices for your Digital Infrastructure

    Tue, 28 Jun 2016 10:00:00

    Your data centre is a key part of successfully transforming and building your digital business. The challenge today is to create a highly reliable, flexible, scalable and cost-effective digital infrastructure. Your cabling system is an important element in the creation of that infrastructure. Attend and learn how to: - Piece together different elements of standards, technical specifications and physical properties in order to choose the right networking equipment - Reduce the time and labour spent maintaining, repairing or installing cabling by adopting improved design and management practices.

  • White Space 39: Attacks on power and cooling

    Tue, 17 May 2016 08:25:00

    This week on White Space, we talk about the security of Industrial Control Systems – the systems that control your CRAC or PDUs. If these devices are connected to a network, attackers can reach them, and shut down a facility. Special guests Ed Ansett and George Rockett.

  • White Space 38: Leaving Las Vegas

    Tue, 10 May 2016 13:25:00

    This week we talk about: Tax Break for a data center Efficiency standards News form the Las Legad event - EMC World The Dell/EMC merger. And much more...

  • Designing Flexibility into your Data Center Power Infrastructure

    Wed, 4 May 2016 18:00:00

    As power density is rapidly increasing in today’s data center, provisioning the right amount of power to the rack without under sizing or over provisioning the power chain has become a real design challenge. Managing the current and future power needs of the data center requires Cap-Ex to deploy a flexible power infrastructure: safely handling peak power demands, balancing critical loads and easily scaling to meet growing power needs. In this webinar you will learn: > How to create Long term power flexibility and improved availability for your operation > How to increase energy efficiency and improve SLAs through a comprehensive set of best practices.

More link