Microsoft’s Project Natick plan, which tested data centers on the ocean floor, is not a one-off engineering folly, but a real program about to proceed on the next steps towards commercialization, the project’s leader told a New York conference this week.

The original test put one rack in a sealed steel container, and operated it in shallow (9m) waters off the Pacific coast of the US for three months, tethered by power and fiber. Follow up projects will use larger racks, place them up to 200m (600ft) below sea level, and develop the use of renewable energy from tides and currents, Microsoft program manager Ben Cutler told the DCD Enterprise event in New York.

microsoft underwater data center project Natick 4
Project Natick  – Microsoft

Race to the bottom?

Although the ocean is seen as inaccessible, half the world’s population live within 90 miles of the coast, said Cutler. And the difficulty of maintaining the unit would not be a problem - this would be a more extreme version of Microsoft’s current ”lights out” data centers, some of which operate unattended for a year, he said.

Production units developed from Natick would be designed to operate unattended for two years, at which point the units would be hauled up from the sea bed, and pre-assembled replacement racks installed.

Although Natick’s 3/4 in steel shell looks exotic, the components inside are mostly standard, with a regular rack updated somewhat to attach to the shell, and standard cooling units that interface with heat exchangers on the outside.

Ben Cutler, Microsoft Natick
Ben Cutler, Microsoft Natick – DCD / Peter Judge

One major worry about Natick is that these units might experience “bio-fouling”, becoming encrusted with organisms and eventually barnacles, but the test ruled that out. Last year’s project used rigid heat exchangers designed for boat keels, and was deployed in shallow waters where high levels of light and nutrients increased the risk of fouling. The next test would most likely be at 600ft in darker and less fertile waters, use a flexible “spaghetti-like” heat exchanger designed by the Microsoft team, and less likely to be colonized.

Cutler revealed that the motivation for Natick when it started in 2014, as the Snowden revelations were made public, was to satisfy calls for data to be located in different jurisdictions, a trend that was exacerbated by the collapse of the Safe Harbor agreement. Rights to deploy equipment in coastal waters are actually far easier to negotiate and terms are more uniform than on land, he said. The other main motivation was to allow faster deployment. 

”The ocean is more of a standard place,” he said. “It’s more consistent, both physically, and in the laws in the ocean, which are more consistent.” Temperatures at the ocean floor are rarely over 59F, even off the coast of Florida, and storms and currents create little motion on the sea bed.

A future generation of Natick might have eight pods and hold a total of 100 racks on a 600ft deep sea bed. Power distribution would be simplified if the energy was generated locally at the right voltage for the equipment, and a 40kW unit could operate at a PUE of 1.1, he said. Water would also protect from any adverse electromagnetic effects.

The potential for renewable energy from water currents is high, as the Gulf Stream alone could generate more than the US’ entire power demands, said Cutler.

Inaccessibility will be less of an issue as servers reach the end of Moore’s Law, he added. When the power of new systems no longer doubles every two years, it makes sense to leave them installed on the sea bed for years.