Chip startup Cerebras Systems is leasing space at Nautilus Data Technologies’ floating barge data center in California.

Nautilus recently announced that it secured a 2.5MW agreement with Cerebras late last year to bring the chip firm's AI hardware into Nautilus’ Stockton data center.

NDT Stockton1
The Stockton data center – Nautilus Data Technologies

“It was Nautilus’ ability to scale its services to meet Cerebras’ 30kW per rack criteria quickly (within 60 days) and without reservation that sealed the deal and allowed Cerebras to begin offering its AI services faster than anyone could have imagined,” Nautilus said this week.

Nautilus first announced an unnamed ‘AI company’ had signed on to lease capacity at the Stockton site late last year.

“In so many ways, the rapid jump in AI truly validates our approach and business model. We designed for high-density AI applications as part of the EcoCore platform. It’s all about addressing the future need in the most efficient, scalable, and sustainable way. That’s what our Stockton data center is about and what our EcoCore solution promises to do for data center operators worldwide. Everyone else is playing catch up,” said Patrick Quirk, Nautilus CTO.

Nautilus is best known for its floating barge data center concept, but is also trying to push into near-water data center deployments. Its colocation facilities are cooled by the water on which they float or are near, which is circulated through an open cooling system that cools purified water in a secondary closed loop that runs through the data halls.

The company currently only operates one floating barge in Stockton, California, but plans to build a land-based facility in Maine. It also has more water-bourne projects in the works in the US as well as Ireland and mainland Europe, and has MoUs to explore business opportunities in Thailand and the Philippines.

“Our Cerebras WSE-3, the world’s fastest AI chip which powers our CS-3 AI supercomputers, requires cutting-edge technology to perform at its best. Nautilus provides exactly that with their advanced data center infrastructure and zero water consumption cooling. They enable us to push the boundaries of AI while staying true to our sustainability goals,” added Dhiraj Mallick, COO of Cerebras.

California-based Cerebras develops wafer-sized semiconductors. Its latest Wafer Scale Engine 3 boasts four trillion transistors and 900,000 'AI cores,' alongside 44GB of on-chip SRAM.

When combined with the Cerebras CS-3 system, the company claims the chip is capable of 125 peak AI petaflops. That can be scaled up to 2,048 CS-3s in a single cluster, for a purported 256 exaflops of AI compute.

Cerebas’ first supercomputer, Condor Galaxy 1, went live at Colovore’s liquid-cooled data center in Santa Clara, California, in 2021. The company is building eight more in partnership with UAE-based G42.

The company said two more systems will come online at data centers in Austin, Texas, and Asheville, North Carolina, in the first half of 2024, with a further six planned for later in the year.

Nautilus’ 7MW data center can support 20kW+ racks with a combination of rear door heat exchangers and direct liquid cooling. Backblaze contracted for 1MW at the site in 2022.