SambaNova has announced its next-generation DataScale system, the DataScale SN30.

DataScale is an integrated hardware-software system that enables companies to run artificial intelligence, deep learning, and foundation model workloads.

“The new DataScale SN30 system achieves world record-breaking performance when compared to the latest DGX A100 systems,” said Marshall Choy, SVP of Product at SambaNova Systems. “With this release, SambaNova is also offering subscription pricing for DataScale and Dataflow-as-a-Service, enabling organizations to achieve ROI faster, reduce risk, and scale more cost-effectively than with any other AI infrastructure.”

social-launchmetrics-1024x486 sambanova cardinal.png
– Sambanova

Each rack system can hold up to three DataScale SN30-8 systems, each of which has eight of its custom Cardinal SN30 RDUs, 8TBs of total memory, and a high-performance 400/200 GbE data switch. The company claims the new generation DataScale system offers GPT training that is six times faster than the Nvidia DGX A100 system (GPT 13B), and 12.8 times more memory capacity.

Several organizations have committed to deploying the next-generation DataScale system, including Livermore Computing, OTP Group, and the Argonne National Laboratory.

“The Argonne Leadership Computing Facility (ALCF) is deploying a multirack system of the next generation of SambaNova’s DataScale systems in the ALCF AI test bed, which provides an infrastructure for the next-generation of AI-accelerator machines,” said Rick Stevens, associate laboratory director and Argonne distinguished fellow at the US Department of Energy’s Argonne National Laboratory.

“Scientists at Argonne and other research institutions will test this next-generation platform for a variety of use cases including large language models like GPT for gene generation, 3-d convolution networks for neutrino physics, and prediction of tumor response to single and paired drugs. We expect a two to 6x performance increase across a broad spectrum of applications.”

Last year OTP Bank announced its intentions to partner with SambaNova to make the ‘fastest AI supercomputer in Europe’. Said supercomputer is set to be built on SambaNova’s DataScale and Data-Flow-As-A-Service systems.

Founded in 2017, San Francisco-based SambaNova has raised more than $1 billion from investors including Google Ventures, Intel, SoftBank, and Singaporean wealth fund GIC. Co-founder Christopher Ré previously founded data company Lattice, which was acquired by Apple in 2017.

The company launched its previous system iteration, the DataScale SN10-8, around 2020.

According to EETimes, a "foundation model” is a large language model which can be trained on huge amounts of diverse data, to perform language-based tasks such as question answering, summarization, and sentiment analysis.

Get a weekly roundup of North America news, direct to your inbox.