Cerebras Systems and Aleph Alpha are teaming up to develop sovereign artificial intelligence (AI) solutions.
The two are first focusing on creating generative AI models that are trained to a high degree of accuracy. This will be done alongside BWI GmBH, the German public IT services provider owned by the country's Ministry of Defence, and will be used by the German Armed Forces.
“This partnership with Cerebras marks Aleph Alpha’s commitment to building a new class of compute-efficient AI models that are sovereign and trustworthy for use by governments and citizens,” said Jonas Andrulis, founder and CEO of Aleph Alpha. “We chose Cerebras because of their world-class AI expertise and peerless wafer-scale technology that enables us to train state-of-the-art AI models with high efficiency.”
Aleph Alpha will also be the first European organization to deploy the Cerebras CS-3 AI supercomputer at its data center, Alpha ONE.
Alpha ONE is used for the research and development of AI applications for both the public sector and private enterprises. Here, the two companies will create foundation models, multimodal models, and new model architectures.
“We are honored that Aleph Alpha selected us to advance state-of-the-art AI using our industry-leading hardware and ML expertise,” said Andrew Feldman, CEO of Cerebras. “With Aleph Alpha’s impressive track record of creating sovereign and secure AI solutions, we believe this new partnership will produce a new class of AI model architectures that governments and enterprises worldwide can benefit from, in a safe and secure way.”
The Alpha ONE data center was launched in 2022, and is located at the GovTech Campus in Germany.
Aleph Alpha was founded in 2019 with a team of international scientists, engineers, and researchers to develop generative AI-enabling technology.
Cerebras is a chip company known for its CS-2 system, powered by the company's Wafer-Scale Engine. It recently revealed that it would be leasing 2.5MW of capacity at Nautilus' floating data center in California to deploy its AI hardware.
The company introduced the latest version of its wafer-sized semiconductor in March 2024 which has four trillion transistors and 900,000 "AI Cores" along with 44GB of on-chip SRAM.
When combined into the Cerebras CS-3 system, the company claims the chip is capable of 125 peak AI petaflops. That can be scaled up to 2,048 CS-3s in a single cluster, for a purported 256 exaflops of AI compute.
The company is reportedly exploring an initial public offering in the second half of 2024.