Cerebras Systems has announced a new multi-year partnership with Mayo Clinic which will see the two entities collaborate on the development of LLMs for medical applications.

Mayo Clinic is a nonprofit organization that provides one the largest integrated health systems across the US, funding research and education in the medical field.

Healthcare IT
– Getty Images

The partnership will see the clinic use computing chips and systems provided by Cerebras to develop AI models using anonymized medical records and patient data, including diagnostics, treatments, outcomes, imaging, and molecular research.

According to a report from Reuters, some of the models will be used to summarize important parts of lengthy medical records, while others will be trained to look for patterns in medical images or analyze genome data. However, it was noted that all medical decisions will still be made by doctors and not the new AI systems.

The models will eventually be made available on the Mayo Clinic Platform, a data network that is accessible to various healthcare systems in the US, Canada, Brazil, and Israel.

“It is an honor to collaborate with Mayo Clinic, the top-ranked hospital in the nation. With its recognized leadership in delivering medical outcomes, we are uniquely positioned to combine AI and medicine. The state-of-the-art AI models we are developing together will work alongside doctors to help with patient diagnosis, treatment planning, and outcome estimation,” said Andrew Feldman, CEO and co-founder of Cerebras.

“To create the first truly patient-centric healthcare AI, Mayo Clinic selected Cerebras for its proven experience in designing and training large-scale, domain-specific generative AI models,” the company wrote on LinkedIn. “Together, Cerebras and Mayo Clinic seek to combine AI and domain expertise to produce better patient outcomes.”

Neither organization disclosed how much the partnership cost but multiple outlets reported that Cerebras said it was a multimillion-dollar deal.

Cerebras develops giant wafer-scale chips. Its Wafer Scale Engine 2 is the world's largest semiconductor, with 2.6 trillion transistors. Built on TSMC 7nm, it has 850,000 'AI optimized' cores, 40GB of on-chip SRAM memory, 20 petabytes of memory bandwidth, and 220 petabits of aggregate fabric bandwidth. The WSE-2 chip is sold packaged with the Cerebras CS-2, a 15U box that also includes HPE’s SuperDome Flex.

This is not the first time Cerebras has made forays into the health tech sphere. In May 2022, biopharmaceutical company AbbVie said that it used a Cerebras CS-2 system to train biomedical natural language processing (NLP) models.

The six billion parameter model is able to translate and make libraries of biomedical literature searchable across 180 languages using large Transformer models such as BERT, BERT LARGE, and BioBERT.