Amazon Web Services (AWS) has partnered with Hugging Face to integrate the startup's software development hub into its cloud.

Hugging Face has quickly established itself as one of the main places for AI developers to share open-source code and models.

It also develops its own models, including GPT-3 rival Bloom. After originally being developed on the Jean Zay supercomputer, it will now be trained on AWS' proprietary Trainium chip.


Users will be able to move models available for free on Hugging Face to AWS-managed machine learning service SageMaker.

Hugging Face hosts more than 100,000 free machine learning models, downloaded more than one million times a day.

“Generative AI has the potential to transform entire industries, but its cost and the required expertise puts the technology out of reach for all but a select few companies,” said Adam Selipsky, CEO of AWS.

“Hugging Face and AWS are making it easier for customers to access popular machine learning models to create their own generative AI applications with the highest performance and lowest costs. This partnership demonstrates how generative AI companies and AWS can work together to put this innovative technology into the hands of more customers.”

Clement Delangue, CEO of Hugging Face, added: “The future of AI is here, but it’s not evenly distributed. Accessibility and transparency are the keys to sharing progress and creating tools to use these new capabilities wisely and responsibly. Amazon SageMaker and AWS-designed chips will enable our team and the larger machine learning community to convert the latest research into openly reproducible models that anyone can build on.”

Hugging Face noted that the deal is not exclusive. While it has made AWS its preferred cloud provider, it may work with other cloud companies in future.

Rival OpenAI offers its services exclusively through Microsoft Azure, following Microsoft's billions in investment. Google gave $300 million to Anthropic for it to use Google Cloud.

Subscribe to our daily newsletters