Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI
Illustration by Alex Castro / The Verge
Google Cloud’s new partnership with AI model repository Hugging Face is letting developers build, train, and deploy AI models without needing to pay for a Google Cloud subscription. Now, outside developers using Hugging Face’s platform will have “cost-effective” access to Google’s tensor processing units (TPU) and GPU supercomputers, which will include thousands of Nvidia’s in-demand and export-restricted H100s.
Hugging Face is one of the more popular AI model repositories, storing open-sourced foundation models like Meta’s Llama 2 and Stability AI’s Stable Diffusion. It also has many databases for model training.
There are over 350,000 models hosted on the platform for developers to work with or upload their own models to Hugging…