Amazon Web Services (AWS) has announced a new partnership with AI experts Hugging Face to help create the ultimate development environment.
The non-exclusive partnership is designed to make it easier to carry out AI workloads using AWS, the biggest cloud computing and cloud hosting provider.
Unlike Google, Microsoft, and OpenAI, which are all focusing their attention on public-facing chatbots, AWS is hoping to work behind the scenes to provide the fundamental tools required to create this sort of technology.
AWS, Hugging Face, and AI
Hugging Face CEO Clem Delangue told Reuters (opens in new tab) that the next generation of the open-source AI model Bloom will be run on Trainium, an AWS-created AI chip.
Chips like Trainium are designed to be more efficient as they are geared toward AI workloads, which should help save developers time and money. In turn, AWS hopes to be able to open up AI workload to even more developers.
“Building, training, and deploying large language and vision models is an expensive and time-consuming process that requires deep expertise in machine learning (ML),” an AWS statement said. “Since the models are very complex and can contain hundreds of billions of parameters, generative AI is largely out of reach for many developers.”
Beyond this, AWS has an extensive history of artificial intelligence, largely centered around its Alexa voice assistant, but also including the large AI model that has been designed to make Amazon search more effective. The company hopes to bring this experience to the table in its new partnership.
Whether other companies like Google and Microsoft should be concerned about this partnership is yet to be seen, however it’s clear that Big Tech is gearing up for an AI-centric year ahead.