Meta has published a massive new artificial intelligence (AI) model capable of translating between more than 200 different languages.
Trained using one of the world’s most powerful AI supercomputers, the No Language Left Behind (NLLB) model is already supporting advanced translation features across Meta’s suite of apps and services.
“The advances here will enable more than 25 billion translations every day across our apps,” said Mark Zuckerberg, Meta CEO. “The AI modeling techniques we used are helping make high-quality translations for languages spoken by billions of people around the world.
“Communicating across languages is one superpower that AI provides, but as we keep advancing our AI work it’s improving everything we do – from showing the most interesting content on Facebook and Instagram, to recommending more relevant ads, to keeping our service safe for everyone”.
Vaulting the language barrier
In addition to supporting the Meta family of apps, by releasing the model under an open source licence the company has also set the stage for a new wave of third-party services designed to tear down the language barrier.
In a promo video, Meta gestured towards the possibility of augmented reality tools that translate foreign language recipe books and apps capable of unlocking the literature of “low-resource languages”, such as Icelandic or Occitan.
“Language is the key to inclusion; if you don’t understand what people are saying or writing, you can be left behind,” said Jean Maillard, Meta Research Engineer, in a piece to camera.
Naturally, there was also mention of potential metaverse applications for the technology; Meta envisions a future in which people of all backgrounds can converse freely in virtual reality, with the support of live translation.
“The idea is to get rid of the language barrier altogether – for everyone to understand your experience without you changing how you communicate,” added Necip Fazil Ayan, Research Director at Meta. “I think the metaverse will be the place that all of these things come together.”
The key to achieving these objectives is the ability for a single AI model to cover off a high volume of languages. Made up of 50 billion parameters (the more, the better), NLLB is capable of translating almost twice the number of languages as the next best performing models, but Meta is aiming for even loftier heights.
The techniques employed to achieve this level of performance are detailed in a research paper (opens in new tab) published by Meta, which describes “multiple architectural and training improvements” that the company hopes will “lay important groundwork towards realizing a universal translation system”.