What is Nvidia ACE? | Trusted Reviews



Nvidia ACE has the potential to revolutionise how we interact with virtual characters, whether they’re a digital waiter in a restaurant or an NPC in a video game. 

But what exactly is this technology, and how will it benefit you in the future? We’ve assembled this guide to explain everything you need to know about Nvidia ACE. 

What is Nvidia ACE?

Nvidia ACE stands for Avatar Cloud Engine, which uses generative AI to enable virtual characters to converse with a real person without the need of a pre-written script. 

When it was first shown off in 2022, Nvidia ACE was used to create interactive avatars that can take restaurant orders or answer questions about a shopping order. 

These avatars use a 3D model, with their animations designed to match their speech. Nvidia is encouraging companies to sign up if they fancy the prospect of creating their own interactive avatars. 

What is Nvidia ACE for Games?

During a Computex 2023 keynote, Nvidia announced ACE for Games, which adopts the technology for video game characters. 

Nvidia has essentially developed its own version of ChatGPT, but instead of integrating it into your web browser like Microsoft has done so with Bing, it’s instead allowing developers to add the AI technology to NPCs (non-playable characters) in future video games and customise it to their requirement.

This would allow the player to interact with NPCs via speech instead of pre-selected dialogue trees, as they’ll be able to respond to anything you say, allowing for a more dynamic conversation. 

Unlike Chat-GPT, characters developed via Nvidia ACE will still have their own personality and background. Nvidia showed the above demonstration, where the player spoke to a worker at a ramen restaurant. Nvidia said that it provided the character with a background – a little bit like a character profile when playing Dungeons and Dragons – but left it up to the AI to generate dialogue in real time. That means the conversation was not pre-written and recorded, as it traditionally is with existing video games. 

Nvidia has collaborated with a company called Convai to create this technology. The founder of Convai, Purnendu Mukherjee, said: “With NVIDIA ACE for Games, Convai’s tools can achieve the latency and quality needed to make AI non-playable characters available to nearly every developer in a cost-efficient way.” 

Nvidia ACE uses multiple AI foundation models to create such experiences. For example, Nvidia NeMo is used to build and deploy language models, with customisation options for adding in lore and character backstories. Meanwhile, Nvidia Riva uses automatic speed recognition and text-to-speech functionality in order to make live speech conversations possible. 

ACE isn’t limited to dialogue, as it makes use of Nvidia Omniverse Audio2Face to create expressive facial animations to match the dialogue generated by artificial intelligence. This should make sure that the NPC won’t be smiling when discussing an upsetting topic, or frowning when telling a joke, which is extremely important since dialogue won’t be prewritten. All of this technology combines to create a realistic conversation. 

Convai and Nvidia are hoping that game developers from across the globe will start making use of the ACE for Games foundry service. The tools can be used via Nvidia DGX Cloud or with a GeForce RTX PC. 

Since Nvidia has only just released this technology, it’s going to take a few years until we start seeing it used in full-length games. But it’s a first step towards an exciting future that could very well reshape how we currently interact with virtual characters. 



Source link

Previous articleHow to disable the camera shutter sound on an iPhone or iPad
Next articleBitcoin Mining Operations Are Quickly Becoming More Sustainable