[ad_1]
NVIDIA recently introduced production microservices for the NVIDIA Avatar Cloud Engine (ACE) that allow developers of games, tools, and middleware to integrate state-of-the-art generative AI models into the digital avatars in their games and applications.
The new ACE microservices let developers build interactive avatars using AI models such as NVIDIA Omniverse Audio2Face (A2F), which creates expressive facial animations from audio sources, and NVIDIA Riva automatic speech recognition (ASR), for building customizable multilingual speech and translation applications using generative AI.
Create lifelike digital characters for games and applications with NVIDIA Avatar Cloud Engine (ACE) Gen AI models
Developers embracing ACE include Charisma.AI, Convai, Inworld, miHoYo, NetEase Games, Ourpalm, Tencent, Ubisoft, and UneeQ.
“Generative AI technologies are transforming virtually everything we do, and that also includes game creation and gameplay. NVIDIA ACE opens up new possibilities for game developers by populating their worlds with lifelike digital characters while removing the need for pre-scripted dialogue, delivering greater in-game immersion,” said Keita Iida, vice president of developer relations at NVIDIA.
Top game and interactive avatar developers embrace NVIDIA ACE
Top game and interactive avatar developers are pioneering ways ACE and generative AI technologies can transform interactions between players and non-playable characters (NPCs) in games and applications.
“For years NVIDIA has been the pied piper of gaming technologies, delivering new and innovative ways to create games. NVIDIA is making games more intelligent and playable through the adoption of gaming AI technologies, which ultimately creates a more immersive experience,” said Zhipeng Hu, senior vice president of NetEase and head of LeiHuo business group.
“This is a milestone moment for AI in games. NVIDIA ACE and Tencent Games will help lay the foundation to bring digital avatars with individual, lifelike personalities and interactions to video games,” said Tencent Games.
NVIDIA ACE brings game characters to life
NPCs have historically been designed with predetermined responses and facial animations. This limited player interactions, which tended to be transactional, short-lived, and, as a result, skipped by a majority of players.
“Generative AI-powered characters in virtual worlds unlock various use cases and experiences that were previously impossible. Convai is leveraging Riva ASR and A2F to enable lifelike NPCs with low-latency response times and high-fidelity natural animation,” said Purnendu Mukherjee, founder and CEO at Convai.
To showcase how ACE can transform NPC interactions, NVIDIA worked with Convai to expand the NVIDIA Kairos demo, which debuted at Computex, with several new features and the inclusion of ACE microservices.
In the latest version of Kairos, Riva ASR and A2F are used extensively, improving NPC interactivity. Convai’s new framework allows NPCs to converse among themselves and gives them awareness of objects, enabling them to pick up and deliver items to desired areas. Furthermore, NPCs gain the ability to lead players to objectives and traverse worlds.
The Audio2Face and Riva automatic speech recognition microservices are available now. Interactive avatar developers can incorporate the models individually into their development pipelines.
Become a subscriber of App Developer Magazine for just $5.99 a month and take advantage of all these perks.
PDf
[ad_2]
Source link