Nvidia recently introduced how developers can integrate its AI “digital human” tools to enhance voice, animation, and dialogue generation for video game characters.
At the 2024 Game Developers Conference (GDC), the tech giant will introduce a playable tech demo highlighting the capabilities of its AI tools that allow non-playable characters (NPCs) to dynamically respond to player interactions. An overview of Covert Protocol has been published. Tailored responses to live gameplay.
Covert Protocol: A unique gameplay experience
(Photo: NVIDIA)
Nvidia has released a demo of its latest AI tool that allows you to create AI-generated game characters. They can add backstory or change characteristics with this digital interaction.
In Covert Protocol, players assume the role of a private investigator and engage in tasks determined by interactions with AI-driven NPCs.
Anyone who is interested in games will know how these characters work. They have limited tasks when you interact with them, but when you incorporate generative AI into them, that's not the case at the moment.
Nvidia emphasizes that each playthrough provides a unique experience, as the player's real-time engagement influences the outcome of the game.
John Spitzer, vice president of development and performance technology at Nvidia, explores the potential of the company's AI technology in driving the subtle animations and conversational voices essential to driving lifelike digital interactions. I'm emphasizing.
Related article: NVIDIA's flagship AI chip unveiled along with the latest AI-powered robots
Collaboration between inworld AI and ACE technology
Covert Protocol is a collaboration between Nvidia and AI gaming startup Inworld AI, and is powered by Nvidia's Avatar Cloud Engine (ACE) technology. ACE was previously featured in Nvidia's futuristic ramen shop demo and forms the backbone of the Covert Protocol.
According to The Verge, while the demo primarily showcases NPC voice lines, Inworld AI is also using the Covert Protocol source code to encourage wider adoption of Nvidia's ACE digital human technology among developers. It is scheduled to be published.
Advances in Audio2Face technology
Nvidia also demonstrated its Audio2Face technology during a sneak peek of the MMO World of Jade Dynasty. This introduced character lip-syncing to both English and Mandarin voices, highlighting the potential of his Audio2Face to facilitate multilingual game development without the need for manual character resuscitation. Additionally, a video snippet from the action melee game Unawake shows how Audio2Face can enhance facial animation in both cinematic and gameplay sequences.
Impact on game developers and voice actors
While these technology demonstrations may interest game developers, especially in the area of increasingly diverse character interactions and language support, the conversational aspect remains a challenge. Covert Protocol's NPCs exhibit limited resemblance to “real people”, reminiscent of early Kairos demos.
This aspect may raise concerns among video game voice actors about the potential impact of AI implementation on their roles and livelihoods, especially when talking about the impact on the voice acting (VA) field. There is no doubt about that.
Indeed, Nvidia's advances in AI-driven NPC technology offer promising prospects for enriching the gameplay experience. However, the balance between innovation and maintaining the human element in games remains a topic of debate, especially regarding the evolving landscape of VA in the industry.
Related article: Nvidia GTC 2024: Automotive technology and breakthroughs unveiled in California
ⓒ 2024 TECHTIMES.com All rights reserved. Please do not reproduce without permission.