
Nvidia announced that leading AI application developers across industries are leveraging NVIDIA's Digital Human technology to create lifelike avatars for commercial purposes and dynamic characters for games. These cutting-edge developments will be showcased at GTC, the global AI conference currently being held in San Jose, California, with demonstrations from companies such as Hippocratic AI, Inworld AI, and UneeQ.
The foundation for these lifelike avatars and characters is built on NVIDIA's Avatar Cloud Engine (ACE) for audio and animation, NVIDIA NeMo for language processing, and NVIDIA RTX for ray tracing rendering. These technologies will allow developers to create digital he-humans who can participate in his AI-driven natural language interactions, thereby increasing the realism and immersion of the conversations.
NVIDIA ACE includes a suite of technologies that make it easy to create digital humans, including facial animation with NVIDIA Audio2Face™ and speech recognition and synthesis with NVIDIA Riva. These microservices are flexible and run across cloud and PC environments, ensuring optimal performance based on local GPU capabilities.
NVIDIA NeMo provides developers with an end-to-end platform for generating AI models, offering accurate data curation, customization, search enhancement generation, and fast performance.
NVIDIA RTX is comprised of rendering technologies such as RTX Global Illumination (RTXGI) and DLSS 3.5 that enable real-time path tracing in games and applications.
To showcase the capabilities of these digital human technologies, NVIDIA has collaborated with leading developers across a variety of sectors. For example, Hippocratic AI develops AI-powered, safety-focused healthcare agents to provide patients with critical medical instructions and follow-up. Similarly, UneeQ specializes in creating AI-powered avatars for customer service and integrates NVIDIA Audio2Face microservices to enhance customer engagement.
In the gaming space, NVIDIA ACE uses technologies such as Covert Protocol, developed by Inworld AI, to transform non-playable characters (NPCs). This demo combines NVIDIA Riva for precision speech-to-text and NVIDIA Audio2Face for realistic facial performance to deliver immersive character interactions in-game.
Thousands of developers in industries such as healthcare, gaming, financial services, media and entertainment, and retail have adopted NVIDIA ACE to revolutionize user interactions. Notable recruiters include miHoYo, Ubisoft, Tencent, Deloitte, and more.
Developers interested in taking advantage of NVIDIA ACE have a variety of resources available, including an early access program and NVIDIA NIM microservices for deploying generative AI models. These initiatives aim to accelerate the integration of digital human technologies into various products and platforms.