Have you ever gotten tired of hearing NPCs in video games spout the same repeated lines of dialogue? What if they actually responded like actual people and could hold a conversation? That’s something gamers have long asked for and are about to receive thanks to Nvidia ACE.
Nvidia demoed its AI platform Avatar Cloud Engine (ACE) at CES 2024. Said demo took place in a dingy-looking cyberpunk cafe. Two NPCs, a chef called Jin and a patron called Nova, were present. What’s interesting about both of these characters is that they can have full conversations with you (and each other). This could be a glimpse of NPC behavior in future games.
As we previously reported, the platform works by capturing your speech and then converting it into text for a large language model (LLM) to process to generate an NPC’s response. The process then goes into reverse so that you hear the game character speaking. An animation model creates realistic lip movements.
For this demo, Nvida collaborated with Convai, a generative AI NPC-creation platform. Convai allows game developers to attribute a backstory to a character and set their voice so that they can take part in interactive conversations. The latest features include real-time character-to-character interaction, scene perception, and actions.
I got to check out and try Nvidia ACE when I met with the company. It was both intriguing and a little scary. One of the folks in my demo group asked the NPC Jin about ramen. Jin’s response was somewhat flat, but his information pleased the person who asked the question. In another instance, another person had a conversation with Jin in Italian. All of this worked rather flawlessly.
When I tried the demo, I asked Nova — who is a techy — what components she would recommend I buy to build one of the best gaming PCs. Not surprisingly, she recommended an Nvidia GeForce RTX 40-series GPU. When I asked her what was in her rig, she said an RTX 4080 with 32GB of RAM. She got cheeky with me when I asked why she didn’t have 64GB instead — saying that much RAM was overkill.
Outlook
I have mixed feelings about AI NPCs. Not having to suffer through the same boring dialogue sounds great, but it could also become a huge time-sink. Since the conversations are dynamic, it’s possible to spend hours talking to an NPC and not even play the main game. That might sound awesome to some people, but I personally don’t want that kind of distraction in a game. But I suppose an NPC can tell you it doesn’t want to talk anymore if a conversation goes on too long or veers off-topic.
Speaking of staying on topic, both NPCs in Nvidia’s demo answered any questions people threw at them. Granted, Jin mostly knew cooking and Nova was a techy, but for dialogue with NPCs to feel realistic, conversations should conform to a game’s world. It would be weird to have an NPC tell you about tech products in a Medieval game, for example. That said, the NPC Jin didn’t know who Nvidia founder Jensen Huang was but Nova did. This shows that game developers can set limits and parameters to what an NPC will discuss.
Nvidia said it’s creating digital avatars that use ACE technologies in collaboration with top game developers including Charisma.AI and NetEase Games. AI NPCs might not become ubiquitous for years, but it’s cool (and weird) to see what might become a standard for games sometime in the future.
Check out our CES 2024 hub for all the latest news from the show as it happens. Follow the Tom’s Guide team in Las Vegas as we cover everything AI, as well as the best new TVs, laptops, fitness gear, wearables and smart home gadgets at the show.
And be sure to check out the Tom’s Guide TikTok channel for all the newest videos from CES!