Nvidia is bringing generative AI to video games. Announced during its Computex 2023 keynote, Nvidia ACE is a new platform that allows developers to use generative AI to power conversations with characters in games.
Think ChatGPT, but instead of a general-purpose chatbot, you get a chatbot with a specific backstory and lore. Nvidia is stressing that flexibility is one of the most important aspects of ACE, as it will allow characters to have a fleshed-out backstory that informs their responses and keeps them from getting too off-topic. The company’s recently announced NeMo Guardrails play a role in this, directing the conversation away from topics the developer doesn’t intend.
ACE isn’t just built to generate text, similar to what we saw with The Portopia Serial Murder Case from Square Enix. It’s a full AI system. Nvidia says ACE not only generates responses for characters, but it also uses AI to animate the character models to match their responses.
If suddenly the dialogue is bad in a game, who’s to blame?
It’s not hard to imagine how cool this could be — the recent Shadows of Doubt has me excited about the possibilities of emergent gameplay from AI. But there’s also a lot that could go wrong.
For starters, Nvidia says the system is designed in such a way that characters can talk among themselves; you don’t have to be the one initiating the AI. That’s great, but Nvidia hasn’t tested that. I would be shocked if we didn’t see two AI-driven characters fall down some unhinged rabbit hole, à la the early days of Bing Chat.
There’s also the possibility that these AI characters just aren’t interesting. Sure, generative AI tools can be a lot of fun to mess around with, but dialogue and character interactions in games are determined by the developers, with conversations curated for a specific reason. If suddenly the dialogue is bad in a game, who’s to blame? The writers, or the AI? Maybe there’s a rich, detailed backstory for each character, but the AI might not surface that, leading to dull, uneventful interactions.
That was certainly the case with The Portopia Serial Murder Case from Square Enix, which promised unique dialogue with each interaction. Instead, it largely pushed players toward a specific path without any of the flair that comes from written dialogue. We’re bound to see some horrific facial animations, too, which is something Square Enix’s demo didn’t need to worry about.
It seems this is the future for video games, though. Ubisoft has a generative AI tool it’s leveraging for some dialogue, and leaders for game engines like Unity are seeing developers start to take advantage of AI frameworks already. I’m sure we’ll see interesting uses of generative AI in games eventually, but there’s probably a long road of weird, unsettling, and hilarious hiccups to get through first.
Nvidia is taking the first step down that road with ACE. It’s currently in development, and Nvidia was careful not to share any promises about when we’ll see ACE in action. The company showed a demo at Computex, built in Unreal Engine, suggesting we could see a plug-in for that engine in the near future. As for the demo, the AI-generated voice sounded a bit robotic, but there could be a future here.
NVIDIA ACE for Games Sparks Life Into Virtual Characters With Generative AI
We don’t know when ACE is coming or exactly how it works, but Nvidia says that most of the system runs in the cloud. That hopefully means you don’t need any specific graphics card to use ACE, as most of the AI processing isn’t happening on your computer.
Even with some concerns, the ball is ultimately in the court of game developers. Nvidia’s tech is simply enabling more sophisticated AI in game characters, so it’s up to developers to learn how to best leverage Nvidia’s new tech, if at all.
Editors’ Recommendations