Missed the GamesBeat Summit excitement? Don't worry! Tune in now to catch all of the live and virtual sessions here.
Replica Studios is unveiling AI-powered smart NPCs (non-player characters) for the Unreal Engine platform.
The company released a demo highlighting the software, which combines Replica Studio’s AI voice and
OpenAI allows players to have dynamic conversations with NPCs who respond in real-time.
The demo in the video looks kind of rough, as it uses Epic Games’ footage of Unreal Engine 5 tech based on The Matrix Awakens demo as well as Epic’s Digital Humans tech. But Replica said it has supplied the NPCs with
the ability to respond with smart conversations to prompts from human players.
Again, the responses from the NPCs are kind of slow, but it is just a demo and the real plugin is coming to Unreal Engine later this year. It will have a rival in Inworld AI and other AI companies targeting games.
The smart NPCs are powered by OpenAI or the user’s own AI language model, and Replica’s library of over 120 ethically licensed AI voices, allowing game developers to develop games at scale and create new dynamic gaming experiences.
To showcase the revolutionary technology, Replica has released a smart NPC demo experience utilizing a modified version of Unreal Engine’s Matrix Awakens sample project to showcase some of the features that will be available with the plugin.
“At Replica, we believe AI Voice technology has two-fold potential for furthering narrative-heavy games: eliminating critical bottlenecks in the development process, enabling studios to tell ever bigger and immersive stories, and allowing for the creation of never seen before living-world gaming experiences that shape around the player in real-time,” said Shreyas Nivas, CEO of Replica Studios, in a statement. “Smart NPCs will allow smaller studios who don’t have the resources to script and vocalize on scale to dream big and scale their games 10 times or 100 times into the epic adventure they’ve always wanted to make.”
In Replica’s smart NPC experience, AI-powered NPCs will dynamically respond to the player’s in-game voice in real time, the company said. Characters will change their dialogue, emotional tone and body gestures in reaction to how the player speaks to them.
Replica Studios’ NPCs now have a broader range of emotions thanks to Replica’s recent Style Morphing update.
Game designers can also alter the information provided to the AI characters to instruct them to behave differently or have certain personalities or incorporate their own AI language models to suit their project.
“Just as live service games and battle royale have emerged as popular gaming experiences in the last decade, we see AI creating a new category of virtual experiences known for the depth of narrative and dynamic storytelling with hundreds of characters, ever-expanding quests and stories,” said Nivas. “When teams begin adopting AI to create these new experiences, we will see the role of writers and narrative designers evolve. Rather than writing complete narrative arcs, they’ll create and fine-tune the conditions that power these AI Smart NPCs -game lore, backstories and motivations – to add a human element that will lead to an engaging live player experience.”
When Replica’s smart NPC plugin launches for Unreal Engine later this year, game developers will be able to integrate Smart NPCs into their own projects to begin scaling up characters in their games. The plugin integrates AI language model smarts along with Replica’s text-to-speech models and animates the characters in real-time by outputting sound phonemes and timelines to match an audio stream while using a customized blend space for facial animations to power accurate lip sync, and custom animation blueprints to send body gesture messages for NPCs during their listening, thinking and speaking phases.
The result is a natural-looking and behaving NPC that can surprise and delight players with thoughtful, funny, and provocative responses, depending on their personal contexts as well as the voice input directly captured from players’ microphones.
The demo will initially run as a cloud-based solution but is working towards a locally based solution to give a low latency, multi-user concurrent live experience.
“Within the next technology hop, we will see AI-powered smart NPC live games and virtual experiences with hundreds of live concurrent players and NPCs all together, with no more latency than a zoom video call that has hundreds of participants,” said Nivas.
Since launching the platform in 2019, Replica has been at the forefront of the AI voice revolution. Replica has been utilized by thousands of game developers to enrich their projects with licensed voices. It should not be confused with the similar company Replika.
The company has 12 employees and it has raised $4.6 million to date. It started in 2018. Asked what the inspiration, Nivas said in an email to GamesBeat, “We’re heavily inspired by games and storytelling and saw the potential for AI voice technology to enable more and more creators to make games with voice acting, as well as make it possible for the bigger games companies to make much bigger games and incorporate dynamic elements to their storytelling ability.”
He added, “Our goal since launching our platform has been to improve the voice quality to the point where it’s indistinguishable from a real voice actor, and building tools and services specifically for game developers so that they can seamlessly adapt Replica tech into any game engine or any project they are working on throughout their production cycles, while also building the most ethical licensing models to attract the worlds best voice talent.”
I asked about the quality of the demo and what was unique about the tech.
Nivas said, “Back in 2019 we were already using GPT2 for certain creative tasks and while the tech was early we already could see the potential for systems like that to improve and complement creators and writers and game designers in tents of ideations and script and dialog generation.”
He also said, “However, as the systems have gotten better we felt the need to develop our own SDK that delivers the best in class AI voices for games with the latest GPT models and new large-language models (LLMs) so that developers can experiment with these systems. our playable demo with The Matrix Awakens is a great example of the possibilities available to game developers using Unreal Engine 5. We’re now working on improvements to latency, improvements to voice quality and adding support for Unity and other major game engines.”
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.