[AI IN ACTION] AI technology bringing game characters to life
Published: 16 Oct. 2023, 18:19
Updated: 23 Oct. 2023, 11:09
- LEE JAE-LIM
- lee.jaelim@joongang.co.kr
The day may soon arrive when a game world would become enmeshed with AI-powered characters, advancing the characters' cognition to a level where they might spontaneously interact with their human players.
Currently, in-game characters merely exist as non-player characters (NPCs). They are limited in their tasks — notifying the players with certain missions or instructing them to acquire items — or low-level enemies in cookie-cutter missions with zero suspense, merely existing for the player to move on to the next stage of their games.
With the fuse of generative AI technology, such characters will function more like humans as the AI trained on machine learning deciphers the data to respond in the adequate manner of speech, tone, emotions, facial expressions and ultimately generate proper responses to the players' needs.
The goal that the global game industry envisions is that these NPCs will become digital humans programmed with their very own, distinct personas based on the hyped AI technology. U.S. software company Unity Technologies launched Unity Sentis — albeit in beta version for now — to embed generative AI models into games without cloud-computing costs or latency issues. This means that these models, based on machine learning, can be realized to become dimensional NPCs on any game platforms from mobile to console, or to websites and PCs.
At the Unity's APAC Industry Summit 2023 hosted at southern Seoul on Oct. 5, Unity introduced its own digital human dubbed "Orb," which was able to respond on-scene to basic questions such as about its identity.
Within Korea, the two game publishers on the front line of this technology are NCSoft and Netmarble.
NCSoft garnered worldwide attraction by unveiling its own digital human dubbed "TJ Kim" at Game Developers Conference 2023 in March. The digital human is modeled after the NCSoft founder Kim Taek-jin, closely resembling his facial expressions and manner of speech to explain about a new console game that the company is developing.
TJ Kim showcased how far the company has come in its AI technologies such as speech synthesis, a technology that converts written text into spoken words, or in synchronizing a character's facial expressions with the dialogue or a certain voice tone.
Kim and chief strategy officer Yoon Song-yee have been eyeing the merging of AI and the game industry for some time now, forming a separate AI lab in 2011, even five years before the famous AlphaGO versus Lee Sedol battle took place in 2016. Another separate division tailored to hone natural language processing for Korean language was formed within the lab in 2015.
NCSoft believes that the base for developing its in-house AI technology is through establishing its own large language model, which it has done so in August. It bestowed upon it the name VARCO, short for Via AI, Realize your Creativity and Originality. The model is being serviced under four categories — foundation, prompt, dialogue and generative models, offered in the bilingual version of Korean and English.
The number of parameters trained on VARCO is admittedly small compared to OpenAI, but VARCO received the highest proficiency rate among Korean language models in similar parameter sizes at Kobest, a benchmark for Korean language reasoning.
VARCO becomes the steppingstone for what the company ultimately aims to do — collectively dubbed as VARCO Studio, NCSoft offers three AI tools to improve efficiency in game development: VARCO Art, which can generate images based on text instructions; VARCO Text, which can generate extensive plot settings and character dialogues from simple texts; and VARCO Human, an integrated tool for creating, editing and managing digital humans.
All of the data that the VARCO model is trained on derives from NCSoft’s intellectual properties (IPs) and open source data that are free from copyright issues, according to Lee Yeon-soo, the head of NCSoft’s Natural Language Processing (NLP) Center.
“What sets VARCO Studio apart from other similar AI tools of its kind is that users can easily create high-quality game content under one, single platform,” Lee told the Korea JoongAng Daily.
VARCO Studio will be utilized by the employees from November for a test run and is scheduled to be commercialized next year.
Netmarble, on the other hand, focuses on developing AI-powered technologies that could enhance user-centric entertainment, establishing an AI center in 2018.
It is developing a series of game bots, such as one based on reinforcement learning so that AI can learn how to play a game autonomously to test and balance the appropriate level of challenge in a particular stage of a game during the game development process.
Another AI-powered game bot is already being utilized in Netmarble games, where players can compete with AI to practice and strengthen their characters’ items.
The main AI technology that Netmarble fronts is emotion guided speech-driven facial animation, where an AI analyzes a person’s voice, detects the emotion and matches that data to a character’s facial expression.
“Currently, we have developed the technology to beyond just syncing the voice with the character’s mouth, for the AI to generate an aggressive expression after detecting a voice containing urgency,” said Oh In-soo, the head of Netmarble’s AI Center.
“If fast, sophisticated graphic rendering skills once determined the game quality, we believe that this AI technology will become turning point to boost the game quality within the industry to another level. This will enable game characters to speak more and more naturally like humans, and diversify their persona in a way that couldn’t be done before.”
According to Oh, voice data contains an array of information such as language, tone, accent, emotion and gender. Netmarble is feeding the AI with such data, which it learns to decipher such categories and generate sophisticated mouth shapes and facial expressions.
Netmarble also incorporated its voice AI software dubbed Monica, short for mobile neural voice command assistant for mobile games, on its mobile game A3: Still Alive in 2020. When a player activates the game and orders Monica to start the game’s main quest, it obliges. Monica can also abide simple requests such as drawing out maps, calling out appropriate skills or items on a character.
“Monica holds significance because we accomplished this on a mobile game, to interpret a player’s instructions and carry them out appropriately,” Oh said. “A day might soon arrive when a player may not have to touch the remote or keyboards at all, but only command the game play with their voices.”
Oh sincerely believes that AI goes simply beyond cutting down costs or human resources, to truly enhance the level of game entertainment and immersion for players.
“Game engines, when they first rolled out publicly, faced similar issues as well, but now they’re compulsory to create high-quality games with immersive plots reminiscent of a film,” Oh said. “The ultimate purpose of Netmarble’s AI Center is to create AI technologies that are critical to creating a high-quality game and its entertainment.”
BY LEE JAE-LIM [lee.jaelim@joongang.co.kr]
with the Korea JoongAng Daily
To write comments, please log in to one of the accounts.
Standards Board Policy (0/250자)