How human-like avatars animate online experiences

Tomorrow's online helpers could offer seamless and hyper-realistic interactions.

By Danny Bradbury, contributor

Imagine that you’re in a luxury dealership sizing up the vehicle of your dreams. You ask for help, only to find a professional racing driver standing beside you, answering your questions. “How does it drive?” you ask him. “Let’s find out,” he says, and beckons you as he hops in the passenger seat. The whole conversation is happening in a virtual reality environment, and he’s talking you through the experience as you venture out on the open road. Not just any road, though; you’ve driven into your favorite video game, and now you’re with him on the racing track.

According to some, experiences like these are not so far-fetched. Thanks to the rapidly evolving concept of virtual humans and the metaverse, they could be with us sooner than we think.

What is a virtual human?

Human-like avatar
Image of a virtual human courtesy of UneeQ.

Virtual humans are the next step in digitizing and automating customer interactions, explains Victor Yuen, chief metaverse officer at New Zealand-based digital human creator UneeQ. Powered by artificial intelligence (AI), its animated digital personas can conduct conversations with visitors online.

“User interfaces were becoming increasingly digitized, but they didn’t have the emotional component,” Yuen explains. People want easy, fast ways to interact with service providers and vendors, while companies want to improve customer service and reduce costs.

“The most natural way for us to communicate and interact is through natural language and verbal communication,” he adds. The visual element—a face that you can talk to, rather than a disembodied chatbot—is also key.

UneeQ’s virtual agents are evolving rapidly. It has delivered them to clients such as healthcare organization Multaq and is also working with Dell Technologies on creating a customer service agent modeled after a popular sci-fi character.

Bridging the uncanny valley

Companies have been modeling digital humans for years now with varying degrees of success. The video games industry has led the charge with realistic, 3D-computer-controlled non-player characters (NPCs). However, their behavior and dialogue are narrowly scripted, as are their body and facial movements. After a while, they say the same things and repeat the same stilted, identical gestures.

That limited movement breaks the spell, explains Yuen, landing designers in unwelcome territory: an area called the uncanny valley. “This is where you’ve got somebody that looks really realistic, but they move in ways that are not realistic,” he explains. Interacting with digital humans that look real but move like robots leaves people feeling unsettled.

Moving past that uncanny valley means thinking beyond this scripted behavior to make digital humans more fluid.

“We’re trying to understand how human behavior works in terms of movement. We’re reconstructing that and simulating it in a rendering engine,” Yuen explains. UneeQ’s digital humans, which are currently head-and-shoulders representations, choose from a complex set of movements in real time based on multiple factors, including what is being said and when. Discussing a difficult topic or dealing with an angry visitor would cause a digital human to animate differently than if they were talking about something positive.

The digital human also uses machine learning to synthesize human speech. That enables it to say the most appropriate thing verbally in response to user input rather than sticking to a set of prerecorded phrases. “All these things create an animation that is not scripted and not repetitive,” says Yuen.

As virtual humans become more visually detailed and their speech becomes more versatile, designers must contend with new challenges. Synthesizing lip movements to match speech is critical, Yuen explains. Even if they don’t look specifically at someone’s lips while they talk, a person is still aware of their movement. If the movement doesn’t fit the speech, they’ll sense that something is off. It’s a difficult problem to solve, and the company is constantly improving.

Making way for the metaverse

Human-like avatar
Image of a virtual human courtesy of UneeQ.

Digital humans build on existing work in chatbots, explains Deloitte, which is an implementation partner for UneeQ. A digital human is based on four platforms: the foundational chatbot layer handles natural language processing, while the visual layer creates the visual and verbal experience. Separate hosting and integration layers operate the digital human’s software and integrate it with the customer’s online platform.

In most cases, these online platforms are websites. Digital humans might act as customer service agents in these environments, helping to put users at ease. The next part of their evolution will see them transfer into the metaverse, says Yuen.

The metaverse is an evolution of traditional online 3D environments like Second Life, which will be two decades old next year. Those environments are walled gardens, explains Yuen. You can’t move easily between them or take items with you. The metaverse will change that by making these 3D worlds interoperable. That’s how a vehicle from an auto maker’s online world could drive seamlessly into another company’s online video game.

From e-commerce to education and beyond

The metaverse offers some exciting possibilities for digital humans, says Guo Freeman, assistant professor in the School of Computing at Clemson University, who studies how humans and avatars interact in these online 3D environments.

“They will probably be useful in a human-AI collaboration context,” Freeman says. A digital human assistant could help a human visitor to navigate their new 3D environment and learn basic tasks. “Because the assistant is more like a human teammate, that probably would help build trust.”

Yuen also sees possibilities for digital humans in everything from commerce to education when operating in the metaverse. Digital sales or support agents could offer an even more immersive experience for customers in a 3D world. On the education side, he envisages historical figures who can help visitors learn about them through conversation.

“It would be like walking into a period role play like we do now in real life where there are people that act out those roles,” he says. “But now, you’re not acting. It’s a history lesson.”

Creating these experiences involves recreating the personality and experiences of the virtual teacher, either by interviewing academics on the topic or, for more recent educational topics, interviewing someone with a first-hand account. Yuen raises the possibility of virtually attending a Martin Luther King rally and talking to a digital representation of someone who was actually there.

It would be like walking into a period role play like we do now in real life where there are people that act out those roles, but now, you’re not acting. It’s a history lesson.

— Victor Yuen, chief metaverse officer, UneeQ

These experiences could also improve commercial training outcomes. A study by the University of Canterbury tested the use of virtual humans to help teach leadership skills. Students in role-playing exercises with virtual human subordinates significantly improved their performance, the study found.

Making digital humans even more convincing

The race is now on to make these digital humans more convincing than ever, explains Yuen. UneeQ is already experimenting with full-body systems that animate these AI-powered characters in virtual environments. He gives the example of interacting with a character around a virtual kitchen island. Traditional video game characters often walk stiffly and unconvincingly around such obstacles, he says. It’s his job to make them act more naturally in these environments, pulling them out of the uncanny valley and making interactions with them more seamless and believable for human visitors.

As these assets become more real, Yuen says that the industry will have to address potential ethical issues. How much advice should a virtual human give to a real one? How extensive should their relationship be? Freeman’s research has found that people feel uneasy interacting with hyper-realistic virtual humans because they want to know clearly whether they’re talking with an AI-powered character or not. How do we label participants in a metaverse environment clearly as either virtual or real humans?

We are already seeing institutions like the University of Granada’s Mind, Brain and Behaviour Research Centre exploring the ethical issues surrounding virtual humans. “There’s still a long way to go with those discussions,” concludes Yuen. Having these conversations now will pave the way for a harmonious future in which digital humans and real ones interact positively in the metaverse and beyond.

Lead image courtesy of UneeQ

Want to read more stories like this? Check out Will the metaverse meet the megahype?