How Facebook is building safety and empathy into AR/VR avatars and shared spaces

Avatars for VR will get more realistic in the future.

Image Credit: Facebook/VentureBeat

Facebook has made both augmented reality and virtual reality experiences that reach a lot of people — more than a billion users in the case of its Spark AR smartphone apps. But the realism of the avatars and the perceived safety of those experiences still need improvement.

So the company showcased what it’s doing on those fronts to make VR and AR avatars more expressive and shared social environments safer. Those efforts followed the day two theme of “responsible innovation” at Facebook’s F8 conference in San Jose, California.

Ronald Mallet, a researcher at Facebook Reality Labs, said during the keynote speech that the company is using the relatively simple sensors on its Oculus VR headsets to capture expressions on the faces of users and translate them into movements for their avatars, or virtual faces that others see.

In videos, Mallet showed how Facebook is modeling virtual skeletons of people to capture their movements, and layering the detection of muscles as well to capture the nonverbal cues such as victory dances that people do when they score in a sports game.

“Motion tells us about engagement, agreement or empathy,” he said. “Muscle movement is key to intention.”

Above: Lindsay Young demos watching movies together, remotely, in VR.

Image Credit: Facebook/VentureBeat

The sensors in today’s headsets aren’t as accurate at capturing those movements, but Mallet is looking at technologies that can be used in future products. Those sensors should not be as burdensome as today’s motion-capture devices, which weigh mocap actors down. Today, such mocap procedures can occupy a team of hardware specialists and artists for months.

“Nothing is scalable,” he said.

Eventually, he said the tech needs to become automated and work with off-the-shelf sensors.

On top of that, the tech has to be secure and authenticated, via things like fingerprint and facial recognition, so that other people can’t steal your virtual avatar and do bad things with it. And it needs to work with a variety of body types, genders, skin types, and hair.

That’s something that VR shares in common with AR technologies like the Portal social camera. The company had to work on the Portal’s ability to automatically focus on people in a room, using artificial intelligence. But it had to train the camera to be able to recognize all kinds of people, so it could focus in on them when they were moving or speaking. That way, it can be used for things like celebrating birthdays even when loved ones are traveling.

Above: Playing soccer in VR.

Image Credit: Facebook/VentureBeat

Lindsay Young, Oculus VR product manager, said that “VR is the next frontier of human interaction.” She said she uses the Facebook social VR app to watch a TV show with her mother, who lives on the other side of the country.

“Anyone who has this can tell you it is transformative,” she said.

But to shift the market from early adopter to everyone, “safety and integrity have to make VR inclusive for everyone,” she said.

As people gather in social VR, they should feel safe and be able to use etiquette, she said.

From Facebook’s experience with its Venues, Spaces, and Rooms VR apps, the company has learned how to improve some things. In Harmonix’s new Dance Central VR app, for instance, users can make a “two thumbs down” gesture to another avatar who is being disrespectful, and that avatar will disappear from the social space. If that person starts behaving, you can reverse that with a “two thumbs up” gesture.

There is also a social “bubble” that forms around an avatar, so another avatar can’t invade someone’s personal space. If avatars do connect in that way in a VR space, then they may become invisible to each other. You can “pause” or “mute” someone, or even block or report them.

“These apps are a test bed for safety tools,” Young said. “Our best practices will grow at the same pace the technology evolves.”