Rick Thompson (left) and Sunny Dhillon are bullish on VR.

Above: Rick Thompson (left) and Sunny Dhillon are bullish on VR.

Image Credit: Dean Takahashi

Thompson: It’s like that little ditty from Avenue Q. “The internet is for porn.” It got a lot of play in the ‘90s.

Dhillon: When people were getting online for the first time.

Thompson: “VR is for porn” is going to be the next one.

Dhillon: Yeah, but it’s not a space that — I think a lot of the headsets also have walled gardens. Part of owning the hardware means owning the software store as well, the distribution store. Iribe and the Oculus guys have said that they’re not going to be closed to adult. Samsung has. It’s very family-friendly, with their Milk VR store. Vive is also more open.

GamesBeat: It looks like this is an opportunity for some alternative store just like with Google in China and all that. The different categories within VR, do you see anything taking off earlier than either games or medical and education and whatever? Or is it all taking off at once?

Dhillon: There are huge enterprise applications, huge medical applications, knowledge management applications. Without getting into specifics, if we believe this is going to become the next major computing platform — let’s say augmented reality taking the lion’s share, if you go off Digi Capital’s market size numbers. Every business that currently exists to facilitate our daily lives, and new ones, will emerge here that are native to this platform.

Let’s talk about VR ad networks for a second. We haven’t made an investment there, nor do we have an intent in the near term to invest in that space, though it’s raised significant venture funding. Immersive, Vertebrae, one or two others. When people pitched it to me, like Immersive, they said, “Oh, it’s one of the founders of Tapjoy that’s doing this. They obviously get ad tech.” I said, “Yeah, they get mobile ad tech.”

When it comes to what defines attention, what are the metrics that matter, what is the defensible tech — if anything — within a 360 video versus a volumetric light field room-scale experience versus a console cone of vision, limited range of motion experience, that kind of stuff has yet to be defined. It’s exciting, but I don’t necessarily think that previous platform experience in a business like an ad network necessarily puts someone in a better position.

Otoy's OctaneRender, a complex Nvidia CUDA application, automatically runs on an AMD FirePro GPU (W9100) without changing a single line of code:

Above: Otoy’s OctaneRender, a complex Nvidia CUDA application, automatically runs on an AMD FirePro GPU (W9100) without changing a single line of code:

Image Credit: Otoy

Thompson: An easy example is print and television advertising. It did absolutely nothing to help someone in online advertising. A lot of people got hired, executives from the ad agencies, at big salaries in the early days of internet advertising, but those guys left at the door. They did not add any value. Which was interesting, but also points to how these new green fields are giving fresh starts for people.

Dhillon: Lifeboats for a lot of people.

Thompson: Yeah, and an even playing field. But the problem is, these guys in mobile may not have any advantage over somebody just coming out of school, or just thinking about it fresh, someone spending time with their headset.

Dhillon: Let’s think about filmmaking. When you think about the Spielbergs, the Lucases, the real visionary filmmakers of our time, and then you look at VR as an emerging medium, Spielberg has gone public several times saying it’s a confusing and scary medium. He’s an advisor to a couple of VR companies that he’s helping promote content through. What’s to say that young kids out of USC’s cinematic and writing programs are not going to be the ones to create that amazing first-time “my God you have to see it” experience?

Right now, the dollars to fund that kind of content are coming from brands, for the most part, experiential marketing dollars. If there’s a movie there will be some P&A advertising budget that will be used to finance a VR experience. The new talent that’s unencumbered by existing prior platform biases and norms and stereotypes, I’m really excited to see what they’ll be doing. The new crop of creators to build on what Rick was saying.

The same goes for games as much as anything else. But experience when it comes to building games is different from experience for doing a five-minute short for 360 video.

GamesBeat: I remembered something Oculus’ Jason Rubin said, that you have to fail a few times at a VR project before you might figure out what the real thing you need to build is.

Thompson: That’s a great thing to tell investors! [laughter]

In Lone Echo, I had to rescue my commander, who had caught her foot on a railing.

Above: In Lone Echo, I had to rescue my commander, who had caught her foot on a railing.

Image Credit: Ready At Dawn Studios/Oculus

GamesBeat: Some of the interesting part was — I’ve seen some of the learnings side by side. One game  used analog sticks to maneuver around in zero gravity. It was good, but it made me seasick.

Dhillon: I got sick on that one too.

GamesBeat: I thought about why I got sick. At Oculus Connect, Ready at Dawn had a game called Lone Echo. You’re using Touch and pulling yourself forward on things you can grab in space. You propel yourself, instead of just spinning all over. I didn’t get sick at all. It was a great experience. You could play Ender’s Game zero-gravity training. And it was exactly the same kind of game, just with a slightly different way to control things. I thought that was an interesting thing relating back to what Jason said. You can learn how to make something good here. It won’t be the same rules you had to follow the last time, but you can find them.

Dhillon: When you think of what input is, and the immersion that VR can create — when it comes to pupil tracking, bringing your hands into something, haptics, the sense of touch for example, it all adds — I think my first go-to genre is always horror games. I loved growing up playing Resident Evil and Silent Hill. I love horror movies.

Imagine you’re in a haunted house in a first-person kind of experience, like that game F.E.A.R. I remember playing that in college with the lights completely off, with the subwoofer turned up. You’d hear the pitter-patter in the surround sound of that The Ring-type girl all contorted running around behind you. She’s not there when you turn around. Then she’s there right in front of you, and it’s terrifying.

Imagine this in VR now, when she knows where you’re looking through pupil tracking. She’ll react different when she knows — you can’t ignore her. If you’re looking at her she knows it and she’ll react differently. Imagine a sense of haptics, so that you’re feeling minute differences of pressure in your hand, for example. You’re holding something, getting ready to cast a spell, whatever. You can feel hot and cold. We’re seeing startups dabble in this sort of stuff. You can imagine, in a full-on haptic experience — it’s not out of the realm of possibility that this could happen in the next five to ten year kind of horizon.

In the near term you’ll have hacked-together solutions that will be for purists and hardcore fans that will add to the level of immersion. But it really is a new input mechanism — using your eyes to control your character, using gestures to create a spell in mid-air. This hasn’t been done before.

Thompson: This all sounds very expensive. We’re talking about $100 million for a console game now, but we’ll be talking about billion-dollar developers eventually.