GamesBeat: I’ve talked to some people who were wondering if Magic Leap will spend a lot of money on developers to get things going, to commission whatever projects, the same way Facebook has poured so much money into Oculus content. Have you figured out what kind of content strategy you might have in the context of that sort of thing, to get people to your platform?
Laidacker: There was the indie fund that was announced last week. That was something that came out of what myself and a few others have been saying for the last year or so. All these other companies have amazing funding to get indie devs building on their platform. To me, indies are the ones who really focus on innovation. Triple-A is where they can iterate on that innovation and bring it to the masses. That’s why it was so important, and I’m happy that we’ve announced it. We’ll be giving hardware, funding, marketing, engineering and design support, the whole caboodle.
At the same time, we want to be mindful with the creators that will get chosen. If I think about the last two years, our early access partner pool is very diverse as far as types of content. We want to continue to spread our content across the different types of initiatives we’re doing. Hopefully we’ll reach out to indies across a number of different spectrums.
GamesBeat: The other thing developers seem to ask for–what’s the road map to the consumer market?
GamesBeat Summit: Into the Metaverse 3
Join the GamesBeat community online, February 1-2, to examine the findings and emerging trends within the metaverse.
Laidacker: During the keynote we saw, at a very high level, what’s coming. When I’ve talked about this with other developers, because I worked on the session side of things–there was a lot of information shared during the con. Some of it might have fallen through the cracks. For example, what was talked about in the keynote and one of our really large features is persistence and our passable world technology.
What that’s starting to enable for developers is, one, thinking about how content can persist over multiple sessions. Create just came out with an update just week so that now, when people play Create, when they restart a session they can build on that and have content that persists in the world around you. But what that technology also enables, and what’s coming online with the passable world tech, is being able to have shared experiences. That’s very important for developers.
Whenever we talk about mixed reality, we always talk about how the real magic of why people do MR is to experience it in the real world, but also with people around them. Everything coming online with passable world–I don’t want to mix anything up as far as what quarter it’s coming out. But it’s enabling everything with passable world and persistence. All of this is also working to inform our object recognition technology as well, which I know is something that’s extremely important for developers.
Why it’s important is because object recognition starts to provide more semantic recognition about the real world around us. There are short-term things we’re working on that weren’t in the keynote, but were announced in the sessions. For example, there’s Environment Toolkit. That’s the first step of what will lie on top of object recognition. It’s starting to provide semantic info about–instead of just saying, “This is a plane and this is mesh,” it’s actually saying, “This is a table. These are seating locations. Here are hiding locations. Here’s a corner of a room.”
Instead of focusing on placement of content, developers can use that contextual information to create content that feels way more intelligent and alive. That’s a stepping stone toward the object recognition that’s coming online after, which was talked about in the road map. Voice commands, natural language processing, that’s something we also talked about in the experimental inputs talk, where they dove into the use space of what we’re doing with that. We’re doing a lot of experimentation and debugging on our side, and that’s an API that will go online as well.
Right now there’s a lot of work. I know it was not all talked about in depth in the keynote. But within the sessions, pretty much everything that was mentioned on that road map, we did a one-hour deep dive on exactly what’s going to be coming for developers.
GamesBeat: Oculus now has this low-middle-high set of developer targets. They have Rift, Quest, and Go, and even the phones. Do you have a feel yet for what your high is? There’s a middle and a low in AR, these hundred-dollar AR experiences or smartphone experiences, but you guys aren’t doing that. If someone comes to you with an idea, do you have a sense as to whether that’s a Magic Leap idea, as opposed to someone else’s AR platform?
Laidacker: It’s hard to say as far as comparing different types of AR content. Definitely, on the VR side–there’s a difference between the VR and MR sides of things. With the different AR platforms today it depends on the type of content you want to create. Do you need to be hands-free? Do you want to have something where you can walk around the world and be able touch things and pick up things? If that’s the case, a HMD is the way you want to go.
If that isn’t as important, or having as much depth information about the environment around you, yes, there are more lightweight AR capabilities. But I know that for us, for Magic Leap, we really want to be pushing the full spectrum of having the environmental information, having access to your hands, focusing on the HMD side of things. But as far as different types of Magic Leap platforms, that I can’t comment on.
GamesBeat: Is this a real 6DOF experience now, do you think, or is it some kind of subset?
Laidacker: As far as the hand interactions–this is feedback we see from developers, and even internally. It’s the beginning days. Right now, yes, we have Totem, which is full 6DOF, and does enable really novel interactions. That’s one example that I really love, fully using the control to be able to control flying things in the air. That’s much more novel and intuitive, because with the full 6DOF–with gestures, I think what people are really interested in is not necessarily just the eight exact gestures we’re supporting today. But developers are excited that we have keypoint tracking. Developers can use that keypoint tracking for whatever is specific to their application.
In terms of where we know we need to go in the future–we want to be able to do gestures that are outside of that tracking zone, using gestures that are natural, instead of having my hands in front. Though we are providing developers with a lot of best practices about how to do more natural things within the field of view. Haptic feedback is also a big one that we’re looking into for future forms of technology.
There are definitely things where we’ve worked with–how do we trick people, almost, into making them think that interacting with something digital has that feedback? Thinking about grounding your digital content to something that’s physical. Maybe having a digital button on a table. That feels much more tactile when you press it, compared to poking into the void and wondering if you really interacted with something. We’re doing research there, it least in terms of sharing best practices. But we know that haptic feedback is something we really want to focus on for the future.
GamesBeat: After the convention, when you’re going around to visit people now, do you have a very different agenda?
Laidacker: We saw a bit about this in the keynote, this focus on the Magicverse and what that is. Our focus is twofold. We still want to work with a lot with our developers to see what are the features and the different APIs and technologies we need to provide to them to create compelling experiences for indoors, for living room use, for offices and all of that. It comes down to that semantic information. But at the same time, we’re also starting to see–what does it mean to have these experiences in the real world?
When I think of it from a use case perspective–think about being at home. What do I like do? I like to relax. I like to play with my kids. I like to cook. That’s where a lot of those experiences are surrounded, around the types of interactions I do in my home. But when I think about going out into the real world, the use cases are very different. I’m often just trying to get from point A to point B. I’m shopping for groceries. I’m hailing a cab. I’m flying somewhere.
We’re starting to look at those use cases, and still taking that same philosophy of how we got Magic Leap One out the door. Use cases inform features, inform systems, inform technologies. Especially for me, coming from a video game background, where my whole focus was doing open world AI system, it makes me excited to see the similarities of technologies that are going to transcend how we bring experiences to the real world around us. That’s why so much of the work we’re doing now with passable world, persistence, and shared experiences is key to how we’ll be able to bring those experiences to the real world around us.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.