Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Learn more.
Adobe has traditionally pioneered tools and workflows for professionals and creatives to build for a digital medium. The rise of virtual reality has led to Adobe building the platforms necessary for creators to transition into the world of 3D spatial computing. I’ve interviewed four innovators from Adobe Research and Adobe’s Design Lab who are building the company’s VR technologies, to get their insights on the future of VR and 3D immersive technologies.
3D experiences can be built on 2D devices
Experts in fields as diverse as architecture to game design often agree that building 3D experiences on 2D screens can be a challenge. However, is the problem that 2D screens are fundamentally insufficient to create 3D experiences? Or could it be that the user experience for these workflows has never been perfected?
Patrick Palmer is a Senior Product Manager for Premiere Pro CC at Adobe. He says that, “the focus has shifted from dimensionality to creating an immersive and interactive feeling that consumers can experience in everyday applications on any device.”
Palmer points to Adobe’s Project Clover, now fully integrated within the new immersive environment in Premiere Pro CC, as an example of bringing high-end VR to the masses. The immersive environment “makes the experience a lot less tedious as designers go back between 2D and headset-editing,” he says. “It also improves the VR editing workflow, by allowing users to access familiar Premiere tools within VR, with the ability to perform edits using an interface optimized for motion controllers. Ultimately, such tools can make 2D and 3D content that co-exists naturally.”
Audio makes immersive experiences complete
Without proper audio design, VR experiences cannot truly be immersive. However, it is a huge challenge to design sound that matches the screen in a virtual environment. Yaniv De Ridder, Senior Experience Developer at Adobe, explains that “audio is critical because sound needs to do more than enhance the mood or fill out the visual experience. Sound cues help orient users, so they know where they are in the virtual space and where they should be looking.”
Yet while VR environments succeed in placing audio effects around a user, they fail to respond to the viewer’s movements or head rotations, De Ridder says: “If audio and video are not synced together, you lose the perception of immersion. For example: If I speak while facing you in a virtual world, but you turn your head away to the right, you want to hear the sound of my voice coming from your left ear. Hence, pairing audio and video together will make an immersive experience complete.”
VR can be the future of film
VR can be the future of film, with companies opening VR-only movie theatres and blockbuster productions in the making. Yet a significant challenge in shooting 360 video content is that when scene objects come near the user’s viewpoint, they cannot move their head to peer around them — resulting in an uncomfortable experience. Even high-end VR cameras with 3DOF playback can feel flat and lack immersion.
This is a challenge that Adobe is tackling with Project Sidewinder. Stephen DiVerdi is a Senior Research Scientist at Adobe. He explains that Project Sidewinder is developing “a technology aimed at bridging the gap between the high-end consumer and the super high-end professional grades of VR filmmaking, by enabling a higher fidelity at a lower price point.”
“VR is experiencing an explosion right now, deservedly so, but there’s still lots of improvement to be made,” he continues. “Super high-fidelity rigs promise full light-fields with 6DOF playback, but at a high cost, high technical complexity, and only for the most exclusive users and producers. We want to open up access to more people because, ultimately, today’s luxury capture and playback technology is tomorrow’s consumer experience.”
Immersive technology will take us beyond the screen
The past several decades of computing have been defined by screens. We’ve used 2D screens on computers, tablets, smartphones, and smartwatches to interact with the world and our friends with gestures like clicking, pinching, swiping, and scrolling. However, the best practices for UI and UX design for immersive experiences are still being defined. How do we interact with computers when the interface becomes the physical 3D space in which we live and move around?
Silka Miesnieks, head of Adobe’s Design Lab, is bringing immersive spatial computing to the forefront for creators. “Virtual and immersive technologies are pushing past the boundaries of physical dimensions,” Miesnieks says. “We are moving from the era of the screen to spatial design where we can embed digital experiences everywhere around us.”
“This will require more intricate interactions, new content and design tools to come to fruition,” Miesnieks continues. “Fully immersive worlds will need more natural interaction with objects controlled by gestures and voice commands, unlike 2D screens that interact with keyboards or remotes. However, if successful, we will design sensory-laden experiences which are much more emotive than anything we’ve seen so far in the VR market.”
Michael Park is the CEO and founder of PostAR, a platform that lets you build, explore, and share augmented realities. This article was created in collaboration with UX Designers Kellie Liang and Katerina Klein.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn more about membership.