As a technology executive since the early ’90s, I’ve enjoyed a front-row seat to the digital revolution. I’ve observed the meteoric rise of the Internet, broadband, social media, and mobile and watched their consumer adoption soar from 9 percent to 90 percent. Today, we’re on the cusp of a similar inflection point for wearable technology and augmented reality.
We all love smartphones. But it’s not realistic to assume the phones we now carry around in our pockets represent the final form factor for mobile.
In fact, the phone is about to explode. It will evolve into many different pieces, with wearable devices for many different parts of the body, including your wrist, ears and eyes. A couple years ago, an episode of Futurama joked that the “eyePhone” would replace the iPhone. As we know now, that transition is already underway, as devices such as Google Glass and Oculus Rift demonstrate.
Yes, the devices are different. But they’re also quite closely related. And if you view Glass and Rift as steps along the same continuum, you start to see a very clear picture of the wearable-computing future.
Glass is a high-profile example of a new class of devices called smart glasses. But it’s not the be-all-and-end-all of smart glasses. It’s just the beginning. We will see many competitors in the sector that copy Glass, improve on it, and reimagine it. And in the years ahead, they’ll usher in a completely original class of application design and user interface that seamlessly integrates the physical and virtual worlds to create a sort of “sixth sense.”
Now let’s place the Oculus Rift on that same continuum. Today, Rift is all about virtual reality and immersing the user in a different world. The headset will not just be tethered to a PC or console but will also support mobile devices. It seems likely that Rift will be the first device to truly deliver on the mind-blowing power and potential of virtual reality. But it is when you integrate the “real world” experience with the “digital world” of computing power, data, and boundless knowledge that something magical will happen.
It’s not hard to envision that future iteration of Rift, which could display your actual reality, augmented with a digital overlay on your physical surroundings. This next-generation Rift could go far beyond gaming and dramatically change the way you communicate, learn and engage with others. In fact, thanks to the Facebook deal, one could argue that Oculus Rift is now in a better position than Google to execute on the potential of smart glasses.
Facebook profoundly changed the face of the digital universe once before, and it seems likely to do it again with Rift. Think back to the early days of the Web. It was a mysterious, anonymous place. Indeed, a New Yorker cartoon famously joked that on the Internet nobody knows you’re a dog. But Facebook popularized and ushered in a new era where our real identities extended online, and billions of users now have an identity with an overlapping online and real world context. With Rift, Facebook will further integrate the real and the digital worlds, thus ushering in the next phase of human-computer interaction.
Many observers are scratching their heads over Facebook’s acquisition of Oculus VR. To me, it makes perfect sense (price not withstanding). In a recent post, CEO Mark Zuckerberg made it clear that Facebook wants Oculus to become a platform for experiences other than gaming. He described Oculus as a “new communication platform,” asserting that “immersive, augmented reality will become a part of daily life for billions of people.”
I agree. This acquisition has less to do with virtual gaming and more to do with what’s coming next: the ability to perceive and interact with the world and with people in a completely new way.
It sounds dramatic. Futuristic, even. But at the core of this development is the same sensor technology that already exists in many mobile devices. And these sensors already permeate new wearable devices and will only get better over time. These sensors do two important jobs. First, they enable devices to know something about themselves, such as their orientation and location. More important, sensors enable devices to know profound things about their users.
Imagine a future in which anyone or anything in your physical world is clickable, where anyone or anything can be linked with information related to it. By looking through your smart glasses, you can learn much more. You can sit in a comfortable sofa at a friend’s house, and by touching the cushion or looking at it a certain way, you can call up information about the sofa: who made it, where your friend got it, what other fabrics it comes in. You can capture a 3D model of the sofa and place it virtually in your living room to see how you like it.
Now imagine you have this sort of connection to all the things around you. Imagine the physical spaces in which you live and work are filled with hyperlinks that make everything clickable — this bottle, that chair, those people sitting over there in the restaurant. This will be achieved with a pair of smart glasses. It may not happen next year — or five years from now. But it will certainly be here within a decade.
And it will be amazing. The wearable-computing future will give you a sixth sense that enhances your other senses. With it, the physical and digital will blend. Everything around you will be more connected and more telling — more infused with meaning — transforming your ability to learn, to explore, and to share. To see.
This brings me back to Google Glass and Oculus Rift and how they are closely related. Whether they realize it or not, the two are evolving in a similar direction. In fact, they are on a collision course for the ultimate prize: the birth of smart glasses that changes our perception of the world and forever alters the way we live in it.
Shawn Hardin is the cofounder and CEO of Mind Pirate, a tech startup delivering an application and cloud platform for the development and distribution of wearable computing apps, called Callisto. The company is focused on making it as easy as possible for OEMs and developers to deploy great apps across, and take full advantage of, a range of wearable devices. You can follow Hardin on Twitter @shawnhardin.
More: MobileBeat 2016 is focused on the paradigm shift from apps to AI, messaging, and chatbots. Don't miss this opportunity: July 12 and 13 in San Francisco.