Netflix recently released a notable foray into the world of science fiction with Altered Carbon, a series adapted from a novel by Richard K. Morgan. In the show’s vision of the future, humanity has learned how to store and transmit consciousness itself. The dystopian cyberpunk series follows characters as they jump from body to body (called “sleeves”) unraveling the predictably horrid behavior in search of boundary-pushing entertainment of our now-interstellar society’s uber-rich aristocracy who can afford to be for all intents and purposes immortal. I thoroughly enjoyed the show and am happy that Netflix for continuing to invest in great science fiction.
What is particularly interesting is the show’s depiction of a future where augmented reality has become the ubiquitous interface to the digital world. We have a perspective on this first-hand given Upskill’s focus on delivering AR the largest companies on the planet, so you can imagine how I jump out of my seat when I see fiction portray the future of such technology — and actually manage to do it well.
Devices of the future
The storyline takes place hundreds of years in the future, so it’s natural that there will be healthy creative license taken. But, the AR technology you see on display in Altered Carbon is much closer to today’s reality than many would think. Characters are depicted inserting and wearing a contact lens-like device (called an ONI) when they wake up that presumably projects the user interface right onto their retinas. We’re still wearing smart glasses today, but the early stages of technology aiming to provide that very kind of functionality is in the works as I write this.
The user input control of these devices is another area where the team behind Altered Carbon seem to have done their homework. The entire interface looks to be controlled via a bracelet device that contains some kind of touch controller which the characters can combine with voice commands to perform a variety of functions. I don’t need to remind any wearer of an Apple Watch that this kind of wrist-worn touch controller exists today, and our very own enterprise-focus software platform provides the same kind of “Call Kristin” command functionality we see protagonist Takeshi Kovacs invoke on-screen.
What’s more, if you keep your eyes peeled, you’ll see that there are other types of AR devices in use as well. In one episode, while a character is under the knife in a top-of-the-line surgical suite of the future (no spoilers, I promise), astute viewers will notice the surgeon wearing a pair of Recon Jet Pro smart glasses. That’s not a prop; it’s an actual pair of smart glasses that you can go on Amazon and purchase today. This is cool for many reasons, but it’s particularly interesting because it illustrates a notion we encounter every day delivering wearable technology to the workforce: different scenarios and uses call for different form factors of devices. Perhaps in this fictional world that doctor can’t use an ONI system for some reason, or maybe they don’t have the processing power to handle the data display he needed for the surgical procedure. We found this type of variability early on in our work in the enterprise. And, it’s exactly why our software platform supports a multitude of different devices with different form factors fit for purpose for different types of jobs.
The uses of augmented reality
In addition to how the show’s producers show this technology working, it’s also important to observe what they have the characters using it to do. In nearly all cases, the most compelling uses for AR revolve around getting something done or increasing convenience. You see examples of the tech used for video calling, object recognition, monitoring, and general data access. These are exactly the use types of use cases where we see the technology thriving today in the enterprise.
In one scene, a character uses their ONI to place a “see what I see” call and have a support team help with their mission. That’s precisely how AR tech is used today – with remote experts looking through live point-of-view streaming video from smart glasses to help on-site techs perform maintenance or repair work. This can save companies money, as they no longer need to fly experts all over the world.
In another scene, a user is seen navigating through various dossiers and videos trying to identify a suspect. Reference material and rich media of any kind from any type of database can be retrieved and reviewed in a variety of smart glasses devices on the right platform. Do you want to use smart glasses to monitor critical Internet of Things sensor data in the same way that future surgeon was monitoring the patient during surgery? Already possible, check out what GE is doing.
While we may be a (little) way off from having the technology to create the type of direct retinal display contact lenses seen in the show, the capabilities are all here with devices you can wear today – and that’s incredibly exciting. It’s what makes our team jump out of bed every morning eager to tackle the next set of challenges.
Art imitating life imitating art
I’ve been inspired by science fiction since I was a little kid. It has shaped my career in more fundamental ways than I can describe. Dreaming about what could be had led me to what I do today and continues to guide me as I’m constantly inspired to see the disruptive technology from the world of science fiction made real.
Seeing a vision of the future that acknowledges the impact of the technology my co-founders and I have built a business around is incredibly fun for me. Because science fiction inspires so many technologists, it’s thought that life tends to frequently imitate art. It’s so refreshing and validating to see that it is much more of a virtuous cycle with the reality of the current day forming the input, art pushing the concepts forward, the next generation being inspired to make the vision a reality and start the cycle all over again.
Jeff Jenkins is co-founder and Chief Technology Officer at Upskill, a leader in enterprise software for augmented reality (AR) devices.