Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.
It seems like every major technology vendor is delving into the augmented and virtual reality space, claiming to have the latest and greatest solution for the market. Unfortunately, these technologies may not live up to expectations when it comes to enterprise use — whether it’s platforms like AWS Sumerian, Microsoft Visual Studio, or hardware such as the HoloLens and the upcoming Magic Leap.
Right now, gaming is the area that has the most logical fit and the best AR/VR experiences, but this doesn’t have to be the case. With a few improvements and modifications, AR and VR technology could be improved so that it’s useful for companies for a range of scenarios including training, maintenance, product design and more.
What are the major issues with the current capabilities, and how can they improve for enterprise use? Vendors should focus on two key problem areas: cospacing abilities and headset deficiencies.
Perhaps the biggest shortcoming of AR and VR is the lack of cospacing functionality. Cospacing is the ability for multiple people to interact in the same experience and have real objects — like a mechanical wrench or a CPR dummy — be included, which is essential to making training experiences effective. Currently in the AR landscape, there is no single vendor product that allows for this, making the technology much harder for enterprises to adopt.
Intelligent Security Summit
Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.
Why is cospacing essential for enterprises to get value from AR and VR? Consider the limitations of a training exercise that does not allow for multiple employees to participate at the same time, in the same place. Further, imagine that objects needed to make the training realistic — like a specific tool that will be used in the real life scenario — cannot be incorporated into the experience. This would severely limit the effectiveness of the training exercise, because it would not allow for the real-time collaboration or practice with the actual tools. It would be like a soccer team trying to practice with each player on a separate field and no balls or goals.
Another example of cospacing’s importance comes from product design teams using AR or VR. There are currently no solutions offered by any of the major vendors that allow people to collaborate in real time with a shared view. In other words, you can imagine two product designers working together in the same space collaborating on a design, but today this isn’t possible — they each have their own separate view making it hard to understand what the other designer might be doing and making collaboration extremely difficult.
Luckily, there is already a technology that solves these issues — here’s one example — and allows for collaborative experiences and the use of real-life tools. More platforms need to offer room-tracking systems with sensors placed on all involved people and objects, allowing them to be easily connected and included in the experience. New objects and people can be included and visualized in the experience by simply adding new sensors. This also gives the ability for motion capture that allows for custom scenarios to be pre-recorded and programmed without needing to wait for a studio to develop and release them — which is especially useful for specialized training scenarios.
Headsets and controllers
Another major issue with AR and VR technology is the hardware on the market. While major leaps have been made since the technology first conceived, they still fall woefully short of being able to provide a practical and useful application for enterprises.
A major issue with today’s AR and VR headsets the lack of eye tracking capabilities. This means users shift their view by moving their entire head rather than looking in a different direction as they would do in real life. In other words, to look up, one must tilt their entire head upwards rather than just looking up with their eyes. This creates a number of issues, the largest being the neck aches caused by moving around the sheer weight of the headset in order to change views, causing an unpleasant user experience and making more complex scenarios difficult to navigate.
The other downside to not tracking users’ eyes is that a treasure trove of potential data is lost. In a training scenario, if eyes are tracked then an employee could get more detailed and useful feedback on how to improve, and a company can find out what potential obstacles their employees do not see.
Another issue that plagues most headsets is their field of view (FOV), which is still quite limited. Ideally AR glasses would provide a minimum of 114 degree horizontal per eye, which covers human stereo vision (the remaining 40 degrees is monocular peripheral vision). To put the inadequacies of current options in perspective, the HoloLens is only 40 degrees, and the ODG R9 FOV is around 50 degrees. The Meta 2 is more respectable at around 90 degrees with the SmokeHMD VR/AR unit at around 100. A limited FOV creates an issue because it limits the users immersion in the experience creating a keyhole or windowed effect.
Projection systems with a focal plane too close to the user, chromatic aberration and other visual distortion can cause eye strain and overall stress due to headaches and confusion. Headsets which block the users downwards peripheral vision stop the user from seeing their feet making walking and moving around a space much more dangerous.
Hand controllers that pair with headsets provide further difficulties, because they require users to have their hands outstretched for long periods of time, causing discomfort. A solution to this issue, is building more content that uses voice controls — specifically when combined with eye tracking and with a visual prompt this will reduce the potential for misunderstandings.
Solving the current problems
Technology to improve AR and VR does exist today — there are room tracking systems to solve cospacing issues and hardware that prioritizes a better user experience. Unfortunately, there is no single solution that incorporates all of these technologies, making the enterprise buyer’s task difficult. Every vendor has its own issues and creates lock-in that inhibits enterprises from purchasing other technology that would remove the shortcomings.
Open standards provide a way to connect all of the AR and VR technologies together, so that they’re able to work together to create the best and most useful experience possible. This means that the Amazons and Microsofts of the world would develop products that were able to connect to each other and the wider ecosystem of AR and VR technologies, rather than trying to trap buyers in their singular product line. As AR and VR tech matures, this type of cross-platform collaboration will be crucial to overcome issues and make the technology accessible and useful to enterprises.
David Brebner is a software visionary with expertise in user interaction, software design, 3D machine vision, AR/VR, IoT, AI and solutions architecture, as the founder and CEO of Umajin.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.