In theory, controlling a flying drone shouldn’t be much different from piloting a helicopter, but without a human in the cockpit, visualizing the drone’s current position relative to environmental obstacles can be tricky — it’s commonly two-dimensional, reliant on cameras and joysticks. This week, researchers Chuhao Liu and Shaojie Shen from the Hong Kong University of Science and Technology revealed an intriguing new solution that uses holographic augmented reality hardware to create live 3D terrain maps, enabling drone pilots to simply point at targets visualized above any flat surface.
The holographic interface relies on a combination of interesting technologies: On the display side, a Microsoft HoloLens headset generates the augmented reality content as a colorful voxel map that can be viewed from any angle, using an autonomous drone’s depth cameras and raycasting for real-time location data. Critically, the system provides a live and highly spatial sense of environmental elevation and depth, allowing the drone to be easily seen from a third-person perspective and repositioned.
Then HoloLens feeds commands back to the drone, determining its next target within the holographic map by turning the wearer’s hand gestures and gazes into point-and-click-like controls. The autonomous drone then flies to the new location, updating the 3D map as it travels. A demonstration video the researchers provided looks straight out of a sci-fi movie — at least on the holography side. Due to bandwidth limitations, the drone only supplies 3D map data to the AR interface, not the accompanying first-person video.
The Hong Kong University team still has a way to go before the holographic drone control system is ready to be deployed. Initially, the drone’s data was shared using Wi-Fi within an indoor testing space, though low-latency 5G cellular connections will likely work outdoors, once 5G networks progress past their currently drone-limited stage. The researchers also noted that HoloLens’ “very limited field of [AR] view … caused frequent complaints” in a group of testers, an issue that could be addressed using HoloLens 2 or another AR headset. Additionally, testers required practice to become proficient at 3D targeting, despite their prior familiarity with AR hardware, an issue that might trace to gesture recognition or an imperfect 3D UI.
It’s worth noting that 3D map data will be more bandwidth efficient than live first-person video, requiring only 272MB of data when updating 10 times per second, versus 1.39GB of data to send first-person video imagery at 30 frames per second. Going forward, the team wants to include both types of streams for the user’s benefit, optimizing the data to meet a network’s minimum bandwidth levels.
Despite the issues, the holographic AR system clearly has plenty of potential. Putting aside the visual novelty of the interface, there’s a huge convenience benefit in controlling a remote vehicle using nothing more than a portable, standalone AR headset, rather than needing a full-fledged computer, monitor, and joystick. The researchers plan to officially present their “first step” in combining AR with autonomous drones at the International Conference on Intelligent Robots and Systems, which is scheduled for October 25-29, 2020.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more