Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.

MIT researchers have developed a technique to train fast-moving autonomous AI drones using VR-enhanced environments, reducing crashes and thereby the need for repairs or replacements. Known as “Flight Goggles,” the system will be detailed at this week’s IEEE International Conference on Robotics and Automation in Brisbane, Australia.

Flight Goggles enables autonomous vehicles to see and learn from virtual environments while they’re actually moving in physically empty spaces. The system tracks a drone’s motion, renders 90-frame-per-second photorealistic imagery of its current virtual location, and quickly transmits the images to the drone’s image processor. Researcher Sertac Karaman told MIT News (via Road to VR) that “[t]he drone will be flying in an empty room, but will be ‘hallucinating’ a completely different environment, and will learn in that environment.”

Karaman said that the team was inspired by a desire to build an autonomous drone that could outperform human-controlled drones in competitive drone races, which include mazes with windows, doors, and other obstacles. By building virtual versions of mazes and letting the drone practice navigating the obstacles, it could learn to move faster than a human attempting the same maneuvers.


GamesBeat Summit Next 2022

Join gaming leaders live this October 25-26 in San Francisco to examine the next big opportunities within the gaming industry.

Register Here

Testing suggests that Flight Goggles practice is valuable. Moving at 5 mph through 10 flights, the drone successfully flew through a virtual reality window 361 times and “crashed” only three times, causing no actual damage. Then, in real testing across eight flights, the drone was able to fly through an actual window 119 times, only crashing or requiring human intervention six times. Traditional testing requires far more precautions to be taken, to say nothing of the expenses of spare parts and whole drones.

“The moment you want to do high-throughput computing and go fast,” Karaman said, “even the slightest changes you make to its environment will cause the drone to crash. You can’t learn in that environment. If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.”

The Flight Goggles system is initially intended for aerial drones, but it also has obvious potential applications with ground-based autonomous vehicles. Using motion capture and VR technologies, moving people and fake objects can be inserted into the learning paths of AI-powered vehicles to train them to avoid real-world obstacles. Not surprisingly, the MIT researchers were backed by institutions interested in next-generation vehicle AI, including Nvidia, the U.S. Office of Naval Research, and MIT Lincoln Laboratory.

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn more about membership.