In contrast to traditional navigation apps, where users have to look away from the road to decipher directions, the Phiar app displays a driver’s real world surroundings, augmented with an easy-to-follow painted path. A green band shows you the way to go (just like in a racing video game).
The money comes from the Venture Reality Fund and Norwest Venture Partners. Other investors include Anorak Ventures, Mayfield Fund, Zeno Ventures, Cross Culture Ventures, GFR Fund, Y Combinator, InnoLinks Ventures, and Half Court Ventures.
The Palo Alto, California-based company launched at Y Combinator earlier this year. It was founded by computer vision researcher Chen-Ping Yu and deep learning expert Ivy Li. Their aim is to solve directional and safety issues experienced by users of traditional two-dimensional navigation systems. It would be nice if carmakers built this tech into the windshield itself, but that could take a long time to happen and would probably very expensive.
The team has a background in software optimization, deep learning, 3D reconstruction, and AR design from Microsoft, Apple, Shutterstock, and VMWare.
“The idea came after taking too many wrong turns on the streets of Boston,” said Yu, in a statement. “Trying to interpret directions from a two-dimensional map, especially at high speeds, is as difficult as it is dangerous. Navigation should be convenient and straightforward, and what we’re building is going to help people get to their destination faster and safer.”
Yu was a post-doctoral fellow focused on neuro-inspired deep learning at Harvard University before cofounding Phiar. The team has five people.
“We want our users to keep their eyes on the road, looking at the real world rather than a 2D rendering of it,” said Li, who is Phiar’s chief technology officer, in a statement. “Our unique combination of augmented reality and AI delivers clear direction and routing information. But what makes the experience so unique is the super-efficient computer vision and deep learning AI, which is capable of running on smartphones. Our navigation system augments your surroundings rather than distracting you from them.”
The app will launch in mobile app stores in mid-2019. It will run edge devices, using the real-time computational processing on commodity smartphones. Phiar said its users will have fast, live, visual navigation with traffic and other contextual data that is automatically analyzed through the platform’s computer vision technology.
“AI-driven immersive and spatial technologies are becoming mainstream,” said Marco DeMiroz, general partner at the Venture Reality Fund, in a statement. “Phiar’s breakthrough deep learning technology and AR navigation app provides not just value to users, but also sets an example of the kind of incredible potential at the nexus of AI and AR.”
After downloading the Phiar app, personal and ride share drivers will be able to place their phone anywhere via a dash or windshield mount.
Phiar aims to become the de facto general-purpose AI-driven navigation solution for all AR platforms. The team predicts that AR glasses will soon be as prevalent as smartphones, and the company is preparing to support users whether they’re walking, driving, indoors, or outside.
In addition, the company’s efficient computer vision technology could eventually help autonomous systems such as self-driving cars reduce their reliance on expensive hardware sensors and processors.
“Phiar has vast growth potential because the platform they’re building is robust enough to support self-driving cars but lightweight enough to run on a smartphone,” said Scott Beechuk, partner at Norwest Venture Partners, in a statement. “The experience is going to change the way people perceive, navigate, and enjoy the world.”