Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.

In collaboration with nonprofit organization Guiding Eyes for the Blind, Google today piloted an AI system called Project Guideline, designed to help blind and low-vision people run races independently with just a smartphone. Using an app that tracked the virtual race via GPS and a Google-designed harness that delivered audio prompts to indicate the location of a prepainted line, Guiding Eyes for the Blind CEO Thomas Panek attempted to run New York Road Runners’ Virtual Run for Thanks 5K in Central Park.

According to the U.S. Centers for Disease Control and Prevention, in 2015, a total of 1.02 million people in the U.S. were blind and approximately 3.22 million people had vision impairment. Technologies exist to help blind and low-vision people navigate challenging everyday environments, but those who wish to run must either rely on a guide animal or a human guide who’s tethered to them.

Google’s Guideline app works without an internet connection and requires only a guideline painted on a pedestrian path. Users wear an Android phone around the waist using the aforementioned harness; the Guideline app runs a machine learning model that looks for the painted line and identifies it. (The model, which emerged from a Google hackathon, accounts for variables in weather and lighting conditions.) Then, the app approximates the user’s position and delivers audio feedback via bone-conducting headphones to help keep them on the guideline. If the user is to the left of the line, they’ll hear audio in their left ear increase in volume and dissonance, and if the user moves to the right of the line, the same will happen in the their right ear.

Project Guideline


Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.

Register Now

“Imagine walking down a hallway in the dark with your arms outstretched. As you drift to the left, you will feel the wall with your left hand and move back to center to correct,” a Google spokesperson told VentureBeat via email. “As you drift to the right, you will feel the wall with your right hand and move back to center to correct. The same applies with Project Guideline, only you hear the boundaries to your left and right, rather than feel them.”

Beyond the pilot with Panek, Google plans to partner with organizations to help paint guidelines in different communities and provide additional feedback.

Project Guideline

Above: Images used to train the Guideline model.

Image Credit: Google

The launch of Guideline comes after Google debuted more in-depth spoken directions for Maps, which inform users when to turn and tell them when they’re approaching an intersection so they can exercise caution when crossing. The company also continues to develop Lookout, an accessibility-focused app that can identify packaged foods using computer vision, scan documents to make it easier to review letters and mail, and more.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.