Dedicated to Improving the Lives of Blind and Visually Impaired People

“Open sourcing Project Guideline: A platform for computer vision accessibility technology”

Project Guideline, a collaboration between Google Research and Guiding Eyes for the Blind that launched in 2021, has “enabled people with visual impairment…to walk, jog, and run independently.” It involves the use of “on-device machine learning (ML)” to navigate users along outdoor paths marked with a painted line. This technology has been tested worldwide, including being demonstrated during the opening ceremony of the Tokyo 2020 Paralympic Games. Since that time, it has been refined further with the introduction of new features, such as “obstacle detection and advance path planning,” to facilitate navigation through more complicated scenarios, like sharp turns and nearby pedestrians and other information about the surrounding environment. The new platform can be used in “a variety of spatially aware applications in the accessibility space and beyond.” With the recent open source release of Project Guideline, this technology is available for anyone’s use to improve and build new accessibility experiences. Project Guideline has been developed specifically for Google Pixel phones with the Google Tensor chip. For more information, read the Google Research Blog on Open sourcing Project Guideline: A platform for computer vision accessibility technology.

Collaborating to Build Project Guideline

To build Project Guideline, Google Research collaborated with Thomas Panek, a runner who is blind, to help him run independently. Panek is President and CEO of Guiding Eyes for the Blind, an organization supported by Reader’s Digest Partner for Sight Foundation (RDPFS) that makes it possible for people with vision loss to receive “running guide dogs” that help them live more active and independent lives. Panek explained that although he has always loved running, once he was diagnosed as legally blind as a young adult, he gave it up. Later on he resumed the pastime, running with human guides. Although he was grateful for the support, he “wanted more independence” and, subsequently, ran the “first half-marathon assisted only by guide dogs.” Knowing that it is not possible for everyone to have a “brilliant, fast companion like my guide dog,” he sought to find out if it would be possible to devise a way to guide a blind runner independently. After posing this thought to a group of designers and technologists at a “Google hackathon,” he worked with them to sketch out a prototype based on a simple concept where he would wear a phone on a waistband, with “bone-conducting headphones.” The camera would discern physical guidelines on the ground and send audio signals based on his position to help him stay on track. After initially testing the technology outside and arriving at the finish line of his run, Panek noted that “For the first time in a lifetime, I didn’t feel like a blind man. I felt free.” Following this debut, the technology was tested further, resulting in the advances described above. In advancing this project, Google Research has sought to “gather feedback from more organizations.” Learn more about Panek’s experience in the blog on Guiding Eyes webpage Introducing Project Guideline.