By David Goddard. Photography by Brian Notess.
Chirp. Chirp. Chirp.
What’s that sound? If you are standing at an intersection in the heart of a major US city, it’s most likely an accessible pedestrian signal—a device that provides visually impaired people with audible cues about the status of a walk signal.
Obviously the repetitive sounds are better than nothing, but they can’t actually tell someone whether it’s safe to cross the street or not. Reckless drivers, objects in the crosswalk, and potholes can present dangerous obstacles.
But what if there were a way to alert blind pedestrians to these potential pitfalls? That’s exactly the question Jindong Tan is trying to answer.
Tan, a professor in the UT Department of Mechanical, Aerospace, and Biomedical Engineering, is working on a unique device that could improve mobility without being overly cumbersome. It’s called Guide Glass.
The wearable tech resembles a pair of sunglasses, with the addition of a small GoPro camera on one side. The camera and a variety of sensors are connected to an onboard microprocessing unit that converts the visual information into data. Proprietary software analyzes the data to evaluate the surroundings in real time. The results are translated into words and transmitted to the wearer through an earpiece.
Guide Glass might look a bit like the ill-fated Google Glass product, but “we actually filed paperwork on ours before Google came out with theirs,” Tan said. “It just happened to work out that the designs are similar.”
Tan’s team began by studying how the eye transmits signals to the brain—where the decisions about movement are made.
For example, a person standing on the curb unconsciously calculates whether or not they have time to cross the street based on what they see. “If a person’s brain could make such a calculation, so could a computer,” Tan rationalized.
They placed a premium on creating a practical compact design using small but powerful components.
“Mobility is about three things: direction, range, and access,” Tan explained. “Being able to address all of those things at once is the key to truly opening up the world for the visually impaired.”
Unfortunately, the most common methods available to help visually impaired people navigate can’t address all three at the same time. Guide dogs can help with direction and range but can’t warn of access restrictions like low-hanging branches. Walking sticks help users detect obstructions and distance but fall short of providing direction.
Finding a way to deliver all three elements concurrently with a gadget the size of a cracker is the challenge Tan and his team are facing. “The current version of Guide Glass is able to convey angle, depth, distance traveled, whether a door is open, things like that,” he said. “It tells the person the info they need, when they need it.”
Tan hopes improvements in GPS technology will eventually allow further refinements, leading to even more precise calculations and directions.
While the promise of Guide Glass represents a tremendous boon for those with permanent visual impairments, there may also be applications for those in situations where vision is temporarily obscured.
Firefighters in a smoky building, rescue personnel in a blackout, or police entering a darkened crime scene could benefit from the device since it doesn’t rely on light to make measurements.
“The possibilities for making a big impact on society are exciting,” Tan said. “We just need to secure enough funding to bring it to market.” With proper backing, he believes Guide Glass could begin public trials as soon as 2018.
Even with the obvious positive implications for society, Tan explained, money has been hard to come by because investors are looking for big profit margins. He noted that they are constantly on the lookout for federal dollars, and the Smith-Kettlewell Eye Research Institute has recently expressed interest.
Until the right partner is found, the team will remain focused on improving Guide Glass to ultimately achieve their vision of helping those unable to see on their own.