Tools to Navigate Outdoor Spaces for People with Low Vision

More than 90 percent of the visually impaired population have low vision—usable but limited vision that cannot be fully corrected. Smart canes, accessible routing apps, and other mobility and wayfinding systems for people with visual impairments often use recorded voice commands, vibration patterns, or other audio and haptic cues. People with low vision often find that it is better, however, to use their sight to the fullest extent rather than relying on audio or haptic tools.

With this CAREER award, Shiri Azenkot, Cornell Tech, is developing and evaluating software applications for head-mounted video displays to help people with low vision navigate outdoor spaces. The displays will provide targeted visual and audio enhancements. Using an iterative, user-centered process, researchers will design and compare various types of vision enhancements and visual and audio cues—such as magnification, arrows, and beeps—to highlight salient information in the environment and increase its accessibility.

This research will generate guidelines for the use of visual and audio cues for a range of low-vision conditions. The findings will help developers of low-vision technologies as well as teachers, orientation and mobility instructors, and rehabilitation counselors who work with people with low vision. Additional benefits will derive from the research design and methodology: This project engages people with disabilities as members of the research team and involves the creation of a new accessibility course through which students will have extended interactions with mentors who have disabilities.

Cornell Researchers

Funding Received

$550 Thousand spanning 5 years