Visit

For people with limited or low vision, walks to the grocery store, post office or a friend’s house can be an extraordinary challenge. The vast amount of visual cues that saturate our physical environment and assist with orientation and wayfinding can be of little help to people who experience limited sight clarity, low peripheral vision, or challenges in perceiving depth.

Emerging technology in vision enhancement poses promising avenues for navigational assistance for people with low vision, but research and development in this area is lacking.

Shiri Azenkot intends to change this, drawing on her expertise in human computer interaction, accessibility, and computer vision to design a head-mounted display (HMD) system to help people with low vision get around outdoors independently and safely.

An assistant professor at the Jacobs Technion-Cornell Institute at Cornell Tech and in Information Science, Azenkot is the recent recipient of the National Science Foundation’s CAREER Award for her proposal, “Designing Head-Mounted Display Systems to Support People with Low Vision in Outdoor Navigation.” Azenkot’s CAREER Award brings Info Sci’s tally this year to two; colleague Malte Jung also received one for his research into human-robot collaboration and teamwork dynamics.

Head-mounted displays, or HMDs, are headsets worn over the eyes that enhance elements of the physical world. They offer promising advances in tech-assisted navigation for those with low vision, even as the technology itself is fairly nascent and limited in terms of weight, speed, and resolution. Thus far, though, HMD research has been limited. As Azenkot notes, people with low vision are rarely considered in computing research, a striking omission since the majority of people with visual impairments have some degree of usable vision. Secondly, HMD technology often enhances and augments a user’s entire field of view, which can prove disorienting and overwhelming for a user who’s attempting to focus on a single object in the environment, like a street sign, for instance.

Azenkot’s NSF-funded research will guide HMD technology into the area of outdoor navigation with the design of a platform with both visual and audio cues for people with low vision.

Azenkot’s NSF CAREER project builds off her previous research with CueSee, an augmented-reality, HMD system she designed to help low-vision users find specific products in a supermarket. When standing in front of a shelving unit brimming with various products, the user can tell CueSee what product to search for. Guided by computer vision algorithms, CueSee scans the shelves, locates the product and then enlarges the product image in the user’s field of view. Azenkot intends to further this work and redirect it toward navigational assistance, namely by enhancing cues to avoid obstacles, navigate elevation changes, read signs, and follow routing guidance.

This story originally appeared on the Cornell CIS website.