calit2

Wireless Navigation for the Blind and Visually Impaired

Breaking Down Barriers: Calit² Researcher John Miller
John Miller
2.5.04 - After receiving his Ph.D. in Electrical and Computer Engineering in June 2003, John Miller now devotes his time to Calit²'s wireless navigation project. He became the first blind student to earn a Ph.D. in electrical engineering from the Jacobs School.
[More]

2.05.04 - Imagine you're in a city you've never visited before and have multiple appointments within walking distance. Your needs are basic: You need an understanding of where you are and a map to navigate to your various destinations.



If you're blind, you have the same needs as above. But, in addition, you can't tell, without exploring, whether the terrain goes up or down and by what degree, whether sidewalks are laid out in a grid or more whimsical pattern as might be found on a university campus, or what hazards, such as construction and low-hanging tree limbs, lie along your route.



Now imagine you're an emergency responder. In a crisis situation involving hazardous chemicals, for example, your HAZMAT suit, designed to protect you, also is likely to severely limit what you are able to see.



A first step in helping the visually impaired navigate in these types of unfamiliar and potentially hazardous environments is being taken by a two-person team in Calit²'s UCSD Division. Paul Blair, a Calit² postgraduate researcher, has already spearheaded work supporting the needs of the disabled in a project providing a wireless assistive interface for people with speech impairments. Focusing on the navigation aspects of this problem, he has teamed with John Miller, a recent Ph.D. recipient in Electrical and Computer Engineering from UCSD who brings a specialty in user interfaces for the blind. Miller has served 10 years as a member of the Research and Development Committee of the National Federation of the Blind. He is also blind.


John Miller and Paul Blair
John Miller and Paul Blair

"This work is intended to benefit the visually impaired," says Ramesh Rao, Calit² director UCSD division and PI (with Sharad Mehrotra, UCI) on a collaborative NSF research grant focusing on responding to the unexpected. "While the target audience is the permanently disabled, we're also hoping to benefit from their insight to understand how better to help those who are temporarily disabled, like emergency responders. The synergy between the two groups seems to be an untapped resource."



Miller, also a member of Rao's NSF grant -- called RESCUE for "RESponding to Crises and Unexpected Events" -- explains the parallel between the needs of the blind and first responders. The two groups share limited vision or the complete inability to see. Both groups want to navigate unfamiliar terrain -- and without reliable sources of information: "People in general can be position-, map-, and neighborhood unaware," says Miller with a laugh, leaving them unable to provide correct information to requests for navigational help. Likewise in emergency situations, locals are in shock or preoccupied and, as a result, tend not to be reliable guides. Finally, both groups want information other than through visual displays.



Focusing on the needs of the permanently visually impaired, Blair says, "There are at least two ways we anticipate our system might benefit a blind person. Such a person might map out a path, walking it alone or accompanied by a sighted person, and then retrace the path using the navigation system as a memory aid. The other way is when a blind person is thrown cold into a new environment and has to cope, such as the first day of school at UCSD when everyone else is so distracted or similarly new to campus that they don't tend to provide much useful assistance."



Miller provides more specifics: Sometimes a taxi driver drops him off at the wrong location, so he is left to figure out where he is and how to get to his destination. Even if the driver describes the location, Miller says it would be helpful to be able to verify his location or have it described in more detail, both of which should be provided by the navigation system he and Blair are developing. He says, "Assuming I'm standing in front of the right building, I want to know if there is an entrance nearby and what direction I'm currently headed in so I can get to that entrance."

John Miller
John Miller

The system the two are developing is based on using GPS and a PDA or cell phone. If using the phone as the navigation aid, the user calls in and issues a command setting the desired destination, and the system responds by providing the direction to head in and the distance in meters. The server uses GPS coordinate data to locate the user, and the user can verify his or her direction using a speaking compass that calls out the direction at the press of a button.



If relying on the PDA as the navigation aid, the user still calls in and indicates desired destination, then presses a button on the PDA indicating readiness to proceed. The PDA, through a voice synthesizer, announces the direction to head in, guides the user with beeps (to indicate when the user is to left or right of the correct path; the absence of beeps indicates the user is on track), and announces the upcoming destination when the user is within a short distance. Blair is pre-programming PDA buttons that announce the user's location and compass direction the user is headed in, beep to indicate the user's relative location (left or right) to the correct path, and provide information on things of interest in the immediate area.



To test the system, the researchers have set up a handful of way points - the entrance to the Calit² "module" (temporary home until the new building is ready), Café Roma, and the Price Center stairs by the loading dock. They've also met with geographic information systems (GIS) experts on campus, led by Dan Henderson, to gain access to all available data about the campus to make the system as full-featured as possible. Says Henderson, "Much of the campus data needed to create base maps for visually impaired students exists in a variety of formats, including image data from which essential architectural features, like building entrances, can be extracted."



But, oddly, while the data is highly resolved, it's not sufficiently tailored to the needs of a visually impaired person. Miller explains that such a person would benefit from digitally "marking up" this data. "For example, blind people make heightened use of all their senses," he says. "If I notice the hum of a generator the first time I'm in a particular location, for example, that sound helps me determine where I am when I pass by in subsequent trips." So the system database, while based on campus data, will need to be customized through field surveys and data entry by students.



This system is a prototypical Calit² example of exploring the interface between physical and virtual space. For example, the user can suspend active navigation mode on the PDA (temporarily detach from the trip at hand) to wander virtually around a voice-based PDA map, then return to navigation. Taking this notion one step further, Miller has created a program -- "What's in the Neighborhood?" -- that enables blind people to virtually learn the lay of the land at UCSD on their PCs before venturing into the physical domain. "I think of this program as a wonderful testbed," says Miller, "on how people relate to their physical environment. And, of course, like the San Diego Traffic Report, it's 'powered by Calit².'"

This project is also prototypical of Calit² research projects in that it builds on previous work and involves a number of students. It integrates research and development from Bill Griswold's ActiveCampus project and infrastructure and expertise developed through the San Diego Traffic Report, led by Ganz Chockalingham. The project involves two undergraduate students and a senior from The Preuss School UCSD. Support is being provided by the San Diego Foundation Blasker-Rose-Miah Fund, the NSF RESCUE grant, and Calit².

John Miller
John Miller

Of particular interest to Miller is the subject of tactile interfaces. He, like several of his fellow Calit² researchers, is guiding a team of students on a project in an independent study class, known as ECE 191. The goal of his team: developing a tactile interface to remotely guide a blind person. "The standard 'tactile interface,' if you will," says Miller, "is for a blind person to take the arm of a sighted guide. In the prototype our students are developing, the guide will send speed and bearing information using a joystick-type device like that used for a remote-control car. The blind person can then receive these tactile clues over an RF link with the sighted guide."



Miller explains that the tactile guide will consist of three parts: a pack that clips on the belt like a pager to receive the RF signals, a pointer on a dial (like a clockhand on a clock) whose angle will indicate the suggested bearing of left or right, and a buzzer (a quick buzz will indicate to move forward, a long buzz to stop or that the receiver has lost the signal from the remote guide).



"This kind of system," says Miller, "would be a big step forward because it would free the blind person to have a conversation with a companion or simply enjoy the sounds of nature. Speaking of the latter, some of these ideas occurred to me because I wanted to be able to interact with my sighted child in a park without constraining his activity."