Calit2 Collaborates with Architects and UC San Diego Neuroscientists to Study How 'Way-Finding' Affects the Brain
San Diego, CA, August 15, 2008 — Losing one's way in a large building can be stressful, time-consuming and costly — a 2004 study published in U.K. medical journal The Lancet found that staff at one large hospital spent 4,500 hours each year giving directions to lost patients, at an associated annual cost of $220,000. But when a patient is critically ill and can't locate the emergency care facility, ease of navigation in a complex setting can mean the difference between life and death.
|
To gain insight into "way-finding," or the process of navigating, getting lost and using cues to reorient one's self, researchers with UC San Diego's Division of Biological Sciences, the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2) and the Swartz Center for Computational Neuroscience (SCCN) have created and tested a prototype virtual-reality system to study the human brain's response to architectural cues. By uniting the expertise of architects, psychologists and neuroscientists — and by making use of the unique visualization technology available at Calit2 — the researchers hope to improve the design of buildings, neighborhoods and urban settings so that people will know how to get where they're going with minimal confusion.
"Helping patients who are lost has real importance," says co-investigator Eve A. Edelstein, Senior Vice President of Research & Design with California-based HMC Architects and a neuroscientist and visiting scholar with UCSD's Division of Biological Sciences. "We wanted to look at how people's brains change when they know precisely where they are versus when they have no idea where they are, in order to gain deeper understanding of how people form memories of spaces and places."
To do so, the researchers developed an interactive and synchronized virtual reality and electroencephalography (EEG) prototype to study the neural sources associated with way-finding in the hopes of gaining a better understanding of how humans create "cognitive maps" of architectural spaces, even when those architectural spaces are virtual.
For the experiment, a virtual-reality test environment in Calit2's New Media Arts Wing (on the first floor of UCSD's Atkinson Hall) was developed by project lead Eduardo Macagno, founding Dean of the UCSD Division of Biology and a Calit2 governing board member, in collaboration with Edelstein, Calit2 Project Scientist Jürgen Schulze and SCCN Project Scientist Klaus Gramann. The test subjects began each way-finding task in one of two locations: the first was a realistic rendering of the wing's front lobby that was rich with visual cues such as architectural landmarks, interior finishes and color; the second location was an ambiguous rendering of the wing's south corridor. The latter contained no prominent visual cues to direction or orientation — walls were white, doorways were not well marked, space was totally symmetrical, and even shadows were eliminated, providing no easy means for test subjects to get a handle on their location, orientation or the arrangement of adjoining rooms.
The test environment was projected to scale onto Calit2's StarCAVE virtual reality system — a 360-degree, 15-panel, 3-D immersive environment that surrounded the test subjects and enabled them to interact and move within both the ambiguous and unambiguous architectural renderings.
Test subjects were instructed to learn and memorize the location of all the rooms and corridors during "free exploration" in the StarCAVE, and to demonstrate their knowledge via drawn plans before and after testing began. Each subject then completed 96 trials in which they navigated through the virtual building from the front (unambiguous) lobby or back (ambiguous) corridor toward stated goals, which were posted on the StarCAVE screen before the trial began. One stated goal might be "Auditorium," for example, or "Theater."
During the trial, test subjects were fitted with what looked like a red swimming cap covered in strands of spaghetti. The tangle of noodles turned out to be 256 high-density EEG electrodes (EEG is a standard technology used to measure electrical activity in the brain). An amplifier system placed in a backpack worn by the subject connected the electrodes via one fiber optic cable to a recording system outside the StarCAVE. The subject was able to move around in the virtual environment by way of a remote controller, while an electromagnetic motion-capture system recorded his or her head and hand movements.
Those movements were then synchronized with the data from the EEG sensors and the virtual-reality data stream. In this way, the researchers were able to track movement along the routes, note virtual cues and scenes as the subject moved through the rendering, and observe the subject's physiological brain responses as they encountered specific landmarks. All data streams were synchronized online and in real-time via a UDP network, which allowed for the active rendering of the virtual environment while the subject navigated through the building.
"With such methods, modifications to the visual features of cues can be explored in terms of their effectiveness in way-finding design," Edelstein says. "Unlike magnetic resonance imaging [MRI] — which records brain responses in prone, immobile subjects — this technology relates navigation events to concurrent brain responses while subjects move freely within the virtual reality CAVE."
|
The researchers are particularly interested in what happens in the hippocampus and the parietal area of the brain during way-finding tasks. The parietal lobe is likely involved in using visuo-spatial information from a first-person perspective, while animal studies in the 1970s demonstrated that "place cells" exist within the hippocampus and are responsive to both memories of events and memories of a place. Recent discoveries also demonstrate a network of "grid cells" in the nearby entorhinal cortex. Additionally, neuroimaging studies of patients with temporal lobe disorders demonstrate that some are unable to recognize or perceive landmarks.
"Factors that influence memory — including dementia or Alzheimer's disease — can also selectively affect memory involved in navigation tasks," Edelstein says. "With such knowledge, the selection of architectural cues may be better able to address the brain's process during navigation, and inform design that better serves people who use different wayfinding strategies, or who have different sensory needs."
Schulze, who assisted the research team with the virtual-reality aspects of the study, said the team decided to conduct the study in a virtual environment rather than an actual building because the architectural parameters were easier to control in the StarCAVE.
Explains Schulze: "With this study we want to find out how a virtual model of some architectural model compares to a real building. We hope that the results are comparable, but if they are not, the differences may reveal significant information about effective cues.
"In a real building, you'll find a lot more irregularities than in the virtual environment and also in the level of resolution — a visual cue could be something as trivial as specks of dust in the air," adds Schulze. "We could improve many of these aspects if we only knew what to focus on. That's an interesting part of my research: To find out how we can make the virtual environment as realistic as possible so that people can't distinguish between it and a real environment. With the StarCAVE, we have the chance to actually get there, much more so than with any prior virtual-reality installation that exists in the world."
Edelstein says the results of this prototype study indicate a "progressively subtle use of visual cues as subjects navigated the ambiguous space."
"In the case where obvious cues were not presented, subjects looked for any distinguishing features that might indicate location, including shadows around doors or patterned finishes," she continues. "This suggests a continuum of cue effectiveness dependent on the surrounding context and the opportunity to repeatedly search for cues."
Schulze says the team has various ideas for how the study can be expanded in the future, including running the same study in Atkinson Hall proper.
"That's part of why we chose to do the study in the virtual-reality representation of the building — so that we can easily just do the same study in the real building," he says. "We've also considered running another user study with a more complex way-finding problem, since right now we only use one part of the building. In the future we might use either the whole first floor or develop a way-finding task that travels across floors. It's very easy to do — the data model exists — we just have to design a good, controllable study."
Meantime, the team plans to publish a paper based on the results of this prototype study and then use the findings to write a larger grant proposal. The researchers hope to eventually come to a deeper understanding of how brain processes underlying the formation of memories may provide new clues to how buildings should be designed.
Schulze says that once the results of the larger way-finding study are complete, and scientists know more about how humans create cognitive maps of their environment, researchers could further employ virtual reality to test the efficacy of certain visual cues, thus keeping building costs down.
"In future experiments we want to find out which of these landmark items work better for wayfinding than others," explains the Calit2 researcher. "We want to find out if we can reduce or optimize the costs for these landmarks by running these experiments in the virtual environment. That way we can find out what type of signage, wall color or other architectural cues work best, without having to conduct those trials in the real world."
In addition, future studies might focus on how the aesthetics of a building affect emotion centers in the brain, Schulze says.
"Although we didn't study it in this experiment, we did discuss the emotional effects of a building on a person with respect to how the person feels — whether they feel comfortable or safe, whether or not they feel like they know where they are. We can imagine that the equipment is capable of giving us the answers to these questions."
Related Links
UCSD Division of Biological Sciences
Swartz Center for Computational Neuroscience
UCSD's Atkinson Hall
Media Contacts
Tiffany Fox, (858) 246-0353, tfox@ucsd.edu