UCI?s Carter Butts on the Human Dimension of Crisis Response

March 10, 2004 / By Anna Lynn Spitzer

Carter Butts
Carter Butts, co-PI of RESCUE and an assistant professor in UCI's Sociology Department

3.10.04 – Firestorms in southern California last October – described as the “worst natural disaster in southern California history” – have underscored the importance of being able to respond to disasters effectively. Doing so can reduce the number of deaths and injuries, contain or prevent secondary disasters, and reduce resulting economic losses and social disruption. Crisis responders need to gather situational information (e.g., state of the civil, transportation, and information infrastructures) and resource information (e.g., available medical facilities, rescue and law enforcement units). Clearly, there is a strong correlation between the reliability and timeliness of the information available and the quality of the decisions that responders make.

It is in this context that the RESCUE project was awarded a $12.5M Information Technology Research grant from the National Science Foundation last fall. The goal of this project is to radically transform the ability of responding organizations to gather, manage, use, and disseminate information to each other and the general public. Based on more robust information systems, response can focus on activities that have the highest potential to save lives and property. RESCUE is led by PIs Sharad Mehrotra, UCI, and Ramesh Rao, UCSD.

Carter Butts, an assistant professor in the UCI Sociology Department and one of the project’s co-PIs, is looking at how people – both emergency responders and the public – respond in a crisis situation. “My main focus within the project is the measurement and prediction of human behavior in crisis situations,” he says.

Butts’ group is also applying technology to self-assessment in crisis response. “We’re developing non-intrusive tools to collect better data so that response organizations can study what they really do – as opposed to what they think they do – to improve their performance over the long term,” he says.

His group is also developing technology solutions that will aid the response process itself. They’re eager to pilot some ideas that can be integrated with technologies in the field, like survivable communications systems using peer-to-peer technologies. “In real-world crisis settings,” he says, “you can have tremendous heterogeneity. You've got to be able to cope with background noise, infrastructure that isn’t working, and a crush of responders using different technologies and procedures. We hope to develop communications tools that will enable the right people to communicate at the right times, even under these difficult conditions.”

When asked to describe his group’s approach to these challenging problems, Butts says, “Our main tactic might be described as search, synthesis, and simulation, followed by testing. We comb the literature in a wide variety of disciplines, culling theories which relate to the problem at hand. We then attempt to synthesize these into computational models, which we use to produce predictions for real-world behavior. Ultimately, our aim is to test these predictions against data drawn from experimental and field settings, which will allow us to improve and refine the models.”

One model relates to responder communication. “For example,” says Butts, “we are working on techniques to non-intrusively gather data on how responders communicate with each other in the field." Proximity detection technology can be very helpful to determine, for example, who was within several feet of whom for an extended period. “Close proximity can be strong evidence of interpersonal interaction,” Butts says. While he notes that this source of information is fallible, combining it with phone and e-mail records (e.g., who’s sending messages to whom) and video records (showing people’s orientations with respect to each other) makes it possible to build a reliable picture of the total communication network. “Right now,” he says, “we have methods for integrating people’s perceptions of the communication network. Our objective is to extend these methods by bringing information technology into the mix.”

In a crisis, both responders and members of the general public have to quickly assess the situation to form an appropriate response. Butts says, “In a crisis, the natural response is to try to figure out the nature and degree of severity of the hazard, as well as the extent of personal exposure.” Once convinced of the reality of a threat, people do not sit idly by. Rather, they tend to react proactively, seeking information from trusted sources (which, besides their families, friends, and coworkers, can include the media and government officials), and trying to make sense of the situation. This process influences the way in which members of the public make decisions about how to respond to the threat, including whether or not to provide resources to response organizations.

"A common perception is that people become malfeasant,” says Butts. Instead, research indicates that people can become extremely cooperative: They want to help, which often leads to massive surges of volunteerism, such as we saw on 9/11. “We call these responses outbreaks of cooperation,” says Butts with a smile. “People employed in response professions who are off-duty or outside the affected area will stream in, for example,” he says. “That self-deployment is an asset, but it’s also a problem: How do you manage this kind of complexity, direct resources to the right places, and keep people safe and organized, especially when you’re dealing with a mix of response organizations – not to mention untrained volunteers – reporting for duty?”

One of the interesting social science components of the RESCUE project is its study of “humans as sensors.” “People have a tremendous ability to process and report information,” says Butts, "and they have great strengths relative to existing technology. A human can interpret complex situations, process speech and other social cues, and do other things which are impractical with current systems.” Another asset of human sensors is their ubiquity. “In any major crisis event,” he says, “you've got people on the scene. The potential upside to harnessing their observations for improved situation assessment is substantial.”

But people are not necessarily good sensors because of the limitations of their perception and memory. People don’t record exactly. They record, remember, and relate what’s consonant with their experience and knowledge. Furthermore, much information available to those on the scene may not be first-hand. The information provided by an informant at a crisis site may result from previous conversations with other persons in the area, thus “contaminating” the original report.

Nor are all informants created equal. Some informants may have had more direct experience with the crisis, have better memory, or simply be more experienced observers. This leads to a lower average error rate. But those with greater expertise can be more biased about what happened, due to their prior expectations. So whom do you trust? “The challenge in using informant reports lies in building statistical models that can cope with these kinds of errors,” says Butts.

The problem is essentially two-pronged: You want to figure out what happened and assess the accuracy of the sources of information. “If you know one, you can figure out the other,” says Butts. “Our methods allow you to figure out both at once.”

Butts’ approach allows for the analysis of spontaneously arriving reports in various modalities – people calling in on cell phones or sending e-mail, data provided by wirelessly enabled instruments such as sensors and wearable cameras surveying the crisis site, etc. – and combining these to infer the history of an unfolding event and the accuracy of the sources themselves. Even given their potential unreliability, human reports are a valuable part of this process, because they are available when other sources of information might not be.

The RESCUE project is extending this concept of humans as sensors in crisis situations to "communities as sensors” through the CAMAS testbed. CAMAS stands for Crisis Assessment, Mitigation, and Analysis System. CAMAS extends the notion of a conventional problem reporting system to include integrated event extraction, filtering, and analysis, with an emphasis on managing large volumes of data from a wide range of sources. Testbed applications for CAMAS in development include an “alternate 911” system for use in load management during crisis events, an integrated call-in system for storm tracking, and security enhancement for settings such as airports. These testbeds are expected to provide a rich source of experimental data to provide “ground truth” for the research conclusions, while showcasing various aspects of the project’s data collection and analysis technologies.

"This can be a very humbling field,” says Butts. “It’s not a field whose problems have never been thought about. In fact, practitioners and researchers have been trying to solve these problems for decades! What we can add – and where I hope to have impact – is our unusual combination of expertise in information technology, social science, and statistical analysis.”