August 27, 2014 / By Tiffany Fox
San Diego, Calif., Aug. 27, 2014 — The National Biomedical Computation Resource (NBCR) at the University of California, San Diego has received $9 million in funding from the National Institutes of Health (NIH). The funding will allow NBCR to continue its work connecting biomedical scientists with supercomputing power and emerging information technologies.
Biomedical computation – which applies physical modeling and computer science to the field of biomedical sciences – is often a cheaper alternative to traditional experimental approaches and can speed the rate at which discoveries are made for host of human diseases and biological processes.
The five-year NIH grant from the National Institute of General Medical Sciences provides funding for everything from staffing and training to developing biomedical research technologies for academic researchers around the world. It involves faculty from UC San Diego’s Physical Sciences, School of Medicine, Jacobs School of Engineering, San Diego Supercomputer Center (SDSC), as well as faculty from The Scripps Research Institute (TSRI), a private, non-profit research organization.
“NBCR has evolved tremendously in the 21 years since it was created,” says Rommie Amaro, the Director of NBCR, an Associate Professor of Chemistry and Biochemistry at UC San Diego and an affiliate of the UC San Diego Qualcomm Institute (QI). “Our main effort remains focused on making connections across diverse scales of biological organization. As scientists, we are very good at looking at particular components of the human body within a single scale, but we ultimately need to connect across three or four scales in order to model and understand complex biological phenomena from the molecular level minutia all the way up to the whole organ.”
NBCR is run under the auspices of the UC San Diego Center for Research in Biological Systems at QI. It provides a collection of computational tools – web services, graphical models, simulation methods and technologies and workflows – that make it possible for, say, a molecular biologist or neuroscientist to extrapolate how the molecular dynamics in brain cells might affect the whole organ.
“A large part of what we do is focused on incorporating dynamics to better understand how biological players interact to create emergent phenomena,” adds Amaro. “Experimentalists can take snapshots of what’s happening in the body, but these views provide insights that are, essentially, frozen in time. We use predictive physical modeling in order to virtually animate how all the different players come together and collectively act to result in behavior that would otherwise be unpredictable if we were limited to looking at a single scale.”
By way of example, Amaro cites the cross-disciplinary work of Mathematics Professor Michael Holst, Neurosciences Professor Mark Ellisman, Chemistry and Biochemistry Professor Andrew McCammon and Bioengineering Professor Andrew McCulloch – all at UC San Diego – as well as colleagues Michel Sanner and Art Olson at TSRI. All work collaboratively to develop new tools and technologies that enable scientists to model and understand the causes of heart failure.
Together, the team develops patient models of hearts available through NBCR to analyze what happens at the organ level when a heartbeat becomes irregular. These models are connected to images of the macroscopic units that regulate calcium (and thus heart beats); drilling deeper down reveals defects in the molecular components that interact with and respond to calcium ions. Models are visualized at multiple scales using state-of-the-art software developed by the TSRI collaborators.
Amaro explains: “The tools developed by NBCR allow researchers to follow a hypothesis all the way from the whole organ, through to the level of cells, and, deeper still, connecting all the way down to the protein or small molecule level. At the very smallest of scales is where drug development happens – and we develop tools for that, too.
“Underneath our abilities to connect across these scales,” she continues, “we are driven by the emerging current of big data. For example, the amount of data coming off new microscopes is potentially petabytes per day. How do we best handle, store, and analyze this data?” According to Computer Weekly, one petabyte is enough to store the DNA of the entire population of the U.S. – and then clone them, twice.
“A related challenge," adds Amaro, "is the sheer diversity of biomedical data, and that’s where workflows come in. NBCR’s workflow-enabled tools and their interfaces make it possible to perform a complicated experiment with over 100 different steps with the click of a button. This not only makes the experiments more reliable and reproducible, but also makes it easier for research leaders to teach and train the next generation of data-savvy scientists.
“Big data is a big part of this puzzle and is the driving force behind many of the tools that we are developing now and over these next 5 years.”
Ilkay Altintas de Callafon, the Director of the Workflows for Data Science Center of Excellence at SDSC and an affiliate of QI, develops workflow tools for NBCR that help researchers tackle tough research problems (reproducibility is one) that arise when research is conducted across scales.
“Her work is very important,” notes Amaro, “since conducting this type of research is hard enough at a single scale and even harder when crossing two or three or more scales.”
Also crucial to NBCR’s mission is the work of SDSC Program Director Phil Papadopoulos, who has created a user-driven, software-based framework for research teams to share significant quantities of data – rapidly, securely and privately – across geographic distance and computing systems.
“With this funding from NIH we have plans to build a number of tools to allow us to develop atomic-scale models of much larger biological structures than ever before,” says Amaro. “We used to just study single proteins – now we have tools that allow us to contextualize them in their subcellular environments. We can look, for example, not just at a single protein but instead understand how this protein sits inside an entire influenza viral particle.
“Another example relates to autism and brain research - looking at neurons in trans-synaptic areas. Instead of just looking at one protein, we are now connecting to subcellular views provided by emerging microscopy techniques to fill in atomic-level detail within these larger swaths of biological space. We’re talking about scaling up the system sizes from tens of nanometers to hundreds of nanometers, even microns in dimension.”
These studies require innovations in modeling techniques, hardware and infrastructure development, data handling, and visualization capabilities. NBCR contributes to all of these aspects.
Amaro points out that NBCR’s reach extends far beyond the confines of one particular field.
“It’s research that is synergistically involved with a lot of different efforts on campus and across the Torrey Pines Mesa,” she adds. NBCR also supports students in the Pacific Rim Experiences for Undergraduates (PRIME) program, which provides undergraduates the opportunity to do real research while living for nine weeks in one of several Pacific Rim countries, working with mentors at both the host institution and at UC San Diego.
“That’s one of the most exciting things about the resource – all the different people involved, from the School of Medicine to SDSC and the Qualcomm Institute to Engineering, the Physical Sciences and TSRI. NBCR is one of UCSD’s most integrated resources. We have faculty and members from nearly every school and division on campus, as well as participants from across the mesa. It’s a powerful example of how much more can be accomplished when we work together to address challenges at the frontier of biomedical research.”
Related Links
National Biomedical Computation Resource
Media Contacts
Tiffany Fox, (858) 246-0353, tfox@ucsd.edu