New Interactions and New Collaborations Within TDLC

New Haven, CT, April 20, 2007  -- One of the Temporal Dynamics of Learning Center's networks, the Perceptual Expertise Network (PEN), met April 19-20 for a workshop on the role of reward and attention in perceptual learning.

Marvin Chun and Michael Frank
Distinguished guest speakers Marvin Chun of Yale, and Michael Frank from the University of Arizona.

Forty-three people attended the two-day event held at the Yale Child Study Center in New Haven, where many ongoing collaborative projects were discussed. Two distinguished guest speakers were chosen by a committee of students and faculty. Marvin Chun (Yale University) told the group about his brain imaging work on the role of attentional control in perceptual encoding, and Michael Frank (University of Arizona) reported on a neurocomputational model of natural action selection that can account for both normal decision making as well as changes in cognition in various disorders such as ADHD or Parkinson's disease.

In addition, Andrea Chiba and Javier Movellan, the leaders of two other TDLC networks, participated in the workshop. The discussion allowed the leaders of new research networks to observe interactions in a group that has been collaborating for over six years, and to discover opportunities for cross-networks projects.

UCSD cognitive science professor Chiba, who leads the Interacting Memory Systems Network (IMS), uses animal models to study how affect and attention modulate learning and memory. She believes that TDLC scientists studying learning in animals and in humans can benefit from the opportunities to synchronize their research and confront some of the big questions that transcend each individual field of research. Such interactions are starting to take place within TDLC, both within each of the four research networks but also in cross-networks interactions.

Javier Movellan, who heads the Social Interaction Network (SIN), told the group about his work using a baby robot to study what visual information is available to infants, and what can be learned automatically, without any external teaching signal.

Because of the TDLC, Jim Tanaka and Robert Schultz (PEN) and Javier Movellan and Marni Bartlett (SIN) are now starting a collaboration bringing together three unique technologies. Let's Face It! is a computer-based intervention to facilitate face-processing skills in children with autism, developed in recent years by Bob Schultz and Jim Tanaka. The Machine Perception Laboratory at UCSD has recently developed the Computer Expression Recognition Toolbox (CERT), which analyzes facial expressions in real time. The UCSD group also developed RUBI, a social robot that can interact with toddlers to teach them simple concepts such as colors and shapes. Bringing together Let's Face It!,CERT and RUBI, the new collaboration aims at developing a computer-based, dynamic, real-time facial expression tutoring system for children with Autism Spectrum Disorders and social learning deficits, in the context of a social robot. The group believes that such a system could be an especially efficient intervention tool because controlled intervention and evaluation would take place in real time, while preserving the real-world temporal dynamics of a social learning experience.