Study shows automatic recognition of facial expressions tracks student engagement in real time
San Diego, April 15, 2014 -- Computer scientists have developed a technology that uses facial expression recognition to detect how engaged students are during a class and to predict how well they will do in that class. The team, led by scientists at the University of California, San Diego and Emotient, a San Diego-based provider of facial expression recognition, showed that the technology was able to detect students’ level of engagement in real time just as accurately as human observers. The team also included researchers from Virginia Commonwealth University and Virginia State University.
|
“Automatic recognition of student engagement could revolutionize education by increasing understanding of when and why students get disengaged,” said Dr. Jacob Whitehill, Machine Perception Lab researcher in UC San Diego’s Qualcomm Institute and Emotient co-founder. “Automatic engagement detection provides an opportunity for educators to adjust their curriculum for higher impact, either in real time or in subsequent lessons. Automatic engagement detection could be a valuable asset for developing adaptive educational games, improving intelligent tutoring systems and tailoring massive open online courses, or MOOCs.”
Whitehill (Ph.D., ’12) received his doctorate from the Computer Science and Engineering department of UC San Diego’s Jacobs School of Engineering.
The study consisted of training an automatic detector, which measures how engaged a student appears in a webcam video while undergoing cognitive skills training on an iPad®. The study used automatic expression recognition technology to analyze students’ facial expressions on a frame-by-frame basis and estimate their engagement level.
|
Movellan's work using computer vision to track student engagement grew, in part, out of his and Whitehill's involvement with the UCSD-based Temporal Dynamics of Learning Center (TDLC), which is led by CSE Prof. Gary Cottrell. TDLC helped fund the research and enabled the partnership between Whitehill, Movellan and another co-author on the paper, Virginia Commonwealth University professor of developmental psychology Zewelanji Serpell, who is a PI on TDLC's Social Interaction Network (along with Movellan). In addition to Movellan, Whitehill and Serpell, the study’s co-authors include Yi-Ching Lin and Aysha Foster from the department of psychology at Virginia State.
Emotient was founded by a team of six Ph.D.s from UC San Diego, who are the foremost experts in applying machine learning, computer vision and cognitive science to facial behavioral analysis. Its proprietary technology sets the industry standard for accuracy and real-time delivery of facial expression data and analysis. Emotient’s facial expression technology is currently available as an API for Fortune 500 companies within consumer packaged goods, retail, healthcare, education and other industries.
Media Contacts
Doug Ramsey/UC San Diego, 858.822.5825, dramsey@ucsd.edu, and Vikki Herrera/Emotient, 858.314.3385, Vikki@emotient.com
Related Links
Machine Perception Laboratory
The Faces of Engagement Article
Calit2’s Qualcomm Institute
Emotient, Inc.
Computer Science and Engineering
IEEE Transactions on Affective Computing
Temporal Dynamics of Learning Center