UC San Diego Students Demonstrate Smart Camera Trap at New Engineering Competition
San Diego, May 7, 2012 -- Forget about building a better mouse trap. University of California, San Diego sophomore Riley Yeakle and his teammates have come up with a better camera trap, and they faced off with finalists from around the country when they unveiled working prototypes of their visions for embedded systems at a new, national engineering student competition. While falling short of the top three, the UC San Diego as well as UC Berkeley teams were awarded Honorable Mentions for their innovations.
|
Embedded systems are computer systems designed and built for specific tasks. The UC San Diego undergraduates demonstrated their Sentinel intelligent camera trap system at the first annual Cornell Cup USA competition, presented by Intel, May 4-5 at Walt Disney World in Florida. It is one of 22 collegiate teams selected as finalists to demonstrate their engineering projects in embedded system design and development.
The judges were looking at “how well we tackled the project, the robustness of the process, how we met specific technical challenges and limitations of existing camera traps, and how we measure our performance,” says Yeakle, the team captain from the Electrical and Computer Engineering (ECE) department of UCSD’s Jacobs School of Engineering. ““The rules make it clear that they [were] looking for the team that makes the best use of its time and resources to meet a specific need.”
Yeakle and fellow ECE undergraduates Perry Naughton (fourth year), Kyle Johnson and Chris Ward (both third-year undergrads) benefited from direct access to National Geographic Society explorers and engineers through the UCSD-National Geographic Engineers for Exploration program. Led by Albert Lin, a research scientist in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2), the program puts teams of students to work on technologies that could eventually be deployed in the field.
“Our top students from a variety of disciplines develop world-class engineering solutions for challenges in exploration,” says Lin, a three-time UC San Diego alumnus (Ph.D. in materials science ’08, M.S. in electrical engineering ’06, ’04). “The concept for the computer vision-based, tracking camera trap was proposed and developed by the students, and their excitement has been contagious.”
|
“We talked to our colleagues at National Geographic about the shortcomings of current camera-trap systems and what they’d like to be able to do with traps that cannot be done with what’s available commercially today,” explains Yeakle. “National Geographic helped us focus in on the computer vision and object tracking, since other systems do not yet offer these features.”
“They expressed interest in being able to track animals, and that’s what convinced us to take advantage of camera vision to pinpoint and track animals automatically and remotely – capturing their movements in the wild with no explorer, photographer or scientist present.”
The new, improved camera trap introduces a cascade of low-power ‘geophone’ vibration sensors placed around the camera turret. The geophones convert ground movement into voltage, so if an animal comes close enough to trip a sensor (usually within a radius of three feet from the sensor), the turret automatically swings around to face in the direction of the tripped sensor.
Simultaneously, an Intel Atom processor begins to run a computer-vision algorithm looking for ‘blobs’ of color (groups of similarly colored pixels). A large, tan-colored blob, for example, may indicate a lion. If so, the camera can lock on to the moving blob and begin tracking it.
“Our object-tracking algorithm sends instructions to the stepper motors of the turret to keep the object in the center of the camera frame,” explains Yeakle. “This allows the camera to capture high-quality video of a moving animal, as well as an extended opportunity to take photographs after a regular camera trap may no longer ‘see’ the animal because it has disappeared out of frame.”
“Getting access to the Intel Atom processor was a major factor in our decision to go the distance and integrate computer-vision techniques,” adds teammate Perry Naughton. “The Atom has good processing power, and computer vision is processing intensive.”
|
The resulting system has 360-degree coverage, so the digital single-lens-reflex (D-SLR) camera in the turret could track and record an animal walking in circles around the camera.
“Current camera traps have a limited field of view and they often are triggered mistakenly because the camera records the scene even if there is no animal there,” says Yeakle. “Our design overcomes those problems, and our ultimate goal is to produce a robust, intelligent, autonomous and low-power system that can be deployed in different environments.”
National Geographic has already expressed interest in the technology for future deployment to capture video and photos of wildlife in the field.
|
“I am incredibly proud of the synergistic energy of this dynamic team, made up of students who define the best in interdisciplinary collaboration, initiative, and leadership,” beams Calit2’s Lin, who is also a National Geographic Emerging Explorer. “The methods that they have created will have real impact in exploration, and they highlight the power of engineering for real-world applications.”
Based upon the highly successful Intel Cup China that attracts over 26,000 students, the Cornell Cup USA was designed to provide an exciting exposition that invites students’ imaginations to construct any design they can dream up and create as the next great embedded technology invention. The top three winning teams shared in $10,000, $5,000 and $2,500 prizes, respectively. The competition is organized by David Schneider of Cornell’s Systems Engineering program; and Byron Gillespie and Kimberly Sills of Intel.
Three of the 22 finalists for the inaugural Cornell Cup USA are based in California. UC San Diego’s Sentinel and UC Berkeley’s Solar Drone UAV are the only University of California projects to make the grade, as did the University of Southern California’s team building VISIONary, an indoor navigation system for people who are visually impaired.
Intel travel support allowed all four members of the UC San Diego team attend the competition. Also on hand: team advisor Ryan Kastner, a professor of Computer Science and Engineering (CSE) in the Jacobs School of Engineering. Kastner teaches CSE 145, a course on embedded systems, and co-directs the Engineers for Exploration program. He is currently on sabbatical in Washington, D.C. at National Geographic.
“The students have done a great job in building Sentinel as an intelligently-triggered video trap that senses and sees its environment in order to capture high-definition video of all animals in its surroundings,” says Kastner. “Observing and documenting the behavior of elusive and endangered species will one day be easier, cheaper and more robust thanks to the students’ ingenuity, teamwork, and our one-of-a-kind partnership with National Geographic.”
In the run-up to the competition in Florida, the UC San Diego team first tested the camera trap on a red box moving around the camera, carried by hand. Then it was time to test their system ‘in the wild’, i.e., in a backyard with several dogs. So far, so good.
[Editor's note: This news release updated May 7 based on competition results.]
Related Links
Camera Trap Project
UCSD-NGS Engineers for Exploration Program
Cornell Cup USA, Presented by Intel
Media Contacts
Doug Ramsey, 858-822-5825, dramsey@ucsd.edu