Fellows Conclude Summer Program

By Sharon Henry

Irvine, August 30, 2016 — SURF IoT Fellows delivered their final project presentations at Calit2, Thursday, Aug. 25.

The nine UC Irvine undergraduates marked the culmination of a unique ten-week summer research program that paired them with faculty mentors to conduct hands-on research related to the Internet of Things.

The Summer Undergraduate Research Fellowship in the Internet of Things (SURF-IoT) — first offered as the Summer Undergraduate Research Fellowship in Information Technology (SURF-IT) in 2005, and renamed in 2015 — is sponsored by Calit2 and the Undergraduate Research Opportunities Program (UROP).

Fellows and projects include:

TurtleBot with Qualcomm Snapdragon ARM CPU

Fellow: David Gogokhiya
Mentor: Solmaz Kia
Additional Mentor: Eli Bozorgzadeh

Mobile robots are increasingly being used in applications such as rescue operations, underwater exploration and space exploration. However, for a swarm of robots to successfully accomplish its task, each robot must first be able to determine its location in the environment. This project is focused on developing a robotic testbed for a robot localization technique called Cooperative Localization, which is one of a number of promising algorithms for GPS denied environments. This technique uses the relative measurements that robots take from each other to improve their localization accuracy. The testbed that we are developing consists of four mobile robots called Turtlebots that run on an open source robotic operating system called ROS. We began by upgrading the Turtlebot’s operating machine from a netbook to Qualcomm Snapdragon ARM CPU to increase the control over computational and communication energy expenditure of the robots. Then, we moved to putting ROS on ARM processors and writing wrappers and software for ROS to simultaneously control multiple Turtlebots. Through our experiments using our testbed, we observed that relying only on the robots’ equations of the motion propagated by measuring the wheel velocities cannot accurately localize the robots. For example, in three minutes of test run, we observed a 30cm error in a robot’s location estimate. We are currently focused on using the Turtlebot’s Kinect camera to take relative measurement from other robots in swarm operations and use these measurements to increase the localization accuracy of the robots using Cooperative Localization techniques.

Mining Twitter/Flickr for Plant Data

Fellow: Xin Hu
Mentor: Bill Tomlinson
Additional Mentors: Juliet Norton, Ankita Raturi

This work seeks to leverage social media to support California native plant horticulture by capturing and distributing ethnobotanical information and plant lifecycles, locations, and community diversity. People are producing and publishing a variety of local plant data on social media platforms, and hashtags are a useful mechanism used to categorize plants based on people’s interests and observations, as well as for community participation. Exploring such data allows us to discover people’s preferences and opinions on plants they encounter. This work seeks to structure existing public plant datasets, improve access to communities of interest, and to allow people to explore and learn more about native plants. We present TagYourPlant, a web application that mines plant pictures and hashtag data from two social media platforms: Twitter and Flickr. A dataset of about a half million posts was collected and processed using a filtering system and regular expression based rule. A tag relevance based scoring system was used to select the most relevant plant data, with a user also able to flag errors to improve accuracy. In contrast with plant data offered by academic or professional organizations, TagYourPlant provides relatable, concise, and up-to-date data. TagYourPlant makes the search process social and visual. In the future, TagYourPlant can be integrated with the Software for Agricultural Ecosystems Plant Database under development in the Green IT Lab.

A New Augmented Reality Interface for Game Based Stroke TeleRehabilitation

Fellow: Arzang Kasiri
Mentor: Walter Scacch
In this study, we developed a new version of an Augmented Reality Game Based Stroke TeleRehabilitation (AR-GBSTR) system. Augmented Reality is a means of human-computer interaction in which users can perceive and interact with virtual objects juxtaposed with physical objects. In a prior study, AR game based stroke telerehabilitation resulted in better outcomes compared to traditional game based stroke telerehabilitation in the activity that was tested. The problem is that the earlier version was too expensive to reproduce and, hence, not scalable. AR-GBSTR version 2 is an attempt at developing a lower cost, more scalable system through the use of IoT sensors, effectors, and controllers. In this study we developed an initial prototype system using the Leap Motion hand gesture sensor, an additional tablet computer as a secondary AR display, the Blender 3D modeling environment, and the Unity game development environment. The Leap Motion sensor uses a smart infra-red sensor to track hand gesture and position in 3D space. Using this collection of sensors, devices, and software, we were able to create an AR game that incorporates upper-extremity stroke rehabilitation exercises and movement gestures. We are preparing for an initial formative assessment by the stroke rehabilitation research team. The results of this study will be included in the final presentation.

Marching Cubes Made Tangible

Fellow: Aldrin Ryan Lupisan
Mentor: Jesse Jackson

An art application that required a large number of 3D printed objects provided an opportunity to optimize a printer farm: an array of 24 matching 3D printers. The primary requirements for this application were maximum speed while maintaining and adequate finish. Two main parameters were changed in the settings prior to printing: infill (%) and speed (mm/s). Both infill and travel speed determine how much filament is needed and how long a block takes to complete. Initially, the blocks were set to 25% infill, which produced hefty blocks that took more than 12 hours to print, but changing to 15% infill reduced the print time to 8 hours. Side by side, the finish of the 25% and 15% were roughly the same and had few distinguishable features. This further led to more trials of less than 15% infill, finally concluding with 6% infill. A travel speed of ~60 mm/s is the recommended “fastest” travel speed, but our tests found that it was possible to run at 100 mm/s while still maintaining a proper finish. Failure came very often, as some infills crumbled when compressed by hand and the prints repeatedly revealed ridging. We explored many interlocking other variables besides the two that were mentioned previously, resulting in varied results. These findings are important for the 3D printing community since there are very few printer farms in existence that are rigorously tested at our rate, so a detailed evaluation of successes and failures is useful to future similar endeavors.

Proabot Flotilla

Fellow: Asis Nath
Mentor: Simon Penny
The goal of the Proabot Flotilla research project is to build a prototype for a fleet of inexpensive and environmentally friendly autonomous sailing craft—powered by the sunlight and the wind—for oceanographic and environmental research. The prototype is radio controlled, using a custom control interface, two-way radio communication and custom sensors and actuators. The final version will be linked to the internet via satellite. In the prototype, the controller and boat both carry Arduino microcontrollers. Sail and rudder positions are given by the controller and managed by onboard sensor feedback. The first sensor suite includes a custom anemometer, wind direction indicator, water speed sensor and digital compass. Other sensors for collecting data like turbidity, water chemistry, currents, temperature, radiation (etc.) will be developed.

Design of Implantable Ocular Micro Pressure Sensor for Continuous Monitoring

Fellow: Ruthannah Wang
Mentor: G.P. Li

Additional Mentors: Mark Bachman, Sarkis Babikian
Glaucoma, a medical condition caused by increased pressure within the eye, is the world’s leading cause of blindness, with 70 million cases worldwide. Currently, methods of patient care have been inadequate for continuously monitoring the development of the condition. In response to this need, this study explores several methods of designing an ocular implant with the ability to monitor eye pressure wirelessly and in real time. One is a radio frequency resonator passive L-C circuit, which can be implanted in the eye and consists of a micro inductor and a micro capacitor. The L-C circuit design uses an inductor with fluidic magnetic core as the pressure transducer. When the pressure is increased in the eye the inductance L increases in a known way, which causes a shift in the resonance frequency of the L-C circuit. An external antenna at radio frequency excites the implanted wireless L-C circuit, and changes in the reflected wave are monitored to quantify changes in the pressure inside the eye. A second sensor is an implantable passive optical sensor, consisting of a membrane with micro features. Increased pressure in the eye causes the membrane to deflect and stretch, which causes a change in the way light scatters off this membrane. By monitoring the light scattering patterns from this membrane the pressure inside the eye can be quantified. Scaled laboratory prototypes were fabricated for both types of sensors, and tested inside a pressure chamber that was designed to represent a physical model of the eye with similar pressure conditions. An analog-to-digital interface was also designed using a microcontroller to control and monitor the pressure inside the chamber in real time. Preliminary results showed a linear response for both types of sensors. These results suggest that both strategies yield easily interpreted data and are thus generally viable for a potential ocular implant design.

News and Social Media Data Analytics using TextDB, a Text-Centric Data Management System

Fellow: Zuozhi Wang
Mentor: Chen Li

News and social media services such as Facebook and Twitter generate a huge amount of data on a daily basis. Many companies and organizations rely on these data to analyze user behaviors and make critical business decisions. These requirements bring many new challenges, including storage, search, analysis, and visualization. In this project, we want to study how to do text analytics on social media data using state-of-the-art tools and techniques. As an example, we want to find solutions to problems such as “extract information about tweets mentioning Zika and their corresponding locations.” We are developing and using TextDB, a text-centric data management system, to analyze news and social media data efficiently.

MediCom: Comprehensive, Intuitive and Interactive View of Health Aspects to Improve Medication Compliance

Fellow: Chifeng Wen
Mentor: John Billimek

Additional Mentors: Sergio Gago, Anmol Rajpurohit
The United States of America (USA) has the highest health care cost inflation among leading developed nations. From 2006 to 2010, healthcare costs in the USA increased by a staggering 19%. More importantly, the overspending in healthcare in the USA due to overuse is estimated to be $750 billion. One of the most promising solutions is patient-centered healthcare. More than half of patients nationwide do not take their medications as prescribed, yet a very small minority of those patients discusses concerns about their medication with the healthcare provider. Our method to increase patient accountability is MediCom, a multiplatform experience for users to access on computers, tablets and smartphones. Our objective was to identify a set of technologies that could collect health information. We tested health collection devices that could track information and export data as a .csv file. Our team tested step trackers, pill bottle cap counters, blood pressure monitors, blood sugar self-testing devices, and phone applications. We chose devices based on cost, convenience, size, ease of exporting, and sharing capabilities. We incorporated the data the devices tracked into our phone application and website. The data is presented as a chart for both patients and doctors to use.