The MIT Media Lab Human Dynamics research group has been harnessing the power of smartphone sensors in collecting data about how people interact with each other. The data are used to draw conclusions on how certain aspects of humanity, such as food choices and political opinions, are spread throughout a certain locality. The study has been in the laboratory for several years until recently when the group decided to make it open-source. This was done in hopes of awakening interests from researchers besides themselves.
The Funf system, as it is known, is a phone-based data collection system that utilizes the sensors on smartphones such as gyroscopes and accelerometers. It is composed of two parts: the Funf Journal and a set of tools. The journal manages the collection and exportation of sensor data and runs on phones that use Google’s Android OS. The tools, which are housed inside a PC, are used for managing and visualizing the data coming from the phone.
Users of the Funf Journal have the option to specify how often the system captures and analyzes data through checkbox menus. Previous sensors settings and configurations can also be saved and loaded at any time. Broadcast updates to participants in a study are also possible if the user allows it.
The developers of the system look at the possible improvements that other researchers can do. “There are a lot of other research groups that are reinventing the wheel,” says Nadav Aharony, a PhD student in the group who led the software’s development. “We felt that we were so advanced in this field that we wanted to share this.”
Together with Aharony are graduate student Wei Pan, Human Dynamics group leader Sandy Pentland, the Toshiba Professor of Media Arts and Sciences, MIT affiliate Cory Ip SM ’11, Masdar Institute scholar Inas Khayal, Cody Sumter, a master’s student, and Alan Gardner ’05, a software developer whose participation in the project was funded by a grant from Google. These researchers have turned the system into a user-friendly software package that was officially released last October 5.
How It Works
The controller for each sensor in the Funf framework is called a “probe”. Interpreting the raw data coming from the phones sensors can be a daunting task even for experts. This is why the Funf Journal comes with additional probes that are made to look for patterns on the data. One of the probes, for example, can distinguish the data coming from a phone accelerometer that is jostled on a subway train to a phone that is carried by a person walking briskly or climbing stairs. Thus, the system can provide a specific score for the person’s physical activity on a specified span of time. Each of the additional probes is configurable via the same menu that manages the sensors themselves.
The Funf Journal comes already with 30 probes built inside it but the developers at MIT are challenging others to build more. They suggest creating additional high-level and low-level probes that can use the data generated by other probes and so on. Non-technical users could also publish settings that they have found useful in specific tasks. Aharony envisions a free marketplace of Funf configurations and probes in the future.
Funf Journal comes with roughly 30 probes built in. But the Media Lab team is eager for developers outside MIT to invent additional high-level probes, and probes that use the data generated by those probes, and so on. Less tech-savvy users could still publish configuration settings that they’ve found useful for particular tasks. “You can imagine a free marketplace of these configurations and also of these probes,” Aharony says. The Media Lab team willingly provides APIs for interested developers which would allow them to add probes or other Funf features into their own programs without explicitly using the Funf Journal system.
More info here.