Mobile Sensing for Behavior Modeling

Seminar Date: 
2012-03-01
Seminar Time: 
10am-11am
Seminar Location: 
M-184 VALE, 200 Meyran Avenue
Presenter: 
Joy Ying Zhang, Ph.D.
Presenter's Institution: 
Carnegie-Mellon University, Silicon Valley

Today’s smart phones come equipped with a rich range of sensors including GPS, accelerometers, Wi-Fi, Bluetooth, NFC, microphone etc. Combined, this contextual information can tell us a great deal about a user’s current activity: where is the user located, and what is the user doing now and for how long? When logged, this data can provide important information about the user’s behavior patterns, based on which caregivers can design effective and personalized plans to improve the user’s health and wellbeing. If we can aggregate this type of information across hundreds of participants in a city, it can also tell us a great deal about that city, for example, wait times for buses, how public and private places are used, what residents typically do, and so on. This kind of large-scale data collection and analysis offers a way to understand human behavior at large scale, which can have positive impact in a number of domains, including health care, traffic planning, urban design, and social network analysis.

While collecting sensor information is trivial, making sense of these heterogeneous sensory data is challenging. In our research, we separate the data representation from the processing algorithms to develop a generic framework for mobile sensing. Through quantization and sensor fusion, input from heterogeneous sensors is converted into a symbolic representation called behavior text. Based on this text-like representation, well-established statistical natural language processing algorithms developed in the areas of language modeling, information retrieval, text summarization, text classification and even machine translation can be applied to tackle the mobile sensing problems.

^