Multimodal Diaries - Robotics Institute Carnegie Mellon University

Multimodal Diaries

Fernando De la Torre Frade and Carlos Agell
Conference Paper, Proceedings of IEEE International Conference on Multimedia and Expo (ICME '07), pp. 839 - 842, July, 2007

Abstract

Time management is an important aspect of a successful professional life. In order to have a better understanding of where our time goes, we propose a system that summarizes the user's daily activity (e.g. sleeping, walking, working on the pc, talking, ...) using all-day multimodal data recordings. Two main novelties are proposed: 1) A system that combines both physical and contextual awareness hardware and software. It records synchronized audio, video, body sensors, GPS and computer monitoring data. 2) A semi-supervised temporal clustering (SSTC) algorithm that accurately and efficiently groups huge amounts of multimodal data into different activities. The effectiveness and accuracy of our SSTC is demonstrated in synthetic and real examples of activity segmentation from multimodal data gathered over long periods of time.

Notes
associated project multimodal diaries

BibTeX

@conference{Frade-2007-9767,
author = {Fernando De la Torre Frade and Carlos Agell},
title = {Multimodal Diaries},
booktitle = {Proceedings of IEEE International Conference on Multimedia and Expo (ICME '07)},
year = {2007},
month = {July},
pages = {839 - 842},
keywords = {clustering, activity recognition, multimodal summarization},
}