Multimodal Diaries - Robotics Institute Carnegie Mellon University

Multimodal Diaries

Portrait of Multimodal Diaries
This Project is no longer active.

In this project, we propose a system that summarizes a user’s daily activity (e.g. sleeping, eating, working on a pc) from multimodal data (audio, video, body sensors, …). The system combines both the physical and contextual awareness of hardware and software to record synchronized audio-visual, body sensing, global position information and computer monitoring data. A key component of our system is a semi-supervised temporal clustering algorithm that efficiently and accurately groups large amounts of multimodal data by activity. The system has a wide variety of applications to time management, self-knowledge and human computer interaction systems in general.

Displaying 1 Publications

past staff

  • Carlos Agell