Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition
Workshop Paper, CVPR '12 Workshop on Egocentric Vision, June, 2012
Abstract
We focus on the use of first-person eye movement and ego-motion as a means of understanding and recognizing indoor activities from an “inside-out” camera system. We show that when eye movement captured by an inside looking camera is used in tandem with ego-motion features extracted from an outside looking camera, the classification accuracy of first-person actions can be improved. We also present a dataset of over two hours of realistic indoor desktop actions, including both eye tracking information and a high quality outside camera video. We run experiments and show that our joint feature is effective and robust over multiple users.
BibTeX
@workshop{Ogaki-2012-109821,author = {K. Ogaki and Kris M. Kitani and Y. Sugano and Y. Sato},
title = {Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition},
booktitle = {Proceedings of CVPR '12 Workshop on Egocentric Vision},
year = {2012},
month = {June},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.