We focus on the use of first-person eye movement and ego-motion as a means of understanding and recognizing indoor activities from an “inside-out” camera system. We show that when eye movement captured by an inside looking camera is used in tandem with ego-motion features extracted from an outside looking camera, the classification accuracy of first-person actions can be improved. We also present a dataset of over two hours of realistic indoor desktop actions, including both eye tracking information and a high quality outside camera video. We run experiments and show that our joint feature is effective and robust over multiple users.
- Keisuke Ogaki, Kris Kitani, Yusuke Sugano and Yoichi Sato, “Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition”, in Proc. IEEE Workshop on Egocentric Vision (in conjunction with CVPR2012), June 2012.