ABSTRACT
Recognition of user activities is a key issue for context-aware computing. We present a method for recognition of user daily activities using gaze motion features and image-based visual features. Gaze motion features dominate for inferring the user's egocentric context whereas image-based visual features dominate for recognition of the environments and the target objects. The experimental results show the fusion of those different type of features improves performance of user daily activity recognition.
- Biedert, Ralf, Buscher, Georg, and Dengel, Andreas. The eyeBook: Using Eye Tracking to Enhance the Reading Experience. Informatik-Spektrum, 33, 3 (Sep 2009), 272--281.Google Scholar
- Bolt, Richard A. Eyes at the interface. In Proceedings of the 1982 Conference on Human Factors in Computing Systems (1982), ACM, 360--362. Google ScholarDigital Library
- Bulling, Andreas, Ward, Jamie, Gellersen, Hans, and Töster, Gerhard. Eye movement analysis for activity recognition using electrooculography. IEEE transactions on pattern analysis and machine intelligence, 33, 4 (2011), 741--53. Google ScholarDigital Library
- Chang, Chih-Chung and Lin, Chih-Jen. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2 (2011), 27:1--27:27. Google ScholarDigital Library
- Csurka, Gabriella and Dance, C. Visual categorization with bags of keypoints. In Workshop on Statistical Learning in Computer Vision, ECCV (2004), 1--22.Google Scholar
- Fathi, Alireza, Li, Yin, and Rehg, James M. Learning to Recognize Daily Actions Using Gaze. In Proceedings of the 12th European Conference on Computer Vision - Volume Part I (2012), Springer-Verlag, 314--327. Google ScholarDigital Library
- Hipiny, Irwandi Mohamad and Mayol-Cuevas, Walterio. Recognising Egocentric Activities from Gaze Regions with Multiple-Voting Bag of Words. CSTR-12-003 (2012), 1--15.Google Scholar
- Hong, Jong-yi, Suh, Eui-ho, and Kim, Sung-Jin. Context-aware systems: A literature review and classification. Expert Systems with Applications, 36, 4 (May 2009), 8509--8522. Google ScholarDigital Library
- Iqbal, Shamsi T and Bailey, Brian P. Using Eye Gaze Patterns to Identify User Tasks. In: The Grace Hopper Celebration of Women in Computing (2004).Google Scholar
- Moghimi, Mohammad, Azagra, Pablo, Montesano, Luis et al. Experiments on an RGB-D Wearable Vision System for Egocentric Activity Recognition. CVPR Workshop on Egocentric (First-person) Vision (2014).Google Scholar
- Nowak, Eric. Sampling Strategies for Bag-of-Features. Proceedings of the 9th European Conference on Computer Vision - Volume Part IV (2006), 490--503. Google ScholarDigital Library
- Ogaki, Keisuke, Kitani, Kris M., Sugano, Yusuke, and Sato, Yoichi. Coupling eye-motion and ego-motion features for first-person activity recognition. 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (Jun 2012), 1--7.Google ScholarCross Ref
- Sukthankar, R. PCA-SIFT: a more distinctive representation for local image descriptors. Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004., 2 (2004), 506--513. Google ScholarDigital Library
- Toyama, Takumi, Kieninger, Thomas, Shafait, Faisal, and Dengel, Andreas. Gaze guided object recognition using a head-mounted eye tracker. In Proc. of the Symposium on Eye Tracking Research and Applications (New York, NY, USA 2012), ACM, 91--98. Google ScholarDigital Library
- Wu, TF, Lin, CJ, and Weng, RC. Probability estimates for multi-class classification by pairwise coupling. Journal of Machine Learning Research, 5 (2004), 975--1005. Google ScholarDigital Library
Index Terms
- Daily activity recognition combining gaze motion and visual features
Recommendations
Combining Low and Mid-Level Gaze Features for Desktop Activity Recognition
Human activity recognition (HAR) is an important research area due to its potential for building context-aware interactive systems. Though movement-based activity recognition is an established area of research, recognising sedentary activities remains ...
Visual features for ego-centric activity recognition: a survey
WearSys '18: Proceedings of the 4th ACM Workshop on Wearable Systems and ApplicationsWearable cameras, which are becoming common mobile sensing platforms to capture the environment surrounding a person, can also be used to infer activities of the wearer. In this paper we critically discuss features for ego-centric activity recognition ...
Complex activity recognition using context-driven activity theory and activity signatures
In pervasive and ubiquitous computing systems, human activity recognition has immense potential in a large number of application domains. Current activity recognition techniques (i) do not handle variations in sequence, concurrency and interleaving of ...
Comments