ABSTRACT
Eye-trackers are expected to be used in portable daily-use devices. However, it must register object information and define a unified coordinate system in advance for human--computer interaction and quantitative analysis. Therefore, we propose a semantic 3D gaze mapping to collect gaze information from multiple people on the unified map and detect focused objects automatically. The semantic 3D map can be reconstructed using keyframe-based semantic segmentation and structure-from-motion, and the 3D point-of-gaze can also be computed on the map. We confirmed that the fixation time of the focused object can be calculated through an experiment without prior information.
- Pablo F Alcantarilla and T Solutions. 2011. Fast explicit diffusion for accelerated features in nonlinear scale spaces. IEEE Trans. Patt. Anal. Mach. Intell 34, 7 (2011), 1281--1298.Google Scholar
- Kai Essig, Daniel Dornbusch, Daniel Prinzhorn, Helge Ritter, Jonathan Maycock, and Thomas Schack. 2012. Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 37--44.Google ScholarDigital Library
- Kakeru Hagihara, Keiichiro Taniguchi, Irshad Abibouraguimane, Yuta Itoh, Keita Higuchi, Jiu Otsuka, Maki Sugimoto, and Yoichi Sato. 2018. Object-wise 3D Gaze Mapping in Physical Workspace. In Proceedings of the 9th Augmented Human International Conference. ACM, 25.Google ScholarDigital Library
- Michael Maurus, Jan Hendrik Hammer, and Jürgen Beyerer. 2014. Realistic heatmap visualization for interactive analysis of 3D gaze data. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 295--298.Google ScholarDigital Library
- Pierre Moulon, Pascal Monasse, Romuald Perrot, and Renaud Marlet. 2016. Openmvg: Open multiple view geometry. In International Workshop on Reproducible Research in Pattern Recognition. Springer, 60--74.Google Scholar
- Susan M Munn and Jeff B Pelz. 2008. 3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 181--188.Google ScholarDigital Library
- Jeff B Pelz, Thomas B Kinsman, and Karen M Evans. 2011. Analyzing complex gaze behavior in the natural world. In Human Vision and Electronic Imaging XVI, Vol. 7865. International Society for Optics and Photonics, 78650Z.Google ScholarCross Ref
- Thies Pfeiffer. 2012. Measuring and visualizing attention in space with 3D attention volumes. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 29--36.Google ScholarDigital Library
- Thies Pfeiffer, Patrick Renner, and Nadine Pfeiffer-Leßmann. 2016. EyeSee3D 2.0: model-based real-time analysis of mobile eye-tracking in static and dynamic three-dimensional scenes. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 189--196.Google ScholarDigital Library
- Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, and Tsukasa Ogasawara. 2010. Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movements. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. ACM, 157--160.Google ScholarDigital Library
- Kentaro Takemura, Kenji Takahashi, Jun Takamatsu, and Tsukasa Ogasawara. 2014. Estimating 3-D point-of-regard in a real environment using a head-mounted eye-tracking system. IEEE Trans. Human-Machine Syst. 44, 4 (2014), 531--536.Google ScholarCross Ref
- Hengshuang Zhao, Jianping Shi, Xiaojuan Qi, Xiaogang Wang, and Jiaya Jia. 2017. Pyramid scene parsing network. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2881--2890.Google ScholarCross Ref
Index Terms
- Semantic 3D gaze mapping for estimating focused objects
Recommendations
Object-wise 3D Gaze Mapping in Physical Workspace
AH '18: Proceedings of the 9th Augmented Human International ConferenceUnderstanding the intention of other people is a fundamental social skill in human communication. Eye behavior is an important, yet implicit communication cue. In this work, we focus on enabling people to see the users' gaze associated with objects in ...
Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression
ICPR '06: Proceedings of the 18th International Conference on Pattern Recognition - Volume 01We propose a novel method for tracking eye gaze that allows natural head movement. Most existing remote eye gaze trackers cannot work under natural head movement due to the difficulty of building a gaze mapping function that can incorporate head motion ...
Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movements
ETRA '10: Proceedings of the 2010 Symposium on Eye-Tracking Research & ApplicationsThe portability of an eye tracking system encourages us to develop a technique for estimating 3D point-of-regard. Unlike conventional methods, which estimate the position in the 2D image coordinates of the mounted camera, such a technique can represent ...
Comments