Abstract
Analyzing gaze behavior with dynamic stimulus material is of growing importance in experimental psychology; however, there is still a lack of efficient analysis tools that are able to handle dynamically changing areas of interest. In this article, we present DynAOI, an open-source tool that allows for the definition of dynamic areas of interest. It works automatically with animations that are based on virtual three-dimensional models. When one is working with videos of real-world scenes, a three-dimensional model of the relevant content needs to be created first. The recorded eye-movement data are matched with the static and dynamic objects in the model underlying the video content, thus creating static and dynamic areas of interest. A validation study asking participants to track particular objects demonstrated that DynAOI is an efficient tool for handling dynamic areas of interest.
Article PDF
Similar content being viewed by others
References
Avidan, S., & Shashua, A. (2000). Trajectory triangulation: 3D reconstruction of moving points from a monocular image sequence. IEEE Transactions on Pattern Analysis & Machine Intelligence, 22, 348–357. doi:10.1109/34.845377
Bednarik, R. (2007). Methods to analyze visual attention strategies: Applications in the studies of programming. Unpublished doctoral dissertation, University of Joensuu.
Brasel, S. A., & Gips, J. (2008). Points of view: Where do we look when we watch TV? Perception, 37, 1890–1894. doi:10.1068/p6253
Brockmole, J. R., & Henderson, J. M. (2007). Prioritizing new objects for eye fixation in real-world scenes: Effects of object-scene consistency. Visual Cognition, 16, 375–390. doi:10.1080/13506280701453623
Fehd, H. M., & Seiffert, A. E. (2008). Eye movements during multiple object tracking: Where do participants look? Cognition, 108, 201–209. doi:10.1016/j.cognition.2007.11.008
Florin, H. (2008). Eye-tracking framework für dynamische AOIs. Unpublished diploma thesis, Eberhard Karls Universität Tübingen.
Han, M., & Kanade, T. (2004). Reconstruction of a scene with multiple linearly moving objects. International Journal of Computer Vision, 59, 285–300. doi:10.1023/B:VISI.0000025801.70038.c7
Henderson, J. M., Weeks, P. A., Jr., & Hollingworth, A. (1999). The effects of semantic consistency on eye movements during complex scene viewing. Journal of Experimental Psychology: Human Perception & Performance, 25, 210–228. doi:10.1037/0096-1523.25.1.210
Hoffman, J. E., & Subramaniam, B. (1995). The role of visual attention in saccadic eye movements. Perception & Psychophysics, 57, 787–795.
Holmberg, A. (2007). Eye tracking and gaming: Eye movements in Quake III. Arena. Master’s thesis, Lund University.
Huff, M., Papenmeier, F., Jahn, G., & Hesse, F. W. (2009). Eye movements across viewpoint changes in multiple object tracking. Manuscript submitted for publication.
Kuhn, G., Tatler, B. W., Findlay, J. M., & Cole, G. G. (2007). Misdirection in magic: Implications for the relationship between eye gaze and attention. Visual Cognition, 16, 391–405. doi:10.1080/13506280701479750
Landry, S. J., Sheridan, T. B., & Yufik, Y. M. (2001). A methodology for studying cognitive groupings in a target-tracking task. IEEE Transactions on Intelligent Transportation Systems, 2, 92–100. doi:10.1109/6979.928720
McDonald, S. A., & Shillcock, R. C. (2003). Eye movements reveal the on-line computation of lexical probabilities during reading. Psychological Science, 14, 648–652. doi:10.1046/j.0956-7976.2003.psci_1480.x
Pollefeys, M., Koch, R., Vergauwen, M., & Van Gool, L. (2000). Automated reconstruction of 3D scenes from sequences of images. ISPRS Journal of Photogrammetry & Remote Sensing, 55, 251–267. doi:10.1016/S0924-2716(00)00023-X
Posner, M. I. (1980). Orienting of attention. Quarterly Journal of Experimental Psychology, 32, 3–25. doi:10.1080/00335558008248231
Pylyshyn, Z. W., & Storm, R. W. (1988). Tracking multiple independent targets: Evidence for a parallel tracking mechanism. Spatial Vision, 3, 179–197. doi:10.1163/156856888X00122
Rashbass, C. (1961). The relationship between saccadic and smooth tracking eye movements. Journal of Physiology, 159, 326–338.
Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124, 372–422.
Rayner, K., Smith, T. J., Malcolm, G. L., & Henderson, J. M. (2009). Eye movements and visual encoding during scene perception. Psychological Science, 20, 6–10. doi:10.1111/j.1467-9280.2008.02243.x
Reimer, B., & Sodhi, M. (2006). Detecting eye movements in dynamic environments. Behavior Research Methods, 38, 667–682.
Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on eye tracking research and applications (pp. 71–78). New York: ACM Press. doi:10.1145/355017.355028
Sasse, D. B. (2008). A framework for psychophysiological data acquisition in digital games. Master’s thesis, Otto-von-Guericke Universität Magdeburg.
Sennersten, C. (2004). Eye movements in an action game tutorial. Unpublished master’s thesis, Lund University, Lund, Sweden.
Sennersten, C., Alfredson, J., Castor, M., Hedström, J., Lindahl, B., Lindley, C., & Svensson, E. (2007). Verification of an experimental platform integrating a Tobii eyetracking system with the HiFi game engine (Report No. FOI-R-2227-SE). Retrieved from FOI, Swedish Defense Research Agency: http://www2.foi.se/rapp/foir2227.pdf.
Shepherd, M., Findlay, J. M., & Hockey, R. J. (1986). The relationship between eye movements and spatial attention. Quarterly Journal of Experimental Psychology, 38A, 475–491.
Smith, T. J., & Henderson, J. M. (2008). Edit blindness: The relationship between attention and global change blindness in dynamic scenes. Journal of Eye Movement Research, 2(2, Art. 6), 1–17.
Tatler, B. W. (2007). The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions. Journal of Vision, 7(14, Art. 4), 1–17. doi:10.1167/7.14.4
Tosi, V., Mecacci, L., & Pasquali, E. (1997). Scanning eye movements made when viewing film: Preliminary observations. International Journal of Neuroscience, 92, 47–52.
Vitu, F., Brysbaert, M., & Lancelin, D. (2004). A test of parafovealon-foveal effects with pairs of orthographically related words. European Journal of Cognitive Psychology, 16, 154–177. doi:10.1080/09541440340000178
Vlasic, D., Adelsberger, R., Vannucci, G., Barnwell, J., Gross, M., Matusik, W., & Popovic, J. (2007). Practical motion capture in everyday surroundings. ACM Transactions on Graphics, 26, 35. doi:10.1145/1276377.1276421
Ware, C., & Mikaelian, H. H. (1987). An evaluation of an eye tracker as a device for computer input. ACM SIGCHI Bulletin, 18, 183–188. doi:10.1145/29933.275627
Zelinsky, G. J., & Neider, M. B. (2008). An eye movement analysis of multiple object tracking in a realistic environment. Visual Cognition, 16, 553–566. doi:10.1080/13506280802000752
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was supported by a Karl-Steinbuch-Stipendium (scholarship) from the MFG Stiftung Baden-Württemberg to F.P.
Rights and permissions
About this article
Cite this article
Papenmeier, F., Huff, M. DynAOI: A tool for matching eye-movement data with dynamic areas of interest in animations and movies. Behavior Research Methods 42, 179–187 (2010). https://doi.org/10.3758/BRM.42.1.179
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.3758/BRM.42.1.179