Skip to main content

Eye Tracking: A Brief Introduction

  • Chapter
  • First Online:
Ways of Knowing in HCI

Abstract

Eye tracking is the process of measuring where eye gaze is focused to infer what someone is paying attention to and/or ignoring. The object of focus could be a digital display (e.g., on a phone, tablet or computer) or another person in a conversation, for example in face-to-face settings or in video conferences. Researchers measure what is looked at (point of gaze) and for how long (gaze duration), and the order in which gaze shifts.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Although saccades and fixations are most commonly analyzed for information processing tasks, there exist other types of eye movements such as pursuit, vergence, and vestibular eye movements. Pursuit eye movements have lower velocity than saccades and occur when the eyes follow a moving object. Vergence eye movements occur when the eyes move toward each other, to fixate on a nearby object. Vestibular eye movements occur when the eyes rotate to compensate for head and body movements in order to maintain the same direction of vision. Other smaller movements of the eyes include drifts and microsaccades.

  2. 2.

    Attention can be of two types: overt (the focus of attention matches where the eyes look) and covert (the focus of attention is different from where the eyes look). For example, when one is looking up to concentrate, where their eyes look is not correlated with they are thinking. This is a case of covert attention. It has been argued that for most natural viewing conditions, the focus of attention correlates with where the eyes look. In the rest of this article, we refer to overt attention as simply attention.

  3. 3.

    There are many companies offering hardware and software for eye tracking studies both in the laboratory or in controlled desktop settings and also for mobile contexts. Well-known companies include SMI (SensoMotoric Instruments) a spin-off from led by Dr. Winfried Teiwes and his academic mentors in 1991 (http://www.smivision.com/), Tobii Technology established in 2001 by John Elvesjö, Henrik Eskilsson, and Mårten Skogö (http://www.tobii.com//), and Arrington Research which was founded in 1995 by Dr. Karl Frederick Arrington as part of a technology transfer initiative at the Massachusetts Institute of Technology (http://www.arringtonresearch.com/). Other companies include Applied Science Laboratories (ASL), EyeTech, Mirametrix, Seeing Machines and SR. Webcam-based eye tracking solutions include GazeHawk and eye-trackShop.

  4. 4.

    For more details on experimental design, please see the chapter on Experimental Research in HCI in this volume.

  5. 5.

    There are many texts that can be drawn on to learn more about eye tracking. We list some below.

  6. 6.

    Comparative assessment of measures of eye movement locations and scanpaths used for evaluation of interface quality. Revealed that well-organized functional grouping of icons result in shorter scanpaths, covering smaller areas. Less well organized interfaces produce less efficient search behavior. However, poorly organized icon groupings do not affect users’ ability to interpret/understand icons.

  7. 7.

    While searching for objects in scenes, eye gaze is affected by the scene context, e.g., one searches for pedestrians on the lower half of a scene, where the street is most likely to be; with this one searches for birds in the upper half of the scene, where the sky is most likely to be.

  8. 8.

    While freely viewing images, eye gaze is biased towards image locations that have high spatial contrast.

  9. 9.

    Fast eye movements and choices between food items are driven by perceptual factors such as the visual catchiness or saliency of items, while slower choices are driven by high-level factors such as the value of items.

  10. 10.

    Eye movements differ when users are distracted compared to when they are not. This study identifies eye gaze markers that are predictive of user reading struggle, or frustration. This study and its companion paper (Navalpakkam & Churchill CHI 2012, focusing mainly on the relationship between eye and mouse tracking) have been used as examples of designing, conducting and analyzing an eye tracking study.

  11. 11.

    The authors outline a novel methodological approach to understanding how humans interact with interactive applications and services. They call their approach “computer interaction analysis”. Computer interaction analysis extends traditional eye tracking approaches by interweaving and drawing out the relationship between people’s input actions, system display events and people’s eye movements.

References and Further Reading

There are many texts that can be drawn on to learn more about eye tracking. We list some below.

References for getting more expert in this method

  • Duchowski, A. T. (2007). Eye tracking methodology: Theory and practice. Secaucus, NJ: Springer.

    Google Scholar 

  • Duchowski, A. T. (2002). A breadth-first survey of eye tracking applications. Behavior Research Methods, Instruments, and Computers, 34, 455–470.

    Article  Google Scholar 

  • Goldberg, J. H., & Wichansky, A. M. (2003). Eye tracking in usability evaluation: A practitioner’s guide. In J. Hyönä, R. Radach, & H. Deubel (Eds.), The mind’s eye: Cognitive and applied aspects of eye movements (pp. 493–516). Oxford, UK: Elsevier Science.

    Chapter  Google Scholar 

  • Holmqvist, K., Nystrom, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van De Weijer, J. (2011). Eye tracking. A comprehensive guide to methods and measures. Oxford: Oxford University Press.

    Google Scholar 

  • Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422.

    Article  Google Scholar 

  • Rayner, K. (2009). Eye movements and attention in reading, scene perception, and visual search. The Quarterly Journal of Experimental Psychology, 62(8), 1457–1506.

    Article  Google Scholar 

  • Henderson, J. (2003). Human gaze control during real-world scene perception. Trends in Cognitive Sciences, 7(11), 498–504.

    Article  Google Scholar 

  • Hayhoe, M., & Ballard, D. (2005). Eye movements in natural behavior. Trends in Cognitive Sciences, 9(4), 188–194.

    Article  Google Scholar 

  • Jacob, R. J. K., & Karn, K. S. (2003a). Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In J. Hyönä, R. Radach, & H. Deubel (Eds.), The mind’s eye: Cognitive and applied aspects of eye movement research (pp. 573–603). Oxford, England: Elsevier.

    Chapter  Google Scholar 

  • Poole, A., & Ball, L. J. (2005). Eye tracking in human-computer interaction and usability research: Current status and future prospects. Psychology, 10(5), 211–219.

    Google Scholar 

References to example papers that have used it well, a short commentary about each reference and what questions they answer can be found in the footnotes

  • Goldberg, J. H. & Kotval, X. P. (1999). Computer interface evaluation using eye movements: Methods and constructs. International Journal of Industrial Ergonomics, 24, 631–645Footnote

    Comparative assessment of measures of eye movement locations and scanpaths used for evaluation of interface quality. Revealed that well-organized functional grouping of icons result in shorter scanpaths, covering smaller areas. Less well organized interfaces produce less efficient search behavior. However, poorly organized icon groupings do not affect users’ ability to interpret/understand icons.

    Google Scholar 

  • Torralba, A., Oliva, A., Castelhano, M. S., & Henderson, J. M. (2006). Contextual guidance of eye movements and attention in real-world scenes: The role of global features in object search. Psychological Review, 113, 766–786Footnote

    While searching for objects in scenes, eye gaze is affected by the scene context, e.g., one searches for pedestrians on the lower half of a scene, where the street is most likely to be; with this one searches for birds in the upper half of the scene, where the sky is most likely to be.

    Google Scholar 

  • Reinagel, P. & Zador, A. M. (1999). Natural scene statistics at the centre of gaze. Network: Computation in Neural Systems, 10, 341–350Footnote

    While freely viewing images, eye gaze is biased towards image locations that have high spatial contrast.

    Google Scholar 

  • Milosavljevic, M., Navalpakkam, V., Koch, C., & Rangel, A. (2012). Relative visual saliency differences induce sizeable bias in consumer choice. Journal of Consumer Psychology, 22(1), 67–74Footnote

    Fast eye movements and choices between food items are driven by perceptual factors such as the visual catchiness or saliency of items, while slower choices are driven by high-level factors such as the value of items.

    Google Scholar 

  • Navalpakkam, V., Rao, J. M., & Slaney, M. (2011). Using gaze patterns to study and predict reading struggles due to distraction. CHI 2011, 7–12 May, 2011, Vancouver, BC, CanadaFootnote

    Eye movements differ when users are distracted compared to when they are not. This study identifies eye gaze markers that are predictive of user reading struggle, or frustration. This study and its companion paper (Navalpakkam & Churchill CHI 2012, focusing mainly on the relationship between eye and mouse tracking) have been used as examples of designing, conducting and analyzing an eye tracking study.

    Google Scholar 

  • Moore, R. J. & Churchill, E. F. (2011). Computer Interaction Analysis: Toward an Empirical Approach to Understanding User Practice and Eye Gaze in GUI-Based Interaction. Computer Supported Cooperative Work, 20 (6), 497–528Footnote

    The authors outline a novel methodological approach to understanding how humans interact with interactive applications and services. They call their approach “computer interaction analysis”. Computer interaction analysis extends traditional eye tracking approaches by interweaving and drawing out the relationship between people’s input actions, system display events and people’s eye movements.

    Google Scholar 

Other references

  • Argyle, M., & Cook, M. (1976). Gaze and mutual gaze. Cambridge: Cambridge University Press.

    Google Scholar 

  • Ballard, D. H., Hayhoe, M. M., Pook, P. K., & Rao, R. P. (1997). Deictic codes for the embodiment of cognition. Behavioral and Brain Sciences, 20(04), 723–742.

    Google Scholar 

  • Brennan, S. E., Chen, X., Dickinson, C., Neider, M., & Zelinsky, G. (2008). Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition, 106, 1465–1477.

    Article  Google Scholar 

  • Boraston, Z., & Blakemore, S. (2007). The application of eye tracking technology in the study of autism. The Journal of Physiology, 581, 893–898.

    Article  Google Scholar 

  • Burke, M., Hornof, A., Nilsen, E., & Gorman, N. (2005). High-cost banner blindness: Ads increase perceived workload, hinder visual search, and are forgotten. ACM Transactions on Computer-Human Interaction, 12(4), 423–445.

    Article  Google Scholar 

  • Buswell, G. T. (1935). How people look at pictures. Chicago, IL: University of Chicago Press.

    Google Scholar 

  • Bruce, N. D., & Tsotsos, J. K. (2009). Saliency, attention, and visual search: An information theoretic approach. Journal of Vision, 9(3), 5. 1–24.

    Article  Google Scholar 

  • Cherubini, M., Nüssli, A. M., & Dillenbourg, P. (2008). Deixis and gaze in collaborative work at a distance: A computational model to detect misunderstandings. In ETRA ’08: Proceedings of the 2008 symposium on eye tracking research and applications (pp. 173–180). New York, NY: ACM.

    Chapter  Google Scholar 

  • Chi, J.-n., Zhang, P.-y., Zheng, S.-y., Zhang, C., & Huang, Y. (2009). Key techniques of eye gaze tracking based on pupil corneal reflection. In WRI global congress on intelligent systems, 2009: GCIS ’09, 19–21 May, 2009, Xiamen, CN (pp. 133–138). Washington, DC: IEEE.

    Google Scholar 

  • Clarke, A. H., Ditterich, J., Druen, K., Schonfeld, U., & Steineke, C. (2002). Using high frame rate CMOS sensors for three-dimensional eye tracking. Behavior Research Methods, Instruments, and Computers, 34, 549–560.

    Article  Google Scholar 

  • Cooke, L. (2006). Is eye tracking the next step in usability testing? In IEEE international professional communication conference: IPCC ’06, 23–25 October, 2006, Saratoga Springs, NY (pp. 236–242). Washington, DC: IEEE. doi:10.1109/IPCC.2006.320355.

    Chapter  Google Scholar 

  • Cutrell, E., & Guan, Z. (2007). What are you looking for?: An eye tracking study of information usage in web search. In Proceedings of the SIGCHI conference on human factors in computing systems, Montréal, QC, Canada, 22–27 April, 2006 (pp. 407–416). New York, NY: ACM.

    Google Scholar 

  • Dabbish, L., & Kraut, R. (2004). Controlling interruptions: Awareness displays and social motivation for coordination. In Proceedings of the 2004 ACM conference on computer supported cooperative work (CSCW ’04), Chicago, IL, 6–10 November, 2004 (pp. 182–191). New York, NY: ACM.

    Google Scholar 

  • Dalton, K. M., Nacewicz, B. M., Johnstone, T., Schaefer, H. S., Gernsbacher, M. A., Goldsmith, H. H., et al. (2005). Gaze fixation and the neural circuitry of face processing in autism. Nature Neuroscience, 8, 519–526.

    Google Scholar 

  • Ellis, S., Candrea, R., Misner, J., Craig, C. S., & Lankford, C. P. (1998). Windows to the soul? What eye movements tell us about software usability. 7th annual conference of the Usability Professionals’ Association conference: UPA ’98, June 22–26, 1998, Washington, DC

    Google Scholar 

  • Exline, R. V. (1974). Visual interaction: The glances of power and preference. In S. Weitz (Ed.), Nonverbal communication (pp. 65–92). New York, NY: The Oxford University Press.

    Google Scholar 

  • Exline, R. V., & Winters, L. G. (1965). Affective relations and mutual glances in dyads. In S. Tomkins & C. Izzard (Eds.), Affect, cognition and personality. New York, NY: Springer.

    Google Scholar 

  • Fussell, S. R., Setlock, L. D., Parker, E. M., & Yang, J. (2003). Assessing the value of a cursor pointing device for remote collaboration on physical tasks. In CHI ’03 extended abstracts on human factors in computing systems (pp. 788–789). New York, NY: ACM.

    Chapter  Google Scholar 

  • Garrett, J. J. (2003). The elements of user experience: User-centered design for the web. New York, NY: New Riders Press.

    Google Scholar 

  • Gergle, D., & Clark, A. T. (2011). See what I’m saying? Using dyadic mobile eye tracking to study collaborative reference. In Proceedings of the ACM 2011 conference on computer supported cooperative work (CSCW ’11) (pp. 435–444). New York, NY: ACM.

    Chapter  Google Scholar 

  • Gibson, J. J., & Pick, A. D. (1963). Perception of another person’s looking behavior. American Journal of Psychology, 76, 386–394.

    Article  Google Scholar 

  • Gramatikov, B. I., Zalloutm, O. H., Wu, Y. K., Hunter, D. G., & Guyton, D. L. (2007). Directional eye fixation sensor using birefringence-based foveal detection. Applied Optics, 46, 1809–1818.

    Article  Google Scholar 

  • Goffman, E. (1964). Behavior in public places. Glencoe: The Free Press.

    Google Scholar 

  • Goodwin, C. (1984). Notes on story structure and the organization of participation. In M. Atkinson & J. Heritage (Eds.), Structures of social action (pp. 225–246). Cambridge: Cambridge University Press.

    Google Scholar 

  • Granka, L. & Rodden, K. (2006). Incorporating eye tracking into user studies at Google. In Workshop Position paper presented at CHI.

    Google Scholar 

  • Hanna, J., & Brennan, S. (2007). Speakers’ eye gaze disambiguates referring expressions early during face-to-face conversation. Journal of Memory and Language, 57(4), 596–615.

    Article  Google Scholar 

  • Henderson, J. M., Brockmole, J. R., Castelhano, M. S., & Mack, M. (2007). Visual saliency does not account for eye movements during search in real-world scenes. In R. van Gompel, M. Fischer, W. Murray, & R. Hill (Eds.), Eye movements: A window on mind and brain (pp. 537–562). Oxford: Elsevier.

    Chapter  Google Scholar 

  • Henderson, J. M., & Ferreira, F. (Eds.). (2004). The interface of language, vision, and action: Eye movements and the visual world. New York, NY: Psychology Press.

    Google Scholar 

  • Huang, J., White, R., & Buscher, G. (2012). User see, user point: Gaze and cursor alignment in web search. In Proceedings of the 2012 ACM annual conference on human factors in computing systems (pp. 1341–1350). New York, NY: ACM.

    Chapter  Google Scholar 

  • Humphrey, K., & Underwood, G. (2009). Domain knowledge moderates the influence of visual saliency in scene recognition. British Journal of Psychology, 100, 377–398.

    Article  Google Scholar 

  • Itti, L., & Baldi, P. (2009). Bayesian surprise attracts human attention. Vision Research, 49(10), 1295–1306.

    Article  Google Scholar 

  • Itti, L., & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40(10–12), 1489–1506.

    Article  Google Scholar 

  • Itti, L., & Koch, C. (2001). Computational modeling of visual attention. Nature Reviews Neuroscience, 2, 194–203.

    Article  Google Scholar 

  • Jacob, R. J., & Karn, K. S. (2003b). Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. Mind, 2(3), 4.

    Google Scholar 

  • Jacobson, J., & Dodwell, P. C. (1979). Saccadic eye movements during reading. Brain and Language, 8(3), 303–314.

    Article  Google Scholar 

  • Javal, E. (1990). Essay on the physiology of reading. Ophthalmic and Physiological Optics, 10, 381–384.

    Article  Google Scholar 

  • Jermann, P., & Nüssli, M. A. (2012). Effects of sharing text selections on gaze cross-recurrence and interaction quality in a pair programming task. In Proceedings of the ACM 2012 conference on computer supported cooperative work (CSCW ’12) (pp. 1125–1134). New York, NY: ACM.

    Google Scholar 

  • Johansen, S. A., Agustin, J. S., Skovsgaard, H., Hansen, J. P., & Tall, M. (2011). Low cost vs. high-end eye tracking for usability testing. In CHI ‘11 Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY, USA, 1177–1182.

    Google Scholar 

  • Just, M. A., & Carpenter, P. A. (1976). Eye fixations and cognitive processes. Cognitive Psychology, 8, 441–480.

    Article  Google Scholar 

  • Kanner, L. (1943). Autistic disturbances of affective contact. Nervous Child, 2(3), 217–250.

    Google Scholar 

  • Kendon, A. (1967). Some functions of gaze-direction in social interaction. Acta Psycholoigica, 26, 22–63.

    Article  Google Scholar 

  • Klin, A., Jones, W., Schultz, R., Volkmar, F., & Cohen, D. (2002). Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Archives of General Psychiatry, 59, 809–816.

    Article  Google Scholar 

  • Land, M. F., & Hayhoe, M. (2001). In what ways do eye movements contribute to everyday activities? Vision Research, 41, 3559–3565.

    Article  Google Scholar 

  • Latimer, C. R. (1988). Eye-movement data: Cumulative fixation time and cluster analysis. Research Methods, Instruments and Computers, 20(5), 437–470.

    Article  Google Scholar 

  • Loftus, G. P., & Mackworth, N. H. (1978). Cognitive determinants of fixation location during picture viewing. Journal of Experimental Psychology: Human Perception and Performance, 4(4), 565–572.

    Google Scholar 

  • Lohse, G. L. (1997). Consumer eye movement patterns on yellow pages advertising. Journal of Advertising, 26(1), 61–73.

    Article  MathSciNet  Google Scholar 

  • Mele, M. L., & Federici, S. (2012). A psychotechnical review on eye tracking systems: Toward user experience. Disability and Rehabilitative Technology, 7(4), 261–281.

    Article  Google Scholar 

  • Moore, R. J., Churchill, E. F., & Kantamneni, R. G. P. (2011). Three sequential positions of query repair in interactions with internet search engines. In Proceedings of CSCW 2011 (pp. 415–424). New York: ACM

    Google Scholar 

  • Najemnik, J., & Geisler, W. S. (2005). Optimal eye movement strategies in visual search. Nature, 434(7031), 387–391.

    Article  Google Scholar 

  • Navalpakkam, V., & Churchill, E. (2012). Mouse tracking: Measuring and predicting users’ experience of web-based content. In Proceedings of the 2012 ACM annual conference on human factors in computing systems (pp. 2963–2972). New York, NY: ACM.

    Chapter  Google Scholar 

  • Navalpakkam, V., & Itti, L. (2002). A goal oriented attention guidance model. In Biologically motivated computer vision (pp. 81–118). Berlin: Springer.

    Google Scholar 

  • Navalpakkam, V., & Itti, L. (2005). Modeling the influence of task on attention. Vision Research, 45(2), 205–231.

    Article  Google Scholar 

  • Navalpakkam, V., & Itti, L. (2007). Search goal tunes visual features optimally. Neuron, 53(4), 605–617.

    Article  Google Scholar 

  • Nielsen, J., & Pernice, K. (2010). Eye tracking web usability. Berkeley, CA: New Riders.

    Google Scholar 

  • Ou, J., Oh, L. M., Yang, J., & Fussell, S. R. (2005). Effects of task properties, partner actions, and message content on eye gaze patterns in a collaborative task. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 231–240). New York, NY: ACM.

    Google Scholar 

  • Ou, J., Shi, Y., Wong, J., Fussell, S. R., & Yang, J. (2006). Combining audio and video to predict helpers’ focus of attention in multiparty remote collaboration on physical tasks. In Proceedings of the 8th international conference on multimodal interfaces (pp. 217–224). New York, NY: ACM.

    Google Scholar 

  • Ou, J., Oh, L. M., Fussell, S. R., Blum, T., & Yang, J. (2008). Predicting visual focus of attention from intention in remote collaborative tasks. IEEE Transactions on Multimedia, 10(6), 1034–1045.

    Article  Google Scholar 

  • Oyekoya, O. (2007). Eye tracking: A perceptual interface for content based image retrieval. Unpublished Ph.D. thesis, Department of Electronic & Electrical Engineering, Adastral Park Campus, University College London.

    Google Scholar 

  • Parkhurst, D., Law, K., & Niebur, E. (2002). Modeling the role of salience in the allocation of overt visual attention. Vision Research, 42(1), 107–124.

    Article  Google Scholar 

  • Poole, A., & Ball, L. (2006). Eye tracking in human-computer interaction and usability research: Current status and future prospects. In C. Ghaoui (Ed.), Encyclopedia of human computer interaction (pp. 211–219). London, UK: Idea Group Reference.

    Google Scholar 

  • Pelphrey, K. A., Sasson, N. J., Reznick, J. S., Paul, G., Goldman, B. D., & Piven, J. (2002). Visual scanning of faces in autism. Journal of Autism and Developmental Disorders, 32, 249–261.

    Article  Google Scholar 

  • Rayner, K., & McConkie, G. W. (1976). What guides a reader’s eye movements? Vision Research, 16(8), 829–837.

    Article  Google Scholar 

  • Rayner, K., & Pollatsek, A. (1994). The psychology of reading. Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Reimer, M. D. (1955). Abnormalities of the gaze – A classification. Psychiatric Quarterly, 29, 659–672.

    Article  Google Scholar 

  • Renninger, L. W., Verghese, P., & Coughlan, J. (2007). Where to look next? Eye movements reduce local uncertainty. Journal of Vision, 7(3), 6. 1–17.

    Article  Google Scholar 

  • Rensink, R. A., O’Regan, J. K., & Clark, J. J. (1997). To see or not to see: The need for attention to perceive changes in scenes. Psychological Science, 8, 368–373.

    Article  Google Scholar 

  • Richardson, D. C., Dale, R., & Kirkham, N. Z. (2007). The art of conversation is coordination: Common ground and the coupling of eye movements during dialogue. Psychological Science, 18(5), 407–413.

    Article  Google Scholar 

  • Roberts, D., Wolff, R., Rae, J., Steed, A., Aspin, R., McIntyre, M., et al. (2009). Communicating eye-gaze across a distance: Comparing an eye-gaze enabled immersive collaborative virtual environment, aligned video conferencing, and being together. In Virtual reality conference, 2009. VR 2009. IEEE (pp. 135–142). Washington, DC: IEEE.

    Google Scholar 

  • Robinson, G. H. (1979). Dynamics of the eye and head during movement between displays: A qualitative and quantitative guide for designers. Human Factors, 21(3), 343–352.

    Google Scholar 

  • Rodden, K. & Fu, X. (2007). Exploring how mouse movements relate to eye movements on web search results pages. In SIGIR 2007 workshop on web information seeking and interaction (WISI), July 27, 2007, Amsterdam, The Netherlands, pp. 29–32.

    Google Scholar 

  • Rodden, K., Fu, X., Aula, A., & Spiro, I. (2008). Eye-mouse coordination patterns on web search results pages. In CHI ’08 extended abstracts on human factors in computing systems (pp. 2997–3002). New York, NY: ACM.

    Google Scholar 

  • Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the Eye Tracking Research and Applications Symposium (pp. 71–78). New York: ACM Press.

    Google Scholar 

  • Steptoe, W., Oyekoya, O., Murgia, A., Wolff, R., Rae, J., Guimaraes, E., et al. (2009). Eye tracking for avatar eye gaze control during object-focused multiparty interaction in immersive collaborative virtual environments. In Virtual reality conference, 2009. VR 2009. IEEE (pp. 83–90). Washington, DC: IEEE.

    Google Scholar 

  • Stritzke, M., Trommershäuser, J., & Gegenfurtner, K. R. (2009). Effects of salience and reward information during saccadic decisions under risk. JOSA, A26(11), B1–B13.

    Article  Google Scholar 

  • Tanenhaus, M. K., Spivey-Knowlton, M. J., Eberhard, K. M., & Sedivy, J. C. (1995). Integration of visual and linguistic information in spoken language comprehension. Science, 268, 632–634.

    Article  Google Scholar 

  • Torralba, A., Oliva, A., Castelhano, M. S., & Henderson, J. M. (2006). Contextual guidance of eye movements and attention in real-world scenes: The role of global features in object search. Psychological Review, 113, 766–786.

    Article  Google Scholar 

  • Walker-Smith, G. J., Gale, A. G., & Findlay, J. M. (1977). Eye movement strategies in face perception. Perception, 6, 313–326.

    Google Scholar 

  • Yarbus, A. L. (1967). Eye movements and vision. New York, NY: Plenum Press.

    Book  Google Scholar 

  • Zhang, L., Tong, M. H., Marks, T. K., Shan, H., & Cottrell, G. W. (2008). SUN: A Bayesian framework for saliency using natural statistics. Journal of Vision, 8(7), 32.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Elizabeth F. Churchill .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this chapter

Cite this chapter

Navalpakkam, V., Churchill, E.F. (2014). Eye Tracking: A Brief Introduction. In: Olson, J., Kellogg, W. (eds) Ways of Knowing in HCI. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-0378-8_13

Download citation

  • DOI: https://doi.org/10.1007/978-1-4939-0378-8_13

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4939-0377-1

  • Online ISBN: 978-1-4939-0378-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics