Skip to main content

Eye Movements and Human-Computer Interaction

  • Chapter
  • First Online:
Eye Movement Research

Abstract

Gaze provides an attractive input channel for human-computer interaction because of its capability to convey the focus of interest. Gaze input allows people with severe disabilities to communicate with eyes alone. The advances in eye tracking technology and its reduced cost make it an increasingly interesting option to be added to the conventional modalities in every day applications. For example, gaze-aware games can enhance the gaming experience by providing timely effects at the right location, knowing exactly where the player is focusing at each moment. However, using gaze both as a viewing organ as well as a control method poses some challenges. In this chapter, we will give an introduction to using gaze as an input method. We will show how to use gaze as an explicit control method and how to exploit it subtly in the background as an additional information channel. We will summarize research on the application of different types of eye movements in interaction and present research-based design guidelines for coping with typical challenges. We will also discuss the role of gaze in multimodal, pervasive and mobile interfaces and contemplate with ideas for future developments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The phrase “Midas touch” originates from Greek mythology, where King Midas was said to be able to turn everything he touched into gold.

  2. 2.

    Available from http://www.ogama.net/ (July 2017).

  3. 3.

    Available from https://www.sis.uta.fi/~csolsp/downloads.php.

  4. 4.

    Available from http://www.pygaze.org/ (July 2017).

  5. 5.

    Available from http://ux.fiit.stuba.sk/GazeHook (July 2017).

  6. 6.

    Available from https://github.com/lexasss/etudriver-web (July 2017).

Bibliography

  • Adams, N., Witkowski, M., & Spence, R. (2008). The inspection of very large images by eye-gaze control. In Proceedings of the working conference on advanced visual interfaces (AVI ’08) (pp. 111–118). New York, NY: ACM.

    Google Scholar 

  • Ajanki, A., Billinghurst, M., Gamper, H., Järvenpää, T., Kandemir, M., Kaski, S. … Tossavainen, T. (2011). An augmented reality interface to contextual information. Virtual Reality, 15(2), 161–173.

    Article  Google Scholar 

  • Akkil, D., James, J. M., Isokoski, P., & Kangas, J. (2016). GazeTorch: Enabling gaze awareness in collaborative physical tasks. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’16) (pp. 1151–1158). New York, NY: ACM.

    Google Scholar 

  • Akkil, D., Kangas, J., Rantala, J., Isokoski, P., Špakov, O., & Raisamo, R. (2015). Glance awareness and gaze interaction in smartwatches. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’15) (pp. 1271–1276). New York, NY: ACM.

    Google Scholar 

  • Akkil, D., Lucero, A., Kangas, J., Jokela, T., Salmimaa, M., & Raisamo, R. (2016). User expectations of everyday gaze interaction on smartglasses. In Proceedings of the 9th Nordic conference on human-computer interaction (NordiCHI ’16). New York, NY: ACM.

    Google Scholar 

  • Alt, F., Bulling, A., Gravanis, G., & Buschek, D. (2015). GravitySpot: Guiding users in front of public displays using on-screen visual cues. In Proceedings of the 28th annual ACM symposium on user interface software & technology (UIST ’15) (pp. 47–56). New York, NY: ACM.

    Google Scholar 

  • Barea, R., Boquete, L., Mazo, M., & López, E. (2002). System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10(4), 209–218.

    Article  PubMed  Google Scholar 

  • Bates, R., Donegan, M., Istance, H. O., Hansen, J. P., & Räihä, K.-J. (2007). Introducing COGAIN: Communication by gaze interaction. Universal Access in the Information Society, 6(2), 159–166.

    Article  Google Scholar 

  • Bates, R., & Špakov, O. (2006). Implementation of COGAIN gaze tracking standards. Communication by Gaze Interaction (COGAIN, IST-2003-511598): Deliverable 2.3. Retrieved from http://wiki.cogain.org/index.php/COGAIN_Reports.

  • Bednarik, R., Gowases, T., & Tukiainen, M. (2009). Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. Journal of Eye Movement Research, 3(1), 1–10.

    Google Scholar 

  • Bednarik, R., Vrzakova, H., & Hradis, M. (2012). What do you want to do next: A novel approach for intent prediction in gaze-based interaction. In Proceedings of the symposium on eye tracking research and applications (ETRA ’12) (pp. 83–90). New York, NY: ACM.

    Google Scholar 

  • Bee, N., & André, E. (2008). Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze. In R. Goebel, J. Siekmann, & W. Wahlster (Eds.), Perception in multimodal dialogue systems, LNCS 5078 (pp. 111–122). Berlin Heidelberg: Springer.

    Chapter  Google Scholar 

  • Biedert, R. B., Schwarz, S., Hees, J., & Dengel, A. (2010). Text 2.0. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’10) (pp. 4003–4008). New York, NY: ACM.

    Google Scholar 

  • Bieg, H.-J., Chuang, L. L., Fleming, R. W., Reiterer, H., & Bülthoff, H. H. (2010). Eye and pointer coordination in search and selection tasks. In Proceedings of the symposium on eye tracking research & applications (ETRA ’10) (pp. 89–92). New York, NY: ACM.

    Google Scholar 

  • Biswas, P., & Langdon, P. (2013). A new interaction technique involving eye gaze tracker and scanning system. In Proceedings of the 2013 conference on eye tracking South Africa (pp. 67–70). New York, NY: ACM.

    Google Scholar 

  • Borgestig, M., Sandqvist, J., Ahlsten, G., Falkmer, T., & Hemmingsson, H. (2016a). Gaze-based assistive technology in daily activities in children with severe physical impairments–An intervention study. Developmental Neurorehabilitation, 20(30), 129–141. https://doi.org/10.3109/17518423.2015.1132281.

    Article  PubMed  Google Scholar 

  • Borgestig, M., Sandqvist, J., Parsons, R., Falkmer, T., & Hemmingsson, H. (2016b). Eye gaze performance for children with severe physical impairments using gaze-based assistive technology—A longitudinal study. Assistive Technology, 28(2), 93–102.

    Article  PubMed  Google Scholar 

  • Bulling, A., Duchowski, A. T., & Majaranta, P. (2011). PETMEI 2011: The 1st international workshop on pervasive eye tracking and mobile eye-based interaction. In Proceedings of the 13th international conference on ubiquitous computing (UbiComp ’13) (pp. 627–628). New York, NY: ACM.

    Google Scholar 

  • Bulling, A., Roggen, D., & Tröster, G. (2009). Wearable EOG goggles: Eye-based interaction in everyday environments. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’09) (pp. 3259–3264). New York, NY: ACM.

    Google Scholar 

  • Bulling, A., Ward, J. A., Gellersen, H., & Tröster, G. (2011b). Eye movement analysis for activity recognition using electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33(4), 741–753.

    Article  PubMed  Google Scholar 

  • Caligari, M., Godi, M., Guglielmetti, S., Franchignoni, F., & Nardone, A. (2013). Eye tracking communication devices in amyotrophic lateral sclerosis: Impact on disability and quality of life. Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration, 14(7–8), 546–552.

    Article  PubMed  Google Scholar 

  • Cantoni, V., & Porta, M. (2014). Eye tracking as a computer input and interaction method. In Proceedings of the 15th international conference on computer systems and technologies (CompSysTech ’14) (pp. 1–12). New York, NY: ACM.

    Google Scholar 

  • Chapman, J. E. (1991). Use of an eye-operated computer system in locked-in syndrome. In Proceedings of the sixth annual international conference on technology and persons with disabilities (CSUN ’91). Los Angeles, CA.

    Google Scholar 

  • Chen, Y., & Newman, W. S. (2004). A human-robot interface based on electrooculography. In Proceedings of 2004 IEEE international conference on robotics and automation (ICRA ’04) (pp. 243–248). IEEE.

    Google Scholar 

  • Chitty, N. (2013). User fatigue and eye controlled technology. OCAD University. Toronto, Ontario, Canada: Licensed under the Creative Commons. Retrieved from http://openresearch.ocadu.ca/143/1/Chitty_MRP.pdf.

  • Cleveland, N. R. (1997). The use of a commercially available eyetracking system by people with severe physical disabilities. In The 9th European conference on eye movements (ECEM ’04). Ulm, Germany.

    Google Scholar 

  • Cymek, D. H., Venjakob, A. C., Ruff, S., Lutz, O. H.-M., Hofmann, S., & Roetting, M. (2014). Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research, 7(4), 1.

    Google Scholar 

  • De Saint-Exupéry, A. (1943). The little prince. Reynal & Hitchcock.

    Google Scholar 

  • Dechant, M., Heckner, M., & Wolff, C. (2013). Den Schrecken im Blick: Eye Tracking und Survival Horror-Spiele. In Mensch & computer 2013 (pp. 539–542). Oldenbourg Verlag.

    Google Scholar 

  • Donegan, M. (2012). Participatory design: The story of Jayne and other complex cases. In P. Majaranta, H. Aoki, M. Donegan, D. W. Hansen, J. P. Hansen, A. Hyrskykari, & K.-J. Räihä (Eds.), gaze interaction and applications of eye tracking: Advances in assistive technologies (pp. 55–61). Hershey, PA: IGI Global.

    Chapter  Google Scholar 

  • Donegan, M. M., Corno, F., Signorile, I., Chió, A., Pasian, V., Vignola, A. … Holmqvist, E. (2009). Understanding users and their needs. Universal Access in the Information Society, 8(4), 259–275.

    Article  Google Scholar 

  • Donegan, M., Gill, L., & Ellis, L. (2012). A client-focused methodology for gaze control assessment, implementation and valuation. In P. Majaranta, H. Aoki, M. Donegan, D. W. Hansen, J. P. Hansen, A. Hyrskykari, & K.-J. Räihä (Eds.), Gaze interaction and applications of eye tracking: Advances in assistive technologies (pp. 279–286). Hershey, PA: IGI Global.

    Chapter  Google Scholar 

  • Donegan, M., Oosthuizen, L., Bates, R., Daunys, G., Hansen, J., Joos, M. … Signorile, I. (2005). D3.1 User requirements report with observations of difficulties users are experiencing. Communication by Gaze Interaction (COGAIN), IST-2003-511598: Deliverable 3.1. Retrieved from http://wiki.cogain.org/index.php/COGAIN_Reports.

  • Donegan, M., Oosthuizen, L., Daunys, G., Istance, H., Bates, R., Signorile, I. … Majaranta, P. (2006). D3.2 Report on features of the different systems and development needs. Communication by Gaze Interaction (COGAIN), IST-2003-511598: Deliverable 3.2. http://wiki.cogain.org/index.php/COGAIN_Reports.

  • Drewes, H., & Schmidt, A. (2007). Interacting with the computer using gaze gestures. In Human-Computer Interaction—INTERACT 2007 (pp. 475–488). Berlin Heidelberg: Springer.

    Google Scholar 

  • Duchowski, A. T. (1998). Incorporating the viewer’s point-of-regard (POR) in gaze-contingent virtual environments. In The engineering reality of virtual reality, SPIE proceedings (Vol. 3295, pp. 332–343).

    Google Scholar 

  • Duchowski, A. T. (2007). Eye tracking methodology: Theory and practice. London, UK: Springer.

    Google Scholar 

  • Duchowski, A. T., Cournia, N., & Murphy, H. (2004). Gaze-contingent displays: A review. CyberPsychology & Behavior, 7(6), 621–634.

    Article  Google Scholar 

  • Duchowski, A. T., House, D. H., Gestring, J., Wang, R. I., Krejtz, K., Krejtz, I. … Bazyluk, B. (2014). Reducing visual discomfort of 3D stereoscopic displays with gaze-contingent depth-of-field. In Proceedings of the ACM symposium on applied perception (SAP ’14) (pp. 39–46). New York, NY: ACM.

    Google Scholar 

  • Eaddy, M., Blasko, G., Babcock, J., & Feiner, S. (2004). My own private kiosk: Privacy-preserving public displays. In Eighth international symposium on wearable computers (ISWC 2004) (pp. 132–135). IEEE.

    Google Scholar 

  • Ehlers, J., Strauch, C., Georgi, J., & Huckauf, A. (2016). Pupil size changes as an active information channel for biofeedback applications. Applied Psychophysiology and Biofeedback, 41(3), 331–339.

    Article  PubMed  Google Scholar 

  • Ekman, I., Poikola, A., & Mäkäräinen, M. (2008a). Invisible Eni: Using gaze and pupil size to control a game. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’08) (pp. 3135–3140). New York, NY: ACM.

    Google Scholar 

  • Ekman, I., Poikola, A., Mäkäräinen, M., Takala, T., & Hämäläinen, P. (2008b). Voluntary pupil size change as control in eyes only interaction. In Proceedings of the symposium on eye tracking research and applications (ETRA ’08) (pp. 115–118). New York, NY: ACM.

    Google Scholar 

  • Esteves, A., Velloso, E., Bulling, A., & Gellersen, H. (2015). Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th annual ACM symposium on user interface software & technology (UIST ’15) (pp. 457–466). New York, NY: ACM.

    Google Scholar 

  • Feit, A. M., Williams, S., Toledo, A., Paradiso, A., Kulkarni, H., Kane, S., & Morris, M. R. (2017). Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 CHI conference on human factors in computing systems (CHI ’17) (pp. 1118–1130). New York, NY: ACM.

    Google Scholar 

  • Findlay, J. (2005). Covert attention and saccadic eye movements. In L. Itti, G. Rees, & J. K. Tsotsos (Eds.), Neurobiology of attention (pp. 114–116). Amsterdam: Elsevier.

    Chapter  Google Scholar 

  • Foulds, R. A., & Fincke, R. W. (1979). A computerized line of gaze system for rapid non-vocal individuals. In National computer conference 1979 personal computing proceedings (p. 81).

    Google Scholar 

  • Frey, L., White, K. J., & Hutchinson, T. (1990). Eye-gaze word processing. IEEE Transactions on Systems, Man, and Cybernetics, 20, 944–950.

    Article  Google Scholar 

  • Friedman, M., Kiliany, G., Dzmura, M., & Anderson, D. (1982). The eyetracker communication system. Johns Hopkins APL Technical Digest, 3, 250–252.

    Google Scholar 

  • Frietman, E., & Tausch, P. (1978). An eye-movement controlled communication appliance: From Eye-letter-selector to handi-writer. A Progress and Evaluation Report at request of the Foundation of Multiple Sclerosis.

    Google Scholar 

  • Gips, J., Olivieri, P., & Tecce, J. (1993). Direct control of the computer through electrodes placed around the eyes. In Human–computer interaction: Applications and case studies (Proceedings of HCI International ‘93) (pp. 630–635). Amsterdam: Elsevier.

    Google Scholar 

  • Gitelman, D. (2002). ILAB: A program for post-experimental eye movement analysis. Behavior Research Methods, Instruments, & Computers, 34(4), 605–612.

    Article  Google Scholar 

  • González-Ortega, D. D.-P., Antón-Rodríguez, M., Martínez-Zarzuela, M., & Díez-Higuera, J. (2013). Real-time vision-based eye state detection for driver alertness monitoring. Pattern Analysis and Applications, 16(3), 285–306.

    Article  Google Scholar 

  • Goossens’, C., & Crain, S. (1987). Overview of nonelectronic eye-gaze communication techniques. Augmentative and Alternative Communication, 3(2), 77–89.

    Article  Google Scholar 

  • Grauman, K., Betke, M., Lombardi, J., Gips, J., & Bradski, G. R. (2003). Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces. Universal Access in the Information Society, 2(4), 359–373.

    Article  Google Scholar 

  • Graupner, S.-T., & Pannasch, S. (2014). Continuous gaze cursor feedback in various tasks: Influence on eye movement behavior, task performance and subjective distraction. In HCI international 2014—Posters’ extended abstracts (pp. 323–329). Cham: Springer.

    Google Scholar 

  • Hafed, Z. M., & Clark, J. J. (2002). Microsaccades as an overt measure of covert attention shifts. Vision Research, 42(22), 2533–2545.

    Article  PubMed  Google Scholar 

  • Han, S., Yang, S., Kim, J., & Gerla, M. (2012). EyeGuardian: A framework of eye tracking and blink detection for mobile device users. In Proceedings of the twelfth workshop on mobile computing systems & applications (HotMobile ’12) (Article 6, 6 pages). New York, NY: ACM.

    Google Scholar 

  • Hansen, D. W., & Hansen, J. P. (2006). Robustifying eye interaction. In Proceedings of the computer vision and pattern recognition workshop (CVPRW ’06). IEEE.

    Google Scholar 

  • Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500.

    Article  PubMed  Google Scholar 

  • Hansen, J. P., Hansen, D. W., Johansen, A. S., & Elvesjö, J. (2005). Mainstreaming gaze interaction towards a mass market for the benefit of all. In Proceedings of the 11th international conference on human-computer interaction (HCII ’05). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Heikkilä, H., & Ovaska, S. (2012). Usability evaluation of gaze interaction. In P. Majaranta, H. Aoki, M. Donegan, D. W. Hansen, J. P. Hansen, A. Hyrskykari, & K.-J. Räihä (Eds.), Gaze interaction and applications of eye tracking: Advances in assistive technologies: Advances in assistive technologies (pp. 255–278). Hershey, PA: IGI Global.

    Chapter  Google Scholar 

  • Heikkilä, H., & Räihä, K.-J. (2009). Speed and accuracy of gaze gestures. Journal of Eye Movement Research, 3(2), 1–14.

    Google Scholar 

  • Heikkilä, H., & Räihä, K.-J. (2012). Simple gaze gestures and the closure of the eyes as an interaction technique. In Proceedings of the symposium on eye tracking research and applications (ETRA ’12) (pp. 147–154). New York, NY: ACM.

    Google Scholar 

  • Heil, D., Renner, P., & Pfeiffer, T. (2015). GazeTK/Blinker: OpenSource framework for gaze-based interaction for the web. Poster presented at the 18th European Conference on Eye Movements (ECEM ’15). Vienna, Austria. More information at http://gazetk.eyemovementresearch.com/.

  • Hewett, T. T., Baecker, R., Card, S., Carey, T., Gasen, J., Mantei, M. … Verplank, W. (1992). ACM SIGCHI curricula for human-computer interaction. New York, NY: ACM.

    Book  Google Scholar 

  • Hodge, S. (2007). Why is the potential of augmentative and alternative communication not being realized? Exploring the experiences of people who use communication aids. Disability & Society, 22(5), 457–471.

    Article  Google Scholar 

  • Holmqvist, E., & Buchholz, M. (2012). A model for gaze control assessments and evaluation. In P. Majaranta, H. Aoki, M. Donegan, D. W. Hansen, J. P. Hansen, A. Hyrskykari, & K.-J. Räihä (Eds.), Gaze interaction and applications of eye tracking: Advances in assistive technologies (pp. 36–47). Hershey, PA: IGI Global.

    Chapter  Google Scholar 

  • Holmqvist, E., Thunberg, G., & Dahlstrand, M. P. (2017). Gaze-controlled communication technology for children with severe multiple disabilities: Parents and professionals’ perception of gains, obstacles, and prerequisites. Assistive Technology, 30(4), 201–208. https://doi.org/10.1080/10400435.2017.1307882.

    Article  PubMed  Google Scholar 

  • Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye tracking: A comprehensive guide to methods and measures. Oxford: Oxford University Press.

    Google Scholar 

  • Hornof, A. J. (2009). Designing with children with severe motor impairments. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’09) (pp. 2177–2180). New York, NY: ACM.

    Google Scholar 

  • Hornof, A., Cavender, A., & Hoselton, R. (2003). EyeDraw: A system for drawing pictures with eye movements. In Proceedings of the 6th international ACM SIGACCESS conference on computers and accessibility (Assets ’04) (pp. 86–93). New York, NY: ACM.

    Google Scholar 

  • Huang, J., White, R., & Buscher, G. (2012). User see, user point: Gaze and cursor alignment in web search. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’12) (pp. 1341–1350). New York, NY: ACM.

    Google Scholar 

  • Huckauf, A., & Urbina, M. H. (2008). On object selection in gaze controlled environments. Journal of Eye Movement Research, 2(4), 4,1–7.

    Google Scholar 

  • Huckauf, A., & Urbina, M. H. (2011). Object selection in gaze controlled systems: What you don’t look at is what you get. ACM Transactions on Applied Perception, 8(2), Article 13, 14 pages.

    Google Scholar 

  • Hutchinson, T., White, K., Martin, W., Reichert, K., & Frey, L. (1989). Human–computer interaction using eye-gaze input. IEEE Transactions on Systems, Man, and Cybernetics, 19, 1527–1534.

    Article  Google Scholar 

  • Hyrskykari, A. (2006). Eyes in attentive interfaces: Experiences from creating iDict, a gaze-aware reading aid. Dissertation, University of Tampere. Retrieved from http://urn.fi/urn:isbn:951-44-6643-8.

  • Hyrskykari, A., Majaranta, P., Aaltonen, A., & Räihä, K.-J. (2000). Design issues of iDICT: A gaze-assisted translation aid. In Proceedings of the symposium on eye tracking research and applications (ETRA ’00) (pp. 9–14). New York, NY: ACM.

    Google Scholar 

  • Hyrskykari, A., Majaranta, P., & Räihä, K.-J. (2003). Proactive response to eye movements. Human-Computer Interaction—INTERACT 2003 (pp. 129–136). Amsterdam: IOS press.

    Google Scholar 

  • Hyrskykari, A., Majaranta, P., & Räihä, K.-J. (2005). From gaze control to attentive interfaces. In Proceedings of human-computer interaction international (HCII 2005) (Vol. 2). Las Vegas, NY.

    Google Scholar 

  • Ishimaru, S., Kunze, K., Tanaka, K., Uema, Y., Kise, K., & Inami, M. (2015). Smart eyewear for interaction and activity recognition. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’15) (pp. 307–310). New York, NY: ACM.

    Google Scholar 

  • Isokoski, P. (2000). Text input methods for eye trackers using off-screen targets. In Proceedings of the symposium on eye tracking research and applications (ETRA ’00) (pp. 15–21). New York, NY: ACM.

    Google Scholar 

  • Isokoski, P., Joos, M., Špakov, O., & Martin, B. (2009). Gaze controlled games. Universal Access in the Information Society, 8(4), 323–337.

    Article  Google Scholar 

  • Istance, H. (2016). An investigation into gaze-based interaction techniques for people with motor impairments. Dissertation, Loughborough University. Retrieved from https://dspace.lboro.ac.uk/2134/23911.

  • Istance, H., Bates, R., Hyrskykari, A., & Vickers, S. (2008). Snap clutch, a moded approach to solving the Midas touch problem. In Proceedings of the symposium on eye tracking research and applications (ETRA ’08) (pp. 221–228). New York, NY: ACM.

    Google Scholar 

  • Istance, H., & Hyrskykari, A. (2012). Gaze-aware systems and attentive applications. In P. Majaranta, H. Aoki, M. Donegan, D. W. Hansen, J. P. Hansen, A. Hyrskykari, & K.-J. Räihä (Eds.), Gaze interaction and applications of eye tracking: Advances in assistive technologies (pp. 175–195). Hershey, PA: IGI Global.

    Chapter  Google Scholar 

  • Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., & Vickers, S. (2010). Designing gaze gestures for gaming: An investigation of performance. In Proceedings of the symposium on eye tracking research and applications (ETRA ’10) (pp. 323–330). New York, NY: ACM.

    Google Scholar 

  • Istance, H., Vickers, S., & Hyrskykari, A. (2009). Gaze-based interaction with massively multiplayer on-line games. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’09) (pp. 4381–4386). New York, NY: ACM.

    Google Scholar 

  • Istance, H., Vickers, S., & Hyrskykari, A. (2012). The validity of using non-representative users in gaze communication research. In Proceedings of the symposium on eye tracking research and applications (ETRA ’12) (pp. 233–236). New York, NY: ACM.

    Google Scholar 

  • Jacob, R. J. (1990). What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’90) (pp. 11–18). New York, NY: ACM.

    Google Scholar 

  • Jacob, R. J. (1991). The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Transactions on Information Systems, 9(2), 152–169.

    Article  Google Scholar 

  • Jacob, R. J. (1995). Eye tracking in advanced interface design. In W. Barfield & T. A. Furness (Eds.), Virtual environments and advanced interface design (pp. 258–288). New York, NY: Oxford University Press Inc.

    Google Scholar 

  • Jacob, R. J., & Karn, K. S. (2003). Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In J. Hyönä, R. Radah, & H. Deubel (Eds.), The mind’s eye: Cognitive and applied aspects of eye movement research (pp. 573–605). Amsterdam: Elsevier.

    Chapter  Google Scholar 

  • Jacob, R., & Stellmach, S. (2016). What you look at is what you get: Gaze-based user interfaces. Interactions, 23(5), 62–65.

    Article  Google Scholar 

  • Just, M. A., & Carpenter, P. A. (1980). A theory of reading: From eye fixations to comprehension. Psychological Review, 87(4), 329–354.

    Article  PubMed  Google Scholar 

  • Kandemir, M., & Kaski, S. (2012). Learning relevance from natural eye movements in pervasive interfaces. In Proceedings of the 14th ACM international conference on multimodal interaction (ICMI ’12) (pp. 85–92). New York, NY: ACM.

    Google Scholar 

  • Kangas, J., Akkil, D., Rantala, J., Isokoski, P., Majaranta, P., & Raisamo, R. (2014a). Delayed haptic feedback to gaze gestures. In Proceedings of EuroHaptics 2014 (pp. 25–31). Berlin Heidelberg: Springer.

    Google Scholar 

  • Kangas, J., Akkil, D., Rantala, J., Isokoski, P., Majaranta, P., & Raisamo, R. (2014b). Gaze gestures and haptic feedback in mobile devices. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’14) (pp. 435–438). New York, NY: ACM.

    Google Scholar 

  • Khamis, M., Alt, F., & Bulling, A. (2015). A field study on spontaneous gaze-based interaction with a public display using pursuits. In Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2015 ACM international symposium on wearable computers (pp. 863–872). New York, NY: ACM.

    Google Scholar 

  • Khan, D., Heynen, J., & Snuggs, G. (1999). Eye-controlled computing: The VisionKey experience. In Proceedings of the fourteenth international conference on technology and persons with disabilities (CSUN’99). Los Angeles, CA. Retrieved from http://www.eyecan.ca/.

  • Kirst, D., & Bulling, A. (2016). On the verge: Voluntary convergences for accurate and precise timing of gaze input. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’16) (pp. 1519–1525). New York, NY: ACM.

    Google Scholar 

  • Köpsel, A., Majaranta, P., Isokoski, P., & Huckauf, A. (2016). Effects of auditory, haptic and visual feedback on performing gestures by gaze or by hand. Behavior & Information Technology, 35(12), 1044–1062.

    Article  Google Scholar 

  • Kristensson, P. O., & Vertanen, K. (2012). The potential of dwell-free eye-typing for fast assistive gaze communication. In Proceedings of the symposium on eye tracking research and applications (ETRA ’12) (pp. 241–244). New York, NY: ACM.

    Google Scholar 

  • Kudo, S., Okabe, H., Hachisu, T., Sato, M., Fukushima, S., & Kajimoto, H. (2013). Input method using divergence eye movement. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’13) (pp. 1335–1340). New York, NY: ACM.

    Google Scholar 

  • Kumar, C., Menges, R., Müller, D., & Staab, S. (2017). Chromium based framework to include gaze interaction in web browser. In Proceedings of the 26th international conference on world wide web companion (WWW ’17 Companion) (pp. 219–223). International World Wide Web Conferences Steering Committee, Geneva, Switzerland.

    Google Scholar 

  • Kumar, M., Paepcke, A., & Winograd, T. (2007). EyePoint: Practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’08) (pp. 421–430). New York, NY: ACM.

    Google Scholar 

  • Kurauchi, A., Feng, W., Joshi, A., Morimoto, C., & Betke, M. (2016). EyeSwipe: Dwell-free text entry using gaze paths. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’16) (pp. 1952–1956). New York, NY: ACM.

    Google Scholar 

  • Land, M. F., & Furneaux, S. (1997). The knowledge base of the oculomotor system. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 352(1358), 1231–1239.

    Article  PubMed  Google Scholar 

  • Land, M., & Hayhoe, M. (2001). In what ways do eye movements contribute to everyday activities? Vision Research, 41(25), 3559–3565.

    Article  PubMed  Google Scholar 

  • Land, M., & Tatler, B. (2009). Looking and acting: vision and eye movements in natural behaviour. Oxford University Press.

    Google Scholar 

  • Laretzaki, G., Plainis, S., Vrettos, I., Chrisoulakis, A., Pallikaris, I., & Bitsios, P. (2011). Threat and trait anxiety affect stability of gaze fixation. Biological Psychology, 86(3), 330–336.

    Article  PubMed  Google Scholar 

  • Levine, J. (1981). An eye-controlled computer. Yorktown Heights, NY: IBM Thomas J. Watson Research Center.

    Google Scholar 

  • Liu, Y., Lee, B.-S., & McKeown, M. J. (2016). Robust eye-based dwell-free typing. International Journal of Human-Computer Interaction, 32(9), 682–694.

    Article  Google Scholar 

  • Lorenceau, J. (2012). Cursive writing with smooth pursuit eye movements. Current Biology, 22(16), 1506–1509.

    Article  PubMed  Google Scholar 

  • MacKenzie, I. S., & Ashtiani, B. (2011). BlinkWrite: Efficient text entry using eye blinks. Universal Access in the Information Society, 10(1), 69–80.

    Article  Google Scholar 

  • Majaranta, P. (2009). Text entry by eye gaze. Dissertation, University of Tampere. Retrieved from http://urn.fi/urn:isbn:978-951-44-7787-4.

  • Majaranta, P., Ahola, U.-K., & Špakov, O. (2009a). Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’09) (pp. 357–360). New York, NY: ACM.

    Google Scholar 

  • Majaranta, P., Bates, R., & Donegan, M. (2009b). Eye tracking. In C. Stephanidis (Ed.), The universal access handbook (pp. 36:1–20). Boca Raton, FL: CRC Press.

    Google Scholar 

  • Majaranta, P., Aoki, H., Donegan, M., Hansen, D. W., Hansen, J. P., Hyrskykari, A., & Räihä, K.-J. (Eds.). (2012). Gaze interaction and applications of eye tracking: Advances in assistive technologies. Hershey, PA: IGI Global.

    Google Scholar 

  • Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human-computer interaction. In S. Fairclough, & K. Gilleade (Eds.), Advances in physiological computing (pp. 39–65). Springer.

    Google Scholar 

  • Majaranta, P., Isokoski, P., Rantala, J., Špakov, O., Akkil, D., Kangas, J., & Raisamo, R. (2016). Haptic feedback in eye typing. Journal of Eye Movement Research, 9(1), 1–14.

    Google Scholar 

  • Majaranta, P., MacKenzie, I. S., Aula, A., & Räihä, K. J. (2006). Effects of feedback and dwell time on eye typing speed and accuracy. Universal Access in the Information Society, 5(2), 199–208.

    Article  Google Scholar 

  • Majaranta, P., & Räihä, K.-J. (2002). Twenty years of eye typing: Systems and design issues. In Proceedings of the symposium on eye tracking research and applications (ETRA ’02) (pp. 15–22). New York, NY: ACM.

    Google Scholar 

  • Manabe, H., Fukumoto, M., & Yagi, T. (2015). Conductive rubber electrodes for earphone-based eye gesture input interface. Personal and Ubiquitous Computing, 19(1), 143–154.

    Article  Google Scholar 

  • Mardanbegi, D., Jalaliniya, S., Khamis, M., & Majaranta, P. (2016). The 6th international workshop on pervasive eye tracking and mobile eye-based interaction. In Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication (UbiComp ’16 Adjunct). New York, NY: ACM.

    Google Scholar 

  • Mardanbegi, D., Hansen, D. W., & Pederson, T. (2012). Eye-based head gestures. In Proceedings of the symposium on eye tracking research and applications (ETRA ’12) (pp. 139–146). New York, NY: ACM.

    Google Scholar 

  • Mele, M. L., & Federici, S. (2012). A psychotechnological review on eye-tracking systems: Towards user experience. Disability and Rehabilitation: Assistive Technology, 7(4), 261–281.

    PubMed  Google Scholar 

  • Missimer, E., & Betke, M. (2010). Blink and wink detection for mouse pointer control. In Proceedings of the 3rd international conference on PErvasive technologies related to assistive environments (PETRA ’10) (Article 23, 8 pages). New York, NY: ACM.

    Google Scholar 

  • Møllenbach, E., Hansen, J. P., & Lillholm, M. (2013). Eye movements in gaze interaction. Journal of Eye Movement Research, 6(2), 1–15.

    Google Scholar 

  • Møllenbach, E., Hansen, J. P., Lillholm, M., & Gale, A. G. (2009). Single stroke gaze gestures. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’09) (pp. 4555–4560). New York, NY: ACM.

    Google Scholar 

  • Morimoto, C. H., & Amir, A. (2010). Context switching for fast key selection in text entry applications. In Proceedings of the 2010 symposium on eye tracking research & applications (ETRA ’10) (pp. 271–274). New York, NY: ACM.

    Google Scholar 

  • Morimoto, C. H., & Mimica, M. R. (2005). Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding, 98(1), 4–24. https://doi.org/10.1016/j.cviu.2004.07.010.

    Article  Google Scholar 

  • Mott, M. E., Williams, S., Wobbrock, J. O., & Morris, M. R. (2017) Improving dwell-based gaze typing with dynamic, cascading dwell times. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’17) (pp. 2558–2570). New York, NY: ACM.

    Google Scholar 

  • Muller, M. J., & Kuhn, S. (1993). Participatory design. Communications of the ACM, 36(6), 24–28.

    Article  Google Scholar 

  • Nielsen, J. (1993). Noncommand user interfaces. Communications of the ACM, 36(4), 83–99.

    Article  Google Scholar 

  • Nielsen, J. (1994). Heuristic evaluation. In J. Nielsen & R. Mack (Eds.), Usability inspection methods (pp. 25–62). New York, NY: Wiley.

    Google Scholar 

  • Norman, D. (1998). The design of everyday things. New York, NY: Basic Books.

    Google Scholar 

  • Pannasch, S. H., Malischke, S., Storch, A., & Velichkovsky, B. M. (2008). Eye typing in application: A comparison of two systems with ALS patients. Journal of Eye Movement Research, 2(8), 6, 1–8.

    Google Scholar 

  • Partala, T., & Surakka, V. (2003). Pupil size variation as an indication of affective processing. International Journal of Human-Computer Studies, 59(1), 185–198.

    Article  Google Scholar 

  • Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., & Hays, J. (2016). Webgazer: Scalable webcam eye tracking using user interactions. In Proceedings of the twenty-fifth international joint conference on artificial intelligence (IJCAI’16) (pp. 3839–3845). AAAI Press.

    Google Scholar 

  • Pedrosa, D., Pimentel, M. D., Wright, A., & Truong, K. N. (2015). Filteryedping: Design challenges and user performance of dwell-free eye typing. ACM Transactions on Accessible Computing, 6(1), 3 (37 pages).

    Google Scholar 

  • Penkar, A. M., Lutteroth, C., & Weber, G. (2012). Designing for the eye: Design parameters for dwell in gaze interaction. In Proceedings of the 24th Australian computer-human interaction conference (OzCHI ’12) (pp. 479–488). New York, NY: ACM.

    Google Scholar 

  • Porta, M., & Turina, M. (2008). Eye-S: A full-screen input modality for pure eye-based communication. In Proceedings of the symposium on eye tracking research and applications (ETRA ’08) (pp. 27–34). New York, NY: ACM.

    Google Scholar 

  • Preece, J., Rogers, Y., & Sharp, H. (2002). Interaction design: Beyond human-computer interaction. New York, NY: Wiley.

    Google Scholar 

  • Qvarfordt, P. (2017). Gaze-informed multimodal interaction. In S. Oviatt, B. Schuller, P. R. Cohen, D. Sonntag, G. Potamianos, & A. Krüger (Eds.), The handbook of multimodal-multisensor interfaces (pp. 365–402). New York, NY: ACM.

    Google Scholar 

  • Qvarfordt, P., Beymer, D., & Zhai, S. (2005). RealTourist—a study of augmenting human-human and human-computer dialogue with eye-gaze overlay. Human-computer interaction—INTERACT 2005 (pp. 767–780). Berlin Heidelberg: Springer.

    Chapter  Google Scholar 

  • Reingold, E. M., Loschky, L. C., McConkie, G. W., & Stampe, D. M. (2003). Gaze-contingent multiresolutional displays: An integrative review. Human Factors, 45(2), 307–328.

    Article  PubMed  Google Scholar 

  • Räihä, K.-J. (2015). Life in the fast lane: Effect of language and calibration accuracy on the speed of text entry by gaze. Human-computer interaction—INTERACT 2015 (pp. 402–417). Berlin Heidelberg: Springer.

    Chapter  Google Scholar 

  • Räihä, K.-J., Hyrskykari, A., & Majaranta, P. (2011). Tracking of visual attention and adaptive applications. In C. Roda (Ed.), Human attention in digital environments (pp. 166–185). Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Räihä, K.-J., & Ovaska, S. (2012). An exploratory study of eye typing fundamentals: Dwell time, text entry rate, errors, and workload. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’12) (pp. 3001–3010). New York, NY: ACM.

    Google Scholar 

  • Raupp, S. (2013). Keyboard layout in eye gaze communication access: Typical vs. ALS. Communication sciences and disorders. East Carolina University. Retrieved from http://hdl.handle.net/10342/4254.

  • Rinard, G., & Rugg, D. (1976). An ocular control device for use by the severely handicapped. In 1986 Conference on systems and devices for the disabled (pp. 76–79). Boston, MA.

    Google Scholar 

  • Rosen, M., & Durfee, W. (1978). Preliminary report on EYECOM, an eye-movement system for non-vocal communication. In Proceedings of the 5th conference on systems and devices for the disabled. Houston, TX.

    Google Scholar 

  • Rozado, D., San Agustin, J., Rodriguez, F. B., & Varona, B. (2012). Gliding and saccadic gaze gesture recognition in real time. ACM Transactions on Interactive Intelligent Systems, 1(2), 10:1–27.

    Google Scholar 

  • Salvucci, D. D. (1999). Inferring intent in eye-based interfaces: Tracing eye movements with process models. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’99) (pp. 254–261). New York, NY: ACM.

    Google Scholar 

  • Samsung. (2013). User manual GT-19295. Samsung Electronics (www.samsung.com).

  • Schenk, S., Tiefenbacher, P., Rigoll, G., & Dorr, M. (2016). SPOCK: A smooth pursuit oculomotor control kit. In Extended abstracts of the conference on human factors in computing systems (CHI EA ’16) (pp. 2681–2687). New York, NY: ACM.

    Google Scholar 

  • Shneiderman, B. (1998). Designing the user interface. Reading, MA: Addison-Wesley.

    Google Scholar 

  • Skovsgaard, H., Räihä, K.-J., & Tall, M. (2012). Computer control by gaze. In P. Majaranta, H. Aoki, M. Donegan, D. W. Hansen, J. P. Hansen, A. Hyrskykari, & K.-J. Räihä (Eds.), gaze interaction and applications of eye tracking: Advances in assistive technologies (pp. 78–102). Hershey, PA: IGI Global.

    Chapter  Google Scholar 

  • Smith, B. A., Ho, J., Ark, W., & Zhai, S. (2000). Hand eye coordination patterns in target selection. In Proceedings of the symposium on eye tracking research and applications (ETRA ’00) (pp. 117–122). New York, NY: ACM.

    Google Scholar 

  • Soegaard, M., & Dam, R. F. (2012). The encyclopedia of human-computer interaction. The Interaction Design Foundation. Retrieved from https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed.

  • Špakov, O. (2008). iComponent—Device-independent platform for analyzing eye movement data and developing eye-based applications. Dissertation, University of Tampere. Retrieved from http://urn.fi/urn:isbn:978-951-44-7321-0.

  • Špakov, O. (2011). Comparison of gaze-to-objects mapping algorithms. In Proceeding of the 1st conference on novel gaze-controlled applications (NGCA’11) (Article 6, 8 pages). New York, NY: ACM.

    Google Scholar 

  • Špakov, O. (2012). Comparison of eye movement filters used in HCI. In Proceedings of the symposium on eye tracking research and applications (ETRA ’12) (pp. 281–284). New York, NY: ACM.

    Google Scholar 

  • Špakov, O., Isokoski, P., Kangas, J., Akkil, D., & Majaranta, P. (2016). PursuitAdjuster: An exploration into the design space of smooth pursuit-based widgets. In Proceedings of the symposium on eye tracking research and applications (ETRA ’16) (pp. 287–290). New York, NY: ACM.

    Google Scholar 

  • Špakov, O., Isokoski, P., & Majaranta, P. (2014). Look and lean: Accurate head-assisted eye pointing. In Proceedings of the symposium on eye tracking research and applications (ETRA ’14) (pp. 35–42). New York, NY: ACM.

    Google Scholar 

  • Špakov, O., & Majaranta, P. (2009). Scrollable keyboards for casual eye typing. PsychNology Journal, 7(2), 159–173.

    Google Scholar 

  • Špakov, O., & Majaranta, P. (2012). Enhanced gaze interaction using simple head gestures. In Proceedings of the 2012 ACM conference on ubiquitous computing (UbiComp ’12) (pp. 705–710). New York, NY: ACM.

    Google Scholar 

  • Špakov, O., & Miniotas, D. (2004). On-line adjustment of dwell time for target selection by gaze. In Proceedings of nordic conference on human-computer interaction (NordiCHI ’04) (pp. 203–206). New York, NY: ACM.

    Google Scholar 

  • Špakov, O., & Miniotas, D. (2005). EyeChess: A tutorial for endgames with gaze-controlled pieces. In Proceedings of COGAIN 2005 (pp. 16–18). Communication by Gaze Interaction (COGAIN).

    Google Scholar 

  • Starker, I., & Bolt, R. A. (1990). A gaze-responsive self-disclosing display. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’90) (pp. 3–10). New York, NY: ACM.

    Google Scholar 

  • Stellmach, S., & Dachselt, R. (2012). Look & touch: Gaze-supported target acquisition. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’12) (pp. 2981–2990). New York, NY: ACM.

    Google Scholar 

  • Sundstedt, V. (2011). Gazing at games: An introduction to eye tracking control. Morgan & Claypool Publishers.

    Google Scholar 

  • Tall, M. (2008). NeoVisus: Gaze interaction interface components. M.Sc. thesis, Lund University. Retrieved from https://www.researchgate.net/publication/235007493_Neovisus_Gaze_Interaction_Interface_Components.

  • Tchalenko, J. (2001). Free-eye drawing. Point: Art and Design Research Journal, 11, 36–41.

    Google Scholar 

  • Ten Kate, J., Frietman, E., Stoel, F., & Willems, W. (1979). Eye-controlled communication aids. Medical Progress Through Technology, 8(1), 1–21.

    Google Scholar 

  • Tobii. (2016, June 28). Apps. Retrieved from Tobii tech/experience: http://www.tobii.com/xperience/apps/.

  • Tuisku, O., Surakka, V., Rantanen, V., Vanhala, T., & Lekkala, J. (2013). Text entry by gazing and smiling. In Advances in human-computer interaction, Article ID 218084, 13 pages.

    Google Scholar 

  • Velichkovsky, B., Sprenger, A., & Unema, P. (1997). Towards gaze-mediated interaction: Collecting solutions of the “Midas touch” problem. Human-computer interaction—INTERACT’97 (pp. 509–516). New York, NY: Springer.

    Google Scholar 

  • Velloso, E., Wirth, M., Weichel, C., Esteves, A., & Gellersen, H. (2016). AmbiGaze: Direct control of ambient devices by gaze. In Proceedings of the 2016 ACM conference on designing interactive systems (DIS ’16) (pp. 812–817). New York, NY: ACM.

    Google Scholar 

  • Vertegaal, R., & Shell, J. S. (2008). Attentive user interfaces: The surveillance and sousveillance of gaze-aware objects. Social Science Information, 47(3), 275–298.

    Article  Google Scholar 

  • Vickers, S., Istance, H., & Hyrskykari, A. (2009). Selecting commands in 3D game environments by gaze. In Proceedings of 5th conference on communication by gaze interaction (COGAIN) (pp. 55–59). The Technical University of Denmark, Lyngby, Denmark.

    Google Scholar 

  • Vickers, S., Istance, H., & Hyrskykari, A. (2013). Performing locomotion tasks in immersive computer games with an adapted eye-tracking interface. ACM Transactions on Accessible Computing (TACCESS), 5(1), 2:1–33.

    Google Scholar 

  • Vidal, M., Bulling, A., & Gellersen, H. (2013). Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing (pp. 439–448). New York, NY: ACM.

    Google Scholar 

  • Voßkühler, A., Nordmeier, V., Kuchinke, L., & Jacobs, A. (2008). OGAMA—open gaze and mouse analyzer: Open source software designed to analyze eye and mouse movements in slideshow study designs. Behavior Research Methods, 40(4), 1150–1162.

    Article  PubMed  Google Scholar 

  • Ward, D. J., & MacKay, D. J. (2002). Fast hands-free writing by gaze direction. Nature, 414, 838.

    Article  Google Scholar 

  • Ware, C., & Mikaelian, H. H. (1987). An evaluation of an eye tracker as a device for computer input2. In Proceedings of the SIGCHI/GI conference on human factors in computing systems and graphics interface (CHI ’87) (pp. 183–188). New York, NY: ACM.

    Google Scholar 

  • Wassermann, B., Hardt, A., & Zimmermann, G. (2012). Generic gaze interaction events for web browsers. WWW12 workshop: Emerging Web Technologies, Facing the Future of Education. In Proceedings of the 21st international conference on World Wide Web (Vol. 9). https://doi.org/10.13140/2.1.1871.9362.

  • Weill-Tessier, P., Turner, J., & Gellersen, H. (2016). How do you look at what you touch?: A study of touch interaction and gaze correlation on tablets. In Proceedings of the symposium on eye tracking research and applications (ETRA ’16) (pp. 329–330). New York, NY: ACM.

    Google Scholar 

  • Wobbrock, J. O., Rubinstein, J., Sawyer, M. W., & Duchowski, A. T. (2008). Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the symposium on eye tracking research and applications (ETRA ’08) (pp. 11–18). New York, NY: ACM.

    Google Scholar 

  • Yamada, M., & Fukuda, T. (1987). Eye word processor (EWP) and peripheral controller for the ALS patient. IEEE Proceedings A—Physical Science, Measurement and Instrumentation, Management and Education—Reviews, 134(4), 328–330.

    Google Scholar 

  • Yarbus, A. L. (1967). Eye movements and vision. New York, NY: Plenum Press.

    Book  Google Scholar 

  • Yeoh, K. N., Lutteroth, C., & Weber, G. (2015). Eyes and keys: An evaluation of click alternatives combining gaze and keyboard. Human-computer interaction—INTERACT 2015 (pp. 367–383). Berlin Heidelberg: Springer.

    Chapter  Google Scholar 

  • Zhai, S., Morimoto, C., & Ihde, S. (1999). Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’99) (pp. 246–253). New York, NY: ACM.

    Google Scholar 

  • Zhang, Y., Bulling, A., & Gellersen, H. (2013). Sideways: A gaze interface for spontaneous interaction with situated displays. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’13) (pp. 851–860). New York, NY: ACM.

    Google Scholar 

  • Zhang, Y., Chong, M. K., Müller, J., Bulling, A., & Gellersen, H. (2015). Eye tracking for public displays in the wild. Personal and Ubiquitous Computing, 19(5), 967–981.

    Article  Google Scholar 

  • Zhang, Y., Müller, J., Chong, M. K., Bulling, A., & Gellersen, H. (2014). GazeHorizon: Enabling passers-by to interact with public displays by gaze. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing (UbiComp ’14) (pp. 559–563). New York, NY: ACM.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Päivi Majaranta .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Majaranta, P., Räihä, KJ., Hyrskykari, A., Špakov, O. (2019). Eye Movements and Human-Computer Interaction. In: Klein, C., Ettinger, U. (eds) Eye Movement Research. Studies in Neuroscience, Psychology and Behavioral Economics. Springer, Cham. https://doi.org/10.1007/978-3-030-20085-5_23

Download citation

Publish with us

Policies and ethics