skip to main content
10.1145/3450341.3458764acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

GazeHelp: Exploring Practical Gaze-assisted Interactions for Graphic Design Tools

Published:25 May 2021Publication History

ABSTRACT

This system development project introduces the Adobe Photoshop plugin GazeHelp, exploring the practical application of multimodal gaze-assisted interaction in assisting current graphic design activities. It implements three core features, including QuickTool: a gaze-triggered popup that allows the user to select their next tool with gaze; X-Ray: creating a small non-destructive window at the gaze point, cutting through an artboard’s layers to expose an element on a selected underlying layer; and Privacy Shield: dimming and blocking the current art board from view when looking away from the display. Each harness the speed, gaze-contingent observational nature and presence-implying strengths of gaze respectively, and are customisable to the user’s preferences. The accompanying GazeHelpServer, complete with intuitive GUI, can also be flexibly used by other programs and plugins for further development.

References

  1. Adobe. 2019. Adobe fast facts. https://www.adobe.com/about-adobe/fast-facts.html. Accessed: 2020-11-19.Google ScholarGoogle Scholar
  2. Adobe. 2020. Face Tracking Overview. https://helpx.adobe.com/au/after-effects/using/facetracking.html. Accessed: 2020-10-08.Google ScholarGoogle Scholar
  3. Renaud Blanch and Michaël Ortega. 2009. Rake cursor: improving pointing performance with concurrent input channels. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1415–1418.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Chris Creed, Maite Frutos-Pascual, and Ian Williams. 2020. Multimodal Gaze Interaction for Creative Design. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376196Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. A.T. Duchowski. 2003. Eye Tracking Methodology: Theory and Practice. Springer. https://books.google.com.au/books?id=qkYtUuop4xkCGoogle ScholarGoogle ScholarDigital LibraryDigital Library
  6. Ribel Fares, Dustin Downing, and Oleg Komogortsev. 2012. Magic-Sense: Dynamic Cursor Sensitivity-Based Magic Pointing. In CHI ’12 Extended Abstracts on Human Factors in Computing Systems (Austin, Texas, USA) (CHI EA ’12). Association for Computing Machinery, New York, NY, USA, 2489–2494. https://doi.org/10.1145/2212776.2223824Google ScholarGoogle Scholar
  7. James Gips and Peter Olivieri Olivieri. 1996. EagleEyes: An eye control system for persons with disabilities. In Proceedings of the eleventh international conference on technology and persons with disabilities. 1–15.Google ScholarGoogle Scholar
  8. Henna Heikkilä. 2013. Tools for a Gaze-Controlled Drawing Application–Comparing Gaze Gestures against Dwell Buttons. In IFIP Conference on Human-Computer Interaction. Springer, 187–201.Google ScholarGoogle ScholarCross RefCross Ref
  9. Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (April 1991), 152–169. https://doi.org/10.1145/123078.128728Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Mohamed Khamis, Florian Alt, and Andreas Bulling. 2016. Challenges and Design Space of Gaze-Enabled Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (Heidelberg, Germany) (UbiComp ’16). Association for Computing Machinery, New York, NY, USA, 1736–1745. https://doi.org/10.1145/2968219.2968342Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Microsoft. 2020. Windows Presentation Foundation (WPF) documentation. https://docs.microsoft.com/en-us/dotnet/desktop/wpf/. Accessed: 2020-11-21.Google ScholarGoogle Scholar
  12. Radiah Rivu, Yasmeen Abdrabou, Ken Pfeuffer, Augusto Esteves, Stefanie Meitner, and Florian Alt. 2020. StARe: Gaze-Assisted Face-to-Face Communication in Augmented Reality. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20 Adjunct). Association for Computing Machinery, New York, NY, USA, Article 14, 5 pages. https://doi.org/10.1145/3379157.3388930Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (The Hague, The Netherlands) (CHI ’00). Association for Computing Machinery, New York, NY, USA, 281–288. https://doi.org/10.1145/332040.332445Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Tobii. 2020. Interaction Library. https://developer.tobii.com/product-integration/interaction-library/. Accessed: 2020-11-21.Google ScholarGoogle Scholar
  15. VisualEyes. 2020. VisualEyes - Optimize your UX via Attention Heatmaps. https://www.visualeyes.design/. Accessed: 2020-10-08.Google ScholarGoogle Scholar
  16. Colin Ware and Harutune H. Mikaelian. 1986. An Evaluation of an Eye Tracker as a Device for Computer Input. In Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface(Toronto, Ontario, Canada) (CHI ’87). Association for Computing Machinery, New York, NY, USA, 183–188. https://doi.org/10.1145/29933.275627Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Pierre Weill-Tessier and Hans Gellersen. 2018. Correlation between Gaze and Hovers during Decision-Making Interaction. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (Warsaw, Poland) (ETRA ’18). Association for Computing Machinery, New York, NY, USA, Article 5, 5 pages. https://doi.org/10.1145/3204493.3204567Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Pittsburgh, Pennsylvania, USA) (CHI ’99). Association for Computing Machinery, New York, NY, USA, 246–253. https://doi.org/10.1145/302979.303053Google ScholarGoogle ScholarDigital LibraryDigital Library

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    ETRA '21 Adjunct: ACM Symposium on Eye Tracking Research and Applications
    May 2021
    78 pages
    ISBN:9781450383578
    DOI:10.1145/3450341

    Copyright © 2021 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 25 May 2021

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • short-paper
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate69of137submissions,50%

    Upcoming Conference

    ETRA '24
    The 2024 Symposium on Eye Tracking Research and Applications
    June 4 - 7, 2024
    Glasgow , United Kingdom

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format