ABSTRACT
This system development project introduces the Adobe Photoshop plugin GazeHelp, exploring the practical application of multimodal gaze-assisted interaction in assisting current graphic design activities. It implements three core features, including QuickTool: a gaze-triggered popup that allows the user to select their next tool with gaze; X-Ray: creating a small non-destructive window at the gaze point, cutting through an artboard’s layers to expose an element on a selected underlying layer; and Privacy Shield: dimming and blocking the current art board from view when looking away from the display. Each harness the speed, gaze-contingent observational nature and presence-implying strengths of gaze respectively, and are customisable to the user’s preferences. The accompanying GazeHelpServer, complete with intuitive GUI, can also be flexibly used by other programs and plugins for further development.
- Adobe. 2019. Adobe fast facts. https://www.adobe.com/about-adobe/fast-facts.html. Accessed: 2020-11-19.Google Scholar
- Adobe. 2020. Face Tracking Overview. https://helpx.adobe.com/au/after-effects/using/facetracking.html. Accessed: 2020-10-08.Google Scholar
- Renaud Blanch and Michaël Ortega. 2009. Rake cursor: improving pointing performance with concurrent input channels. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1415–1418.Google ScholarDigital Library
- Chris Creed, Maite Frutos-Pascual, and Ian Williams. 2020. Multimodal Gaze Interaction for Creative Design. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376196Google ScholarDigital Library
- A.T. Duchowski. 2003. Eye Tracking Methodology: Theory and Practice. Springer. https://books.google.com.au/books?id=qkYtUuop4xkCGoogle ScholarDigital Library
- Ribel Fares, Dustin Downing, and Oleg Komogortsev. 2012. Magic-Sense: Dynamic Cursor Sensitivity-Based Magic Pointing. In CHI ’12 Extended Abstracts on Human Factors in Computing Systems (Austin, Texas, USA) (CHI EA ’12). Association for Computing Machinery, New York, NY, USA, 2489–2494. https://doi.org/10.1145/2212776.2223824Google Scholar
- James Gips and Peter Olivieri Olivieri. 1996. EagleEyes: An eye control system for persons with disabilities. In Proceedings of the eleventh international conference on technology and persons with disabilities. 1–15.Google Scholar
- Henna Heikkilä. 2013. Tools for a Gaze-Controlled Drawing Application–Comparing Gaze Gestures against Dwell Buttons. In IFIP Conference on Human-Computer Interaction. Springer, 187–201.Google ScholarCross Ref
- Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (April 1991), 152–169. https://doi.org/10.1145/123078.128728Google ScholarDigital Library
- Mohamed Khamis, Florian Alt, and Andreas Bulling. 2016. Challenges and Design Space of Gaze-Enabled Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (Heidelberg, Germany) (UbiComp ’16). Association for Computing Machinery, New York, NY, USA, 1736–1745. https://doi.org/10.1145/2968219.2968342Google ScholarDigital Library
- Microsoft. 2020. Windows Presentation Foundation (WPF) documentation. https://docs.microsoft.com/en-us/dotnet/desktop/wpf/. Accessed: 2020-11-21.Google Scholar
- Radiah Rivu, Yasmeen Abdrabou, Ken Pfeuffer, Augusto Esteves, Stefanie Meitner, and Florian Alt. 2020. StARe: Gaze-Assisted Face-to-Face Communication in Augmented Reality. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20 Adjunct). Association for Computing Machinery, New York, NY, USA, Article 14, 5 pages. https://doi.org/10.1145/3379157.3388930Google ScholarDigital Library
- Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (The Hague, The Netherlands) (CHI ’00). Association for Computing Machinery, New York, NY, USA, 281–288. https://doi.org/10.1145/332040.332445Google ScholarDigital Library
- Tobii. 2020. Interaction Library. https://developer.tobii.com/product-integration/interaction-library/. Accessed: 2020-11-21.Google Scholar
- VisualEyes. 2020. VisualEyes - Optimize your UX via Attention Heatmaps. https://www.visualeyes.design/. Accessed: 2020-10-08.Google Scholar
- Colin Ware and Harutune H. Mikaelian. 1986. An Evaluation of an Eye Tracker as a Device for Computer Input. In Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface(Toronto, Ontario, Canada) (CHI ’87). Association for Computing Machinery, New York, NY, USA, 183–188. https://doi.org/10.1145/29933.275627Google ScholarDigital Library
- Pierre Weill-Tessier and Hans Gellersen. 2018. Correlation between Gaze and Hovers during Decision-Making Interaction. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (Warsaw, Poland) (ETRA ’18). Association for Computing Machinery, New York, NY, USA, Article 5, 5 pages. https://doi.org/10.1145/3204493.3204567Google ScholarDigital Library
- Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Pittsburgh, Pennsylvania, USA) (CHI ’99). Association for Computing Machinery, New York, NY, USA, 246–253. https://doi.org/10.1145/302979.303053Google ScholarDigital Library
Recommendations
Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article ...
Gaze and Touch Interaction on Tablets
UIST '16: Proceedings of the 29th Annual Symposium on User Interface Software and TechnologyWe explore how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to ...
Gaze interaction in the post-WIMP world
CHI EA '12: CHI '12 Extended Abstracts on Human Factors in Computing SystemsWith continuous progression away from desktop to post-WIMP applications, including multi-touch, gestural, or tangible interaction, there is high potential for eye gaze as a more natural human-computer interface in numerous contexts. Examples include ...
Comments