ABSTRACT
Despite the wide availability of body-sensing technologies, the design of control gestures that feel natural and that can be intuitively "guessed" by the users is still an embodied interaction challenge. This is especially true for systems that require a set of complementary control gestures. Part of the problem lies in the separation between the locus of the interaction (the body) and the focus of the interaction (the screen). We extend Johnson's theory of Embodied Schemata with Embodied Allegories, in order to create a unifying context that spans across the locus and focus of interaction. We present results that demonstrate how this approach increases the chance that users select the same gesture or movement for producing an effect within the virtual context, and that the resultant gesture set is deemed more intuitive by users. We also present the accompanying methodology, "Framed Guessability," which can increase users' agreement when conducting Guessability Studies.
- Antle, A. N., Corness, G., Bakker, S., Droumeva, M., Van Den Hoven, E., and Bevans, A. Designing to support reasoned imagination through embodied metaphor. Proceeding of the seventh ACM conference on Creativity and cognition CC 09, (2009), 275. Google ScholarDigital Library
- Antle, A. N., Droumeva, M., and Corness, G. Playing with The Sound Maker: Do Embodied Metaphors Help Children Learn? Proceedings of the 7th international conference on Interaction design and children IDC 08, (2008), 178--185. Google ScholarDigital Library
- Antle, A. N. Springboard: Exploring Embodiment, Balance and Social Justice. Social Justice A Journal Of Crime Conflict And World Order, (2009), 3961--3966.Google Scholar
- Bakker, S., Antle, A. N., and Van Den Hoven, E. Embodied metaphors in tangible interaction design. Personal Ubiquitous Comput. 16, 4 (2012), 433--449. Google ScholarDigital Library
- Cafaro, F., Panella, A., Lyons, L., Roberts, J., and Radinsky, J. I see you there!: developing identity-preserving embodied interaction for museum exhibits. Proceedings of the 2013 ACM annual conference on Human factors in computing systems, ACM (2013), 1911--1920. Google ScholarDigital Library
- Cafaro, F. Using embodied allegories to design gesture suites for human-data interaction. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, ACM (2012), 560--563. Google ScholarDigital Library
- Dourish, P. Where the action is: the foundations of embodied interaction. MIT Press, Cambridge, MA, USA, 2001. Google ScholarDigital Library
- England, D. Whole Body Interaction: An Introduction. In D. England, ed., Whole Body Interaction An Introduction. Springer London, 2011, 1--5.Google Scholar
- Gentner, D. Structure-mapping: A theoretical framework for analogy. Cognitive Science 7, 2 (1983), 155--170.Google ScholarCross Ref
- Hurtienne, J. and Israel, J. H. Image schemas and their metaphorical extensions. Tangible and Embedded Interaction, ACM Press (2007), 127. Google ScholarDigital Library
- Johnson, M. The Body in the Mind. The University of Chicago Press, 1987.Google ScholarCross Ref
- Krueger, M. W. Responsive environments. Proceedings of the June 13--16, 1977 National Computer Conference, ACM (1977), 423--433. Google ScholarDigital Library
- Krueger, M. W. Environmental technology: making the real world virtual. Commun. ACM 36, 7 (1993), 36--37. Google ScholarDigital Library
- Lao, S., Heng, X., Zhang, G., Ling, Y., and Wang, P. A gestural interaction design model for multi-touch displays. Proceedings of the 23rd British HCI Group, British Computer Society (2009), 440--446. Google ScholarDigital Library
- McNeill, D. Hand and Mind: What Gestures Reveal about Thought. University Of Chicago Press, 1992.Google Scholar
- Piumsomboon, T., Clark, A., Billinghurst, M., and Cockburn, A. User-defined gestures for augmented reality. CHI '13 Extended Abstracts on Human Factors in Computing Systems, ACM (2013), 955--960. Google ScholarDigital Library
- Ruiz, J., Li, Y., and Lank, E. User-defined motion gestures for mobile interaction. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM (2011), 197--206. Google ScholarDigital Library
- Rutkowski, C. An Introduction to the Human Applications Standard Computer Interface, Part 1: Theory and Principles. Byte 7, 11 (1982), 291--310.Google Scholar
- Shneiderman, B. Direct Manipulation: A Step Beyond Programming Languages. Computer 16, 8 (1983). Google ScholarDigital Library
- Williams, A., Kabisch, E., and Dourish, P. From interaction to participation: Configuring space through embodied interaction. Proceedings of Ubicomp 2005, LNCS 3660, (2005), 287--304. Google ScholarDigital Library
- Wimmer, R. Grasp sensing for human-computer interaction. Proceedings of TEI '11, ACM (2011), 221--228. Google ScholarDigital Library
- Wobbrock, J. O., Aung, H. H., Rothrock, B., and Myers, B. A. Maximizing the guessability of symbolic input. CHI '05 Extended Abstracts on Human Factors in Computing Systems, ACM (2005), 1869--1872. Google ScholarDigital Library
- Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM (2009), 1083--1092. Google ScholarDigital Library
Index Terms
- Framed guessability: using embodied allegories to increase user agreement on gesture sets
Recommendations
Framed Guessability: Improving the Discoverability of Gestures and Body Movements for Full-Body Interaction
CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing SystemsThe wide availability of body-sensing technologies (such as Nintendo Wii and Microsoft Kinect) has the potential to bring full-body interaction to the masses, but the design of hand gestures and body movements that can be easily discovered by the users ...
Embodied metaphors in tangible interaction design
For centuries, learning and development has been supported by physical activity and manipulating physical objects. With the introduction of embedded technologies, opportunities for employing tangible or embodied interaction for learning and development ...
“I Don’t Want People to Look At Me Differently”: Designing User-Defined Above-the-Neck Gestures for People with Upper Body Motor Impairments
CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing SystemsRecent research proposed eyelid gestures for people with upper-body motor impairments (UMI) to interact with smartphones without finger touch. However, such eyelid gestures were designed by researchers. It remains unknown what eyelid gestures people with ...
Comments