skip to main content
10.1145/1180995.1181012acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
Article

GSI demo: multiuser gesture/speech interaction over digital tables by wrapping single user applications

Authors Info & Claims
Published:02 November 2006Publication History

ABSTRACT

Most commercial software applications are designed for a single user using a keyboard/mouse over an upright monitor. Our interest is exploiting these systems so they work over a digital table. Mirroring what people do when working over traditional tables, we want to allow multiple people to interact naturally with the tabletop application and with each other via rich speech and hand gestures. In previous papers, we illustrated multi-user gesture and speech interaction on a digital table for geospatial applications -- Google Earth, Warcraft III and The Sims. In this paper, we describe our underlying architecture: GSI Demo. First, GSI Demo creates a run-time wrapper around existing single user applications: it accepts and translates speech and gestures from multiple people into a single stream of keyboard and mouse inputs recognized by the application. Second, it lets people use multimodal demonstration -- instead of programming -- to quickly map their own speech and gestures to these keyboard/mouse inputs. For example, continuous gestures are trained by saying "Computer, when I do [one finger gesture], you do [mouse drag]". Similarly, discrete speech commands can be trained by saying "Computer, when I say [layer bars], you do [keyboard and mouse macro]". The end result is that end users can rapidly transform single user commercial applications into a multi-user, multimodal digital tabletop system.

References

  1. Boyle, M. and Greenberg, S. Rapidly Prototyping Multimedia Groupware. Proc Distributed Multimedia Systems (DMS'05), Knowledge Systems Institute, 2005.Google ScholarGoogle Scholar
  2. Cao, X. and Balakrishnan, R. Evaluation of an online adaptive gesture interface with command prediction. Proc Graphics Interface, 2005. 187--194. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Cohen, P. R., Coulston, R. and Krout, K., Multimodal interaction during multiparty dialogues: Initial results. Proc IEEE Int'l Conf. Multimodal Interfaces, 2002, 448--452. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Cohen, P. R., Johnston, M., McGee, D., Oviatt, S., Pittman, J., Smith, I., Chen, L. and Clow, J., QuickSet: Multimodal interaction for distributed applications. Proc. ACM Multimedia, 1997, 31--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Cypher, A. Watch What I Do: Programming by Demonstration. MIT Press, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Dietz, P. H., Leigh, D. L., DiamondTouch: A Multi-User Touch Technology, Proc ACM UIST, 2001. 219--226. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Greenberg, S. and Boyle, M. Customizable physical interfaces for interacting with conventional applications. Proc. ACM UIST Conference, 2002, 31--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Gutwin, C. and Greenberg, S. Design for individuals, design for groups: Tradeoffs between power and workspace awareness. Proc ACM CSCW, 1998, 207--216. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Lunsford, R., Oviatt, S., and Coulston, R., Audio-visual cues distinguishing self- from system-directed speech in younger and older adults. Proc. ICMI, 2005, 167--174. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. McGee, D. R. and Cohen, P. R., Creating tangible interfaces by augmenting physical objects with multimodal language. Proc ACM Conf. Intelligent User Interfaces, 2001, 113--119. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Oviatt, S. L. Ten myths of multimodal interaction, Comm. ACM, 42(11), 1999, 74--81. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Tse, E., Shen, C., Greenberg, S. and Forlines, C. Enabling Interaction with Single User Applications through Speech and Gestures on a Multi-User Tabletop. Proc. AVI 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Tse, E., Greenberg, S., Shen, C. and Forlines, C. (2006) Multimodal Multiplayer Tabletop Gaming. Proc. Workshop on Pervasive Games 2006.Google ScholarGoogle Scholar
  14. Wu, M., Shen, C., Ryall, K., Forlines, C., Balakrishnan, R., Gesture Registration, Relaxation, and Reuse for Multi-Point Direct-Touch Surfaces, Proc. TableTop2006. 183--190. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. GSI demo: multiuser gesture/speech interaction over digital tables by wrapping single user applications

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICMI '06: Proceedings of the 8th international conference on Multimodal interfaces
      November 2006
      404 pages
      ISBN:159593541X
      DOI:10.1145/1180995

      Copyright © 2006 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 2 November 2006

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      Overall Acceptance Rate453of1,080submissions,42%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader