Skip to main content
Log in

Multimodal interaction: a survey from model driven engineering and mobile perspectives

  • Survey
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

The multimodal interaction is becoming richer in last years thanks to the increasing evolution of mobile devices (smartphones/tablets) and their embedded sensors including accelerometer, gyroscope, global positioning system, near field communication and proximity sensors. Using such sensors, either sequentially or simultaneously, to interact with applications ensures an intuitive interaction and the user acceptance. Today, the development of multimodal mobile systems incorporating input and output modalities through sensors is a long and difficult task. Despite the facts that numerous model-based approaches have emerged and are supposed to simplify the multimodal mobile applications engineering, the design and implementation of these applications are still generally in an ad hoc way. In order to explain this situation, the present paper reviews, discusses, and analyses different model-based approaches proposed to develop multimodal mobile applications. The analysis considers not only the modelling and generation of mobile multimodality features, but also the inclusion of model-driven engineering features such as guidance and model reuse that allows the appropriate use of models and benefits from them. Our aim is to identify the current gaps that hinder the facility and acceleration of the multimodal mobile applications development using model-based approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

Notes

  1. http://ansonalex.com/infographics/smartphone-usage-statistics-2012-infographic/.

  2. http://ansonalex.com/infographics/smartphone-usage-statistics-2012-infographic/.

  3. Translated from French “Modèles et Outils pour Applications NOmades de découverte de territoire”. MOANO is funded by the French National Research Agency (ANR).

References

  1. Kvale K, Warakagoda ND (2010) Multimodal interfaces to mobile terminals—a design-for-all approach. In: User interfaces

  2. de Souza M, Carvalho DDB, Barth P, Rmos JV, Comunello E, von Wangenheim A (2010) Using acceleration data from smartphones to interact with 3D medical data. In: Proceedings of the SIBGRAPI conference on graphics, patterns and images, pp 339–345

  3. Aebi F (2012) Multimodal fusion on mobiles. In: Seminar on multimodal interaction on mobiles devices

  4. van den Brand MGJ, Groote JF (2012) Advances in model driven software engineering. ERCIM News 91:23–24

    Google Scholar 

  5. Obrenovic Z, Starcevic D (2004) Modelling multimodal human computer interaction. IEEE Comput 37:65–72

    Article  Google Scholar 

  6. Bellik Y, Teil D (1992) Définitions Terminologiques pour la Communication Multimodale. In: Proceedings of interface Homme-machine (IHM)

  7. Nigay L, Coutaz J (1997) Multifeature systems: the CARE properties and their impact on software design. In: Proceedings of first workshop on intelligence and multimodality in multimedia interfaces: research and applications, association for the advancement of artificial intelligence (AAAI)

  8. Bordegoni M, Faconti G, Feiner S, Maybury MT, Rist T, Ruggieri S, Trahanias P, Wilson M (1997) A standard reference model for intelligent multimedia presentation systems. In: Computer standards and interfaces, pp 477–496

  9. Nigay L, Coutaz J (1996) Espaces conceptuels pour linteraction multimdia et multimodale. In: TSI, special Multimedia et Collecticiel, AFCET & Hermes, pp 1195–1225

  10. Nigay L (1994) Conception et modlisation logicielles des syst‘emes interactifs: application aux interfaces multimodales. PhD thesis, University of Joseph Fourier-Grenoble, France

  11. Coutaz J, Nigay L, Salber D (1993) The MSM framework: a design space for multi-sensory-motor systems. In: Proceedings of east-west human–computer interaction conference (EWHCI), pp 231–241

  12. Landragin F (2007) Physical, semantic and pragmatic levels for multimodal fusion and fission. In: Proceedings of seventh international workshop on computational semantics (IWCS-7), pp 346–350

  13. Coutaz J, Nigay L, Salber D, Blandford A, May JY (1995) Four easy pieces for assessing the usability of multimodal interaction: the CARE properties. In: Proceedings of INTERACT, pp 115–120

  14. Bellik Y (1995) Interface multimodales: concepts, modles et architectures. PhD thesis, University of Paris XI, France

  15. Bouchet J, Nigay L (2004) ICARE: a component-based approach for the design and development of multimodal interfaces. In: Proceedings of conference on humans factors in computing systems (CHI), extended abstracts, pp 1325–1328

  16. Martin JC (1999) TYCOON : six primitive types of cooperation for observing, evaluating and specifying cooperations. In: Proceedings of association for the advancement of artificial intelligence (AAAI) fall, symposium on psychological models of communication in collaborative systems

  17. Martin JC (1997) Towards intelligent cooperation between modalities. The example of a system enabling multimodal interaction with a map. In: Proceedings of international joint conference on artificial intelligence (IJCAI), workshop on intelligent multimodal systems

  18. MOANO: Models and Tools for Pervasive Applications focusing on Territory Discovery. ANR Project 2011–2014

  19. Kent S (2002) Model driven engineering. In: Proceedings of integrated formal methods (IFM), pp 286–298

  20. Frank U (2011) Some guidelines for the conception of domain-specific modelling languages. In: Proceedings of enterprise modelling and information systems architectures (EMISA), pp 93–106

  21. Lange CFJ, Chaudron MRV (2005) Managing model quality in UML-based software development. In: Proceedings of the 13th international workshop on software technology and engineering practice, pp 7–16

  22. France R, Rumpe B (2007) Model-driven development of complex software: a research roadmap. In: Proceedings of the international conference on software engineering (ICSE), future of software engineering, pp 37–54

  23. Campos JC, Harrison MD (2001) Model checking interactor specifications. Autom Softw Eng 8:275–310

    Article  MATH  Google Scholar 

  24. Lenat DB, Guha RV, Pittman K, Pratt D, Shepherd M (1990) CYC: towards programs with common sense. In: Communications of the ACM, pp 30–49

  25. Ghezzi C, Jazayeri M, Mandrioli D (2003) Fundamentals of software engineering. Prentice Hall, Englewood Cliffs, pp I–XX. ISBN 978-0-13-305699-0

  26. Romero JR, Rivera JE, Duran F, Vallecillo A (2007) Formal and tool support for model driven engineering with Maude. J Object Technol 6:187–207

    Article  Google Scholar 

  27. Large CFJ, Wijns MAM, Chaudron MRV (2007) Supporting task-oriented modeling using interactive UML views. J Vis Lang Comput 18:399–419

    Article  Google Scholar 

  28. Dumas B (2010) Frameworks, description languages and fusion engines for multimodal interactive systems. PhD thesis, University of Fribourg, Switzerland

  29. Mechkour S (2011) SMUIML editor: graphical tool for modeling multimodal interaction. Master thesis, University of Fribourg, Switzerland

  30. Dumas B, Signer B, Lalanne D (2011) A graphical UIDL editor for multimodal interaction design based on SMUIML. In: Proceedings of the workshop on software support for user interface description language

  31. Dumas B, Lalanne D, Ingold R (2009) HephaisTK: a toolkit for rapid prototyping of multimodal interfaces. In: Proceedings of international conference on multimodal interfaces and workshop on machine learning for multimodal interaction (ICMI-MLMI), pp 231–232

  32. Cutugno F, Leano VA, Rinaldi R, Mignini G (2012) Multimodal framework for mobile interaction. In: Proceedings of advanced visual interfaces (AVI), pp 197–203

  33. Bourguet ML (2002) A toolkit for creating and testing multimodal interface designs. In: Proceedings of user interface software and technology (UIST), pp 29–30

  34. Le Bodic L, De Loor P, Kahn J (2005) Umar: a modeling of multimodal artifact. In: Proceedings of human–computer interaction (HCI)

  35. Chao C, Thomaz AL (2012) Timing in multimodal turn–taking interactions: control and analysis using timed Petri nets. J Hum Robot Interact 1:46–67

    Google Scholar 

  36. Palanque P, Schyn A (2003) A model-based approach for engineering multimodal interactive. in: Proceedings of INTERACT, pp 543–550

  37. Paternò F, Mancini C, Meniconi S (1997) ConcurTaskTrees: a diagrammatic notation for specifying task models. In: Proceedings of INTERACT, pp 362–369

  38. Clerckx T, Vandervelpen C, Coninx K (2007) Task-based design and runtime support for multimodal user interface distribution. In: Proceedings of engineering interactive systems

  39. Clerckx T, Winters F, Coninx K (2005) Tool support for designing context-sensitive user interfaces using a model-based approach. In: Task models and diagrams for user interface design (TAMODIA), pp 11–18

  40. Limbourg Q, Vanderdonckt J, Michotte B, Bouillon L, Florins M (2004) UsiXML: a user interface description language supporting multiple levels of independence. In: Proceedings of workshop on device independent web engineering, pp 325–338

  41. The UIMS Tool Developers Workshop (1992) A metamodel for the runtime architecture of an interactive system. ACM SIGCHI Bulletin, pp 32–37

  42. Green M (1985) Report on dialog specification tools. In: User interface management systems, pp 9–20

  43. Coutaz J (1987) PAC, on object oriented model for dialog design. In: Proceedings of INTERACT

  44. Nigay L (1993) PAC-Amodeus, a software architecture model for multimodal systems. In: Proceedings of conference on human factors in computing systems, INTERCHI (INTERACT+CHI)

  45. Rousseau C, Bellik Y, Vernier F, Bazalgette D (2006) A framework for the intelligent multimodal presentation of information. in: Processing of European association for signal processing (EURASIP), pp 12–18

  46. Serrano M, Nigay L, Demumieux R, Descos J, Losquin P (2006) Multimodal interaction on mobile phones: development and evaluation using ACICARE. In: Proceedings of mobile human computer interaction (MobileHCI), pp 129–136

  47. Serrano M, Nigay L, Lawson JYL, Ramsay A, Murray-Smith R, Denef S (2008) The openinterface framework: a tool for multimodal interaction. In: CHI extended abstracts on human factors in computing systems, pp 3501–3506

  48. Avouac PA, Lalanda P, Nigay L (2011) Service-oriented autonomic multimodal interaction in a pervasive environment. In: Proceedings of international conference on multimodal interaction (ICMI), pp 369–376

  49. Jourde F, Laurillau Y, Nigay L (2010) COMM notation for specifying collaborative and multimodal interactive systems. In: Proceedings of the second ACM SIGCHI symposium on engineering interactive computing systems, pp 125–134

  50. Mansoux B, Nigay L, Troccaz J (2006) Output multimodal interaction: the case of augmented surgery. In: Proceedings of human–computer interaction (HCI), pp 117–192

  51. Mili H, Mili F, Mili A (1995) Reusing software: issues and research directions. IEEE Trans Softw Eng 21:528–562

    Google Scholar 

Download references

Acknowledgments

The authors are grateful to the French National Agency ANR MOANO project for providing support for this research.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to José Rouillard.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Elouali, N., Rouillard, J., Le Pallec, X. et al. Multimodal interaction: a survey from model driven engineering and mobile perspectives. J Multimodal User Interfaces 7, 351–370 (2013). https://doi.org/10.1007/s12193-013-0126-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-013-0126-z

Keywords

Navigation