Abstract
As the use of autonomous systems continues to proliferate, their user base has transitioned from one primarily comprised of pilots and engineers with knowledge of the low level systems and algorithms to non-expert UAV users like scientists. This shift has highlighted the need to develop more intuitive and easy-to-use interfaces such that the strengths of the autonomous system can still be utilized without requiring any prior knowledge about the complexities of running such a system. Gesture-based natural language interfaces have emerged as a promising new alternative input modality. While on their own gesture-based interfaces can build general descriptions of desired inputs (e.g., flight path shapes), it is difficult to define more specific information (e.g., lengths, radii, height) while simultaneously preserving the intuitiveness of the interface. In order to assuage this issue, multimodal interfaces that integrate both gesture and speech can be used. These interfaces are intended to model typical human-human communication patterns which supplement gestures with speech. However, challenges arise when integrating gestures into a multimodal HMI architecture such as user perception of their ability vs. actual performance, system feedback, synchronization between input modalities, and the bounds on gesture execution requirements. We discuss these challenges, their possible causes and provide suggestions for mitigating these issues in the design of future multimodal interfaces. Although this paper discusses these challenges in the context of unmanned aerial vehicle mission planning, similar issues and solutions can be extended to unmanned ground and underwater missions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chandarana, M., Trujillo, A., Shimada, K., Allen, B.D.: A natural interaction interface for UAVs using intuitive gesture recognition. In: Savage-Knepshield, P., Chen, J. (eds.) Advances in Human Factors in Robots and Unmanned Systems, pp. 387–398. Springer, Berlin (2017)
Chandarana, M., Meszaros, E., Trujillo, A., Allen, B.: Fly like this: Natural language interfaces for UAV mission planning. In: Proceedings of the 10th International Conference on Advances in Computer-Human Interaction. ThinkMind (2017)
Reitsema, J., Chun, W., Fong, T., Stiles, R.: Team-centered virtual interactive presence for adjustable autonomy. In: AIAA conference, Space 2005, p. 6606 (2005)
Wachs, J.P., Kölsch, M., Stern, H., Edan, Y.: Vision-based hand-gesture applications. Commun. ACM 54(2), 60–71 (2011)
Perzanowski, D., Schultz, A.C., Adams, W., Marsh, E., Bugajska, M.: Building a multimodal human-robot interface. IEEE Intell. Syst. 16(1), 16–21 (2001)
Pavlovic, V.I., Sharma, R., Huang, T.S.: Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans. Pattern Anal. Mach. Intell. 19(7), 677–695 (1997)
Fernández, R.A.S., Sanchez-Lopez, J.L., Sampedro, C., Bavle, H., Molina, M., Campoy, P.: Natural user interfaces for human-drone multi-modal interaction. In: 2016 International Conference on Unmanned Aircraft Systems (ICUAS), pp. 1013–1022. IEEE (2016)
Becker, M., Kefalea, E., Maël, E., Von Der Malsburg, C., Pagel, M., Triesch, J., Vorbrüggen, J.C., Würtz, R.P., Zadel, S.: Gripsee: a gesture-controlled robot for object perception and manipulation. Auton. Robots 6(2), 203–221 (1999)
Lambrecht, J., Kleinsorge, M., Krüger, J.: Markerless gesture-based motion control and programming of industrial robots. In: 2011 IEEE 16th Conference on Emerging Technologies & Factory Automation (ETFA), pp. 1–4. IEEE (2011)
Ende, T., et al.: A human-centered approach to robot gesture based communication within collaborative working processes. In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3367–3374. IEEE (2011)
Raheja, J.L., Shyam, R., Kumar, U., Prasad, P.B.: Real-time robotic hand control using hand gestures. In: 2010 Second International Conference on Machine Learning and Computing (ICMLC), pp. 12–16. IEEE (2010)
Marks, P.: Smart software uses drones to plot disaster relief. https://www.newscientist.com/article/mg22029455-100-smart-softwareusesdrones-to-plot-disaster-relief/ (2013). Accessed Feb 2017
Wegener, S., Schoenung, S., Totah, J., Sullivan, D., Frank, J., Enomoto, F., Frost, C., Theodore, C.: UAV autonomous operations for airborne science missions. In: AIAA 3rd “Unmanned Unlimited” Technical Conference, Workshop and Exhibit, p. 6416 (2004)
Wegener, S., Schoenung, S.: Lessons learned from NASA UAV science demonstration program missions. In: 2nd AIAA” Unmanned Unlimited” Conference and Workshop and Exhibit, p. 6616 (2003)
Afshar, A., Haghani, A.: Modeling integrated supply chain logistics in real-time large-scale disaster relief operations. Soc. Econ. Plan. Sci. 46(4), 327–338 (2012)
Long, D.: Logistics for disaster relief: engineering on the run. IIE Solut. 29(6), 26–30 (1997)
DeBusk, W.: Unmanned aerial vehicle systems for disaster relief: Tornado alley. In: AIAA Infotech @ Aerospace 2010, p. 3506 (2010)
Waharte, S., Trigoni, N.: Supporting search and rescue operations with UAVs. In: 2010 International Conference on Emerging Security Technologies (EST), pp. 142–147. IEEE (2010)
Naidoo, Y., Stopforth, R., Bright, G.: Development of an UAV for search & rescue applications. In: AFRICON, 2011, pp. 1–6. IEEE (2011)
Semsch, E., Jakob, M., Pavlicek, D., Pechoucek, M.: Autonomous UAV surveillance in complex urban environments. In: IEEE/WIC/ACM International Joint Conferences on Web Intelligence and Intelligent Agent Technologies, 2009. WI-IAT’09, vol. 2, pp. 82–85. IEEE (2009)
Nigam, N., Kroo, I.: Persistent surveillance using multiple unmanned air vehicles. In: Aerospace Conference, 2008 IEEE, pp. 1–14. IEEE (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Chandarana, M., Meszaros, E.L., Trujillo, A., Allen, B.D. (2018). Challenges of Using Gestures in Multimodal HMI for Unmanned Mission Planning. In: Chen, J. (eds) Advances in Human Factors in Robots and Unmanned Systems. AHFE 2017. Advances in Intelligent Systems and Computing, vol 595. Springer, Cham. https://doi.org/10.1007/978-3-319-60384-1_17
Download citation
DOI: https://doi.org/10.1007/978-3-319-60384-1_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60383-4
Online ISBN: 978-3-319-60384-1
eBook Packages: EngineeringEngineering (R0)