ABSTRACT
The U.S. Army is exploring the use of advanced technologies such as tactile and spatial (3-D) audio displays to enhance Soldier performance in human-robot interaction (HRI) tasks. A field study was conducted at the U.S. Army Research Laboratory (ARL) in 2006 to determine the extent to which the integration of spatial auditory and tactile displays affects soldier situation awareness in a simulated UV HRI target search task performed in a moving HMMWV. Participants were 12 civilian males ranging in age from 18 to 46 years, with a mean age of 32 years. Participants performed a target search task, in which they searched for one target symbol among 50 non-target symbols displayed on an 18-inch diagonal computer monitor (a 30° field of view (FOV) visual display). Participants received audio and tactile cues to indicate on which third of a computer screen the target symbol was located. The independent variables were display modality, signal azimuth, participant age, and HMMWV movement condition. Display modalities were visual displays with supplemental cues in three display modalities; spatial audio, tactile, and combined spatial audio + tactile. The dependent variables were participant response time and accuracy, as well as the participant's subjective workload rating of display modality effectiveness. Accuracy data indicated that participants located over 99% of the targets correctly. Display modality was significant in terms of participant workload ratings, but was not significant for response time. Response time data indicated that no one display modality provided the shortest response time to all age groups, for all terrains. Workload with auditory + tactile displays was rated lowest of the three display modalities, which may have been because the combination audio + tactile display incorporated cues from both the audio and tactile modalities, an advantage in an environment with strong auditory and tactile distractors. The discrepancy between the workload and the performance data indicate that a greater understanding is needed of the role of each modality in on-the-move operations. Future research will deal with multimodal directional cues that can inform Soldiers of important HRI events 360° around of their field of view.
- Emmerman, P. J., Grills, J. P., and Movva, U. Y. (2000). Challenges to Agentization of the Battlefield. Proceedings of the International Command and Control Research and Technology Symposium. U.S. Department of Defense.Google Scholar
- Ishitake, T., Ando, H., Miyazaki, Y., and Matoba, F. (1988). Changes of visual performance induced by exposure to whole-body vibration. Kurume Medical Journal, 45(1), 59--62.Google ScholarCross Ref
- Griffin, M. J., and Lewis, C. H. (1978). A review of the effects of vibration on visual acuity and continuous manual control, part I: Visual acuity. Journal of Sound and Vibration, 56: 383--413.Google ScholarCross Ref
- Lathan, C. E., and Tracey, M. (2002). The effects of operator spatial perception and sensory feedback on human-robot teleoperation performance. Presence: Teleoperators and Virtual Environments, 11(4), 368--377. Google ScholarDigital Library
- Burdea, G., Richard, P., and Coiffet, P. (1996). Multimodality virtual reality: Input-output devices, systems integration, and human factors. International Journal of Human-computer Interaction: Special Issues of Human-Virtual Environment Interaction, 8(1), 5--24. Google ScholarDigital Library
- Haas, E. C., Pillalamarri, K., Stachowiak, C., and Lattin, M. (2005). Audio cues to assist visual search in robotic system operator control unit displays. U.S. Army Research Laboratory Technical Report ARL-TR-3632, U.S. Army Research Laboratory, Aberdeen Proving Ground, MD.Google ScholarCross Ref
- Haas, E. C., Stachowiak, C., Pillalamarri, K., and Lattin, M. (2006). Integrating audio and tactile displays for guiding visual search in robotic system OCU displays. Unpublished manuscript, U.S. Army Research Laboratory, Aberdeen Proving ground, MD.Google Scholar
- Perrott, D. R., Sadralodabai, T., Saberi, K., and Strybel, T. Z. (1991). Aurally aided visual search in the central visual field; effects of visual load and visual enhancement of the target. Human Factors, 33(4), 389--400. Google ScholarDigital Library
- Elias, B. (1996). The effects of spatial auditory preview on dynamic visual search performance. Proceedings of the Human Factors and Ergonomics Society 40th Annual Meeting, 1227--1231.Google ScholarCross Ref
- Fujawa, G. E., and Strybel, T. Z. (1997). The effects of cue informativeness and signal amplitude on auditory spatial facilitation of visual performance. Proceedings of the Human Factors and Ergonomics Society 41st Annual Conference, pp. 556--560.Google ScholarCross Ref
- Simpson, B. D., Bolia, R. S., and Draper, M. H. (2004). Spatial auditory display concepts supporting situation awareness for operators of unmanned aerial vehicles. In D. A. Vincenzi, M. Mouloua, and P. A. Hancock (Eds.), Vol. I.- Human performance, Situational Awareness, and Automation: Current Research and Trends (pp. 61--65). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.Google Scholar
- Gemperle, F., Ota, N., and Siewiorek, D. (2001). Design of a wearable tactile display. Proceedings of the 5th IEEE International Symposium on Wearable Computers, 5--12. Google ScholarDigital Library
- Calhoun, G., Fontejon, J., Draper, M., Ruff, H., and Guilfoos, B. (2004). Tactile versus aural redundant alert cues for UAV control applications. Proceedings of the Human Factors and Ergonomics Society 48th Meeting (pp. 137--141).Google ScholarCross Ref
- Gunn, D. V., Nelson, W. T., Bolia, R. S., Warm, J. S., Schumsky, D. A., and Corcoran, K. J. (2002). Target Acquisition with UAVs: Vigilance Displays and Advanced Cueing Interfaces. Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting, 1541--1545.Google ScholarCross Ref
- Gunn, D. V., Warm, J. S., Nelson, W. T., Bolia, R. S., Schumsky, D. A., and Corcoran, K. J. (2005). Target acquisition with UAVs: Vigilance displays and advance cueing interfaces. Human Factors, 47(3), 488--497.Google ScholarCross Ref
- Chou, Wusheng, Wang and Tianmiao (2001). The design of multimodal human-machine interface for teleoperation. Proceedings of the IEEE International Conference of systems, Man and Cybernetics, Volume 5, 2187--3192.Google Scholar
- U.S. Army (1991). Hearing Conservation. U.S. Army Pamphlet 40--501, Washington, D.C.: Department of Defense.Google Scholar
- Lockyer, B. (2004). Operation manual for the MIT wireless tactile control unit. Cambridge, Massachusetts: Massachusetts Institute of Technology.Google Scholar
- Hart, S. A., and Bortolussi, M. R. (1984). Pilot errors as a source of workload. Human Factors, 25(5), 575--556.Google Scholar
- Moray, N., Dessouky, M. I., Kijowski, B. A., and Adapathya, R. S. (1991). Strategic behavior, workload, and performance in task scheduling. Human Factors, 33, (6), 607--629.Google ScholarCross Ref
- Garamone, J. (2007). Army reserve components boost enlistment age limit. U.S. Department of Defense American Forces Press Service News Articles, July 23, 2007, http://www.defenselink.mil/news/newsarticle.aspx?id=31126.Google Scholar
- Multimodal displays to enhance human robot interaction on-the-move
Recommendations
Multimodal feedback for tabletop interactions
ITS '11: Proceedings of the ACM International Conference on Interactive Tabletops and SurfacesThis paper presents a study into the use of different modalities in providing per contact feedback for interactions on tabletop computers. We replicate the study by Wigdor et al. [3] and confirm their results, and extend the study to examine not just ...
Biting, Whirling, Crawling - Children's Embodied Interaction with Walk-through Displays
INTERACT '09: Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part IUnderstanding of embodied interaction in the context of walk-through displays and designing for it is very limited. This study examined children's intuitive embodied interaction with a large, semi-visible, projective walk-through display and space ...
3D Freehand Gestural Navigation for Interactive Public Displays
Users increasingly expect more-interactive experiences with public displays for applications including learning, gaming, urban visualization, and planning. However, user interaction with applications on public displays is challenging and often doesn't ...
Comments