Abstract
We present an empirical study where the effects of three levels of system transparency of an automated target classification aid on fighter pilots’ performance and initial trust in the system were evaluated. The levels of transparency consisted of (1) only presenting text–based information regarding the specific object (without any automated support), (2) accompanying the text-based information with an automatically generated object class suggestion and (3) adding the incorporated sensor values with associated (uncertain) historic values in graphical form. The results show that the pilots needed more time to make a classification decision when being provided with display condition 2 and 3 than display condition 1. However, the number of correct classifications and the operators’ trust ratings were the highest when using display condition 3. No difference in the pilots’ decision confidence was found, yet slightly higher workload was reported when using display condition 3. The questionnaire results report on the pilots’ general opinion that an automatic classification aid would help them make better and more confident decisions faster, having trained with the system for a longer period.
Chapter PDF
Similar content being viewed by others
References
Bisantz, A.M., Cao, D., Jenkins, M., Pennathur, P.R., Farry, M., Roth, E., Potter, S.S., Pfautz, J.: Comparing uncertainty visualizations for a dynamic decision-making task. Journal of Cognitive Engineering and Decision Making 5(3), 277–293 (2011)
British Ministry of Defence: Military Aircraft Accident Summary: Aircraft accident to Royal Air Force Tornado GR MK4A ZG710. Tech. rep. (2004)
Dong, X., Hayes, C.: Uncertainty visualizations helping decision makers become more aware of uncertainty and its implications. Journal of Cognitive Engineering and Decision Making 6(1), 30–56 (2012)
Fisher, C., Kingma, B.: Criticality of data quality as exemplified in two disasters. Information & Management 39(2), 109–116 (2001)
Gelsema, S.: The desirability of a nato-central database for non-cooperative target recognition of aircraft. In: Proceedings of the RTO SET Symposium on Target Identification and Recognition Using RF Systems, Oslo, Norway, October 11-13 (2004)
Irandoust, H., Benaskeur, A., Kabanza, F., Bellefeuille, P.: A mixed-initiative advisory system for threat evaluation. In: Proceedings of the 15th International Command and Control Research and Technology Symposium: The Evolution of C2, Santa Monica, California, USA (2010)
Jian, J.Y., Bisantz, A., Drury, C.: Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics 4(1), 53–71 (2000)
de Jong, J., Burghouts, G., Hiemstra, H., te Marvelde, A., van Norden, W., Schutte, K.: Hold your fire!: Preventing fratricide in the dismounted soldier domain. In: Proceedings of the 13th International Command and Control Research and Technology Symposium: C2 for Complex Endeavours, Bellevue, WA, USA (2008)
Krüger, M., Kratzke, N.: Monitoring of reliability in bayesian identification. In: 12th International Conference on Information Fusion, FUSION 2009, pp. 1241–1248. IEEE (2009)
Lee, J., See, K.: Trust in automation: Designing for appropriate reliance. Human Factors: The Journal of the Human Factors and Ergonomics Society 46(1), 50–80 (2004)
Liebhaber, M., Feher, B.: Air threat assessment: Research, model, and display guidelines. In: The Proceedings of the 2002 Command and Control Research and Technology Symposium (2002)
MacEachren, A.M., Roth, R.E., O’Brien, J., Li, B., Swingley, D., Gahegan, M.: Visual semiotics & uncertainty visualization: An empirical study. IEEE Transactions on Visualization and Computer Graphics 18(12), 2496–2505 (2012)
Mark, G., Kobsa, A.: The effects of collaboration and system transparency on cive usage: an empirical study and model. Presence: Teleoperators & Virtual Environments 14(1), 60–80 (2005)
McGuirl, J., Sarter, N.: Supporting trust calibration and the effective use of decision aids by presenting dynamic system confidence information. Human Factors: The Journal of the Human Factors and Ergonomics Society 48(4), 656–665 (2006)
Neyedli, H., Hollands, J., Jamieson, G.: Beyond identity incorporating system reliability information into an automated combat identification system. Human Factors: The Journal of the Human Factors and Ergonomics Society 53(4), 338–355 (2011)
Paradis, S., Benaskeur, A., Oxenham, M., Cutler, P.: Threat evaluation and weapons allocation in network-centric warfare. In: 8th International Conference on Information Fusion, vol. 2, pp. 1078–1085. IEEE (2005)
Parasuraman, R., Sheridan, T., Wickens, C.: A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 30(3), 286–297 (2000)
Preece, J., Rogers, Y., Sharp, H.: Interaction Design: Beyond Human-Computer Interaction. Wiley, New York (2002)
Skeels, M., Lee, B., Smith, G., Robertson, G.G.: Revealing uncertainty for information visualization. Information Visualization 9(1), 70–81 (2010)
Smith, C., Johnston, J., Paris, C.: Decision support for air warfare: Detection of deceptive threats. Group Decision and Negotiation 13(2), 129–148 (2004)
Wang, L., Jamieson, G., Hollands, J.: Trust and reliance on an automated combat identification system. Human Factors: The Journal of the Human Factors and Ergonomics Society 51(3), 281–291 (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Helldin, T., Ohlander, U., Falkman, G., Riveiro, M. (2014). Transparency of Automated Combat Classification. In: Harris, D. (eds) Engineering Psychology and Cognitive Ergonomics. EPCE 2014. Lecture Notes in Computer Science(), vol 8532. Springer, Cham. https://doi.org/10.1007/978-3-319-07515-0_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-07515-0_3
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-07514-3
Online ISBN: 978-3-319-07515-0
eBook Packages: Computer ScienceComputer Science (R0)