Skip to main content
Log in

How to make the most of your human: design considerations for human–machine interactions

  • Original Article
  • Published:
Cognition, Technology & Work Aims and scope Submit manuscript

Abstract

Reconsidering the function allocation between automation and the pilot in the flight deck is the next step in improving aviation safety. The current allocation, based on who does what best, makes poor use of the pilot’s resources and abilities. In some cases, it may actually handicap pilots from performing their role. Improving pilot performance first lies in defining the role of the pilot—why a human is needed in the first place. The next step is allocating functions based on the needs of that role (rather than fitness), and then using automation to target specific human weaknesses in performing that role. Examples are provided (some of which could be implemented in conventional cockpits now). Along the way, the definition of human error and the idea that eliminating/automating the pilot will reduce instances of human error will be challenged.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Abbott TS (1989) Task-oriented display design: concept and example (Tech. Rep. No. 892230). National Aeronautics and Space Administration, Langley Research Center, Hampton, VA

    Google Scholar 

  • Abbott TS (2000) Task-oriented display design: the case of an engine-monitoring display. In: Sarter NB, Amalberti R (eds) Cognitive engineering in the aviation domain. Erlbaum, Hillsdale, NJ

    Google Scholar 

  • Abbott TS, Rogers WH (1993) Functional categories for human-centered flight deck design. In: Digital Avionics Systems Conference, 1993. 12th DASC. AIAA/IEEE. IEEE

  • Anderson KJ (1994) Impulsivity, caffeine, and task difficulty: a within-subjects test of the Yerkes–Dodson law. Personal Individ Differ 16(6):813–829

    Article  Google Scholar 

  • Aviation Safety Network http://aviation-safety.net/statistics/period/stats.php?cat=A1

  • Beach BE (1980) Line-oriented flight training. In: Cooper GE (ed) Resource management on the flightdeck. Proceedings of a NASA/Industry Workshop, NASA Ames Research Center, Moffett Field, CA

  • Bhana H (2010) Trust but verify. Aero Saf World 5(5):13–14

    Google Scholar 

  • Billings CE (1996) Human-centered aviation automation: principles and guidelines

  • Boeing Commercial Airplanes (2005) Statistical summary of commercial jet airplane accidents worldwide operations 1959–2004. Boeing Commercial Airplanes, Seattle

    Google Scholar 

  • Boeing Commercial Airplanes (2014) Statistical summary of commercial jet airplane accidents worldwide operations 1959–2013. Boeing Commercial Airplanes, Seattle

    Google Scholar 

  • Broadhurst PL (1959) A confirmation of the Yerkes–Dodson law and its relationship to emotionality in the rat. Acta Physiol (Oxf) 15:603–604

    Google Scholar 

  • Bureau d’Enquêtes et d’Analyses (BEA) (1993) Official Report of the Commission of Investigation into the Accident on 20 January 1992 near Mont Sainte Odile (Bas-Rhin) of the Airbus A.320 Registered F-Gged Operated by Air Inter, Report Number BEA F-ED920120

  • Casner SM, Geven RW, Williams KT (2012) The effectiveness of airline pilot training for abnormal events. Hum Factors 55(3):477–485

    Article  Google Scholar 

  • Cataldo M, de Souza CRB (2014) Exploring the impact of API complexity on failure-proneness. In: 2014 IEEE 9th International Conference on Global Software Engineering (ICGSE). IEEE

  • de Colombia AC (1996) Aircraft accident report: controlled flight into terrain, American Airlines flight 965, Boeing 757–223, N651AA near Cali, Colombia, 20 December 1995. Aeronautica Civil, Bogota, Colombia

  • Dismukes RK, Young GE, Sumwalt RL (1998) Cockpit interruptions and distractions: effective management requires a careful balancing act. ASRS Directline 10:4–9

    Google Scholar 

  • et d’Analyses, Bureau d’Enquetes (2012) Final report on the accident on 1st June 2009 to the Airbus A330-203 registered F-GZCP operated by Air France flight AF 447 Rio de Janeiro–Paris. Ministère de l’Écologie. du Dévéloppement durable, des Transports et du Logement, Paris

  • Ferrucci D et al (2010) Building Watson: an overview of the DeepQA project. AI Mag 31(3):59–79

    Article  Google Scholar 

  • Funk K et al (1999) Flight deck automation issues. Int J Aviat Psychol 9(2):109–123

    Article  Google Scholar 

  • Gillespie JM (1947) We Flew the Atlantic ‘No Hands’. Popular Science

  • Google Inc (2016) Google Self-Driving Car Project Monthly Report June 2016 Mountain View

  • Hamman WR et al. (1993) The future of LOFT scenario design and validation (Line-Oriented Flight Training). In: International Symposium on Aviation Psychology, 7th, Columbus

  • Jordan N (1963) Allocation of functions between man and machines in automated systems. J Appl Psychol 47(3):161

    Article  Google Scholar 

  • Kanki B, Helmreich R, Anca J (eds) (2010) Crew resource management. Academic Press, San Diego

    Google Scholar 

  • Klein G (1997) The recognition-primed decision (RPD) model: looking back, looking forward. Nat Decis Mak 285–292

  • Maslow AH (1948) “Higher” and “lower” needs. J Psychol 25(2):433–436

    Article  Google Scholar 

  • Maslow AH, Frager R, Cox R, Fadiman J (1970) Motivation and personality, vol 2. Harper & Row, New York

    Google Scholar 

  • National Transportation Safety Board (2010) Loss of Control on Approach, Colgan Air, Inc. Operating as Continental Connection Flight 3407 Bombardier DHC-8- 400, N200WQ, Clarence Center, New York, February 12, 2009. NTSB/AAR-10/01. National Transportation Safety Board, Washington

  • Neher A (1991) Maslow’s theory of motivation a critique. J Humanist Psychol 31(3):89–112

    Article  Google Scholar 

  • NTSB (2014) Descent below visual glidepath and impact with seawall: Asiana Airlines Flight 214, Boeing 777–200ER, HL7742, San Francisco, California, July 6, 2013. National Transportation Safety Board, Washington

  • Parasuraman R, Manzey DH (2010) Complacency and bias in human use of automation: an attentional integration. Hum Factor J Hum Factor Ergon Soc 52(3):381–410

    Article  Google Scholar 

  • Parasuraman R, Riley V (1997) Humans and automation: use, misuse, disuse, abuse. Hum Factor J Hum Factor Ergon Soc 39(2):230–253

    Article  Google Scholar 

  • Parasuraman R, Molloy R, Singh IL (1993) Performance consequences of automation-induced’ complacency. Int J Aviat Psychol 3(1):1–23

    Article  Google Scholar 

  • Parasuraman R, Mouloua M, Molloy R, Hilburn B (1996) Monitoring of automated systems. In: Parasuraman R, Mouloua M (eds) Automation and human performance: theory and applications Mahwah. Lawrence Erlbaum Associates, New Jersy, pp 91–116

    Google Scholar 

  • Pope A (1909) An essay on criticism. Clarendon Press, Oxford

    Google Scholar 

  • Reason J (1990) Human error. Cambridge University Press, Cambridge

    Book  Google Scholar 

  • Sarter NB, Woods DD (1995) How in the world did we ever get into that mode? Mode error and awareness in supervisory control. Hum Factors J Hum Factor Ergon Soc 37(1):5–19

    Article  Google Scholar 

  • Sarter NB, Woods DD, Billings CE (1997) Automation surprises. Handb Hum Factor Ergon 2:1926–1943

    Google Scholar 

  • Schutte P (1999) Complemation: an alternative to automation. J Inf Technol Impact 1(3):113–118

    Google Scholar 

  • Schutte P, Goodrich K, Williams R (2016) Synergistic allocation of flight expertise on the flight deck (SAFEdeck): A design concept to combat mode confusion, complacency, and skill loss in the flight deck. In: 7th International Conference on Applied Human Factors and Ergonomics, Orlando

  • Skitka LJ, Mosier K, Burdick Mark D (2000) Accountability and automation bias. Int J Hum Comput Stud 52(4):701–717

    Article  Google Scholar 

  • Staal MA (2004) Stress, cognition, and human performance: a literature review and conceptual framework (NASA Tech. Memorandum 212824). NASA Ames Research Center, Moffett Field, CA

    Google Scholar 

  • Wiener EL (1988) Cockpit automation. In: Wiener EL, Nagel DC (eds) Human Factors in Aviation. Academic, New York, pp 433–461

    Google Scholar 

  • Wiener EL (1988b) Cockpit automation. In: Wiener EL, Nagel DC (eds) Human factors in aviation San Diego. Academic Press, Millbrae, pp 433–461

    Google Scholar 

  • Wiener EL (1989) Human factors of advanced technology (glass cockpit) transport aircraft

  • Williard N et al (2013) Lessons learned from the 787 Dreamliner issue on lithium-ion battery reliability. Energies 6(9):4682–4695

    Article  Google Scholar 

  • Woods DD (1996) Decomposing automation: apparent simplicity, real complexity. In: Parasuraman R, Mouloua M (eds) Automation and Human Performance: Theory and Applications. Lawrence Erlbaum Associates, Mahwah, New Jersey

    Google Scholar 

  • Yerkes RM, Dodson JD (1908) The relation of strength of stimulus to rapidity of habit-formation. J Comp Neurol Psychol 18(5):459–482

    Article  Google Scholar 

  • Zsambok CE, Klein G (2014) Naturalistic decision making. Psychology Press, New York, NY

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paul C. Schutte.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Schutte, P.C. How to make the most of your human: design considerations for human–machine interactions. Cogn Tech Work 19, 233–249 (2017). https://doi.org/10.1007/s10111-017-0418-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10111-017-0418-2

Keywords

Navigation