Skip to main content
Log in

Applicability of human reliability assessment methods to human–computer interfaces

  • Original Article
  • Published:
Cognition, Technology & Work Aims and scope Submit manuscript

Abstract

The UK Office of Nuclear Regulation (ONR) has undertaken a Generic Design Assessment of two nuclear power station designs for prospective construction in the UK. This assessment included a review of the Human Reliability Assessments (HRAs) submitted as part of the probabilistic safety assessments (PSAs). Both reactor designs have human–system interfaces driven by digital technology. However, the data and methods for assessing human error probability (HEP) pre-date such technology. Therefore, the ONR sought to establish whether existing HRA methods remain applicable to modern human–interface interactions and hence continue to provide a credible insight into the risk contribution from human error. An extensive literature review was undertaken to identify or derive relevant HEPs. Data have ranged from those associated with particular interface objects, plant start-ups and post-fault diagnoses. There appear to be some interesting paradoxes within the data explored in this paper. Based upon the data reviewed, it is concluded that existing human reliability assessment methods are likely to be optimistic in their estimates of HEPs where diagnosis is involved or where process control is dependent on human–computer interaction. Shortfalls in the availability of published relevant data and the scope of existing HRA methods have been identified by this work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Boring RL, Blackman HS (2007) The origins of the SPAR-H method’s performance shaping factor multipliers. In: Proceedings of the joint 8th IEEE conference on human factors and power plants and 13th annual workshop on human performance/root cause/trending/operating experience/self-assessment, 26–31 Aug 2007, Monterey Marriott, Monterey, CA

  • Broberg H, Massaiua S, Julius J, Johansson B (2010) The international HRA empirical study: simulator results from the loss of feedwater scenarios. In: Proceedings of the PSAM 10, 7–11 June 2010, Renaissance Seattle Hotel, Seattle, Washington, USA

  • Bye A, Lois E, Dang VN, Parry G, Forester J, Massaiu S, Boring R, Braarud PØ, Broberg H, Julius J, Männistö I, Nelson P (2010) The international HRA empirical study—phase 2 report results from comparing hra methods predictions to hammlab simulator data on SGTR scenarios. HWR-915, Institutt for energiteknikk, NO-1751 Halden, Norway

  • Fischer S, Doherty U (2008) Adaptively shortened pull-down menus: location knowledge and selection efficiency. Behav Inf Technol 27(5):439–444

    Article  Google Scholar 

  • Gertman D, Blackman H, Marble J, Byers J, Smith C (2005) The SPAR-H human reliability analysis method. NUREG/CR–6883. Idaho National Laboratory, Battelle Energy Alliance, Idaho Falls, ID 83415, Aug 2005

  • Hannaman GW, Spurgin AJ, Lukic YD (1985) A model for assessing human cognitive reliability in PRA studies. In: Proceedings of the IEEE third conference on human factors and nuclear safety. Monterey, CA

  • Hwang S-L, Liang G-F, Lin J-T, Yau Y-J, Yenn T-C, Hsu C–C, Chuang C-F (2009) A real-time warning model for teamwork performance and system safety in nuclear power plants. Saf Sci 47(K122):425–435

  • Javaux D (2002) A method for predicting errors when interacting with finite state systems. How implicit learning shapes the user’s knowledge of a system. Reliab Eng Syst Saf 75:147–165

    Article  Google Scholar 

  • Jorgensen AH, Garde AH, Laursen B, Jensen BR (2002) Using mouse and keyboard under time pressure: preference, strategies and learning. Behav Inf Technol 21(5):317–319

    Article  Google Scholar 

  • Kontogiannis T, Moustakis V (2002) An experimental evaluation of comprehensibility aspects of knowledge structures derived through induction technique: a case study of industrial false diagnosis. Behav Inf Technol 21(2):117–135

    Article  Google Scholar 

  • Kunkel K, Bannert M, Fach PW (1995) The influence of design decisions on the usability of direct manipulation user interfaces. Behav Inf Technol 14(2):93–106

    Article  Google Scholar 

  • Lin D-YM, Su Y-L (1998) The effect of time pressure on expert system based training for emergency management. Behav Inf Technol 17(4):195–202

    Article  MathSciNet  Google Scholar 

  • Lind M, Seipel S, Mattiason C (2001) Displaying meta-information in context. Behav Inf Technol 20(6):427–432

    Article  Google Scholar 

  • Mackenzie IS, Zhang SX (2001) An empirical investigation of the novice experience with soft keyboards. Behav Inf Technol 20(6):411–418

    Article  Google Scholar 

  • McCarthy JC, Monk AF (1994) Evaluating user interfaces. I: software. Behav Inf Technol 13(5):311–319

    Article  Google Scholar 

  • Molich R, Dumas JS (2008) Comparative usability evaluation (CUE-4). Behav Inf Technol 27(3):263–281

    Article  Google Scholar 

  • Müsseler J, Meiunecke C, Döbler J (1996) Complexity of user interfaces: can it be reduced by a mode key? Behav Inf Technol 15(5):291–300

    Article  Google Scholar 

  • Nordby K, Raanas RK, Magnussen S (2002) Keying briefly presented multiple-digit numbers. Behav Inf Technol 21(1):27–38

    Article  Google Scholar 

  • Parush A (2004) An empirical evaluation of textual display configurations for supervisory tasks. Beh Inf Technol 23(4):225–235

    Article  Google Scholar 

  • Renaud K, Ramsay J (2007) Now what was that password again? A more flexible way of identifying and authenticating our seniors. Behav Inf Technol 26(4):309–322

    Article  Google Scholar 

  • Roth EM, Mumaw RJ, Lewis PM (1994) An empirical investigation of operator performance in cognitively demanding simulated emergencies. NUREG/CR-6208 US Nuclear Regulatory Commission, Washington, D.C., USA

  • Shryane NM, Westerman SJ, Crawshaw CM, Hockey GRJ, Sauer J (1998) Task analysis for the investigation of human error in safety critical software design: a convergent methods approach. Ergonomics 41(11):1719–1736

    Article  Google Scholar 

  • Snowberry K, Parkinson S, Sisson N (1985) Effects of help fields on navigating through a hierarchical menu structures. Int J Man-Mach Stud 22:479–491

    Article  Google Scholar 

  • Swain AD (1987) Accident sequence evaluation program human reliability analysis procedure. NUREG/CR-4772, Sandia National Laboratories, Albuquerque, New Mexico, May 1987

  • Swain AD, Guttmann HE (1983) Handbook of human reliability analysis with emphasis on nuclear power plant applications. NUREG/CR–1278, Sandia National Laboratories, Albuquerque, New Mexico

  • Tractinsky M (2000) A theoretical framework and empirical examination of the effects of foreign and translated Interface language. Behav Inf Technol 19(1):1–13

    Google Scholar 

  • Trewin S, Pain H (1999) A model of keyboard configuration requirements. Beh Inf Technol 18(1):27–35

    Article  Google Scholar 

  • Westerman SJ, Davies DR, Glendon AI, Stammers RB, Matthews G (1995) Age and cognitive ability as predictors of computerized information retrieval. Behav Inf Technol 14:313–326

    Google Scholar 

  • Wiedenbeck S (1999) The use of icons and labels in an end-user application program: an empirical study of learning and retention. Behav Inf Technol 18(2):68–82

    Article  Google Scholar 

  • Williams JC (1988) A data-based method for assessing and reducing human error to improve operational performance. In: Proceedings of 4th IEEE conference on human factors in power plants, Monterey, California, June 1988

  • Workman M (2007) The effects of technology mediated interaction and openness in virtual team performance measures. Behav Inf Technol 26(5):355–365

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to E. M. Hickling.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hickling, E.M., Bowie, J.E. Applicability of human reliability assessment methods to human–computer interfaces. Cogn Tech Work 15, 19–27 (2013). https://doi.org/10.1007/s10111-012-0215-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10111-012-0215-x

Keywords

Navigation