ABSTRACT
Many sources of empirical data can be used to evaluate an interface (e.g., time to learn, time to perform benchmark tasks, number of errors on benchmark tasks, answers on questionnaires, comments made in verbal protocols). This paper examines the relative contributions of both quanti?ta?tive and qualitative data gathered during a usability study. For each usability problem uncovered by this study, we trace each contributing piece of evidence back to its empirical source. For this usability study, the verbal protocol provided the sole source of evidence for more than one third of the most severe problems and more than two thirds of the less severe problems. Thus, although the verbal protocol provided the bulk of the evidence, other sources of data contributed disproportionately to the more critical problems. This work suggests that further research is required to determine the relative value of different forms of empirical evidence.
- 1.Card, S. K., T. P. Moran, and A. Newell. "The Keystroke- Level Model for User Performance Time with Interactive Systems." Communications of the ACM 23,396-410 (1980).]] Google ScholarDigital Library
- 2.Cuomo, D. L. and C. D. Bowen. "Understanding Usability Issues Address by Three User-system Interface Evaluation Techniques." Interacting with Computers, 6(1), 86-108 (1994).]]Google ScholarCross Ref
- 3.Desurvire, H. W. "Faster! Cheaper!! Are usability inspection methods as efficient as empirical testing?" In Jakob Nielsen and Robert L. Mack, eds., Usability Inspection Methods. John Wiley, NY (1994).]] Google ScholarDigital Library
- 4.Dumas, J. R. and J. C. Redish. A Practical Guide to Usability Testing. Ablex Publishing Corporation, Norwood NJ (1993).]] Google ScholarDigital Library
- 5.Ebling, M. R. "Translucent Cache Management for Mobile Computing." Ph.D. Dissertation. Carnegie Mellon University (1998).]] Google ScholarDigital Library
- 6.Ericsson, K. A. and H. Simon. "Verbal Reports as Data." Psychological Review 87(3), 215-51 (May 1980).]]Google ScholarCross Ref
- 7.Fink, A. and J. Kosecoff. How to Conduct Surveys: A Stepby-Step Guide. Sage Publications, Newbury Park, CA (1985).]]Google Scholar
- 8.Gray, W. D. and M. C. Salzman. "Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods." Human-Computer Interaction 13(3), 203-261 (1998).]]Google ScholarDigital Library
- 9.Jacobsen, N. E., M. Hertzum, B. E. John. "The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgements." Proceedings of the Human Factors and Ergonomics Society 42na Annual Meeting. HFES, 1336-1340 (1998).]] Google ScholarDigital Library
- 10.Jeffries, R., J. R. Miller, C. Wharton, and K. M. Uyeda. "User interface evaluation in the real world: A comparison of four techniques", Proceedings of CHI, (1991).]] Google ScholarDigital Library
- 11.John, B. E. and S. J. Marks. "Tracking the Effectiveness of Usability Evaluation Methods." Behaviour & Information Technology 16(4/5), 188-202 (1997).]]Google ScholarCross Ref
- 12.John, B. E., and M. M. Mashyna. "Evaluating a Multimedia Authoring Tool with Cognitive Walkthrough and Think- Aloud User Studies." Journal of the American Society of Information Science. 48(9), (1997).]] Google ScholarDigital Library
- 13.Jorgensen, A. H. "Using the thinking-aloud method in system development." In Salvendy, G., and Smith, M. J. (Eds.), Designing and Using Human-Computer Interfaces and Knowledge Based Systems. Elsevier Science Publishers, Amsterdam, 743-750 (1989).]] Google ScholarDigital Library
- 14.Karat, C.-M., R. Campbell, and T. Fiegel. "Comparison of empirical testing and walkthrough methods in user interface evaluation." In Proceedings of CHI, 397-404 (May 1992).]] Google ScholarDigital Library
- 15.Kistler, J. and M. Satyanarayanan. "Disconnected Operation in the Coda File System." ACM Transactions on Computers Systems. 10(1), 3-25 (February 1992).]] Google ScholarDigital Library
- 16.Lewis, C. "Using the 'thinking-aloud' method in cognitive interface design." Research Report RC9265, IBM T. J. Watson Research Center, Yorktown Heights, NY (1982).]]Google Scholar
- 17.Mummert, L. B., M. R. Ebling, M. Satyanarayanan. "Exploiting Weak Connectivity for Mobile File Access." In Proceedings of the Fifteenth A CM Symposium on Operating Systems Principles, 143-155 (December 1995).]] Google ScholarDigital Library
- 18.Newell, A. and H. A. Simon. Human Problem Solving. Prentice-Hall, Englewood Cliffs, NJ (1972).]] Google ScholarDigital Library
- 19.Nielsen, J. "Evaluating the thinking aloud technique for use by computer scientists." In Hartson, H. R., and Hix, D. (Eds.), Advances in Human-Computer Interaction, Vol. 3, Ablex, Norwood, NJ, 69-82 (1992).]] Google ScholarDigital Library
- 20.Nielsen, J. Usability Engineering. AP Professional, NY (1993).]] Google ScholarDigital Library
- 21.Nielsen, J. and V. L. Phillips. "Estimating the relative usability of two interfaces: Heuristic, formal, and empirical methods compared." In Proceedings of lNTERCHI'93, 214- 221 (1993).]] Google ScholarDigital Library
- 22.Satyanarayanan, M., J. J. Kistler, P. Kumar, M. E. Okasaki, E. H. Siegel, and D. C. Steere. "Coda: A Highly Available File System for a Distributed Workstation Environment." IEEE Transactions on Computers. 39(4), 447-59 (April 1990).]] Google ScholarDigital Library
Index Terms
- On the contributions of different empirical data in usability testing
Recommendations
Users’ perception of open source usability: an empirical study
The number of open source software (OSS) users has increased in recent years. No longer are they limited to technically adept software developers. Many believe that the OSS market share could increase tremendously provided OSS had systems that were ...
Usability of user interfaces: from monomodal to multimodal
BCS-HCI '07: Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 2This workshop is aimed at reviewing and comparing existing Usability Evaluation Methods (UEMs) which are applicable to monomodal and multimodal applications, whether they are web-oriented or not. It addresses the problem on how to assess the usability ...
Usability testing of a healthcare chatbot: Can we use conventional methods to assess conversational user interfaces?
ECCE '19: Proceedings of the 31st European Conference on Cognitive ErgonomicsChatbots are becoming increasingly popular as a human-computer interface. The traditional best practices normally applied to User Experience (UX) design cannot easily be applied to chatbots, nor can conventional usability testing techniques guarantee ...
Comments