skip to main content
10.1145/347642.347766acmconferencesArticle/Chapter ViewAbstractPublication PagesdisConference Proceedingsconference-collections
Article
Free Access

On the contributions of different empirical data in usability testing

Published:01 August 2000Publication History

ABSTRACT

Many sources of empirical data can be used to evaluate an interface (e.g., time to learn, time to perform benchmark tasks, number of errors on benchmark tasks, answers on questionnaires, comments made in verbal protocols). This paper examines the relative contributions of both quanti?ta?tive and qualitative data gathered during a usability study. For each usability problem uncovered by this study, we trace each contributing piece of evidence back to its empirical source. For this usability study, the verbal protocol provided the sole source of evidence for more than one third of the most severe problems and more than two thirds of the less severe problems. Thus, although the verbal protocol provided the bulk of the evidence, other sources of data contributed disproportionately to the more critical problems. This work suggests that further research is required to determine the relative value of different forms of empirical evidence.

References

  1. 1.Card, S. K., T. P. Moran, and A. Newell. "The Keystroke- Level Model for User Performance Time with Interactive Systems." Communications of the ACM 23,396-410 (1980).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. 2.Cuomo, D. L. and C. D. Bowen. "Understanding Usability Issues Address by Three User-system Interface Evaluation Techniques." Interacting with Computers, 6(1), 86-108 (1994).]]Google ScholarGoogle ScholarCross RefCross Ref
  3. 3.Desurvire, H. W. "Faster! Cheaper!! Are usability inspection methods as efficient as empirical testing?" In Jakob Nielsen and Robert L. Mack, eds., Usability Inspection Methods. John Wiley, NY (1994).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. 4.Dumas, J. R. and J. C. Redish. A Practical Guide to Usability Testing. Ablex Publishing Corporation, Norwood NJ (1993).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. 5.Ebling, M. R. "Translucent Cache Management for Mobile Computing." Ph.D. Dissertation. Carnegie Mellon University (1998).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. 6.Ericsson, K. A. and H. Simon. "Verbal Reports as Data." Psychological Review 87(3), 215-51 (May 1980).]]Google ScholarGoogle ScholarCross RefCross Ref
  7. 7.Fink, A. and J. Kosecoff. How to Conduct Surveys: A Stepby-Step Guide. Sage Publications, Newbury Park, CA (1985).]]Google ScholarGoogle Scholar
  8. 8.Gray, W. D. and M. C. Salzman. "Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods." Human-Computer Interaction 13(3), 203-261 (1998).]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. 9.Jacobsen, N. E., M. Hertzum, B. E. John. "The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgements." Proceedings of the Human Factors and Ergonomics Society 42na Annual Meeting. HFES, 1336-1340 (1998).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. 10.Jeffries, R., J. R. Miller, C. Wharton, and K. M. Uyeda. "User interface evaluation in the real world: A comparison of four techniques", Proceedings of CHI, (1991).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. 11.John, B. E. and S. J. Marks. "Tracking the Effectiveness of Usability Evaluation Methods." Behaviour & Information Technology 16(4/5), 188-202 (1997).]]Google ScholarGoogle ScholarCross RefCross Ref
  12. 12.John, B. E., and M. M. Mashyna. "Evaluating a Multimedia Authoring Tool with Cognitive Walkthrough and Think- Aloud User Studies." Journal of the American Society of Information Science. 48(9), (1997).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. 13.Jorgensen, A. H. "Using the thinking-aloud method in system development." In Salvendy, G., and Smith, M. J. (Eds.), Designing and Using Human-Computer Interfaces and Knowledge Based Systems. Elsevier Science Publishers, Amsterdam, 743-750 (1989).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. 14.Karat, C.-M., R. Campbell, and T. Fiegel. "Comparison of empirical testing and walkthrough methods in user interface evaluation." In Proceedings of CHI, 397-404 (May 1992).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. 15.Kistler, J. and M. Satyanarayanan. "Disconnected Operation in the Coda File System." ACM Transactions on Computers Systems. 10(1), 3-25 (February 1992).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. 16.Lewis, C. "Using the 'thinking-aloud' method in cognitive interface design." Research Report RC9265, IBM T. J. Watson Research Center, Yorktown Heights, NY (1982).]]Google ScholarGoogle Scholar
  17. 17.Mummert, L. B., M. R. Ebling, M. Satyanarayanan. "Exploiting Weak Connectivity for Mobile File Access." In Proceedings of the Fifteenth A CM Symposium on Operating Systems Principles, 143-155 (December 1995).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. 18.Newell, A. and H. A. Simon. Human Problem Solving. Prentice-Hall, Englewood Cliffs, NJ (1972).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. 19.Nielsen, J. "Evaluating the thinking aloud technique for use by computer scientists." In Hartson, H. R., and Hix, D. (Eds.), Advances in Human-Computer Interaction, Vol. 3, Ablex, Norwood, NJ, 69-82 (1992).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. 20.Nielsen, J. Usability Engineering. AP Professional, NY (1993).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. 21.Nielsen, J. and V. L. Phillips. "Estimating the relative usability of two interfaces: Heuristic, formal, and empirical methods compared." In Proceedings of lNTERCHI'93, 214- 221 (1993).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. 22.Satyanarayanan, M., J. J. Kistler, P. Kumar, M. E. Okasaki, E. H. Siegel, and D. C. Steere. "Coda: A Highly Available File System for a Distributed Workstation Environment." IEEE Transactions on Computers. 39(4), 447-59 (April 1990).]] Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. On the contributions of different empirical data in usability testing

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      DIS '00: Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques
      August 2000
      456 pages
      ISBN:1581132190
      DOI:10.1145/347642

      Copyright © 2000 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 August 2000

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      DIS '00 Paper Acceptance Rate48of127submissions,38%Overall Acceptance Rate1,158of4,684submissions,25%

      Upcoming Conference

      DIS '24
      Designing Interactive Systems Conference
      July 1 - 5, 2024
      IT University of Copenhagen , Denmark

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader