Skip to main content

Self-reported Methods for User Satisfaction Evaluation: A Bibliometric Analysis

  • Conference paper
  • First Online:
Human-Computer Interaction (HCI-COLLAB 2019)

Abstract

This research analyzes self-reported methods for user satisfaction evaluation through science mapping. The focal point of the domain fields of a user satisfaction evaluation must be fully established according to the current reality (challenges, issues and gaps) and future scientific perspectives (patterns and trends). The foregoing motivates the authors of the present article to use tools such as SciMAT to analyze the bibliographical production on user satisfaction and to identify the thematic patterns related to the user experience of particular interest in this study, such as self-reported methods, specifically SUS, SUMI and QUIS. Self-reported methods are the most frequently used evaluation tools due to their simplicity and low cost. Such methods offer information about users’ subjective reactions and can become one of the most important inputs to collect and understand users’ behavior, preferences and perceptions. Identifying these methods in science mapping provides understanding of their evolution throughout a certain period of time: 2001–2019 (based on a corpus of bibliographic references from 359 documents). Thanks to the analyzed information, some research opportunities were identified regarding the evaluation instruments that motivate the present study, such as the neglect of any connection between the emotional and the use of software, variety of contexts to evaluate; in addition to the promising future that is possible in the field of user satisfaction evaluation if methodologies and tools are generated or adapted for this purpose.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://www.webofknowledge.com.

  2. 2.

    The corpora analyzed in science mapping studies are generally thousands of records. For that reason, it is considered that those used in this study are relatively small.

  3. 3.

    Co-occurrence, when two analysis units appear together in a set of documents.

References

  1. Moggridge, B.: Designing Interactions. The MIT Press, Cambridge (2007)

    Google Scholar 

  2. Calvo, R.A., Vella-Brodrick, D., Desmet, P., Ryan, R.M.: Positive computing: a new partnership between psychology, social sciences and technologists. Psychol. Well Being 6, 10 (2016)

    Article  Google Scholar 

  3. Aguirre, A.F., Villareal-Freire, Á., Gil, R., Collazos, C.A.: Extending the concept of user satisfaction in e-learning systems from ISO/IEC 25010. In: Marcus, A., Wang, W. (eds.) DUXU 2017. LNCS, vol. 10290, pp. 167–179. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-58640-3_13

    Chapter  Google Scholar 

  4. Aguirre, A.F., Villareal, Á.P., Collazos, C.A., Gil, R.: Aspectos a considerar en la evaluación de la satisfacción de uso en Entornos Virtuales de Aprendizaje. Rev. Colomb. Comput. 16, 75–96 (2015)

    Google Scholar 

  5. Capota, K., Van Hout, M., Van Der Geest, T.: Measuring the emotional impact of websites: a study on combining a dimensional and discrete emotion approach in measuring visual appeal of university websites. In: Proceedings of the 2007 Conference on Designing Pleasurable Products and Interfaces, 22–25 August 2007. ACM (2007). https://doi.org/10.1145/1314161.1314173

  6. Shahriar, S.D.: A comparative study on evaluation of methods in capturing emotion (2011)

    Google Scholar 

  7. Tullis, T., Albert, W.: Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics. Morgan Kaufmann (2013). https://doi.org/10.1016/c2011-0-00016-9

  8. Brooke, J.: SUS - A quick and dirty usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClleland, I.L. (eds.) Usability evaluation in industry, pp. 189–194. Taylor & Francis, Abingdon (1996)

    Google Scholar 

  9. Hartson, R., Pyla, P.: The UX Book: Process and Guidelines for Ensuring a Quality User Experience. Morgan Kaufmann, Burlington (2012)

    Google Scholar 

  10. Kirakowski, J., Corbett, M.: SUMI: the software usability measurement inventory. Br. J. Educ. Technol. 24, 10–12 (1993)

    Article  Google Scholar 

  11. Software Usability Measurement Inventory (SUMI). Human Factors Research Group, University College Cork (1993). http://sumi.ucc.ie/index.html. Accessed 22 Jan 2015

  12. ISO/IEC TR 9126-4: Software engineering – Product quality – Part 4: Quality in use metrics (2004)

    Google Scholar 

  13. ISO 9241-11: Ergonomic requirements for office work with visual display terminals (VDTs) – Part 11: Guidance on usability (1998)

    Google Scholar 

  14. Chin, J.P., Diehl, V.A., Norman, K.L.: Development of an instrument measuring user satisfaction of the human-computer interface. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 213–218 (1988)

    Google Scholar 

  15. Kirakowski, J., Claridge, N., Whitehand, R.: Human centered measures of success in web site design. In: Proceedings of the Fourth Conference on Human Factors & the Web (1998)

    Google Scholar 

  16. Lindgaard, G., Dudek, C.: What is this evasive beast we call user satisfaction? Interact. Comput. 15, 429–452 (2003)

    Article  Google Scholar 

  17. Measuring the Usability of Multi-Media System (MUMMS). Human Factors Research Group, University College Cork (1996)

    Google Scholar 

  18. Chin, J.P., Diehl, V.A., Norman, K.L.: Questionnaire For User Interaction Satisfaction (QUIS). Human-Computer Interaction Lab, University of Maryland at College Park (1988). https://isr.umd.edu/news/news_story.php?id=4099. Accessed 19 Nov 2018

  19. Johnson, T.R., Zhang, J., Tang, Z., Johnson, C., Turley, J.P.: Assessing informatics students’ satisfaction with a web-based courseware system. Int. J. Med. Inform. 73, 181–187 (2004)

    Article  Google Scholar 

  20. Börner, K., Chen, C., Boyack, K.W.: Visualizing knowledge domains. Ann. Rev. Inf. Sci. Technol. 37, 179–255 (2003)

    Article  Google Scholar 

  21. Cobo, M.J., López-Herrera, A.G., Herrera-Viedma, E., Herrera, F.: Science mapping software tools: review, analysis, and cooperative study among tools. J. Am. Soc. Inf. Sci. Technol. 62, 1382–1402 (2011)

    Article  Google Scholar 

  22. Callon, M., Courtial, J.P., Laville, F.: Co-word analysis as a tool for describing the network of interactions between basic and technological research: the case of polymer chemsitry. Scientometrics 22, 155–205 (1991)

    Article  Google Scholar 

  23. Coulter, N., Monarch, I., Konda, S.: Software engineering as seen through its research literature: a study in co-word analysis. J. Am. Soc. Inf. Sci. 49, 1206–1223 (1998)

    Article  Google Scholar 

  24. Cobo, M.J., López-Herrera, A.G., Herrera-Viedma, E., Herrera, F.: An approach for detecting, quantifying, and visualizing the evolution of a research field: a practical application to the fuzzy sets theory field. J. Informetr. 5, 146–166 (2011)

    Article  Google Scholar 

  25. Cobo, M.J., Martínez, M.A., Gutiérrez-Salcedo, M., Fujita, H., Herrera-Viedma, E.: 25 years at knowledge-based systems: a bibliometric analysis. Knowl.-Based Syst. 80, 3–13 (2015)

    Article  Google Scholar 

  26. De Angeli, A., Sutcliffe, A., Hartmann, J.: Interaction, usability and aesthetics: what influences users’ preferences? In: Proceedings of the 6th Conference on Designing Interactive Systems, pp. 271–280 (2006)

    Google Scholar 

  27. Sonderegger, A., Sauer, J.: The influence of design aesthetics in usability testing: effects on user performance and perceived usability. Appl. Ergon. 41, 403–410 (2010)

    Article  Google Scholar 

  28. ISO/IEC 25010:2011, Systems and software engineering – Systems and software Quality Requirements and Evaluation (SQuaRE) – System and software quality models. International Organization for Standardization (2011)

    Google Scholar 

  29. Grégoire, C., Roberge, G., Archambault, É.: Bibliometrics and Patent Indicators for the Science and Engineering Indicators 2016. SRI International (2016)

    Google Scholar 

Download references

Acknowledgements

This work was (partially) financed by the Dirección de Investigación, Universidad de La Frontera.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Andrés F. Aguirre-Aguirre , Ángela Villareal-Freire , Jaime Díaz , Carlos González-Amarillo , Rosa Gil or César A. Collazos .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Aguirre-Aguirre, A.F., Villareal-Freire, Á., Díaz, J., González-Amarillo, C., Gil, R., Collazos, C.A. (2019). Self-reported Methods for User Satisfaction Evaluation: A Bibliometric Analysis. In: Ruiz, P., Agredo-Delgado, V. (eds) Human-Computer Interaction. HCI-COLLAB 2019. Communications in Computer and Information Science, vol 1114. Springer, Cham. https://doi.org/10.1007/978-3-030-37386-3_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-37386-3_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-37385-6

  • Online ISBN: 978-3-030-37386-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics