ABSTRACT
The recent shift of emphasis to user experience (UX) has rendered it a central focus of product design and evaluation. A multitude of methods for UX design and evaluation exist, but a clear overview of the current state of the available UX evaluation methods is missing. This is partly due to a lack of agreement on the essential characteristics of UX. In this paper, we present the results of our multi-year effort of collecting UX evaluation methods from academia and industry with different approaches such as literature review, workshops, Special Interest Groups sessions and an online survey. We have collected 96 methods and analyzed them, among other criteria, based on the product development phase and the studied period of experience. Our analysis reveals development needs for UX evaluation methods, such as early-stage methods, methods for social and collaborative UX evaluation, establishing practicability and scientific quality, and a deeper understanding of UX.
- ENGAGE, Report on the evaluation of generative tools and methods for 'emotional design'. Deliverable D15.3. EU project Engage 520998 (2006).Google Scholar
- Ganglbauer, E., Schrammel, J., Deutsch, S., and Tscheligi, M. Applying Psychophysiological Methods for Measuring User Experience: Possibilities, Challenges and Feasibility. Workshop on User Experience Evaluation Methods in Product Development. August 25, 2009. Uppsala, Sweden.Google Scholar
- Gaver, B., Dunne, T., and Pacenti, E. Design: Cultural probes. Interactions 6, 1 (Jan. 1999), 21--29. Google ScholarDigital Library
- Green, W., Dunn, G., and Hoonhout, J. Developing the scale adoption framework for evaluation (SAFE). In: Proc. of the 5th COST294-MAUSE Open Workshop "Meaningful Measures: Valid Useful User Experience Measurement (VUUM)", Iceland June 2008. Also published in the ACM Library.Google Scholar
- Hassenzahl, M., and Tractinsky, N., User Experience - a research agenda. In: Behavior & Information Technology, 25(2), (2006) pp. 91--97.Google Scholar
- Hassenzahl, M. 2008. User experience (UX): towards an experiential perspective on product quality. In Proc. of the 20th international Conference of the Association Francophone D'interaction Homme-Machine. IHM '08, vol. 339. (2008) ACM, New York, NY, 11--15. Google ScholarDigital Library
- Hoonhout, H. C. M. Let the game tester do the talking: think aloud and interviewing to learn about the game experience, In: Isbister, K., Schaffer, N. (eds.), Game Usability: Advice from the Experts for Advancing the Player Experience, San Fransisco, CA: Morgan Kaufmann Publishers, (2008) 65--77.Google Scholar
- HUMAINE D9j: Final report on WP9. 30th January, 2008.Google Scholar
- ISO DIS 9241-210:2010. Ergonomics of human system interaction - Part 210: Human-centred design for interactive systems (formerly known as 13407). International Standardization Organization (ISO). Switzerland.Google Scholar
- Isomursu M, Tähti, M., Väinämö, S., and Kuutti, K. Experimental evaluation of five methods for collecting emotions in field settings with mobile applications. International Journal of Human-Computer Studies, Volume 65(4), April 2007, pp. 404--418. Google ScholarDigital Library
- Isomursu, M. (2008). User experience evaluation with experimental pilots. In: Väänänen-Vainio-Mattila, K.; Roto, V.; Hassenzahl, M. Now Let's Do It in Practice: User Experience Evaluation Methods in Product Development, Workshop at CHI2008 (2008). Google ScholarDigital Library
- Jordan, P. Designing Pleasurable Products. (2000) Taylor & Francis, London.Google Scholar
- Karapanos, E., Zimmerman, J., Forlizzi, J., and Martens, J. User experience over time: an initial framework. In Proc. CHI '09. (2009) ACM, New York, NY, 729--738. Google ScholarDigital Library
- Kaye, J. Evaluating experience-focused HCI. In CHI '07 Extended Abstracts on Human Factors in Computing Systems CHI '07. (2007) ACM, New York, NY, 1661--1664. Google ScholarDigital Library
- Kujala, S., and Väänänen-Vainio-Mattila, K. Value of Information Systems and Products: Understanding the Users' Perspective and Values, Journal of Information Technology Theory and Application (JITTA), 9, 4, 2009.Google Scholar
- Law, E., Roto, V., Hassenzahl, M., Vermeeren, A., and Kort, J. (2009). Understanding, Scoping and Defining User eXperience: A Survey Approach. Proc. CHI'09, ACM SIGCHI conference on Human Factors in Computing Systems. Google ScholarDigital Library
- Law, E., Scapin, D., Cockton, G., Stary, M., and Winckler, M. Maturation of Usability Evaluation Methods: Retrospect and Prospect. COST294-MAUSE Closing Conference Proceedings (2009). http://141.115.28.2/cost294/upload/533.pdfGoogle Scholar
- Lewin, K. Field theory in social science; selected theoretical papers. D. Cartwright (ed.). (1951) New York: Harper & Row.Google Scholar
- Mäkelä, A., Fulton Suri, J. Supporting Users' Creativity: Design to Induce Pleasurable Experiences. Proc. of the Int. Conf. on Affective Human Factors Design, (2001) pp. 387--394.Google Scholar
- Monahan, K., Lahteenmaki, M., McDonald, S., and Cockton, G. 2008. An investigation into the use of field methods in the design and evaluation of interactive systems. In Proc. of the 22nd British HCI Group Annual Conf. on HCI 2008: People and Computers Xxii: Culture, Creativity, interaction - Volume 1. (2008) British Computer Society, Swinton, UK, 99--108. Google ScholarDigital Library
- Nisbett, R. E. and Wilson, T., Telling more than we can know: verbal reports on mental processes. Psychological Review, 1977, 84(3), 231--259.Google ScholarCross Ref
- Obrist, M., Roto, V., and Väänänen-Vainio-Mattila, K. User Experience Evaluation -- Do You Know Which Method to Use? Special Interest Group in CHI2009 Conference, Boston, USA, 5--9 April, 2009. Google ScholarDigital Library
- Roto, V., Väänänen-Vainio-Mattila, K., Law, E., and Vermeeren, A. User Experience Evaluation Methods in Product Development. Workshop in INTERACT'09, August 25, 2009. Uppsala, Sweden. Google ScholarDigital Library
- Shami, N. S., Hancock, J. T., Peter, C., Muller, M., and Mandryk, R. Measuring affect in HCI: going beyond the individual. In CHI '08 Extended Abstracts on Human Factors in Computing Systems. CHI '08. (2008) ACM, New York, NY, 3901--3904. Google ScholarDigital Library
- Stone, D., Jarrett, C., Woodroffe, M., and Minocha, S. User Interface Design and Evaluation (The Morgan Kaufmann Series in Interactive Technologies). (2005) Morgan Kaufmann. Google ScholarDigital Library
- Tullis, T., and Albert, B. Measuring the User Experience. Collecting, Analyzing, and Presenting Usability Metrics. (2008) Morgan Kaufmann. Google ScholarDigital Library
- Väänänen-Vainio-Mattila, K., Roto, V., and Hassenzahl, M. Now Let's Do It in Practice: User Experience Evaluation Methods in Product Development. In Extended Abstract of CHI 2008, ACM Press (2008), pp. 3961--3964. Google ScholarDigital Library
Index Terms
- User experience evaluation methods: current state and development needs
Recommendations
Game User Experience Evaluation
CHI EA '16: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing SystemsIn a nutshell: This course comprehensively covers important user experience (UX) evaluation methods as well as opportunities and challenges of UX evaluation in the area of entertainment and games. The course is an ideal forum for attendees to gain ...
Let users tell the story: evaluating user experience with experience reports
CHI EA '10: CHI '10 Extended Abstracts on Human Factors in Computing SystemsUser experience (UX) has been under extensive research in recent years. One of the key questions has been how to evaluate user experience. Several methods such as diaries, experience sampling and questionnaires have been used for collecting data on user ...
The experience of interactive storytelling: comparing “fahrenheit” with “façade”
ICEC'11: Proceedings of the 10th international conference on Entertainment ComputingAt the intersection of multimedia, artificial intelligence, and gaming technology, new visions of future entertainment media arise that approximate the “Holodeck” ® idea of interactive storytelling. We report exploratory experiments on the user ...
Comments