A survey of technologies on the rise for emotion-enhanced interaction

https://doi.org/10.1016/j.jvlc.2015.10.001Get rights and content

Highlights

  • We highlight techniques for emotion measurement in context of affective interaction.

  • We present and categorize affective interactional systems and applications.

  • Main research challenges of emotion measuring technologies are discussed.

Abstract

Emotions are a major part of the human existence and social interactions. Some might say that emotions are one of the aspects that make us truly human. However, while we express emotions in various life settings, the world of computing seems to struggle with supporting and incorporating the emotional dimension. In the last decades, the concept of affect has gotten a new upswing in research, moving beyond topics like market research and product development, and further exploring the area of emotion-enhanced interaction. In this paper, we highlight techniques that have been employed more intensely for emotion measurement in the context of affective interaction. Besides capturing the functional principles behind these approaches and the inherent volatility of human emotions, we present relevant applications and establish a categorization of the roles of emotion detection in interaction. Based on these findings, we also capture the main challenges that emotion measuring technologies will have to overcome in order to enable a truly seamless emotion-driven interaction.

Introduction

All of us experience emotions throughout our entire lives and in the most varied circumstances. They influence our decisions and experiences, but most of all they affect our social interactions and interpersonal relations. And while emotions manage to spill into almost every aspect of our days, there are still some areas that are almost exclusively absent of human emotion.

One such context is given by our interactions with technology. Nowadays, we may work with specialized tools, communicate online, organize our lives with apps, have fun with games and experience an entire set of emotions while doing so, anything from joy to excitement and to frustration. Yet, while we do bring our emotional selves into the world of computing, this field has still very little in the way of supporting out emotion-escorted interaction. We communicate our subjective states online through emoticons, emojis and “like”-buttons, we smile in the face of an aesthetically pleasing interface but without any reaction from the system, we work together with our colleagues without knowing how they feel about our work. As stated by Picard [1] in her work in the field of Affective Computing: “Basic affect recognition and expression are expected by humans in communication. However, computers today cannot even tell if you are pleased or displeased. They will scroll screenfuls of information past you regardless of whether you are sitting forward eagerly in your seat, or have begun to emit loud snoring sounds.” All this makes our current technology seem dry, without the ability to capture the entire human experience and interaction, and center on the most important element of the equation: the user.

To obtain affective technology that truly shapes itself on the human paradigm, we need to be able to recognize or estimate user emotional states and employ these findings in the design, functionality and interaction support of our digital systems. But before we focus on how we can use technology to detect emotions, we need to clarify what emotions are and how they are classified. Certainly, subjective human experiences are inherently subtle and hard to define, resulting in publications that intermix concepts like affect, emotion, mood, and sentiment. Furthermore, cognitive psychology offers a relatively wide array of definitions for all these concepts, leaving many degrees of freedom in selecting a view on the topic.

In the following, we provide a brief overview of the most widely accepted definitions and classifications of affective states. The notion of ‘affective state’ covers a set of concepts, including core affect, emotions, moods, and personality. These concepts differ in multiple ways, one of which is their temporal persistency. Core affects are defined as “a simple primitive non-reflective feeling most evident in mood and emotion but always available to consciousness” [2]. Core affect is thus constantly present in an individual, and it can be experienced either as part of an emotion or mood, or completely independently [3]. Furthermore, core affect can be linked to Russell׳s circumplex model of affect, detailed later in this section. Some elementary building blocks of emotion that are included in the core affect category include pleasure–displeasure and high–low energy levels.

Next, emotion is defined as a medium term affective state that is characterized by its appearance as a response to an external or internal stimulus [4], [5], represented by a person, an event, a memory, an image, a scent, etc. Thus, emotions are not bounded by time or reality, as emotions can be generated also by imaginary experiences or by events from the past. Examples of emotional states include love, fear, anger, and sadness.

Compared to emotions, moods are generally not elicited by a concrete stimulus, thus having a more diffuse nature. While the origin of a mood is rarely known, it is also an affective state that remains active for longer periods of time than emotions [6]. Finally, personality is a more persistent subjective aspect encoding attitudes towards a concept or object. In this paper, we focus on technologies that have been employed to estimate user emotions and moods, with an emphasis on the former. Emotions are of particular importance in affective computing and interaction, as they can be linked to events or objects, thus offering feedback about the interaction cycle of a user with his/her collaborators or a system.

In terms of classification, emotion theory has centered around two ways of grouping affective experiences. On the one hand, some theories focus on defining and analyzing emotions as a set of distinct states. Some of the more widely used theories include Ekman׳s theory of six basic emotions [7], [8] (disgust, anger, fear, joy, sadness, and surprise) and Plutchick׳s theory of eight basic emotions [9] (disgust, anger, fear, sorrow, joy, acceptance, anticipation, and surprise). Emotions that do not fall into the category of basic ones are usually defined as combinations or variations of basic emotions [10].

One characteristic of the basic emotions is that they are, by their nature, easier to recognize with various technologies, as they would generate distinct functional patterns in the human body (e.g., brain activation patterns or physiological patterns). However, studies have shown that non-basic emotions like frustration, boredom and confusion can be more frequent and thus more useful in human–computer interaction scenarios [11].

On the other hand, there are theories of affect that focus on distributing and describing all emotional states through a set of dimensions. While the number of possible considered dimensions is variable, most widely accepted approaches focus on a 2D or 3D model: Russell׳s circumplex model of affect (see Fig. 1) encodes emotional states in a two-dimensional space defined by valence (positive-negative) and arousal (excited-calm) [12]; and the three-dimensional model of Mehrabian incorporates the three axes of pleasure (valence), arousal and dominance (abbreviated: PAD) [13]. Note that Ekman׳s six basic emotions have counterparts in Russell׳s two-dimensional model, basically offering a correspondence between the two categories of models.

Contrary to affective computing, affective interaction1 focuses on the importance of the emotional experiences of the users from the perspective of awareness and reflexion, and the level in which they are grounded in interaction itself [14], [15]. Furthermore, affective interaction focuses less on accurate emotional readings and more on raising user emotional awareness for enabling him/her to analyze and evaluate the experience, communication and interaction.

Section 2 illuminates the functionality and abilities of a selected group of technologies, currently used in affective computing and interaction. We then present applications that use emotion estimation technology to enhance or augment the user interaction (Section 3). Finally, we highlight in Section 4 a set of research challenges and limitations of the current emotion recognition technologies and present our conclusions in Section 5.

Section snippets

Estimating user emotion

One particular way user emotions can be employed is to augment user interaction by offering the ability of developing more user-centered systems. But before inspecting examples of emotion-enhanced interactive systems, we would like to focus on highlighting some of the most common emotion measurement technologies that have been used increasingly often in the context of affective interaction.

While the detection techniques are varied, there are a set of commonalities when addressing emotion

Affective interactional systems

Emotion-enhanced interaction encompasses systems and applications that consider emotions as a novel dimension of communication between user and machine. These applications focus on emotion in the context of interaction as well as encourage a user-centered experience through reflexion based on these emotional states [80].

Inspired by the article of Boehner et al. [81], we propose a novel categorization of affective solutions based on the concrete role that affect plays in the interactional

Research challenges

While the discussed technologies for detecting user emotions are improving in multiple aspects, there are still a couple of challenges that these solutions need to address in the near future. These challenges are focused on affect detection techniques and are closely related to the nature and definitions of emotional states as well as influenced by user-centered requirements.

One of the main challenges of emotion detection revolves around uncertainty. Affective states have an inherent

Conclusions

Real-time detection and classification of user emotions is at the base of not only affective computing, but also has the potential of enabling a more user-centered interaction. In this paper, we focused on highlighting a selection of technologies and approaches that are increasingly often used in this context of emotion-enhanced interaction. We presented a choice of emotion detection technologies, including brain–computer interfaces and eye tracking, which have as advantages portability and

References (114)

  • J. Russell

    Core affect and the psychological construction of emotion

    Psychol. Rev.

    (2003)
  • K. Scherer, Toward a dynamic theory of emotion: the component process model of affective states, Geneva Stud. Emot....
  • K. Scherer

    Psychological models of emotion

  • N. Frijda, Mood, in: D. Sander, K. Scherer (Eds.), The Oxford Companion to Emotion and the Affective Sciences, Oxford...
  • P. Ekman, An argument for basic emotions, Cogn. Emot. 6 (3–4) (1992)...
  • J. Panksepp

    A critical role for affective neuroscience in resolving what is basic about basic emotions

    Psychol. Rev.

    (1992)
  • R. Plutchik

    The nature of emotions

    Am. Sci.

    (2001)
  • G. Bower

    How might emotions affect learning?

  • S.K. D׳Mello, R.A. Calvo, Beyond the basic emotions: what should affective computing compute?, in: CHI Extended...
  • J. Russell

    A circumplex model of affect

    J. Personal. Soc. Psychol.

    (1980)
  • A. Mehrabian

    Framework for a comprehensive description and measurement of emotional states

    Genet. Soc. Gen. Psychol. Monogr.

    (1995)
  • K. Boehner, R. DePaula, P. Dourish, P. Sengers, Affect: from information to interaction, in: Proceedings of the Fourth...
  • K. Höök, A. Ståhl, P. Sundström, J. Laaksolaahti, Interactional empowerment, in: Proceedings of the SIGCHI Conference...
  • C. Xu, S. Li, G. Liu, Y. Zhang, E. Miluzzo, Y.-F. Chen, J. Li, B. Firner, Crowd++: unsupervised speaker count with...
  • R. Vipperla, J. Geiger, S. Bozonnet, D. Wang, N. Evans, B. Schuller, G. Rigoll, Speech overlap detection and...
  • D. Charlet, C. Barras, J.-S. Lienard, Impact of overlapping speech detection on speaker diarization for broadcast news...
  • R.A. Calvo, S. D׳Mello, J. Gratch, A. Kappas (Eds.), The Oxford Handbook of Affective Computing, 1st edition, Oxford...
  • A. Tawari, M. Trivedi, Speech emotion analysis in noisy real-world environment, in: The 20th International Conference...
  • B. Schuller, D. Seppi, A. Batliner, A. Maier, S. Steidl, Towards more reality in the recognition of emotional speech,...
  • C.-C. Lee et al.

    Speech in affective computing

  • E.L. van den Broek, M.H. Schut, J.H. D.M. Westerink, J. van Herk, K. Tuinenbreijer, Computing emotion awareness through...
  • D. Heger, F. Putze, T. Schultz, Online recognition of facial actions for natural EEG-based BCI applications, in: ACII...
  • P. Ekman, W. Friesen, The Facial Action Coding System: A Technique for the Measurement of Facial Movement, Consulting...
  • C. Izard

    The Maximally Discriminative Facial Movement Coding System (max)

    (1977)
  • W.V. Friesen, P. Ekman, EMFACS-7: emotional facial action coding system, Unpublished manuscript, University of...
  • G. Littlewort, J. Whitehill, T. Wu, I. Fasel, M. Frank, J. Movellan, M. Bartlett, The computer expression recognition...
  • S.B. Gokturk, Model-based face tracking for view-independent facial expression recognition, in: Proceedings of the...
  • R.E. Jack et al.

    Facial expressions of emotion are not culturally universal

    Proc. Natl. Acad. Sci USA

    (2012)
  • R. Reisenzein et al.

    Coherence between emotion and facial expressionevidence from laboratory experiments

    Emot. Rev.

    (2013)
  • C.S.S. Tan, J. Schöning, K. Luyten, K. Coninx, Informing intelligent user interfaces by inferring affective states from...
  • V.I. Pavlovic et al.

    Visual interpretation of hand gestures for human–computer interactiona review

    IEEE Trans. Pattern Anal. Mach. Intell.

    (1997)
  • J. Aggarwal, Q. Cai, Human motion analysis: a review, Comput. Vis. Image Underst. 73 (3)...
  • A. Poole, L.J. Ball, Eye tracking in human–computer interaction and usability research: current status and future, in:...
  • A. Duchowski

    Eye Tracking MethodologyTheory and Practice, 2nd edition

    (2007)
  • K. Holmqvist et al.

    Eye TrackingA Comprehensive Guide to Methods and Measures

    (2011)
  • K. Rayner, A. Pollatsek, The Psychology of Reading, 1994,...
  • D. Bruneau, M. Sasse, J. McCarthy, The eyes never lie: the use of eye tracking data in HCI research, in: Proceedings of...
  • M. Pomplun, S. Sunkara, Pupil dilation as an indicator of cognitive workload in human–computer interaction, in:...
  • C. Niemic et al.

    Studies of emotion: a theoretical and empirical review of psychophysiological studies of emotion

    J. Undergrad. Res.

    (2002)
  • M.M. Bradley et al.

    Startle and emotionlateral acoustic probes and the bilateral blink

    Psychophysiology

    (1991)
  • Cited by (32)

    • EMODASH: A dashboard supporting retrospective awareness of emotions in online learning

      2020, International Journal of Human Computer Studies
      Citation Excerpt :

      In the education field, both discrete and dimensional models of emotions are used as well as more learning-centered models (Kort et al., 2001; Pekrun, 2006). Data collected on user emotions are generally grouped into three categories (Cernea and Kerren, 2015; Montero and Suhonen, 2014): perception-based (or behavioral) estimations, physiological estimations, and subjective (or psychological) feelings. Perception-based estimations consist of recognizing emotions from facial expressions, voice, and body movements.

    • An empirical evaluation of a hands-free computer interaction for users with motor disabilities

      2019, Journal of Biomedical Informatics
      Citation Excerpt :

      First, existing research shows that EEG recordings based on portable EEG headsets are usually contaminated by undesired signals called artifacts. EEG artifacts are caused by head movements, EMG signals stem from facial muscles, eye blinks and movements, as well as body movements, etc. [28–32]. Although some existing studies have proposed solutions for EEG artifact removal methods, there is no commonly accepted methodology for artifact removal in daily-life EEG applications [33].

    • Is heart rate variability (HRV) an adequate tool for evaluating human emotions? – A focus on the use of the International Affective Picture System (IAPS)

      2017, Psychiatry Research
      Citation Excerpt :

      Recently, many studies assessing human psychological and mental states and measuring emotional changes have been conducted in various fields, including psychology, psychiatry, and neurophysiology (Cernea and Kerren, 2015; Coelho et al., 2010; Sheppes et al., 2015).

    View all citing articles on Scopus

    This paper has been recommended for acceptance by Henry Duh.

    View full text