A survey of technologies on the rise for emotion-enhanced interaction☆
Introduction
All of us experience emotions throughout our entire lives and in the most varied circumstances. They influence our decisions and experiences, but most of all they affect our social interactions and interpersonal relations. And while emotions manage to spill into almost every aspect of our days, there are still some areas that are almost exclusively absent of human emotion.
One such context is given by our interactions with technology. Nowadays, we may work with specialized tools, communicate online, organize our lives with apps, have fun with games and experience an entire set of emotions while doing so, anything from joy to excitement and to frustration. Yet, while we do bring our emotional selves into the world of computing, this field has still very little in the way of supporting out emotion-escorted interaction. We communicate our subjective states online through emoticons, emojis and “like”-buttons, we smile in the face of an aesthetically pleasing interface but without any reaction from the system, we work together with our colleagues without knowing how they feel about our work. As stated by Picard [1] in her work in the field of Affective Computing: “Basic affect recognition and expression are expected by humans in communication. However, computers today cannot even tell if you are pleased or displeased. They will scroll screenfuls of information past you regardless of whether you are sitting forward eagerly in your seat, or have begun to emit loud snoring sounds.” All this makes our current technology seem dry, without the ability to capture the entire human experience and interaction, and center on the most important element of the equation: the user.
To obtain affective technology that truly shapes itself on the human paradigm, we need to be able to recognize or estimate user emotional states and employ these findings in the design, functionality and interaction support of our digital systems. But before we focus on how we can use technology to detect emotions, we need to clarify what emotions are and how they are classified. Certainly, subjective human experiences are inherently subtle and hard to define, resulting in publications that intermix concepts like affect, emotion, mood, and sentiment. Furthermore, cognitive psychology offers a relatively wide array of definitions for all these concepts, leaving many degrees of freedom in selecting a view on the topic.
In the following, we provide a brief overview of the most widely accepted definitions and classifications of affective states. The notion of ‘affective state’ covers a set of concepts, including core affect, emotions, moods, and personality. These concepts differ in multiple ways, one of which is their temporal persistency. Core affects are defined as “a simple primitive non-reflective feeling most evident in mood and emotion but always available to consciousness” [2]. Core affect is thus constantly present in an individual, and it can be experienced either as part of an emotion or mood, or completely independently [3]. Furthermore, core affect can be linked to Russell׳s circumplex model of affect, detailed later in this section. Some elementary building blocks of emotion that are included in the core affect category include pleasure–displeasure and high–low energy levels.
Next, emotion is defined as a medium term affective state that is characterized by its appearance as a response to an external or internal stimulus [4], [5], represented by a person, an event, a memory, an image, a scent, etc. Thus, emotions are not bounded by time or reality, as emotions can be generated also by imaginary experiences or by events from the past. Examples of emotional states include love, fear, anger, and sadness.
Compared to emotions, moods are generally not elicited by a concrete stimulus, thus having a more diffuse nature. While the origin of a mood is rarely known, it is also an affective state that remains active for longer periods of time than emotions [6]. Finally, personality is a more persistent subjective aspect encoding attitudes towards a concept or object. In this paper, we focus on technologies that have been employed to estimate user emotions and moods, with an emphasis on the former. Emotions are of particular importance in affective computing and interaction, as they can be linked to events or objects, thus offering feedback about the interaction cycle of a user with his/her collaborators or a system.
In terms of classification, emotion theory has centered around two ways of grouping affective experiences. On the one hand, some theories focus on defining and analyzing emotions as a set of distinct states. Some of the more widely used theories include Ekman׳s theory of six basic emotions [7], [8] (disgust, anger, fear, joy, sadness, and surprise) and Plutchick׳s theory of eight basic emotions [9] (disgust, anger, fear, sorrow, joy, acceptance, anticipation, and surprise). Emotions that do not fall into the category of basic ones are usually defined as combinations or variations of basic emotions [10].
One characteristic of the basic emotions is that they are, by their nature, easier to recognize with various technologies, as they would generate distinct functional patterns in the human body (e.g., brain activation patterns or physiological patterns). However, studies have shown that non-basic emotions like frustration, boredom and confusion can be more frequent and thus more useful in human–computer interaction scenarios [11].
On the other hand, there are theories of affect that focus on distributing and describing all emotional states through a set of dimensions. While the number of possible considered dimensions is variable, most widely accepted approaches focus on a 2D or 3D model: Russell׳s circumplex model of affect (see Fig. 1) encodes emotional states in a two-dimensional space defined by valence (positive-negative) and arousal (excited-calm) [12]; and the three-dimensional model of Mehrabian incorporates the three axes of pleasure (valence), arousal and dominance (abbreviated: PAD) [13]. Note that Ekman׳s six basic emotions have counterparts in Russell׳s two-dimensional model, basically offering a correspondence between the two categories of models.
Contrary to affective computing, affective interaction1 focuses on the importance of the emotional experiences of the users from the perspective of awareness and reflexion, and the level in which they are grounded in interaction itself [14], [15]. Furthermore, affective interaction focuses less on accurate emotional readings and more on raising user emotional awareness for enabling him/her to analyze and evaluate the experience, communication and interaction.
Section 2 illuminates the functionality and abilities of a selected group of technologies, currently used in affective computing and interaction. We then present applications that use emotion estimation technology to enhance or augment the user interaction (Section 3). Finally, we highlight in Section 4 a set of research challenges and limitations of the current emotion recognition technologies and present our conclusions in Section 5.
Section snippets
Estimating user emotion
One particular way user emotions can be employed is to augment user interaction by offering the ability of developing more user-centered systems. But before inspecting examples of emotion-enhanced interactive systems, we would like to focus on highlighting some of the most common emotion measurement technologies that have been used increasingly often in the context of affective interaction.
While the detection techniques are varied, there are a set of commonalities when addressing emotion
Affective interactional systems
Emotion-enhanced interaction encompasses systems and applications that consider emotions as a novel dimension of communication between user and machine. These applications focus on emotion in the context of interaction as well as encourage a user-centered experience through reflexion based on these emotional states [80].
Inspired by the article of Boehner et al. [81], we propose a novel categorization of affective solutions based on the concrete role that affect plays in the interactional
Research challenges
While the discussed technologies for detecting user emotions are improving in multiple aspects, there are still a couple of challenges that these solutions need to address in the near future. These challenges are focused on affect detection techniques and are closely related to the nature and definitions of emotional states as well as influenced by user-centered requirements.
One of the main challenges of emotion detection revolves around uncertainty. Affective states have an inherent
Conclusions
Real-time detection and classification of user emotions is at the base of not only affective computing, but also has the potential of enabling a more user-centered interaction. In this paper, we focused on highlighting a selection of technologies and approaches that are increasingly often used in this context of emotion-enhanced interaction. We presented a choice of emotion detection technologies, including brain–computer interfaces and eye tracking, which have as advantages portability and
References (114)
- et al.
Detection, tracking, and classification of action units in facial expression
Robot. Auton. Syst.
(2000) Microexpression and macroexpression
- et al.
Psychophysiological responses to changes in workload during simulated air traffic control
Biol. Psychol.
(1996) Eye blinksnew indices for the detection of deception
Psychophysiology
(2001)- et al.
Pupillometric measures of cognitive and emotional processes
Int. J. Psychophysiol.
(2004) - et al.
Pupil size variation as an indication of affective processing
Int. J. Hum.-Comput. Stud.
(2003) - et al.
Definition and computation of oculomotor measures in the study of cognitive processes
- et al.
How emotion is made and measured
Int. J. Hum.-Comput. Stud.
(2007) Affective Computing
(1997)- J. Russell, L. Feldman Barrett, Core affect, in: D. Sander, K. Scherer (Eds.), The Oxford Companion to Emotion and the...
Core affect and the psychological construction of emotion
Psychol. Rev.
Psychological models of emotion
A critical role for affective neuroscience in resolving what is basic about basic emotions
Psychol. Rev.
The nature of emotions
Am. Sci.
How might emotions affect learning?
A circumplex model of affect
J. Personal. Soc. Psychol.
Framework for a comprehensive description and measurement of emotional states
Genet. Soc. Gen. Psychol. Monogr.
Speech in affective computing
The Maximally Discriminative Facial Movement Coding System (max)
Facial expressions of emotion are not culturally universal
Proc. Natl. Acad. Sci USA
Coherence between emotion and facial expressionevidence from laboratory experiments
Emot. Rev.
Visual interpretation of hand gestures for human–computer interactiona review
IEEE Trans. Pattern Anal. Mach. Intell.
Eye Tracking MethodologyTheory and Practice, 2nd edition
Eye TrackingA Comprehensive Guide to Methods and Measures
Studies of emotion: a theoretical and empirical review of psychophysiological studies of emotion
J. Undergrad. Res.
Startle and emotionlateral acoustic probes and the bilateral blink
Psychophysiology
Cited by (32)
Frontline Cyborgs at Your Service: How Human Enhancement Technologies Affect Customer Experiences in Retail, Sales, and Service Settings
2020, Journal of Interactive MarketingEMODASH: A dashboard supporting retrospective awareness of emotions in online learning
2020, International Journal of Human Computer StudiesCitation Excerpt :In the education field, both discrete and dimensional models of emotions are used as well as more learning-centered models (Kort et al., 2001; Pekrun, 2006). Data collected on user emotions are generally grouped into three categories (Cernea and Kerren, 2015; Montero and Suhonen, 2014): perception-based (or behavioral) estimations, physiological estimations, and subjective (or psychological) feelings. Perception-based estimations consist of recognizing emotions from facial expressions, voice, and body movements.
An empirical evaluation of a hands-free computer interaction for users with motor disabilities
2019, Journal of Biomedical InformaticsCitation Excerpt :First, existing research shows that EEG recordings based on portable EEG headsets are usually contaminated by undesired signals called artifacts. EEG artifacts are caused by head movements, EMG signals stem from facial muscles, eye blinks and movements, as well as body movements, etc. [28–32]. Although some existing studies have proposed solutions for EEG artifact removal methods, there is no commonly accepted methodology for artifact removal in daily-life EEG applications [33].
Is heart rate variability (HRV) an adequate tool for evaluating human emotions? – A focus on the use of the International Affective Picture System (IAPS)
2017, Psychiatry ResearchCitation Excerpt :Recently, many studies assessing human psychological and mental states and measuring emotional changes have been conducted in various fields, including psychology, psychiatry, and neurophysiology (Cernea and Kerren, 2015; Coelho et al., 2010; Sheppes et al., 2015).
Envisioning Wearable Color-Changing Device to Facilitate Emotion Recognition and Communication in Children with Autism Spectrum Disorder
2023, ACM International Conference Proceeding Series
- ☆
This paper has been recommended for acceptance by Henry Duh.