Next Article in Journal
Consumption Optimization in an Office Building Considering Flexible Loads and User Comfort
Previous Article in Journal
On the Calibration of GNSS-Based Vehicle Speed Meters
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Human Emotion Recognition: Review of Sensors and Methods

by
Andrius Dzedzickis
1,
Artūras Kaklauskas
2 and
Vytautas Bucinskas
1,*
1
Faculty of Mechanics, Vilnius Gediminas Technical University, J. Basanaviciaus g. 28, LT-03224 Vilnius, Lithuania
2
Faculty of Civil engineering, Vilnius Gediminas Technical University, Sauletekio ave. 11, LT-10223 Vilnius, Lithuania
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(3), 592; https://doi.org/10.3390/s20030592
Submission received: 11 December 2019 / Revised: 10 January 2020 / Accepted: 12 January 2020 / Published: 21 January 2020
(This article belongs to the Section Biomedical Sensors)

Abstract

:
Automated emotion recognition (AEE) is an important issue in various fields of activities which use human emotional reactions as a signal for marketing, technical equipment, or human–robot interaction. This paper analyzes scientific research and technical papers for sensor use analysis, among various methods implemented or researched. This paper covers a few classes of sensors, using contactless methods as well as contact and skin-penetrating electrodes for human emotion detection and the measurement of their intensity. The results of the analysis performed in this paper present applicable methods for each type of emotion and their intensity and propose their classification. The classification of emotion sensors is presented to reveal area of application and expected outcomes from each method, as well as their limitations. This paper should be relevant for researchers using human emotion evaluation and analysis, when there is a need to choose a proper method for their purposes or to find alternative decisions. Based on the analyzed human emotion recognition sensors and methods, we developed some practical applications for humanizing the Internet of Things (IoT) and affective computing systems.

1. Introduction

With the rapid increase in the use of smart technologies in society and the development of the industry, the need for technologies capable to assess the needs of a potential customer and choose the most appropriate solution for them is increasing dramatically. Automated emotion evaluation (AEE) is particularly important in areas such as: robotics [1], marketing [2], education [3], and the entertainment industry [4]. The application of AEE is used to achieve various goals:
(i)
in robotics: to design smart collaborative or service robots which can interact with humans [5,6,7];
(ii)
in marketing: to create specialized adverts, based on the emotional state of the potential customer [8,9,10];
(iii)
in education: used for improving learning processes, knowledge transfer, and perception methodologies [11,12,13];
(iv)
in entertainment industries: to propose the most appropriate entertainment for the target audience [14,15,16,17].
In the scientific literature are presented numerous attempts to classify the emotions and set boundaries between emotions, affect, and mood [18,19,20,21]. From the prospective of automated emotion recognition and evaluation, the most convenient classification is presented in [3,22]. According to the latter classification, main terms defined as follows:
(i)
“emotion” is a response of the organism to a particular stimulus (person, situation or event). Usually it is an intense, short duration experience and the person is typically well aware of it;
(ii)
“affect” is a result of the effect caused by emotion and includes their dynamic interaction;
(iii)
“feeling” is always experienced in relation to a particular object of which the person is aware; its duration depends on the length of time that the representation of the object remains active in the person’s mind;
(iv)
“mood” tends to be subtler, longer lasting, less intensive, more in the background, but it can affect affective state of a person to positive or negative direction.
According to the research performed by Feidakis, Daradoumis and Cabella [21] where the classification of emotions based on fundamental models is presented, exist 66 emotions which can be divided into two groups: ten basic emotions (anger, anticipation, distrust, fear, happiness, joy, love, sadness, surprise, trust) and 56 secondary emotions. To evaluate such a huge amount of emotions, it is extremely difficult, especially if automated recognition and evaluation is required. Moreover, similar emotions can have overlapping parameters, which are measured. To handle this issue, the majority of studies of emotion evaluation focuses on other classifications [3,21], which include dimensions of emotions, in most cases valence (activation—negative/positive) and arousal (high/low) [23,24], and analyses only basic emotions which can be defined more easily. A majority of researches use variations of Russel’s circumplex model of emotions (Figure 1) which provides a distribution of basic emotions in two-dimensional space in respect of valence and arousal. Such an approach allows for the definition of a desired emotion and evaluating its intensity just analyzing two dimensions.
Using the above-described model, the classification and evaluation of emotions becomes clear, but still there are many issues related to the assessment of emotions, especially the selection of measurement and results evaluation methods, the selection of measurement hardware and software. Moreover, the issue of emotion recognition and evaluation remains complicated by its interdisciplinary nature: emotion recognition and strength evaluation are the object of psychology sciences, while the measurement and evaluation of human body parameters are related with medical sciences and measurement engineering, and sensor data analysis and solution is the object of mechatronics.
This review focuses on the hardware and methods used for automated emotion recognition, which are applicable for machine learning procedures using obtained experimental data analysis and automated solutions based on the results of these analyses. This study also analyzes the idea of humanizing the Internet of Things and affective computing systems, which has been validated by systems developed by the authors of this research [25,26,27,28].
Intelligent machines with empathy for humans are sure to make the world a better place. The IoT field is definitely progressing on human emotion understanding thanks to achievements in human emotion recognition (sensors and methods), computer vision, speech recognition, deep learning, and related technologies [29].

2. Emotions Evaluation Methods

Emotion evaluations methods which are presented in the literature can be classified into two main groups according to the basic techniques used for emotions recognition: self-repot techniques based on emotions self-assessment by filing various questionnaires [30,31,32]; machine assessment techniques based on measurements of various parameters of human body [33,34,35]. In addition, there are frequent cases of simultaneous use of several methods in order to increase reliability of obtained results. According to [36,37], each emotion can be evaluated by analyzing five main components of emotion (Behavioral tendencies, physiological reactions, motor expressions cognitive appraisals and subjective feelings) but only the first four can be evaluated automatically and can give indications about the emotional state of an user during an interaction, without interrupting it. Subjective feelings usually evaluated only using self-assessment techniques.
Automated emotion recognition is typically performed by measuring various human body parameters or electric impulses in the nervous system and analyzing their changes. The most popular techniques are electroencephalography, skin resistance measurements, blood pressure, heart rate, eye activity, and motion analysis.

2.1. Electroencephalography (EEG)

The EEG is an electrophysiological noninvasive technique for the recording of electrical activity arising from the human brain [38]. The first report on the application of this technique presented by Hans Berger, a German psychiatrist, pioneered the EEG in humans in 1924 [38]. EEG signals usually are collected using a special device called an electroencephalogram. The main parts of this device are special metal plate electrodes which should be placed on the human scalp, while in special cases, alternative needle electrodes can be inserted directly into the scalp [39]. In most cases, 8, 16 or 32 pairs of electrodes are located on four standard positions on the head: the nasion, inion, and right and left preauricular points (Figure 2a) [39]
Electrodes can be attached to the human scalp using adhesive-conducting gel or special headsets [41] (Figure 2b) with installed electrodes. The EEG signal is a fluctuation of voltage between two paired electrodes in respect of time [42] (Figure 3a) and signal amplitude is usually evaluated using peak to peak technique (Figure 3b).
For the evaluation of human emotions, brains response to various stimuli are usually measured and analyzed by five frequency ranges from EEG signals, namely: delta, theta, alpha, beta and gamma. These band waves are omnipresent in different parts of the brain [45,46] and are related to various emotional states (Table 1).
Depending on the object of interest, a variety of different methods can be implemented for the processing and analysis of EEG signals. If the purpose is to evaluate an average level of valence and arousal or to detect the efficiency of applied stimulus, a fast Fourier transformation [48] or latency test can be used [49]. If the purpose is to identify a specific emotion and its strength, statistical methods [50] or machine learning techniques [51] can be implemented. A review of related works based on only EEG signals is provided in Table 2.
From information provided in Table 2 it is seen that main part of researches are focused on the development of more advanced methods for emotion recognition from EEG signals. For this purpose, it is generally not required to provide a real experiment in order to validate the proposed method, because free databases are available with recorded EEG signals under known conditions. One of the most popular databases for EEG signal analysis is DEAP (database for emotion analysis using physiological signals dataset) [62] which contains EEG and other psychological signals from 32 participants stimulated by 40 different one minute length music videos. EEG and peripheral physiological signals recorded using a Biosemi ActiveTwo system. All 32 channels were recorded with sampling frequency 512 Hz. Obtained data relates to results, obtained from self-assessment and from other emotion recognition techniques in order to form reliable dataset, appropriate for use in the future researches. A comparison between the DEAP dataset, MAHNOB-HCI (multimodal database for affect recognition and implicit tagging) [63], EMDB (emotional movie database) [64] and DECAF (multimodal dataset for decoding affective physiological responses) [65] is provided in [66].
Information on the scientific research of emotion recognition using EEG is provided in Table 2. The main focus of activities points to the development of new methods for information extraction from EEG rather than a measurement procedure and therefore opens broad potential for machine learning techniques with IoT capabilities. Moreover, in practice, EEG quite often uses sensor data fusion, which is a basic technique complemented by other sensors and methods. Thus, big data technology with IoT implementation opens new horizons in automated emotions recognition.

2.2. Electrocardiography (ECG)

The heart is one of the most critical organs in the human body, and electrocardiography (ECG) is considered to be one of the most powerful diagnostic tools in medicine that is routinely used for the assessment of the functionality of the heart. ECG being a physiological signal is used as the conventional method for noninvasive interpretation of the electrical activity of the heart in real time [67]. Since heart activity is related with human central system ECG is useful not only in analyzing the heart’s activity it can be also used for emotion recognition [68].
The ECG recording procedure is described in in detail as follows [69]. The most commonly used technique is known as the 12-lead ECG technique. This technique uses nine sensors placed on the human body (Figure 4a). The positions of the three main sensors are distributed on the left arm (LA), right arm (RA), and on the left leg (LL). The right leg (RL) is connected only by a wire, which should be used as ground for the interconnected sensors. By only having these three sensors, physicians can use a method called 3-lead ECG, which suffers from the lack of information about some parts of the heart but is useful for some emergency cases requiring quick analysis. To obtain a higher resolution, six sensors (V1-V6) are added on the chest (Figure 4a). These sensors also measure to ground (G) on the right leg (RL). Using all the nine sensors and interconnecting them for the 12-lead ECG gives twelve signals, known in biomedical terms as: Lead I, Lead II, Lead III, aVR, aVL, aVF, V1, V2, V3, V4, V5, and V6 (Figure 4b).
The most important points on the ECG signal are the peaks: P, Q, R, S, T, and U [69] (Figure 5b). Each of these peaks is related to the heart activity [69] and it has its own characteristics (Table 3). Emotion recognition using physiological signal is a more complex process compared to EEG because of it’s sensitivity to movement artifacts and the inability to visually perceive emotion from data [70]. In order to eliminate the noises caused by outside factors, such as the movement of the subject during measurement procedure [71], ECG is usually performed in spaces protected from environment effects when the human is in calm state (Figure 5a).
There are main five parameters, as shown in the Table 3, which are often used to evaluate ECG signals. Usually, all five parameters are analyzed only for medical purposes, trying to define abnormal heart activity, and to obtain its deviation parameter. For the recognition of emotions, in most cases, QRS Complex is used, which defines activation of the heart related with human emotional state and is a suitable indicator to recognize main emotions, but there are also difficulties in the emotion recognition due to the fact that this indicator has variant sensitivity to specific emotions. Results of research provided by Cai, Liu, and Hao [73], shows that sadness can be recognized more easily and precisely than emotion of joy. The majority of studies related to ECG based emotion recognition focus on the definition and evaluation of QRS amplitudes and the duration between those waves. Further, there are set of researches focused on the analysis of QT/QTc dispersion [74] which provides proof that this interval is related with anxiety level and can be used as a marker to recognize intense anger.
Main drawback of the 12-lead ECG is that it produces huge amounts of data, especially when used for a long number of hours. Physicians use the 12-lead ECG method because it allows them to view the heart in its three dimensional form, thus enabling the detection of any abnormality that may not be apparent in the 3-lead or 6-lead ECG techniques [69]. ECG application in the automated emotion recognition requires using sophisticated signal processing techniques, which enables detection and extraction of the required parameters from the raw signal. A majority of QRS complex extraction techniques based on assumption that, at the beginning, it is enough to define P or R peaks, and other parameters (Figure 5b) will be estimated using these peaks since the signal shape is stable. There are a huge number of researches available focused on different types of feature extraction methods. Some of those methods include heart rate variability (HRV), empirical mode decomposition (EMD) with-in beat analysis (WIB), FFT analysis, and various methods of wavelet transformations [51]. A detailed overview of various methods used for emotion recognition from ECG is presented in [76]. Analysis of related researches shows the suitability of the ECG technique for precise emotion recognition in the laboratories and predefined stable environments, but fundamental limitations exist that do not allow application of this method for contactless instantaneous emotion recognition. Such methods of emotion evaluation will inevitably be required in future applications in the field of neuromarketing, tutoring, or human–machine interaction.
Due to the complicity of ECG signal analysis in practical applications, quite often, ECG is used in conjunction with other emotion recognition techniques. A short overview of scientific researches based on ECG is presented in Table 4.
Despite the above-described drawback, ECG remains a powerful and prospective technique for emotion recognitions since it allows to measure signals in the human body which are directly related with emotional states. The fact that many researches focuses on creation of new methods of useful information extraction allows to state that ECG based emotion recognition is a great medium for the implementation of various machine-learning techniques. Machine learning allows for automatically analyzing a huge amount of data and to define relations between measurements performed under various circumstances: states when a human is relaxed or affected by some stimulus. Moreover, due to high precision ECG being complemented by machine learning based signal analysis and processing techniques, it is possible to use for researches of emotion perception mechanisms and for the creation of predictive models based on the long-term monitoring of human behavior and emotional response.

2.3. Galvanic Skin Response (GSR)

The galvanic skin response (GSR), also known as electrodermal activity (EDA) or skin conductance (SC), is a continuous measurement of electrical parameters of human skin. Most often, skin conductions is used as the main parameter in this technique. Electrical parameters of the skin are not under conscious human control [78] since, according to the traditional theory, they depend on the variation of sweat reaction, which reflects changes in the sympathetic nervous system [79]. There is proof that some output signals from sympathetic nervous bursts are followed by the changes of skin conductance [80]. Emotional changes induce sweat reactions, which are mostly noticeable on the surface of the hands fingers and the soles. Sweat reaction causes a variation of the amount of salt in the human skin and this leads to the change of electrical resistance of the skin [81]. When sweat glands becomes more active, they secrete moisture towards the skin surface. That changes the balance of positive and negative ions and affects the electrical currents’ flow property on the skin [82].
Skin conductance is mainly related with the level of arousal: if the arousal level is increased, the conductance of the skin also increases. GSR signal amplitude is associated with stress, excitement, engagement, frustration, and anger, and the obtained measurement results correlate with the self-reported evaluation of arousal [83]. Attention-grabbing stimuli and attention-demanding tasks lead to the simultaneous increase of the frequency and magnitude of GSR. So, GSR allows not only to recognize emotions, but also to automatically detect decision making process [84].
In the GSR method, the electrical conductance of the skin is measured using one or two sensor(s) [81] which consist special electrodes containing Ag/AgCl (silver-chloride) contact points with the skin. There is a variety of possibilities for placing electrodes (Figure 6) [85,86], but usually sensors are attached to the fingers, wrist, shoulder, or foot (positions 1, 4, 10, and 6 in Figure 6).
A raw GSR signal contains information about two types of activity: tonic and phasic (Figure 7). The conductivity level of tonic activity changes slowly and individually for each human, and it mainly depends on their skin hydration level, dryness, and autonomic regulation in response to environmental factors such as temperature, for example. Phasic responses are short term peaks in GSR, mostly independent of the tonic level and reflecting reactions of the sympathetic nervous system to emotionally arousing events [87].
Since a GSR signal contains useful information related with its amplitude and frequency, usually, it is analyzed in time and frequency domains by applying various techniques and extracting such statistical parameters as: median, mean, standard deviation, minimum, maximum, as well as ratio of minimum and maximum [78]. The application of traditional signal analysis methods for GSR measurements is complicated by the fact that a signal contains low and high frequency components, and a reaction to the same stimulus is not always identical. Implementing machine learning algorithms, it is possible to increase the precision of emotion recognition and to recognize specific emotions related with the level of arousal, e.g., excitement or stress [81].
Compared to ECG and EEG, GSR gives less information about emotional state, but it has a few important advantages:
(i)
it requires fewer measuring electrodes, which allows for the easier use of wearable devices and definition of emotional states when a person engages in normal activities;
(ii)
GSR provides fewer raw data, especially if long term monitoring is performed, this allows to analyse obtained data more quickly and does not require a lot of computational power;
(iii)
equipment required for GSR measurements is much more simple and cheaper, if special electrodes are available, a measuring device can be assembled using popular and freely available components (ADC converters, microcontrollers, etc.).
The main drawback of the GSR method it is lack of information related to the valence level. This issue is usually solved additionally implementing other emotion recognitions methods, and these complementarily obtained results allows to perform detailed analysis. A short review of researches where GSR is used for emotion recognition is provided in Table 5.
From a review of papers, provided in Table 5, several directions of research are noticeable. The first direction of interest is the development and validation of emotion recognition methods combining GSR and other techniques. The second direction is the development of various wearable sensors. The third direction is the implementation of modern signal processing and analysis techniques in order to create systems, which will be able to define certain emotions with extremely high reliability. An example of such applications is provided in [96], which proposes a stress detection system wherein only two physiological signals are required, namely GSR and heart rate. The study comes up with the conclusions that the best approach combining accuracy and real-time application uses fuzzy logic, modelling the behavior of individuals under different stressing and non-stressing situations, and using that proposed system definitely detected stress by means of fuzzy logic with an accuracy of 99.5%.

2.4. Heart Rate Variability (HRV)

HRV is an emotional state evaluation technique based on the measurement of heart rate variability, which means the beat-to-beat variation in time within a certain period of sinus rhythm (RR interval in Figure 5b). Unlike mean heart rate variance, which is expressed in a period of 60 s, HRV analysis examines the nuance time variance in each cycle of a heartbeat and its regularity [97]. The variability in heart rate is regulated by the synergistic action of the two branches of the autonomous nervous system, namely the sympathetic and parasympathetic nervous system. The heart rate represents the net effect of the parasympathetic nerves, which slow heartrate, and the sympathetic nerves, which accelerate it. These changes are influenced by emotions, stress, and physical exercise [98,99]. Moreover, HRV depends on age and gender, and additional factors include physical and mental stress, smoking, alcohol, coffee, overweight, and blood pressure, as well as glucose level, infectious agents, and depression. Inherited genes also significantly affect heart rate variability. A low HRV indicates a state of relaxation, whereas an increased HRV indicates a potential state of mental stress or frustration [100].
The classical technique for HRV measurements is ECG [97] which measures the primary electro biological signal related with heart activity and provides the ability to define the time between heart pulses by extracting information about the RR interval (Figure 5b) variation in respect to time. Variation of RR interval from ECG signal can be extracted using common peak detection techniques, which allows for defining the duration between each R peak and forms an HRV signal, which expresses the variation of interval between R peaks in respect to time.
A common method of HVR analysis usually includes analysis methods in time and frequency domains [97]. Various studies based on analyses in one or both domains are shortly reviewed in [101]. The application of HVR for emotions recognition is complicated by the fact that HRV affects other factors, and in order to solve this issue, various signal filtration and feature extraction techniques are implemented. There exist approximately 14 different parameters, which are able to extracted by analyzing HRV. A detailed description of these parameters and their relation with main emotions is presented in [102]. The most common technique used for HRV analyses is the calculation of power spectral density (PSD) of the signal [101]. The PSD represents the spectral power density of a time series as a function of frequency. Typical HRV measurements taken from frequency domain analysis are powers within frequency bands and ratios of powers. The amount of power contained within a frequency band can be obtained by integrating the PSD within the band frequency limits [103].
The main drawback of HRV based on ECG is related to the features of ECG, mainly the complexity of sensors and high requirements for the measurement procedure in order to minimize affects from the environment. An alternative for ECG based HRV is photoplethysmography (PPG). Photoplethysmography is a technique to detect a change of microvascular blood volume in tissues. The principle of this technology is very simple and it requires only a light source and a photodetector. The light source illuminates the tissue and the photodetector measures the small variations in transmitted or reflected light (Figure 8a,b) associated with changes in perfusion in the tissue [99].
The PPG signal (Figure 8b) consists of two main components:
(i)
the static part of signal depends on the structure of the tissue and the average blood volume arterial and venous blood, and it varies very slowly depending on respiration;
(ii)
the dynamic part represents changes in the blood volume that occurs between the systolic and diastolic phases of the cardiac cycle [104].
PPG signals, which are analogous voltage values in the time domain, are analyzed using methods similar to those used for the analysis of ECG based HRV. The main difference of the latter to PPG is the filtering of its signal using high-pass filters before defining peaks and forming HRV signal.
PPG can be performed using only one sensor attached to the finger, or using multiple sensors attached to the right and left ear lobes, index finger pads, and great toe pads [105].
There is a variety of studies proving the successful implementation of this technique and demonstrating its advantages compared to ECG [106,107]. In [105], a comparison between ECG and PPG signal (Figure 9) is presented which proves strict relations between both signals. Delays of PPTp and PPTf in a PPG signal represent the transition time until a pulse from the heart reaches themeasuring point.
Recently, there has been increased interest in remote photoplethysmography (rPPG) whereby it is possible to recover the cardiovascular pulse wave by measuring variations of back-scattered light remotely, using only ambient light and low-cost vision systems [99]. Remote measurements allow to significantly increase human comfort level during the measurement procedure, but this decreases the signal noise ratio and increases the need for more advanced signal processing and analysis algorithms. In [108], machine learning algorithms were implemented in order to the increase precision of HVR measurements performed by smart watches. Results of this research prove that ML is useful tool for PPG measurements data analysis and the extraction of desired features.
Compared to EEG or ECG, HRV (especially based on the PPG technique) is a more comfortable, cheap, and quite universal method. A variety of possible measurement methods in [110] presented a HRV evaluation approach based on heart sound measurements, proving that heart sound correlates well with RR interval from ECG. Another HVR measurement approach using Doppler radar presented by Boris-Lubecke et. al. [111]. In this case, a transceiver transmits a radio wave signal and receives a motion-modulated signal reflected from a human chest which acts as target. Considering that chest movement amplitude in the calm state is about 10 mm due to respiratory and about 0.1 mm due to heart activity, it is possible to extract HRV features from the recorded response signal. A short review of researches focused on emotion recognition using HRV is presented in Table 6.
From Table 6, it is evident that HRV is a quite popular and powerful technique for emotion recognitions. The results of the performed review shows that, in this field, the situation is in contrast to the situation with EEG or ECG, where the attention of researches is directed to the full development of PPG and rPPG techniques, including the development of a novel configuration of wearable PPG sensors, improvement of signal analysis and measurement methods, and the exploration of new application fields.
The main advantage of PPG based HRV consists in the absence of the requirement for special human preparation. Usually, it is enough to touch the active surface of the sensor for a few seconds. The rPPG method provides the possibility of non-contact measurements. Nevertheless, cheap PPG equipment and its accessibility to any potential user is so simple that even the touchscreen of common smartphone can be used as PPG sensor. Mentioned features of this methodology reveal the potential of its implementation in a wide area of applications, especially in the area of human–machine interaction and IoT, since sensors of this type can be easily installed into joysticks and other machine control devices, and can even be hidden for users.
In special cases, when a multitude of emotions or their detection accuracy has requirements by conditions, the HRV technique needs to be complemented by other techniques, such as ECG, GSR, and data fusion. Such a situation develops the high potential for the applications of big data analysis techniques.

2.5. Respiration Rate Analysis (RR)

Respiratory monitoring data contains useful information about emotional states. Respiration velocity and depth usually vary with human emotion: deep and fast breathing shows excitement that is accompanied by happy, angry, or afraid emotions; shallow and fast breathing shows tension; relaxed people often have deep and slow breathing; shallow and slow breathing shows a calm or negative state. A normal breathing rate in calm states is about 20 times per minute, while in excitement, it can reach up to 40–50 times per minute [118]. The respiration processes is quite complex, and it affects a major part of the body, and due to this many techniques for respiration evaluation exist. Main measurement methods fall into several groups according to measurement principles:
  • manual or semi-automatic breath rate evaluation using simple timers or specialized software applications;
  • methods based on measurements of air humidity fluctuation in exhaled air;
  • methods based on measurements of temperature fluctuation in exhaled air;
  • measurements based on definition of air pressure variation due to respiration;
  • methods based on measurements of variation of carbon dioxide concentration;
  • measurements of variation of oxygen concentration;
  • methods based on measurements of body movements;
  • methods based on measurements of respiratory sounds.
Moreover, it is possible to extract respiratory rate from ECG, PPG, or even blood pressure measurements. All above mentioned methods are explained in detail in [119]. Another very detailed review focused on respiration measurement methods, sensors, and signal processing techniques is provided in [120].
Despite the numerous methods for respiration rate measurement, the popularity of implementation of this technique in the field of emotion recognition is lower in comparison with ECG, GSR, or HRV methods. The main obstacles limiting the application of respiration monitoring are caused by the nature of the signal. Although breath rate depends on emotional state, it can be affected by a variety of external factors, such as human body movement or the level of human fatigue, while environmental conditions, such as air temperature and humidity level also can influence measurement results. Such complex signal requires to implement advanced signal processing techniques, like machine learning algorithms [118] or complement research by using additional measuring methods in order to extract required information from measurement signals. The use of the latter is limited by the fact that a majority of measurement methods requires the use of contact sensors, thus creating discomfort and limitations for normal human activity.
The main advantage of the emotion evaluation technique, based on respiration rate analysis is it’s possibility to implement non-contact measurements methods unlike in EEG or ECG, for example measurements of body movement using video or thermal cameras. In the case of using video camera, signal, which shows respiration rate variation in respect to time, information obtained from tracking displacement of reference point by comparing sequentially, recorded frames by video analysis algorithms. The use of thermal cameras defines respiration rate by analyzing temperature fluctuations near the mouth and nose area caused by exhaled air.
A short review of researches with respiration rate analysis for emotion recognition is provided in the Table 7.
The analysis of papers, provided in the Table 7, allows to for the conclusion of emotion recognition based on respiration rate evaluation used as a complimentary method for enforcing other emotion recognition and evaluation methods. Despite the fact that this method is not frequent and carries some functional limitations, it can be successfully applied in cases where the subject takes a fixed position and does not change it significantly during the monitoring period, for example, in the control operation of technological machines or automotive driving. In 2005, Healey and Picard [126] presented methods for collecting and analyzing physiological data during real world driving tasks to determine a driver’s relative stress level. ECG, electromyogram, GSR, and respiration were recorded continuously while drivers followed a set route through open roads. Task design analysis recognized driver stress level with an accuracy of over 97% across multiple drivers and driving days. Provided with a successful example of respiration rate method, let us hope that in the future respiration rate analysis, complemented by other methods can become a big player in emotion analysis within the field of human–machine interaction.

2.6. Skin Temperature Measurements (SKT)

The best bio signal for automatic emotion recognition is signals, which represent a reaction of the autonomic nervous system, which is beyond human control. Skin temperature is one such parameter, related to the human heart activity and sweat reaction. The thermal radiation of a cutaneous surface depends on the perfusion controlled by the autonomic nervous system, which controls the vessels that irrigate the skin. Although the parasympathetic system has an influence through the endothelial cells (in body places like: palmar and plantar surfaces, tip of the nose, sensitive point on the face), the vasomotion is principally regulated by sympathetic noradrenergic fibers, whose activation leads to vasoconstriction and to the decrease of local temperature [127]. In [128], the results prove a good correlation between skin-surface temperature and fingertip blood flow. In [129] it was defined that finger temperature varies due to emotional states and an applied stimulus. Emotions like stress with predominant anxiety, anger, embarrassment, humiliation, joy with anxiety, depression with hostility, guilt, fear of abandonment or fear of conflict over the use of hands for aggressive and sexual purposes, causes decrease of finger temperature. In cases when a patient was not involved in action, but only affected by the speech of another human, experiencing such emotional reactions as anger and anxiety, there was a fall in finger temperature. In addition, a fall of temperature was detected in situations which disturb human safety devices. Similar results were also defined in [130] where it was found that the skin temperature of patients was higher for the expression of low intensity negative emotions compared to the expression of low intensity positive emotions.
In the literature, the most often used temperature measurements methods include: contact method based on the implementation of various semiconductors sensors [117] and non-contact method based on face or full body thermal imaging using infrared cameras [127]. A typical example of skin temperature changes due to an applied stimulus is provided in Figure 10.
Advantage of SKT is possibilities of non-contact measurements, which provides high comfort for the patients and allows eliminating Hawthorne effect (people behave differently, while being observed). Moreover, SKT can be used evaluate emotions not only for humans, but also for animals. In [131,132], stress and body temperature dependencies in animals are discussed, these studies assert that, under stress, animals experience a rise in temperature.
The main drawback of the SKT technique is quite big latency compared to the previously described method. This creates some limitation for this method: a stimulus is required which will take some amount of time and will cause intense emotions, due to this, SKT is well suited to evaluating longer actions like songs or advertising videos, but is not the best choice for evaluating pictures or situations which disappear in a short period of time. The inability to recognize an exact emotion is also a drawback of this method, which can be compensated for by combining SKT with other techniques, but usually it will be less reliable compared with other methods. In [133] presented research, where the input signals were electrocardiogram, skin temperature variation, and GSR, all of which were acquired without much discomfort from the body surface, and can reflect the influence of emotion on the autonomic nervous system. A support vector machine was adopted as a pattern classifier. Correct classification ratios for 50 subjects were 78.4% and 61.8%, for the recognition of three (sadness, anger, stress) and four (sadness, anger, stress, surprise) emotion categories, respectively.
A short summary of researches in which SKT was implemented is provided in Table 8.
From the information provided in Table 8, it is seen that SKT is a popular emotion evaluation method, which fits good with other methods and doesn’t require complicated measurement equipment in the case of contact measurements. This is well suited for the cases where high recognition precision is not required. The results of the performed review points to the focus of researches in this field of the development and evaluation of a new emotion recognition methodology based on SKT measurements, and to the improvement of non-contact emotion evaluation techniques, which are able to define one or a few intense emotions. Such methods have great potential in the future of smart applications and can be useful in medicine, tutoring, human–machine interaction etc.

2.7. Electromyogram (EMG)

Electromyography is a technique for evaluating and recording the electrical potential generated by muscle cells [142]. In medicine, this test is used to detect neuromuscular abnormalities, in emotion recognition field it is used to find the correlation between cognitive emotion and physiological reactions [142]. A majority of EMG based researches focus on the analysis of facial expressions due to the hypothesis that facial mimicry contributes to the emotional response to various stimuli. This hypothesis was first announced by Ekman and Friesen in 1978 [143] who described dependencies between simple emotions, facial muscles, and their caused actions (Table 9). Depending on the purpose of analysis, the activity of selected facial muscles (most often: occipitofrontalis, corrugator supercilii, levator labii superioris, zygomaticus major and orbicularis oculi) can be recorded [144].
The EMG procedure is performed by measuring voltages between special electrodes. The EMG is usually done in two steps: in the first step, a baseline is defined (voltage level then human is in calm state) [147]. This level is unique for each person and depends on multiple factors. In the second step, the response to stimulus is measured, and the caused effect evaluated as a ratio between base line and measured value.
Typical places for electrode location during facial EMG are shown in Figure 11. A huge variety of electrodes can be classified into a few groups according their properties [148]. There are two main types of EMG electrode: surface (or skin electrodes) and inserted electrodes. Inserted electrodes are further classified into two types: needle and fine wire electrodes (Figure 12a,b). Surface electrodes can also be classified into two types: dry electrodes and gelled electrodes (Figure 12c,d).
Needle electrodes are most often used in medical applications. They consist of wire, which is isolated by a special thin tube, and only the end point of the electrode acts as an active contact surface. The advantages of these electrodes include a good signal noise ratio and the possibility to take a precise readout from a relatively small area. Wire electrodes can be made from any small diameter, highly non-oxidizing, stiff wire with insulation. Wire electrodes are extremely fine, they can be implanted more easily, and they are less painful compared to the needle electrodes.
Gelled surface electrodes contain a gelled electrolytic substance, which allows an electric current from the muscle to pass across the junction between skin, electrolyte, and electrode. Silver chloride (Ag-AgCl) gelled electrodes are used most often. Dry EMG electrodes do not require a gel interface between the skin and the detecting surface. Dry electrodes are usually heavier (>20 g) as compared to gelled electrodes (<1 g), and due to this, special material for fixation of the electrode on the skin is required. The main advantages of surfaces electrodes, compared to needle ones, is that they can be reusable, and they allow non-invasive measurements. Moreover, universal electrodes [150] can be used for EMG, EEG, and ECG procedures just by changing the electrode location and data acquisition device. Drawbacks of the use of surface electrodes is the strict requirements for skin preparation (shaved hair, degreased skin), bigger measurement area, and inefficient signal noise ratio.
Comparing to the latter, procedures of measurement using EMG, EEG, and ECG are similar, but in scientific research, procedures of EMG are seldom used. The main limitation of EMG it is sensitivity to the emotion intensity, however it is a very good technique to detect strong emotions. Nevertheless, small changes of valence and arousal intensity could not be detected, since facial expressions changes only due to strong emotions [151]. The second limitation is the same as for EEG and ECG: This procedure requires the use of contact measurement methods, and therefore it affects comfort level of the persons and creates limitations for their casual activity. In addition, EMG (especially when surface electrodes are used) similarly to ECG, raises requirements for the room in which the procedure is performed: it is necessary to protect from direct sunlight and from electromagnetic noise. Direct sunlight can cause uncontrolled movement of facial muscles; electromagnetic noise can increase noise level in the signal and destroy measurement signals. An advantage of EMG compared to EEG and ECG is the relatively simple analysis of a signal, since various muscles or their groups are affected by different emotions, and separate emotions can be more easily defined analyzing recorded signals.
Comparing emotion recognition and precision EMG gives better results than SKT. In [122] a methodology and a wearable system for the evaluation of the emotional states of car-racing drivers is presented. The proposed approach performs an assessment of the emotional states using facial electromyograms, electrocardiogram, respiration, and GSR. The emotional classes identified are high stress, low stress, disappointment, and euphoria. Support vector machines (SVMs) and an adaptive neuro-fuzzy inference system (ANFIS) have been used for the classification. The overall classification rates achieved by using tenfold cross validation are 79.3% and 76.7% for the SVM and the ANFIS, respectively.
A short summary of researches where the EMG methodology is implemented is provided in Table 10. In a majority of cases, EMG is used in combination with other methods, and researchers have focused on the development of emotion recognition methods and data classification and analysis techniques.

2.8. Electrooculography (EOG)

Electrooculography is a technique for measuring the corneo-retinal standing potential that exists between the front and the back of the human eye. Primary applications appear in ophthalmological diagnosis and in recording eye movements [156]. To measure eye movement, pairs of electrodes are typically placed either above or below the eye, or to the left and right of the eye (Figure 13a). If the eye moves from center position toward one of the two electrodes, an electrical potential appears between those electrodes which corresponds to the eye’s position (Figure 13b) [157]. The idea of implementing EOG for emotion recognition is based on the same hypothesis as EMG, and EOG is often used as a complementary technique. EOG in most cases relies on the detection of eye-blinking and is useful to detect emotions such as stress or surprise [158]. EOG is also useful for assessing fatigue, concentration, and drowsiness [159]. A comparison between response signals from EMG and EOG provided in Figure 14.
EOG can be applied using contact and non-contact measurement techniques. Contact measurements suitable using EMG electrodes and same equipment. Non-contact measurements can be performed using video camera videooculographysystems (VOG) or infrared camera infrared oculography (IROG) [157].
From Figure 14, it is evident that an EOG signal correlates with EMG in time scale, but signal amplitude is much smaller and some latency between vertical and horizontal EOG is noticeable. The complexity of EOG signal processing depends on the measurement method and on the information which can be extracted from the signal. The simplest case is blinking detection, which is represented by peaks in the time-domain signal recorded from electrodes. The detection of exact eye position will require recording the baseline and conscious eye movements in order to have a relation between voltage variation and the position of the eye (Figure 14). The extraction of time-dependent features will require some analysis in the frequency domain, for example, FFT or Wavelet transformation [163]. In the case of non-contact measurement, this method uses analysis under different vision-based object detection and tracking algorithms.
Compared to EMG, EOG is less powerful technique (it can recognize less amount of simple emotions) but it provides the possibility of non-contact measurements. Disadvantages of EOG and EMG quite similar: if the procedure is performed in a natural office environment, test eye movement can be caused by some unrelated external effect, like bright sunlight, noise, or influence of other persons.
A short summary of researches wherein EOG is implemented as a methodology is provided in Table 11.
The scientific research presented in the Table 11 embraces EOG technology for emotion recognition and their intensity evaluation. A majority of techniques separate positive and negative emotion levels (see [164,165,166]). The evaluation of emotion intensity level remains uncertain for many cases and not comprehensibly described. Video-based systems demonstrate great potential for implementation due to widespread hardware availability as well as the performance of off-line analysis of existing video material.

2.9. Facial Expresions (FE) Body Posture (BP) and Gesture Analysis (GA)

In the past decade, there has been a noticeable increase of interest in emotions recognition methods based on the analysis of facial expressions, body posture and gestures. This increase of interest is possibly explained by recent advances in computer vision systems. Emotion recognition methods based on analysis of facial expression, body postures, and gestures are based on the same [143] hypothesis as EMG, claiming that body postures and gestures are also involved in the response of emotions [169,170] and suitable for recognizing the same elementary emotions. A common assumption is that body language is just a different method to express the same basic emotions, e.g., expressed by facial motion. Moreover, the same muscles are used to express emotions in widely different cultures [171]. A summary of main relations between body postures and emotions is provided in Table 12.
Advantages and difficulties of these methods stem from the fact that, in the human body, there are plenty of reference points which should be monitored. One of the most advanced commercially available system from Imotion, namely the Facial Action Coding System (FACS) [172], uses various combinations of 64 parameters for emotion recognition. Measurements of facial expressions, body posture, and gestures are usually performed using computer vision systems and analysis algorithms, which can track movements of selected reference points. Such a measurement technique also has preferences in the emotion recognition field, since it allows for the performance of non-contact measurements and produces quite reliable results. In [173,174] presented research cases where emotion recognition accuracy in a random scenario was 60–86%.
Since the above presented method is based on the same hypothesis as EMG, similar limitations exist:
(i)
it recognizes only strong emotions which last some amount of time, response to weak emotions or to very short not intense stimulus does not create the noticeable facial movements or change in body posture;
(ii)
the possibility exists that changes in human motion or facial expressions are due to environmental effects.
Also, there exist a few drawbacks related to the measurement methods:
(i)
huge amount of data is created while tracking a lot of reference points;
(ii)
track of body posture: it is difficult to define the exact position of a reference point, which is covered by clothes, and in this case, special marks for vision systems should be implemented.
Despite the mentioned drawback, facial expressions, body posture, and gesture tracking remain promising techniques in the emotion recognition field, especially taking into account recent advances in computer vision systems, big data analysis, and machine learning techniques. A short summary of researches involving the analysis of facial expressions, body posture, and implemented gestures is provided in Table 13.
From Table 13, it is evident that, in a majority of researches, facial expressions, body posture, and gestures analysis methods were used together, and were even complemented by other techniques in order to improve recognition accuracy. Also in the literature, there are presented cases where these methods were used together with not so common techniques: for example, speech analysis [175,177,178].
Comparing facial expressions, body posture, and gestures analysis methods with previously described methods, it can be stated that these methods are the most promising in future applications. Especially in practical cases, which do not require extremely high accuracy and sensitivity due to their wide applicability, a large number of measurable parameters, as well as advances in video analysis and large data processing capabilities, allows for the implementation of a multimodal approach.
One of the most promising implementations of facial expression analysis is the Internet of Things. IoT objects, which respond to users’ emotional states, can be used to create more personalized user experiences. The IoT covers fields as diverse as medicine, advertising, robotics, virtual reality, diagnostic software, driverless cars, pervasive computing, affective toys, gaming, education, working conditions and safety, automotive industry, home appliances, etc., which will significantly benefit from emotion-sensing technology.
The analysis of facial expression, body movements, and gestures represent a contactless method applicable for mass and individual emotion recognition. All techniques are simple to collect data but require sophisticated video frame analysis in dynamics and sculpted surface analysis for static frame content. Methods for facial recognition and gesture recognition are typically separate, but material for them is the same. Methods are available in real time systems and in the off-line mode, and therefore in-depth analysis, the development of reactions in the time progression, and human test progression easily possible. As video-based systems, these techniques demonstrate the potential to grow in the future.
All positive features of mentioned methods are limited by a great amount of data, which raises the requirement for data storing, processing, and cross-analysis time. Cloud computing and IoT represent some solutions, but then mobile data transmission can be a bottleneck for the recent situation. Another drawback would be low accuracy in the intensity level definition, as temperament is a key parameter in gesture and facial expression.

3. Signal Analysis and Features Extraction Methods

Reliability, precision, and speed of emotion evaluation strongly depend not only on the used measurement method and sensor, but also on the applied signal processing and analysis technique. In this chapter, we will provide a review of the most commonly used signal analysis and feature extraction methods (Table 14).
In [184,185,186,187,188,189,190,191,192] analyzed a number of various emotions using different measurement methods and feature extraction techniques. For example, in [184] introduced the overall paradigm for their multimodal system that aims at recognizing its users’ emotions and responding to them accordingly depending upon the current context or application. They described the design of the emotion elicitation experiment they conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (GSR, HRV, SKT) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). They showed the results of three different supervised learning algorithms that categorize collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. Overall, three algorithms, namely the k-nearest neighbor (KNN), discriminant function analysis (DFA), and Marquardt backpropagation algorithm (MBP), could categorize emotions with 72.3%, 75.0%, and 84.1% accuracy, respectively.
Li and Chen [185] proposed to recognize emotion using physiological signals obtained from multiple subjects from the body surface. Four physiological signals, namely ECG, SKT, GSR, and respiration rate, were selected to extract features for recognition. Canonical correlation analysis was adopted as a pattern classifier, and the correct classification ratio is 85.3%. The classification rates for fear, neutral, and joy were 76%, 94%, and 84% respectively.
A new approach to enhance driving safety via multi-media technologies by recognizing and adapting to drivers’ emotions (neutrality, panic/fear, frustration/anger, boredom/sleepiness) with multi-modal intelligent car interfaces is presented in [192]. A controlled experiment was designed and conducted in a virtual reality environment in order to collect physiological data signals (GSR, HRV, and SKT) from participants who experienced driving-related emotions and states (neutrality, panic/fear, frustration/anger, and boredom/sleepiness). KNN, MBP, and resilient backpropagation (RBP) algorithms were implemented to analyze the collected data signals and to find unique physiological patterns of emotions. RBP was the best classifier of these three emotions with 82.6% accuracy, followed by MBP with 73.26%, and KNN with 65.33%.
In [186] presented an artificial intelligence based system which could detect the early onset of fatigue in drivers using HRV as the human physiological measure. The detection performance of the neural network was tested using a set of ECG data recorded under laboratory conditions. The neural network gave an accuracy of 90%.
In [187] presented classification of three emotions (boredom, pain, and surprise) by using four machine learning algorithms (linear discriminate analysis (LDA), classification and regression tree (CART), self-organizing map (SOM), and support vector machine (SVM)). GSR, ECG, HRV, and SKT as physiological signals were acquired for one minute before emotional state as a baseline and for 1–1.5 min during emotional states. For emotion classification, the difference values of each feature-subtracting baseline from the emotional state were used for machine learning algorithms. The result showed that an accuracy of emotion classification by SVM was the highest. In the analysis of LDA, the accuracy of all emotions was 78.6%, and in each emotion, boredom was recognized by LDA with 77.3%, pain 80.0%, and surprise 78.6%. CART provided accuracy of 93.3% when it classified all emotions. In boredom, accuracy of 94.3% was achieved with CART, 95.9% in pain, and 90.1% in surprise accuracy rate of LDA was 78.6%, 93.3% in CART, and SOMs provided accuracy of 70.4%. The result of emotion classification using SOM showed that, according to orders of boredom, pain and surprise recognition accuracy of 80.1%, 65.1%, and 66.2% were obtained by SOM correspondingly. Finally, the result of emotion classification using SVM showed an accuracy rate of 100.0%.
User-independent emotion recognition method with the goal of recovering affective tags for videos using electroencephalogram, pupillary response and gaze distance presented in [188]. Initially, 20 video clips were selected with extrinsic emotional content from movies and online resources. Ground truth was defined based on the median arousal and valence scores given to clips in a preliminary study using an online questionnaire. Based on the participants’ responses, three classes for each dimension were defined. The arousal classes were calm, medium aroused, and activated and the valence classes were unpleasant, neutral, and pleasant. The best classification accuracies of 68.5% for three labels of valence and 76.4% for three labels of arousal were obtained using a modality fusion strategy and a SVM.
In [189] presented a specific emotion induction experiment to collect five physiological signals of subjects including ECG, GSR, blood volume pulse, and pulse. The support vector regression (SVR) method was used to train the trend curves of three emotions (sadness, fear, and pleasure). Experimental results show that the proposed method achieves a recognition rate up to 89.2%.
In [190] proposed and investigated a methodology to determine the emotional aspects attributed to a set of computer aided design (CAD) tasks by analyzing the CAD operators’ psycho-physiological signals. Psycho-physiological signals of EEG, GSR, and ECG were recorded along with a log of CAD system user interactions. A fuzzy logic model was established to map the psycho-physiological signals to a set of key emotions, namely frustration, satisfaction, engagement, and challenge and the results were analyzed. The correlations between fuzzy model outputs and reported emotions are 84.18% (frustration), 76.83% (satisfaction), 97% (engagement), and 97.99% (challenge) respectively.
A novel approach for the multimodal fusion of information from a large number of channels to classify and predict emotions is presented in [191]. The multimodal physiological signals are 32 channels EEG, eight-channel GSR, blood volume pressure, and respiration pattern, SKT, EMG, and EOG. The experiments are performed to classify different emotions (terrible, love, hate, sentimental, lovely, happy, fun, shock, cheerful, depressing, exciting, melancholy, mellow) from four classifiers. The average accuracies are 81.45%, 74.37%, 57.74%, and 75.94% for SVM, MLP, KNN, and MMC classifiers respectively. The best accuracy is for ‘depressing’ with 85.46% using SVM.
An evaluation of the accuracy of provided methods, presented in Table 14, outlined three technologies with the best accuracy of emotion recognition, namely ECG, EEG, and GSR. A noticeable positive influence implementing FUZZY logic for the recognition of emotion intensity level is supported by respectable research [190]. An increase in the number of recognized emotions sharply decrease the quality and reliability of recognition, and this proposes a new roadmap for emotion recognition process planning. Three emotions recognition accuracy can reach even 100%, but their intensity level is still not defined uniquely. Research confirmed better recognition of negative rather than positive emotions in a majority of methods.

4. Discussion

The selection of measurement methods and sensors is a complex process in which a huge set of questions are presented. There are multiple choices for physiological parameters to measure as well as physical principles of obtaining signals. Technology of measurements in relation to particular sensors creates a huge set of possibilities to select. Multiple attempts to classify emotions, sensors, and universal selection algorithms have been made. Some attempts are presented in [193] and we try to fill this gap with the current proposal. In contrast to physical measurements, obtaining emotions from human body parameter measurements creates a tough problem. Therefore, basic sensor selection methods become unclear due to a lack of classification methods and functional relations between sensors and desired emotions.
We conclude by providing the classification of emotion recognition measurement methods (Figure 15), which allows to fulfil the selection procedure proposed in [193] as a two-step procedure. The first step consists of the selection of measurement parameters and methods, while the second step realizes the selection of sensors.
We assume that, in the beginning of method selection, it is necessary to define if we are interested in a conscious or unconscious response, or maybe in both methods simultaneously. Research based on conscious responses are relatively simple and does not require any special hardware, but it requires a lot of attention to prepare questionnaires. In contrast, results from self-evaluation are not so reliable, and there exists a possibility that a person will not recognize their own emotions correctly or will provide imprecise answers to uncomfortable questions. Methods based on unconscious responses usually provide more reliable results, but they require multiple attempts for measuring procedures and raise high requirements for hardware.
Methods based on unconscious responses provide many choices and we propose the selection of electrical or non-electrical parameter measurement. As all reactions in the human body are controlled by electric signals generated in the central nervous system, we can state that electric parameters are primary entities, which gives mostly precise results, and the measurement of non-electrical signals give reactions of the human body affected by electric signals. On the other hand, electrical signals can be measured using only contact measurement methods, and of course there is the possibility to send a signal to an acquisition unit using wireless techniques, but despite this, there remain some limitations for human activity during measurement procedures.
Measurements of electrical parameters have two features: it is possible to use methods based on direct (self-generating) sensors, when is measured signal crated by central nervous system (EEG, ECG, HRV, EMG, EOG), or measurements based on modulating sensors when changes in human body modulates properties of the sensor (GSR). From theory, it is known that direct sensors are more precise [193] but they can a little affect the signal (especially signal with small amplitude) since they take part of the power form it. On the other hand, modulating sensors will have some latency, which depends on properties of individual sensor.
Measurement methods based on measurements of non-electrical parameters usually suffer from lack of accuracy and latency but their main advantage is possibility to perform non-contact methods without limiting human activity and they better fits for field applications and for approximate emotional state evaluation.
Recent researches in the field of emotion recognitions shows that there is no method that is ideal for one case and the best solution is multimodal analysis, as presented in [194] or in [195] using several methods they complements each other and allows to achieve a higher reliability of obtained results.
A noticeable methodological problem in all emotion recognition techniques is lack of united conception of dataset. Researchers choose control group sizes, compositions, experimental time, and periods arbitrarily or based on possibility. Features in each emotion recognition methodology are different, but certain description for reliable standards, covering dataset issues begs for definition. This would release unnecessary resources, used by research with reliable results on the output.
Signal processing and analysis techniques also play important roles in the selection of methods and sensors. In a majority of cases, the effectiveness of emotion recognition depends on the applied procedures of signal processing and analysis. For example, from ECG data, it is possible to extract information about HRV and respiration rate. Recent research shows that the most powerful techniques applied for emotion recognition are multi-criteria analysis based on statistical methods (ANOVA) or on machine learning algorithms.
Summing up, we can state that the interest in emotion recognition and practical implementation of this technique is steadily increasing and finds more areas of application. Detailed research available in public sources is mostly focused on the physiological side of this object. We found a lack of research and unified classification focused on the engineering part of this question, for example, missing rationale related to measurement methods, measurement uncertainties, and clear specification as to which method, sensor, processing, and analysis techniques are best suited for recognizing a particular emotion.
The future of such analysis can present a background for future systems with emotion recognition. The sensors and methods for human emotion recognition along with computer vision, speech recognition, deep learning, and related technologies have demonstrated tremendous progress in the IoT field. Due to this, the understanding of human emotions has also experienced distinct progress [29].
The study and development of systems and devices constitutes affective computing. This is a means for recognizing, interpreting, processing, and simulating the influences on people [196]. Machinery for recognizing, expressing, modelling, communicating, and responding to information about emotions as well as certain affective computing instances have been built globally by numerous researchers [197]. Innovative understanding of the self and better, advanced human communications have become possible by the advances of affective computing technology. This promises new technologies for reducing stress, rather than increasing it. Management requires measurement, as is popularly said. The real-time skills provided by computers are complex and challenging. These skills allow them greater understanding and intelligent responses to human emotions, which are complicated but occur naturally and, for the same reason, are expressed naturally. The range for their use covers a number of fields such as those in the fields of human sciences like neuroscience, physiology, and psychology [198]. The state of the art for multi-modal affect analysis frameworks, however, lacks a comprehensive discussions in surveys of available scholarly literature [199].
The realization of mood sensor technology is expected on an annual basis. Efforts in research and development will increasingly aim at contactless technology involved in measuring emotions, despite the on-body devices and/or voice/facial recognition software currently required by most existing human emotion recognition sensors, methods, and technologies. However, regardless of these efforts, there is very slow movement towards the humanization of the IoT with human emotion recognition methods and/or sensors at present [29]. Consequently, this research represents an attempt at introducing the idea of humanizing the IoT and affective computing systems by applying human emotion recognition sensors and methods to academic and business communities. The confirmation of such is by the IoT and affective computing systems, which were developed by the authors of this research [25,26,27,28].

5. Conclusions and Future Trends

Emotion recognitions is a powerful and very useful technique for the evaluation of human emotional states and predicting their behavior in order to provide the most suitable advertising material in the field of marketing or education. In addition, emotion recognition and evaluation is very useful in the development process of various human machine interaction systems.
Relations between particular emotions and human body reactions have long been known, but there remain many uncertainties in selecting measurement and data analysis methods. There are eight methods most used in that field, which are based on measurements of various parameters and an innumerable majority of data analysis methods and attempts for practical applications.
In this review, we observed more than 160 scientific articles and provided the classification of AEE methods, using a summarized description of common emotion recognition methods and various attempts to improve the reliability of its results. This paper also provides an engineering view to AEE methods and their reliability, sensibility, and stability.
In the near future, a combination of those methods and implementation of machine learning for data analysis seems to be an extremely powerful combination, which will create breakthroughs in practical application in all fields starting from advertising and marketing and finishing with industrial engineering applications.

Author Contributions

Conceptualization, A.K. and V.B.; methodology A.D. and V.B., investigation, A.D.; resources, A.K.; writing—A.D.; writing—review and editing, V.B.; supervision, V.B.; project administration, A.K.; funding acquisition, A.K. All authors have read and agreed to the published version of the manuscript.

Funding

This project has received funding from European Regional Development Fund (project No 01.2.2-LMT-K-718-01-0073) under grant agreement with the Research Council of Lithuania (LMTLT).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rattanyu, K.; Ohkura, M.; Mizukawa, M. Emotion Monitoring from Physiological Signals for Service Robots in the Living Space. In Proceedings of the ICCAS 2010, Gyeonggi-do, Korea, 27–30 October 2010; pp. 580–583. [Google Scholar]
  2. Byron, K.; Terranova, S.; Nowicki, S. Nonverbal Emotion Recognition and Salespersons: Linking Ability to Perceived and Actual Success. J. Appl. Soc. Psychol. 2007, 37, 2600–2619. [Google Scholar] [CrossRef]
  3. Feidakis, M.; Daradoumis, T.; Caballe, S. Emotion Measurement in Intelligent Tutoring Systems: What, When and How to Measure. In Proceedings of the 2011 Third International Conference on Intelligent Networking and Collaborative Systems, IEEE, Fukuoka, Japan, 30 November–2 December 2011; pp. 807–812. [Google Scholar]
  4. Mandryk, R.L.; Atkins, M.S.; Inkpen, K.M. A continuous and objective evaluation of emotional experience with interactive play environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’06), Montréal, QC, Canada, 22–27 April 2006; ACM Press: New York, NY, USA, 2006; p. 1027. [Google Scholar]
  5. Sosnowski, S.; Bittermann, A.; Kuhnlenz, K.; Buss, M. Design and Evaluation of Emotion-Display EDDIE. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 3113–3118. [Google Scholar]
  6. Ogata, T.; Sugano, S. Emotional communication between humans and the autonomous robot which has the emotion model. In Proceedings of the 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C), Detroit, MI, USA, 10–15 May 1999; Volume 4, pp. 3177–3182. [Google Scholar]
  7. Malfaz, M.; Salichs, M.A. A new architecture for autonomous robots based on emotions. IFAC 2004, 37, 805–809. [Google Scholar] [CrossRef]
  8. Delkhoon, M.A.; Lotfizadeh, F. An Investigation on the Effect of Gender on Emotional Responses and Purchasing Intention Due to Advertisements. UCT J. Soc. Sci. Humanit. Res. 2014, 2, 6–11. [Google Scholar]
  9. Singh, J.; Goyal, G.; Gill, R. Use of neurometrics to choose optimal advertisement method for omnichannel business. Enterp. Inf. Syst. 2019, 1–23. [Google Scholar] [CrossRef]
  10. Chung, W.J.; Patwa, P.; Markov, M.M. Targeting Advertisements Based on Emotion. U.S. Patent Application No 12/958,775, 7 June 2012. [Google Scholar]
  11. D’Mello, S.K.; Craig, S.D.; Gholson, B.; Franklin, S.; Picard, R.W.; Graesser, A.C. Integrating Affect Sensors in an Intelligent Tutoring System. In Proceedings of the 2005 International Conference on Intelligent User Interfaces, San Diego, CA, USA, 10–13 January 2005. [Google Scholar]
  12. Woolf, B.P.; Arroyo, I.; Cooper, D.; Burleson, W.; Muldner, K. Affective Tutors: Automatic Detection of and Response to Student Emotion; Springer: Berlin/Heidelberg, Germany, 2010; pp. 207–227. [Google Scholar]
  13. Scotti, S.; Mauri, M.; Barbieri, R.; Jawad, B.; Cerutti, S.; Mainardi, L.; Brown, E.N.; Villamira, M.A. Automatic Quantitative Evaluation of Emotions in E-learning Applications. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 Septemebr2006; pp. 1359–1362. [Google Scholar]
  14. Kolakowska, A.; Landowska, A.; Szwoch, M.; Szwoch, W.; Wrobel, M.R. Emotion recognition and its application in software engineering. In Proceedings of the 2013 6th International Conference on Human System Interactions (HSI), Gdansk, Sopot, Poland, 6–8 June 2013; pp. 532–539. [Google Scholar]
  15. Guo, F.; Liu, W.L.; Cao, Y.; Liu, F.T.; Li, M.L. Optimization Design of a Webpage Based on Kansei Engineering. Hum. Factors Ergon. Manuf. Serv. Ind. 2016, 26, 110–126. [Google Scholar] [CrossRef]
  16. Yannakakis, G.N.; Hallam, J. Real-Time Game Adaptation for Optimizing Player Satisfaction. IEEE Trans. Comput. Intell. AI Games 2009, 1, 121–133. [Google Scholar] [CrossRef] [Green Version]
  17. Fleureau, J.; Guillotel, P.; Huynh-Thu, Q. Physiological-Based Affect Event Detector for Entertainment Video Applications. IEEE Trans. Affect. Comput. 2012, 3, 379–385. [Google Scholar] [CrossRef]
  18. Oatley, K.; Johnson-laird, P.N. Towards a Cognitive Theory of Emotions. Cognit. Emot. 1987, 1, 29–50. [Google Scholar] [CrossRef]
  19. Von Scheve, C.; Ismer, S. Towards a Theory of Collective Emotions. Emot. Rev. 2013, 5, 406–413. [Google Scholar] [CrossRef]
  20. Gray, J.A. On the classification of the emotions. Behav. Brain Sci. 1982, 5, 431–432. [Google Scholar] [CrossRef]
  21. Feidakis, M.; Daradoumis, T.; Caballe, S. Endowing e-Learning Systems with Emotion Awareness. In Proceedings of the 2011 Third International Conference on Intelligent Networking and Collaborative Systems, Fukuoka, Japan, 30 November–2 December 2011; pp. 68–75. [Google Scholar]
  22. Université de Montréal; Presses de l’Université de Montréal. Interaction of Emotion and Cognition in the Processing of Textual Materia; Presses de l’Université de Montréal: Québec, QC, Canada, 1966; Volume 52. [Google Scholar]
  23. Russell, J.A. A circumplex model of affect. J. Pers. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
  24. Csikszentmihalyi, M. Flow and the Foundations of Positive Psychology: The collected works of Mihaly Csikszentmihalyi; Springer: Dordrecht, The Netherlands, 2014; ISBN 9401790884. [Google Scholar]
  25. Kaklauskas, A. Biometric and Intelligent Decision Making Support; Springer: Cham, Switzerland, 2015; Volume 81, ISBN 978-3-319-13658-5. [Google Scholar]
  26. Kaklauskas, A.; Kuzminske, A.; Zavadskas, E.K.; Daniunas, A.; Kaklauskas, G.; Seniut, M.; Raistenskis, J.; Safonov, A.; Kliukas, R.; Juozapaitis, A.; et al. Affective tutoring system for built environment management. Comput. Educ. 2015, 82, 202–216. [Google Scholar] [CrossRef]
  27. Kaklauskas, A.; Jokubauskas, D.; Cerkauskas, J.; Dzemyda, G.; Ubarte, I.; Skirmantas, D.; Podviezko, A.; Simkute, I. Affective analytics of demonstration sites. Eng. Appl. Artif. Intell. 2019, 81, 346–372. [Google Scholar] [CrossRef]
  28. Kaklauskas, A.; Zavadskas, E.K.; Bardauskiene, D.; Cerkauskas, J.; Ubarte, I.; Seniut, M.; Dzemyda, G.; Kaklauskaite, M.; Vinogradova, I.; Velykorusova, A. An Affect-Based Built Environment Video Analytics. Autom. Constr. 2019, 106, 102888. [Google Scholar] [CrossRef]
  29. Emotion-Sensing Technology in the Internet of Things. Available online: https://onix-systems.com/blog/emotion-sensing-technology-in-the-internet-of-things (accessed on 30 December 2019).
  30. Wallbott, H.G.; Scherer, K.R. Assesing emotion by questionnaire. In The Measurement of Emotions; Academic Press: Cambridge, MA, USA, 1989; pp. 55–82. ISBN 9780125587044. [Google Scholar]
  31. Becker, A.; Hagenberg, N.; Roessner, V.; Woerner, W.; Rothenberger, A. Evaluation of the self-reported SDQ in a clinical setting: Do self-reports tell us more than ratings by adult informants? Eur. Child. Adolesc. Psychiatry 2004, 13, 17–24. [Google Scholar] [CrossRef]
  32. Isomursu, M.; Tähti, M.; Väinämö, S.; Kuutti, K. Experimental evaluation of five methods for collecting emotions in field settings with mobile applications. Int. J. Hum. Comput. Stud. 2007, 65, 404–418. [Google Scholar] [CrossRef]
  33. Mahlke, S.; Minge, M.; Thüring, M. Measuring multiple components of emotions in interactive contexts. In CHI ‘06 Extended Abstracts on Human Factors in Computing Systems-CHI EA ‘06; ACM Press: New York, NY, USA, 2006; p. 1061. [Google Scholar]
  34. Liapis, A.; Katsanos, C.; Sotiropoulos, D.; Xenos, M.; Karousos, N. Recognizing Emotions in Human Computer Interaction: Studying Stress Using Skin Conductance; Springer: Cham, Switzerland, 2015; pp. 255–262. [Google Scholar]
  35. Camurri, A.; Lagerlöf, I.; Volpe, G. Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques. Int. J. Hum. Comput. Stud. 2003, 59, 213–225. [Google Scholar] [CrossRef]
  36. Scherer, K.R. What are emotions? And how can they be measured? Soc. Sci. Inf. 2005, 44, 695–729. [Google Scholar] [CrossRef]
  37. Gonçalves, V.P.; Giancristofaro, G.T.; Filho, G.P.R.; Johnson, T.; Carvalho, V.; Pessin, G.; de Almeida Neris, V.P.; Ueyama, J. Assessing users’ emotion at interaction time: a multimodal approach with multiple sensors. Soft Comput. 2017, 21, 5309–5323. [Google Scholar] [CrossRef]
  38. St. Louis, E.K.; Frey, L.C.; Britton, J.W.; Frey, L.C.; Hopp, J.L.; Korb, P.; Koubeissi, M.Z.; Lievens, W.E.; Pestana-Knight, E.M.; St. Louis, E.K. Electroencephalography (EEG): An Introductory Text and Atlas of Normal and Abnormal Findings in Adults, Children, and Infants; American Epilepsy Society: Chicago, IL, USA, 2016; ISBN 9780997975604. [Google Scholar]
  39. Aminoff, M.J. Electroencephalography: General principles and clinical applications. In Aminoff’s Electrodiagnosis in Clinical Neurology; Saunders, W.B., Ed.; Elsevier B.V.: Amsterdam, The Netherlands, 2012; pp. 37–84. ISBN 9781455703081. [Google Scholar]
  40. Hope, C. “Volunteer Duty” Psychology Testing|Photo by Chris Hope AS.| Flickr. Available online: https://www.flickr.com/photos/tim_uk/8135755109/in/photostream/ (accessed on 27 December 2019).
  41. EEG: Electroencephalography—iMotions Software and EEG Headsets. Available online: https://imotions.com/biosensor/electroencephalography-eeg/ (accessed on 29 October 2019).
  42. Electroencephalography | Definition, Procedure, & Uses | Britannica.com. Available online: https://www.britannica.com/science/electroencephalography (accessed on 29 October 2019).
  43. B Bajaj, V.; Pachori, R.B. EEG Signal Classification Using Empirical Mode Decomposition and Support Vector Machine. In Proceedings of the International Conference on Soft Computing for Problem Solving (SocProS 2011) December 20–22, 2011; Springer: New Delhi, India; 2012; pp. 623–635. [Google Scholar]
  44. Oikonomou, V.P.; Tzallas, A.T.; Fotiadis, D.I. A Kalman filter based methodology for EEG spike enhancement. Comput. Methods Programs Biomed. 2007, 85, 101–108. [Google Scholar] [CrossRef] [PubMed]
  45. Kaur, B.; Singh, D.; Roy, P.P. EEG Based Emotion Classification Mechanism in BCI. In Proceedings of the Procedia Computer Science, Sanur, Bali, Indonesia, 17–19 April 2018. [Google Scholar]
  46. Pagani, C. Violence and Complexity. Open Psychol. J. 2015, 8, 11–16. [Google Scholar] [CrossRef] [Green Version]
  47. Wan Ismail, W.O.A.S.; Hanif, M.; Mohamed, S.B.; Hamzah, N.; Rizman, Z.I. Human Emotion Detection via Brain Waves Study by Using Electroencephalogram (EEG). Int. J. Adv. Sci. Eng. Inf. Technol. 2016, 6, 1005. [Google Scholar] [CrossRef] [Green Version]
  48. Shakshi, R.J. Brain Wave Classification and Feature Extraction of EEG Signal by Using FFT on Lab View. Int. Res. J. Eng. Technol. 2016, 3, 1208–1212. [Google Scholar]
  49. EEG-Event Related Potentials. Available online: http://www.medicine.mcgill.ca/physio/vlab/biomed_signals/eeg_erp.htm (accessed on 3 November 2019).
  50. Vijayan, A.E.; Sen, D.; Sudheer, A.P. EEG-Based Emotion Recognition Using Statistical Measures and Auto-Regressive Modeling. In Proceedings of the 2015 IEEE International Conference on Computational Intelligence & Communication Technology, Riga, Latvia, 3–5 June 2015; pp. 587–591. [Google Scholar]
  51. Dissanayake, T.; Rajapaksha, Y.; Ragel, R.; Nawinne, I. An Ensemble Learning Approach for Electrocardiogram Sensor Based Human Emotion Recognition. Sensors 2019, 19, 4495. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  52. Nakisa, B.; Rastgoo, M.N.; Tjondronegoro, D.; Chandran, V. Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Syst. Appl. 2018, 93, 143–155. [Google Scholar] [CrossRef] [Green Version]
  53. Liu, Y.-H.; Wu, C.-T.; Cheng, W.-T.; Hsiao, Y.-T.; Chen, P.-M.; Teng, J.-T. Emotion Recognition from Single-Trial EEG Based on Kernel Fisher’s Emotion Pattern and Imbalanced Quasiconformal Kernel Support Vector Machine. Sensors 2014, 14, 13361–13388. [Google Scholar] [CrossRef] [Green Version]
  54. Zhang, J.; Chen, M.; Zhao, S.; Hu, S.; Shi, Z.; Cao, Y. ReliefF-Based EEG Sensor Selection Methods for Emotion Recognition. Sensors 2016, 16, 1558. [Google Scholar] [CrossRef]
  55. Mehmood, R.; Lee, H. Towards Building a Computer Aided Education System for Special Students Using Wearable Sensor Technologies. Sensors 2017, 17, 317. [Google Scholar] [CrossRef]
  56. Purnamasari, P.; Ratna, A.; Kusumoputro, B. Development of Filtered Bispectrum for EEG Signal Feature Extraction in Automatic Emotion Recognition Using Artificial Neural Networks. Algorithms 2017, 10, 63. [Google Scholar] [CrossRef]
  57. Li, Y.; Huang, J.; Zhou, H.; Zhong, N. Human Emotion Recognition with Electroencephalographic Multidimensional Features by Hybrid Deep Neural Networks. Appl. Sci. 2017, 7, 1060. [Google Scholar] [CrossRef] [Green Version]
  58. Alazrai, R.; Homoud, R.; Alwanni, H.; Daoud, M. EEG-Based Emotion Recognition Using Quadratic Time-Frequency Distribution. Sensors 2018, 18, 2739. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  59. Chao, H.; Dong, L.; Liu, Y.; Lu, B. Emotion Recognition from Multiband EEG Signals Using CapsNet. Sensors 2019, 19, 2212. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  60. Cai, J.; Chen, W.; Yin, Z. Multiple Transferable Recursive Feature Elimination Technique for Emotion Recognition Based on EEG Signals. Symmetry 2019, 11, 683. [Google Scholar] [CrossRef] [Green Version]
  61. Gao, Z.; Cui, X.; Wan, W.; Gu, Z. Recognition of Emotional States using Multiscale Information Analysis of High Frequency EEG Oscillations. Entropy 2019, 21, 609. [Google Scholar] [CrossRef] [Green Version]
  62. Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.-S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis;Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
  63. Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A Multimodal Database for Affect Recognition and Implicit Tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef] [Green Version]
  64. Carvalho, S.; Leite, J.; Galdo-Álvarez, S.; Gonçalves, Ó.F. The Emotional Movie Database (EMDB): A Self-Report and Psychophysiological Study. Appl. Psychophysiol. Biofeedback 2012, 37, 279–294. [Google Scholar] [CrossRef] [Green Version]
  65. Abadi, M.K.; Subramanian, R.; Kia, S.M.; Avesani, P.; Patras, I.; Sebe, N. DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses. IEEE Trans. Affect. Comput. 2015, 6, 209–222. [Google Scholar] [CrossRef]
  66. Schaekermann, M. Biosignal Datasets for Emotion Recognition. Available online: http://hcigames.com/hci/biosignal-datasets-emotion-recognition/ (accessed on 5 November 2019).
  67. International Neural Network Society; Verband der Elektrotechnik; Institute of Electrical and Electronics Engineers. ANNA ’18: Advances in Neural Networks and Applications 2018 September 15–17, 2018, St. St. Konstantin and Elena Resort, Bulgaria; Vde Verlag GmbH: Berlin, Germany, 2018; ISBN 9783800747566. [Google Scholar]
  68. Goshvarpour, A.; Abbasi, A.; Goshvarpour, A. An Emotion Recognition Approach Based on Wavelet Transform and Second-Order Difference Plot of ECG. J. AI Data Min. 2017, 5, 211–221. [Google Scholar]
  69. Al Khatib, I.; Bertozzi, D.; Poletti, F.; Benini, L.; Jantsch, A.; Bechara, M.; Khalifeh, H.; Hajjar, M.; Nabiev, R.; Jonsson, S. Hardware/software architecture for real-time ECG monitoring and analysis leveraging MPSoC technology. In Transactions on High-Performance Embedded Architectures and Compilers I; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2007; pp. 239–258. [Google Scholar]
  70. Paithane, A.N.; Bormane, D.S.; Dinde, S. Human Emotion Recognition using Electrocardiogram Signals. Int. J. Recent Innov. Trends Comput. Commun. 2014, 2, 194–197. [Google Scholar]
  71. Amri, M.F.; Rizqyawan, M.I.; Turnip, A. ECG signal processing using offline-wavelet transform method based on ECG-IoT device. In Proceedings of the 2016 3rd International Conference on Information Technology, Computer and Electrical Engineering, Semarang, Indonesia, 18–20 October 2016; pp. 25–30. [Google Scholar]
  72. ECG Setup—Wikimedia Commons. Available online: https://commons.wikimedia.org/wiki/File:Ekg_NIH.jpg (accessed on 28 December 2019).
  73. Cai, J.; Liu, G.; Hao, M. The Research on Emotion Recognition from ECG Signal. In Proceedings of the 2009 International Conference on Information Technology and Computer Science, Kiev, Ukraine, 25–26 July 2009; pp. 497–500. [Google Scholar]
  74. Uyarel, H.; Okmen, E.; Cobanoǧlu, N.; Karabulut, A.; Cam, N. Effects of anxiety on QT dispersion in healthy young men. Acta Cardiol. 2006, 61, 83–87. [Google Scholar] [CrossRef] [PubMed]
  75. Abdul Jamil, M.M.; Soon, C.F.; Achilleos, A.; Youseffi, M.; Javid, F. Electrocardiograph (ECG) circuit design and software-based processing using LabVIEW. J. Telecommun. Electron. Comput. Eng. 2017, 9, 57–66. [Google Scholar]
  76. Nikolova, D.; Petkova, P.; Manolova, A.; Georgieva, P. ECG-based Emotion Recognition: Overview of Methods and Applications. In Proceedings of the ANNA ’18 Advances in Neural Networks and Applications 2018, St. Konstantin and Elena Resort, Bulgaria, 15–17 September 2018; pp. 118–122. [Google Scholar]
  77. Marín-Morales, J.; Higuera-Trujillo, J.L.; Greco, A.; Guixeres, J.; Llinares, C.; Scilingo, E.P.; Alcañiz, M.; Valenza, G. Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors. Sci. Rep. 2018, 8, 13657. [Google Scholar] [CrossRef] [PubMed]
  78. Udovičić, G.; Derek, J.; Russo, M.; Sikora, M. Wearable Emotion Recognition system based on GSR and PPG signals. In Proceedings of the 2nd International Workshop on Multimedia for Personal Health and Health Care, Mountain View, CA, USA, 23 October 2017; pp. 53–59. [Google Scholar]
  79. Wu, G.; Liu, G.; Hao, M. The Analysis of Emotion Recognition from GSR Based on PSO. In Proceedings of the 2010 International Symposium on Intelligence Information Processing and Trusted Computing, Wuhan, China, 28–29 October 2010; pp. 360–363. [Google Scholar]
  80. Lidberg, L.; Wallin, B.G. Sympathetic Skin Nerve Discharges in Relation to Amplitude of Skin Resistance Responses. Psychophysiology 1981, 18, 268–270. [Google Scholar] [CrossRef]
  81. Ayata, D.; Yaslan, Y.; Kamasak, M. Emotion recognition via galvanic skin response: Comparison of machine learning algorithms and feature extraction methods. Istanbul Univ. J. Electr. Electron. Eng. 2017, 17, 3129–3136. [Google Scholar]
  82. Critchley, H.D. Review: Electrodermal Responses: What Happens in the Brain. Neurosci 2002, 8, 132–142. [Google Scholar] [CrossRef]
  83. Lang, P.J.; Greenwald, M.K.; Bradley, M.M.; Hamm, A.O. Looking at pictures: Affective, facial, visceral, and behavioral reactions. Psychophysiology 1993, 30, 261–273. [Google Scholar] [CrossRef]
  84. Duda, S.; Hawkins, D.; McGill, M. Physiological Response Measurements. Eye Track. User Exp. Des. 2014, 81–108. [Google Scholar] [CrossRef]
  85. Boucsein, W.; Fowles, D.C.; Grimnes, S.; Ben-Shakhar, G.; Roth, W.T.; Dawson, M.E.; Filion, D.L.; Society for Psychophysiological Research Ad Hoc Committee on Electrodermal Measures. Publication recommendations for electrodermal measurements. Psychophysiology 2012, 49, 1017–1034. [Google Scholar]
  86. Van Dooren, M.; de Vries, J.J.; Janssen, J.H. Emotional sweating across the body: Comparing 16 different skin conductance measurement locations. Physiol. Behav. 2012, 106, 298–304. [Google Scholar] [CrossRef] [PubMed]
  87. Neuro-Tools: GSR|Acuity Eyetracking Blog. Available online: https://acuityets.wordpress.com/2016/10/24/series-neuro-tools-gsr/ (accessed on 8 November 2019).
  88. Gatti, E.; Calzolari, E.; Maggioni, E.; Obrist, M. Emotional ratings and skin conductance response to visual, auditory and haptic stimuli. Sci. Data 2018, 5, 180120. [Google Scholar] [CrossRef] [PubMed]
  89. Greco, A.; Lanata, A.; Citi, L.; Vanello, N.; Valenza, G.; Scilingo, E. Skin Admittance Measurement for Emotion Recognition: A Study over Frequency Sweep. Electronics 2016, 5, 46. [Google Scholar] [CrossRef] [Green Version]
  90. Villon, O.; Lisetti, C. Toward Recognizing Individual’s Subjective Emotion from Physiological Signals in Practical Application. In Proceedings of the Twentieth IEEE International Symposium on Computer-Based Medical Systems (CBMS’07), Maribor, Slovenia, 20–22 June 2007; pp. 357–362. [Google Scholar]
  91. Chanel, G.; Kierkels, J.J.M.; Soleymani, M.; Pun, T. Short-term emotion assessment in a recall paradigm. Int. J. Hum. Comput. Stud. 2009, 67, 607–627. [Google Scholar] [CrossRef]
  92. Chanel, G.; Kronegg, J.; Grandjean, D.; Pun, T. Emotion Assessment: Arousal Evaluation Using EEG’s and Peripheral Physiological Signals. In Proceedings of the International Workshop on Multimedia Content Representation, Classification And Security; Istanbul, Turkey, 11–13 September 2006, Springer: Berlin/Heidelberg, Germany, 2006; Volume 4105, pp. 530–537. [Google Scholar]
  93. Peter, C.; Ebert, E.; Beikirch, H. A Wearable Multi-sensor System for Mobile Acquisition of Emotion-Related Physiological Data; Springer: Berlin/Heidelberg, Germany, 2005; pp. 691–698. [Google Scholar]
  94. Villon, O.; Lisetti, C. A User-Modeling Approach to Build User’s Psycho-Physiological Maps of Emotions using Bio-Sensors. In Proceedings of the ROMAN 2006–The 15th IEEE International Symposium on Robot and Human Interactive Communication, Herthfordshire, UK, 6–8 September 2006; pp. 269–276. [Google Scholar]
  95. Sungwon, L.; Choong-Seon, H.; Yong Kwi, L. Hyun-soon Shin Experimental emotion recognition system and services for mobile network environments. In Proceedings of the 2010 IEEE Sensors, Limerick, Ireland, 1–4 November 2010; pp. 136–140. [Google Scholar]
  96. Sierra, A.D.S.; Ávila, C.S.; Casanova, J.G.; Bailador, G. Real-Time Stress Detection by Means of Physiological Signals. In Advanced Biometric Technologies; IntechOpen: London, UK, 2011; pp. 23–44. [Google Scholar]
  97. Hsieh, P.-Y.; Chin, C.-L. The emotion recognition system with Heart Rate Variability and facial image features. In Proceedings of the 2011 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2011), San Diego, CA, USA, 8–12 March 2011; pp. 1933–1940. [Google Scholar]
  98. Huang, C.; Liew, S.S.; Lin, G.R.; Poulsen, A.; Ang, M.J.Y.; Chia, B.C.S.; Chew, S.Y.; Kwek, Z.P.; Wee, J.L.K.; Ong, E.H.; et al. Discovery of Irreversible Inhibitors Targeting Histone Methyltransferase, SMYD3. ACS Med. Chem. Lett. 2019, 10, 978–984. [Google Scholar] [CrossRef]
  99. Benezeth, Y.; Li, P.; Macwan, R.; Nakamura, K.; Yang, F.; Benezeth, Y.; Li, P.; Macwan, R.; Nakamura, K.; Gomez, R.; et al. Remote Heart Rate Variability for Emotional State Monitoring. In Proceedings of the 2018 IEEE EMBS International Conference on Biomedical & Health Informatics (BHI), Las Vegas, NV, USA, 4–7 March 2018; pp. 153–156. [Google Scholar]
  100. Andreas, H.; Silke, G.; Peter, S.J.W. Emotion Recognition Using Bio-Sensors: First Steps Towards an Automatic System. In Affective Dialogue Systems; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Geramny, 2004. [Google Scholar]
  101. Mikuckas, A.; Mikuckiene, I.; Venckauskas, A.; Kazanavicius, E.; Lukas, R.; Plauska, I. Emotion recognition in human computer interaction systems. Elektron. Elektrotech. 2014, 20, 51–56. [Google Scholar] [CrossRef]
  102. Zhu, J.; Ji, L.; Liu, C. Heart rate variability monitoring for emotion and disorders of emotion. Physiol. Meas. 2019, 40, 064004. [Google Scholar] [CrossRef]
  103. Markovics, Z.; Lauznis, J.; Erins, M.; Minejeva, O.; Kivlenieks, R. Testing and Analysis of the HRV Signals from Wearable Smart HRV Sensors. Int. J. Eng. Technol. 2018, 7, 1211. [Google Scholar] [CrossRef]
  104. Tamura, T.; Maeda, Y.; Sekine, M.; Yoshida, M. Wearable Photoplethysmographic Sensors—Past and Present. Electronics 2014, 3, 282–302. [Google Scholar] [CrossRef]
  105. Allen, J. Photoplethysmography and its application in clinical physiological measurement. Physiol. Meas. 2007, 28, R1–R39. [Google Scholar] [CrossRef] [Green Version]
  106. Jeyhani, V.; Mahdiani, S.; Peltokangas, M.; Vehkaoja, A. Comparison of HRV parameters derived from photoplethysmography and electrocardiography signals. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Milano, Italy, 25–29 August 2015; pp. 5952–5955. [Google Scholar]
  107. Choi, K.-H.; Kim, J.; Kwon, O.S.; Kim, M.J.; Ryu, Y.H.; Park, J.-E. Is heart rate variability (HRV) an adequate tool for evaluating human emotions?–A focus on the use of the International Affective Picture System (IAPS). Psychiatry Res. 2017, 251, 192–196. [Google Scholar] [CrossRef] [PubMed]
  108. Maritsch, M.; Bérubé, C.; Kraus, M.; Lehmann, V.; Züger, T.; Feuerriegel, S.; Kowatsch, T.; Wortmann, F. Improving Heart Rate Variability Measurements from consumer Smartwatches with Machine Learning. In Proceedings of the UbiComp ’19: The 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing, London, UK, 9–13 September 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 934–938. [Google Scholar]
  109. Elgendi, M.; Fletcher, R.; Liang, Y.; Howard, N.; Lovell, N.H.; Abbott, D.; Lim, K.; Ward, R. The use of photoplethysmography for assessing hypertension. NPJ Digit. Med. 2019, 2, 60. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  110. Xiefeng, C.; Wang, Y.; Dai, S.; Zhao, P.; Liu, Q. Heart sound signals can be used for emotion recognition. Sci. Rep. 2019, 9, 6486. [Google Scholar] [CrossRef] [PubMed]
  111. Boric-Lubecke, O.; Massagram, W.; Lubecke, V.M.; Host-Madsen, A.; Jokanovic, B. Heart Rate Variability Assessment Using Doppler Radar with Linear Demodulation. In Proceedings of the 2008 38th European Microwave Conference, Amsterdam, The Netherlands, 28–30 October 2008; pp. 420–423. [Google Scholar]
  112. Chanel, G.; Ansari-Asl, K. Thierry Pun Valence-arousal evaluation using physiological signals in an emotion recall paradigm. In Proceedings of the 2007 IEEE International Conference on Systems, Man and Cybernetics, Quebec, QC, Canada, 7–10 October 2007; pp. 2662–2667. [Google Scholar]
  113. Park, M.W.; Kim, C.J.; Hwang, M.; Lee, E.C. Individual Emotion Classification between Happiness and Sadness by Analyzing Photoplethysmography and Skin Temperature. In Proceedings of the 2013 Fourth World Congress on Software Engineering, Hong Kong, China, 3–4 December 2013; pp. 190–194. [Google Scholar]
  114. Quazi, M.T.; Mukhopadhyay, S.C.; Suryadevara, N.K.; Huang, Y.M. Towards the smart sensors based human emotion recognition. In Proceedings of the 2012 IEEE International Instrumentation and Measurement Technology Conference Proceedings, Graz, Austria, 13–16 May 2012; pp. 2365–2370. [Google Scholar]
  115. Choi, J.; Ahmed, B.; Gutierrez-Osuna, R. Development and Evaluation of an Ambulatory Stress Monitor Based on Wearable Sensors. IEEE Trans. Inf. Technol. Biomed. 2012, 16, 279–286. [Google Scholar] [CrossRef] [Green Version]
  116. Lee, M.S.; Lee, Y.K.; Pae, D.S.; Lim, M.T.; Kim, D.W.; Kang, T.K. Fast Emotion Recognition Based on Single Pulse PPG Signal with Convolutional Neural Network. Appl. Sci. 2019, 9, 3355. [Google Scholar] [CrossRef] [Green Version]
  117. Hui, T.K.L.; Sherratt, R.S. Coverage of Emotion Recognition for Common Wearable Biosensors. Biosensors 2018, 8, 30. [Google Scholar]
  118. Zhang, Q.; Chen, X.; Zhan, Q.; Yang, T.; Xia, S. Respiration-based emotion recognition with deep learning. Comput. Ind. 2017, 92–93, 84–90. [Google Scholar] [CrossRef]
  119. Ginsburg, A.S.; Lenahan, J.L.; Izadnegahdar, R.; Ansermino, J.M. A Systematic Review of Tools to Measure Respiratory Rate in Order to Identify Childhood Pneumonia. Am. J. Respir. Crit. Care Med. 2018, 197, 1116–1127. [Google Scholar] [CrossRef]
  120. Liu, H.; Allen, J.; Zheng, D.; Chen, F. Recent development of respiratory rate measurement technologies. Physiol. Meas. 2019, 40, 07TR01. [Google Scholar] [CrossRef] [Green Version]
  121. Takahashi, K.; Namikawa, S.; Hashimoto, M. Computational emotion recognition using multimodal physiological signals: Elicited using Japanese kanji words. In Proceedings of the 2012 35th International Conference on Telecommunications and Signal Processing (TSP), Prague, Czech Republic, 3–4 July 2012; pp. 615–620. [Google Scholar]
  122. Katsis, C.D.; Katertsidis, N.; Ganiatsas, G.; Fotiadis, D.I. Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2008, 38, 502–512. [Google Scholar] [CrossRef]
  123. Nhan, B.R.; Chau, T. Classifying Affective States Using Thermal Infrared Imaging of the Human Face. IEEE Trans. Biomed. Eng. 2010, 57, 979–987. [Google Scholar] [CrossRef] [PubMed]
  124. Landowska, A. Emotion Monitoring—Verification of Physiological Characteristics Measurement Procedures. Metrol. Meas. Syst. 2014, 21, 719–732. [Google Scholar] [CrossRef] [Green Version]
  125. Valderas, M.T.; Bolea, J.; Laguna, P.; Vallverdu, M.; Bailon, R. Human emotion recognition using heart rate variability analysis with spectral bands based on respiration. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milano, Italy, 25–29 August 2015; pp. 6134–6137. [Google Scholar]
  126. Healey, J.A.; Picard, R.W. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transp. Syst. 2005, 6, 156–166. [Google Scholar] [CrossRef] [Green Version]
  127. Kosonogov, V.; De Zorzi, L.; Honoré, J.; Martínez-Velázquez, E.S.; Nandrino, J.-L.; Martinez-Selva, J.M.; Sequeira, H. Facial thermal variations: A new marker of emotional arousal. PLoS ONE 2017, 12, e0183592. [Google Scholar] [CrossRef] [PubMed]
  128. Krumova, E.K.; Frettlöh, J.; Klauenberg, S.; Richter, H.; Wasner, G.; Maier, C. Long-term skin temperature measurements – A practical diagnostic tool in complex regional pain syndrome. Pain 2008, 140, 8–22. [Google Scholar] [CrossRef]
  129. American Psychosomatic Society; National Research Council (U.S.); Committee on Problems of Neurotic Behavior; American Society for Research in Psychosomatic Problems. Psychosom. Medicine; Elsevier: Amsterdam, The Netherlands, 1943; Volume 5. [Google Scholar]
  130. Vos, P.; De Cock, P.; Munde, V.; Petry, K.; Van Den Noortgate, W.; Maes, B. The tell-tale: What do heart rate; skin temperature and skin conductance reveal about emotions of people with severe and profound intellectual disabilities? Res. Dev. Disabil. 2012, 33, 1117–1127. [Google Scholar] [CrossRef] [PubMed]
  131. Okada, S.; Hori, N.; Kimoto, K.; Onozuka, M.; Sato, S.; Sasaguri, K. Effects of biting on elevation of blood pressure and other physiological responses to stress in rats: Biting may reduce allostatic load. Brain Res. 2007, 1185, 189–194. [Google Scholar] [CrossRef] [PubMed]
  132. Briese, E. Cold increases and warmth diminishes stress-induced rise of colonic temperature in rats. Physiol. Behav. 1992, 51, 881–883. [Google Scholar] [CrossRef]
  133. Kim, K.H.; Bang, S.W.; Kim, S.R. Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Comput. 2004, 42, 419–427. [Google Scholar] [CrossRef]
  134. Leijdekkers, P.; Gay, V.; Wong, F. CaptureMyEmotion: A mobile app to improve emotion learning for autistic children using sensors. In Proceedings of the 26th IEEE International Symposium on Computer-Based Medical Systems, Porto, Portugal, 20–22 June 2013; pp. 381–384.
  135. Choi, J.-S.; Bang, J.; Heo, H.; Park, K. Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors. Sensors 2015, 15, 17507–17533. [Google Scholar] [CrossRef] [Green Version]
  136. Nakanishi, R.; Imai-Matsumura, K. Facial skin temperature decreases in infants with joyful expression. Infant Behav. Dev. 2008, 31, 137–144. [Google Scholar] [CrossRef]
  137. Bruno, P.; Melnyk, V.; Völckner, F. Temperature and emotions: Effects of physical temperature on responses to emotional advertising. Int. J. Res. Mark. 2017, 34, 302–320. [Google Scholar] [CrossRef]
  138. Sonkusare, S.; Ahmedt-Aristizabal, D.; Aburn, M.J.; Nguyen, V.T.; Pang, T.; Frydman, S.; Denman, S.; Fookes, C.; Breakspear, M.; Guo, C.C. Detecting changes in facial temperature induced by a sudden auditory stimulus based on deep learning-assisted face tracking. Sci. Rep. 2019, 9, 4729. [Google Scholar] [CrossRef] [PubMed]
  139. Van Marken Lichtenbelt, W.D.; Daanen, H.A.M.; Wouters, L.; Fronczek, R.; Raymann, R.J.E.M.; Severens, N.M.W.; Van Someren, E.J.W. Evaluation of wireless determination of skin temperature using iButtons. Physiol. Behav. 2006, 88, 489–497. [Google Scholar] [CrossRef] [PubMed]
  140. Nasoz, F.; Alvarez, K.; Lisetti, C.L.; Finkelstein, N. Emotion recognition from physiological signals using wireless sensors for presence technologies. Cognit. Technol. Work 2004, 6, 4–14. [Google Scholar] [CrossRef]
  141. Puri, C.; Olson, L.; Pavlidis, I.; Levine, J.; Starren, J. Stresscam: Non-contact measurement of users’ emotional states through thermal imaging. In Proceedings of the Conference on Human Factors in Computing Systems (CHI EA 2005), Portland, OR, USA, 2–7 April 2005; pp. 1725–1728. [Google Scholar]
  142. Zong, C.; Chetouani, M. Hilbert-Huang transform based physiological signals analysis for emotion recognition. In Proceedings of the 2009 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Ajman, UAE, 14–17 December 2009; pp. 334–339. [Google Scholar]
  143. Paul Ekman, W.V.F. A technique for the measurement of facial action. In Facial Action Coding System (FACS); Paul Ekman Group: Manchester, UK, 1978. [Google Scholar]
  144. Matzke, B.; Herpertz, S.C.; Berger, C.; Fleischer, M.; Domes, G. Facial Reactions during Emotion Recognition in Borderline Personality Disorder: A Facial Electromyography Study. Psychopathology 2014, 47, 101–110. [Google Scholar] [CrossRef] [PubMed]
  145. Turabzadeh, S.; Meng, H.; Swash, R.; Pleva, M.; Juhar, J. Facial Expression Emotion Detection for Real-Time Embedded Systems. Technologies 2018, 6, 17. [Google Scholar] [CrossRef] [Green Version]
  146. Huang, Y.; Chen, F.; Lv, S.; Wang, X. Facial Expression Recognition: A Survey. Symmetry 2019, 11, 1189. [Google Scholar] [CrossRef] [Green Version]
  147. Weyers, P.; Muhlberger, A.; Hefele, C.; Pauli, P. Electromyographic responses to static and dynamic avatar emotional facial expressions. Psychophysiology 2006, 43, 450–453. [Google Scholar] [CrossRef]
  148. Zahak, M. Signal Acquisition Using Surface EMG and Circuit Design Considerations for Robotic Prosthesis. In Computational Intelligence in Electromyography Analysis–A Perspective on Current Applications and Future Challenges; InTech: London, UK, 2012. [Google Scholar]
  149. Boxtel, A. Van Facial EMG as a tool for inferring affective states. Proc. Meas. Behav. 2010, 2010, 104–108. [Google Scholar]
  150. EMG Electrodes—Supplies. Available online: https://bio-medical.com/supplies/emg-electrodes.html?p=2 (accessed on 7 November 2019).
  151. Wioleta, S. Using physiological signals for emotion recognition. In Proceedings of the 2013 6th International Conference on Human System Interactions (HSI), Gdansk, Poland, 6–8 June 2013; pp. 556–561. [Google Scholar]
  152. Girardi, D.; Lanubile, F.; Novielli, N. Emotion detection using noninvasive low cost sensors. In Proceedings of the 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), San Antonio, TX, USA, 23–26 October 2017; pp. 125–130. [Google Scholar]
  153. Martínez-Rodrigo, A.; Zangróniz, R.; Pastor, J.M.; Latorre, J.M.; Fernández-Caballero, A. Emotion Detection in Ageing Adults from Physiological Sensors. In Proceedings of the Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2015; Volume 376, pp. 253–261. [Google Scholar]
  154. Nakasone, A.; Prendinger, H.; Ishizuka, M. ProComp Infiniti Bio-signal Encoder. In Proceedings of the 5th International Workshop on Biosignal Interpretation, Tokyo, Janpan, 6–8 September 2005; pp. 219–222. [Google Scholar]
  155. Wagner, J.; Kim, J.; Andre, E. From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification. In Proceedings of the 2005 IEEE International Conference on Multimedia and Expo, Amsterdam, The Netherlands, 6–8 July 2005; pp. 940–943. [Google Scholar]
  156. Furman, J.M.; Wuyts, F.L. Vestibular Laboratory Testing. In Aminoff’s Electrodiagnosis in Clinical Neurology; Saunders: Philadelphia, PA, USA, 2012; pp. 699–723. [Google Scholar]
  157. Lord, M.P.; Wright, W.D. The investigation of eye movements. Rep. Prog. Phys. 1950, 13, 1–23. [Google Scholar]
  158. Aguiñaga, A.R.; Lopez Ramirez, M.; Alanis Garza, A.; Baltazar, R.; Zamudio, V.M. Emotion analysis through physiological measurements. In Workshop Proceedings of the 9th International Conference on Intelligent Environments; IOS Press: Amsterdam, The Netherlands, 2013; pp. 97–106. [Google Scholar]
  159. Picot, A.; Charbonnier, S.; Caplier, A. EOG-based drowsiness detection: Comparison between a fuzzy system and two supervised learning classifiers. IFAC Proc. Vol. 2011, 44, 14283–14288. [Google Scholar] [CrossRef] [Green Version]
  160. Ramkumar, S.; Sathesh Kumar, K.; Dhiliphan Rajkumar, T.; Ilayaraja, M.; Shankar, K. A review-classification of electrooculogram based human computer interfaces. Biomed. Res. 2018, 29, 1078–1084. [Google Scholar]
  161. Siddiqui, U.; Shaikh, A.N. An Overview of “Electrooculography”. Int. J. Adv. Res. Comput. Commun. Eng. 2013, 2, 4238–4330. [Google Scholar]
  162. Perdiz, J.; Pires, G.; Nunes, U.J. Emotional state detection based on EMG and EOG biosignals: A short survey. In Proceedings of the 2017 IEEE 5th Portuguese Meeting on Bioengineering (ENBENG), Coimbra, Portugal, 16–18 February 2017; pp. 1–4. [Google Scholar]
  163. Cruz, A.; Garcia, D.; Pires, G.; Nunes, U. Facial Expression Recognition based on EOG toward Emotion Detection for Human-Robot Interaction. In Proceedings of the International Conference on Bio-inspired Systems and Signal Processing, SCITEPRESS—Science and and Technology Publications, Lisbon, Portugal, 2–15 January 2015; pp. 31–37.
  164. Chai, X.; Wang, Q.; Zhao, Y.; Li, Y.; Liu, D.; Liu, X.; Bai, O. A Fast, Efficient Domain Adaptation Technique for Cross-Domain Electroencephalography(EEG)-Based Emotion Recognition. Sensors 2017, 17, 1014. [Google Scholar] [CrossRef] [Green Version]
  165. Wang, Y.; Lv, Z.; Zheng, Y. Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems. Sensors 2018, 18, 2826. [Google Scholar] [CrossRef] [Green Version]
  166. Paul, S.; Banerjee, A.; Tibarewala, D.N. Emotional eye movement analysis using electrooculography signal. Int. J. Biomed. Eng. Technol. 2017, 23, 59. [Google Scholar] [CrossRef]
  167. Soundariya, R.S.; Renuga, R. Eye movement based emotion recognition using electrooculography. In Proceedings of the 2017 Innovations in Power and Advanced Computing Technologies (i-PACT), Vellore, India, 21–22 April 2017; pp. 1–5. [Google Scholar]
  168. Bulling, A.; Ward, J.A.; Gellersen, H.; Tröster, G. Eye movement analysis for activity recognition using electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 741–753. [Google Scholar] [CrossRef]
  169. Saneiro, M.; Santos, O.C.; Salmeron-Majadas, S.; Boticario, J.G. Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches. Sci. World J. 2014, 2014, 1–14. [Google Scholar] [CrossRef]
  170. Yi Li Hand gesture recognition using Kinect. In Proceedings of the 2012 IEEE International Conference on Computer Science and Automation Engineering, Zhangjiajie, China, 25–27 May 2012; pp. 196–199.
  171. Schindler, K.; Van Gool, L.; de Gelder, B. Recognizing emotions expressed by body pose: A biologically inspired neural model. Neural Netw. 2008, 21, 1238–1246. [Google Scholar] [CrossRef]
  172. Farnsworth, B. Facial Action Coding System (FACS)—A Visual Guidebook. Available online: https://imotions.com/blog/facial-action-coding-system/ (accessed on 9 November 2019).
  173. Shan, C.; Gong, S.; McOwan, P.W. Beyond facial expressions: Learning human emotion from body gestures. 2007. Available online: https://www.dcs.warwick.ac.uk/bmvc2007/proceedings/CD-ROM/papers/276/bmvc07_v2.pdf (accessed on 9 November 2019).
  174. Gavrilescu, M. Recognizing emotions from videos by studying facial expressions, body postures and hand gestures. In Proceedings of the 2015 23rd Telecommunications Forum Telfor (TELFOR), Belgrade, Serbia, 24–26 November 2015; pp. 720–723. [Google Scholar]
  175. Metri, P.; Ghorpade, J.; Butalia, A. Facial Emotion Recognition Using Context Based Multimodal Approach. Int. J. Interact. Multimed. Artif. Intell. 2011, 1, 12. [Google Scholar] [CrossRef]
  176. Lee, S.; Bae, M.; Lee, W.; Kim, H. CEPP: Perceiving the Emotional State of the User Based on Body Posture. Appl. Sci. 2017, 7, 978. [Google Scholar] [CrossRef] [Green Version]
  177. Van den Stock, J.; Righart, R.; de Gelder, B. Body Expressions Influence Recognition of Emotions in the Face and Voice. Emotion 2007, 7, 487–494. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  178. Castellano, G.; Kessous, L.; Caridakis, G. Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech. In Affect and Emotion in Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2008; pp. 92–103. [Google Scholar]
  179. Subramanian, R.; Wache, J.; Abadi, M.K.; Vieriu, R.L.; Winkler, S.; Sebe, N. ASCERTAIN: Emotion and Personality Recognition Using Commercial Sensors. IEEE Trans. Affect. Comput. 2018, 9, 147–160. [Google Scholar] [CrossRef]
  180. Gay, V.; Leijdekkers, P.; Wong, F. Using sensors and facial expression recognition to personalize emotion learning for autistic children. Stud. Health Technol. Inform. 2013, 189, 71–76. [Google Scholar]
  181. Ganzha, M.; Maciaszek, L.; Paprzycki, M.; Polskie Towarzystwo Informatyczne; Institute of Electrical and Electronics Engineers; Polskie Towarzystwo Informatyczne; Mazovia Chapter; Institute of Electrical and Electronics Engineers, Region 8; IEEE Poland Section; IEEE Computational Intelligence Society. Computer Society Chapter. In Proceedings of the 2018 Federated Conference on Computer Science and Information Systems, Poznań, Poland, 9–12 September 2018; Ganzha, M., Maciaszek, L., Paprzycki, M., Eds.; Polskie Towarzystwo Informatyczne: Warszawa, Poland; Institute of Electrical andElectronics Engineers: New York, NY, USA, 2018; ISBN 9788360810903. [Google Scholar]
  182. Lee, K.; Hong, H.; Park, K. Fuzzy System-Based Fear Estimation Based on the Symmetrical Characteristics of Face and Facial Feature Points. Symmetry 2017, 9, 102. [Google Scholar] [CrossRef] [Green Version]
  183. Sapiński, T.; Kamińska, D.; Pelikant, A.; Anbarjafari, G. Emotion Recognition from Skeletal Movements. Entropy 2019, 21, 646. [Google Scholar] [CrossRef] [Green Version]
  184. Lisetti, C.L.; Nasoz, F. Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP J. Appl. Signal. Process. 2004, 2004, 1672–1687. [Google Scholar] [CrossRef] [Green Version]
  185. Li, L.; Chen, J.H. Emotion recognition using physiological signals. In Advances in Artificial Reality and Tele-Existence; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Geramny, 2006; Volume 4282, pp. 437–446. [Google Scholar]
  186. Patel, M.; Lal, S.K.L.; Kavanagh, D.; Rossiter, P. Applying neural network analysis on heart rate variability data to assess driver fatigue. Expert Syst. Appl. 2011, 38, 7235–7242. [Google Scholar] [CrossRef]
  187. Jang, E.H.; Park, B.J.; Kim, S.H.; Chung, M.A.; Sohn, J.H. Classification of three emotions by machine learning algorithms using psychophysiological signals. Int. J. Psychophysiol. 2012, 85, 402–403. [Google Scholar] [CrossRef]
  188. Soleymani, M.; Pantic, M.; Pun, T. Multimodal emotion recognition in response to videos. IEEE Trans. Affect. Comput. 2012, 3, 211–223. [Google Scholar] [CrossRef] [Green Version]
  189. Chang, C.Y.; Chang, C.W.; Zheng, J.Y.; Chung, P.C. Physiological emotion analysis using support vector regression. Neurocomputing 2013, 122, 79–87. [Google Scholar] [CrossRef]
  190. Liu, Y.; Ritchie, J.M.; Lim, T.; Kosmadoudi, Z.; Sivanathan, A.; Sung, R.C.W. A fuzzy psycho-physiological approach to enable the understanding of an engineer’s affect status during CAD activities. CAD Comput.-Aided. Des. 2014, 54, 19–38. [Google Scholar] [CrossRef] [Green Version]
  191. Verma, G.K.; Tiwary, U.S. Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals. Neuroimage 2014, 102, 162–172. [Google Scholar] [CrossRef]
  192. Nasoz, F.; Lisetti, C.L.; Vasilakos, A.V. Affectively intelligent and adaptive car interfaces. Inf. Sci. (Ny). 2010, 180, 3817–3836. [Google Scholar] [CrossRef]
  193. Regtien, P.P.L. Sensors for Mechatronics, 2nd ed.; Elsevier: Amsterdam, The Netherelands, 2012. [Google Scholar]
  194. Takahashi, K. Remarks on Emotion Recognition from Bio-Potential Signals. In Proceedings of the IEEE International Conference on Industrial Technology, Hammamet, Tunisia, 8–10 December 2004; Volume 3, pp. 1138–1143. [Google Scholar]
  195. Lin, C.J.; Lin, C.-H.; Wang, S.-H.; Wu, C.-H. Multiple Convolutional Neural Networks Fusion Using Improved Fuzzy Integral for Facial Emotion Recognition. Appl. Sci. 2019, 9, 2593. [Google Scholar] [CrossRef] [Green Version]
  196. Zucco, C.; Calabrese, B.; Cannataro, M. Sentiment analysis and affective computing for depression monitoring. In Proceedings of the 2017 IEEE International Conference on Bioinformatics and Biomedicine (BIBM 2017), Kansas, MO, USA, 13–16 November 2017; Institute of Electrical and Electronics Engineers Inc.: Kansas City, MO, USA, 2017; Volume 2017, pp. 1988–1995. [Google Scholar]
  197. Picard, R.W. Affective computing: Challenges. Int. J. Hum. Comput. Stud. 2003, 59, 55–64. [Google Scholar] [CrossRef]
  198. Picard, R.W. Affective Computing: From laughter to IEEE. IEEE Trans. Affect. Comput. 2010, 1, 11–17. [Google Scholar] [CrossRef] [Green Version]
  199. Poria, S.; Cambria, E.; Bajpai, R.; Hussain, A. A review of affective computing: From unimodal analysis to multimodal fusion. Inf. Fusion 2017, 37, 98–125. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Russel’s circumplex model of emotions.
Figure 1. Russel’s circumplex model of emotions.
Sensors 20 00592 g001
Figure 2. Electroencephalography (EEG) measurements: (a) distribution of EEG electrodes on human scalp [39]; (b) special headset with installed electrodes [40].
Figure 2. Electroencephalography (EEG) measurements: (a) distribution of EEG electrodes on human scalp [39]; (b) special headset with installed electrodes [40].
Sensors 20 00592 g002
Figure 3. EEG signal: (a) example of raw data [43]; (b) peak to peak signal amplitude evaluation technique [44].
Figure 3. EEG signal: (a) example of raw data [43]; (b) peak to peak signal amplitude evaluation technique [44].
Sensors 20 00592 g003
Figure 4. Schematic representation of electrocardiography (ECG) [69]: (a) 12-lead ECG: RA, LA, LL, RL; (b) example of ECG signals.
Figure 4. Schematic representation of electrocardiography (ECG) [69]: (a) 12-lead ECG: RA, LA, LL, RL; (b) example of ECG signals.
Sensors 20 00592 g004
Figure 5. ECG procedure [72]: (a) typical set up; (b) Main parameters of an ECG heartbeat signal.
Figure 5. ECG procedure [72]: (a) typical set up; (b) Main parameters of an ECG heartbeat signal.
Sensors 20 00592 g005
Figure 6. Possible places for attaching GSR electrodes [86].
Figure 6. Possible places for attaching GSR electrodes [86].
Sensors 20 00592 g006
Figure 7. Example of raw GSR signal. The blue area indicates the phasic component of the signal; grey area represents the tonic component. The red line indicates the trigger (moment of delivery of the stimulus) [88].
Figure 7. Example of raw GSR signal. The blue area indicates the phasic component of the signal; grey area represents the tonic component. The red line indicates the trigger (moment of delivery of the stimulus) [88].
Sensors 20 00592 g007
Figure 8. Principle of photoplethysmography (PPG) [104]: (a) reflective mode; (b) transmitting mode; (c) example of PPG signal.
Figure 8. Principle of photoplethysmography (PPG) [104]: (a) reflective mode; (b) transmitting mode; (c) example of PPG signal.
Sensors 20 00592 g008
Figure 9. Comparison between ECG and PPG signals [109].
Figure 9. Comparison between ECG and PPG signals [109].
Sensors 20 00592 g009
Figure 10. Example of skin temperature change due to applied stimulus [127].
Figure 10. Example of skin temperature change due to applied stimulus [127].
Sensors 20 00592 g010
Figure 11. Facial electromyography [149]: location of electrodes.
Figure 11. Facial electromyography [149]: location of electrodes.
Sensors 20 00592 g011
Figure 12. Example of EMG electrodes [148]: (a) needle electrode; (b) fine wire electrode; (c) gelled electrodes; (d) dry electrodes.
Figure 12. Example of EMG electrodes [148]: (a) needle electrode; (b) fine wire electrode; (c) gelled electrodes; (d) dry electrodes.
Sensors 20 00592 g012
Figure 13. Principle of electrooculography (EOG): (a) electrode placement scheme [160]; (b) measurement principle [161].
Figure 13. Principle of electrooculography (EOG): (a) electrode placement scheme [160]; (b) measurement principle [161].
Sensors 20 00592 g013
Figure 14. Comparison between EOG and EMG signals during three different, sequential actions [162]: 1—Corrugator supercilii EMG; 2—vertical EOG; 3—horizontal EOG.
Figure 14. Comparison between EOG and EMG signals during three different, sequential actions [162]: 1—Corrugator supercilii EMG; 2—vertical EOG; 3—horizontal EOG.
Sensors 20 00592 g014
Figure 15. Classification of measurement methods for emotions recognition.
Figure 15. Classification of measurement methods for emotions recognition.
Sensors 20 00592 g015
Table 1. Classification of brain waves [47,48].
Table 1. Classification of brain waves [47,48].
Type of WavesRelated Emotional StateShort Description
Delta (δ) (0.5–4 Hz)Strong sense of empathy and intuitionThe slowest brain waves often associated with sleep. Multiple frequencies in this range are accompanied by the release of human growth hormone, which is useful in healing. These waves produced in the waking state show an opportunity to access the subconscious activity.
Theta (θ) (4–8 Hz)Deep relaxation, meditationMainly adults produce the theta brain waves, when the person is in the light sleep or in dreams. These waves normally appear with closing the eyes and disappears with opening of eyes. Frequency of these waves is mainly associated stress relief and memory recollection. Twilight conditions can be used to reach deeper meditation resulting in improved health, as well as increasing creativity and learning capabilities
Alpha (α) (8–16 Hz)Creativity, RelaxationThese waves mostly present during the state of awake relaxation with eyes closed. Alpha is the resting state for the brain. Activity of alpha waves decreases in response to all types of motor activities. Alpha waves aid overall mental coordination, calmness, alertness, mind/body integration, and learning
Beta (β) (16–32 Hz)Beware, Concentration.The beta waves are produced when the person is in an alert or anxious state, and it is a dominant rhythm. Usually, they are generated in the frontal and central part of the brain. In this state, brains can easily perform: analysis, preparations of the information, generate solutions and new ideas.
Gamma (γ) (32 Hz-above)Regional Learning, Memory and Language Processing and Ideation.These waves are emitted when a person is in the abnormal condition or there will be some mental disorder. Gamma brainwaves are the fastest of brain waves and relate to simultaneous processing of information from different brain areas. Numerous theories have proposed that gamma contributes directly to brain function, but others argue that gamma is better viewed as a simple byproduct of network activity
Table 2. Review of scientific researches focused on emotions recognition and evaluation using only electroencephalography (EEG) signals.
Table 2. Review of scientific researches focused on emotions recognition and evaluation using only electroencephalography (EEG) signals.
AimEmotionsHardware and SoftwareRef.
Creation of emotion classification system using EEG signals.High/low arousal and valence.5 channels wireless headset Emotiv Insight[52]
Creation of new emotions evaluation technique based on a three-layer EEG-ER scheme.High/low arousal and valenceElectro-cap (Qucik-Cap 64) from NeuroScan system (Compumedics Inc., Charlotte, NC, USA)[53]
Research of Relief-based channel selection methods for EEG-based emotion recognition Joy, fear, sadness, relaxation[54]
Creation of an intelligent emotion recognition system for the improvement of special students learning processHappy, calmness, sadness, scareEmotiv-EPOC System. 14 electrodes with two reference channels were used[55]
Automated human emotions recognition from EEG signal using higher order statistics methods.High/low arousal and valenceThe EEG input signals were provided by the DEAP database[56]
Creation of new methodic for recognition of human emotions High/low arousal and valenceMulti-channel EEG device was used[57]
New EEG-based emotion recognition approach with a novel time-frequency feature extraction technique is presentedHigh/low arousal and valenceThe EEG signals provided by the DEAP dataset[58]
New deep learning framework based on a multiband feature matrix (MFM) and a capsule network (CapsNet) is proposed.High/low arousal, valence and dominanceThe DEAP dataset was used [59]
New cross-subject emotion recognition model based on the newly designed multiple transferable recursive feature elimination are developed High/low arousal, valence and dominance32 channel data from DEAP dataset was used to validate the proposed method[60]
Presented novel approach based on the multiscale information analysis (MIA) of EEG signals for distinguishing emotional.High/low arousal and valenceThe EEG input signals were provided by the DEAP dataset[61]
Table 3. Description of main parameters of electrocardiography (ECG) signal [75].
Table 3. Description of main parameters of electrocardiography (ECG) signal [75].
ParameterDuration, sAmplitude, mVShort Description
P~0.04~0.1–0.25This wave is a result from strial contraction (or depolarization). P wave that exceeds typical values might indicate atria hypertrophy.
PR0.12–0.20The PR interval measured from the start of the P wave to the start of Q wave. It represents the duration of atria depolarization (contraction).
QRS Complex0.08–0.12 The QRS complex measured from the start of Q wave to the end of S wave. It represents the duration of ventricle depolarization (contraction). If the duration is longer, it might indicate the presence of bundle branch blocks.
QT/QTc~0.41 It is measured from the start of the Q wave to the end of T wave. QT interval represents the duration of contraction and relaxation of the ventricles. Duration of QT/QTc varies inversely with the heart rate.
Table 4. Review of scientific researches focused on emotion recognition and evaluation using ECG.
Table 4. Review of scientific researches focused on emotion recognition and evaluation using ECG.
AimEmotionsMethodsHardware and SoftwareRef.
Study focuses on emotion recognition for service robots in the living spaceHigh/neutral/low valence. Negative arousal categorized into: sadness, anger, disgust, and fearECGWireless bio sensor RF-ECG[1]
This research suggests an ensemble learning approach for developing a machine learning model that can recognize four major human emotions Anger; sadness; joy; and pleasureECGSpiker-Shield Heart and Brain sensor [51]
Creation of new methodology for the evaluation of interactive entertainment technologies.Level of arousalECG, galvanic skin response (GSR), electromyography of the face, heart rateDigital camera, ProComp Infiniti system and sensors, BioGraph Software from Thought Technologies.[4]
Presentation of new AfC methodology capable of recognizing the emotional state of a subject.High/low valence and arousalECG, EEGB-Alert X10 sensor (Advanced Brain Monitoring, Inc., USA)[77]
Proposed new method for the automatic location of P-QRS-T wave, and automatic feature extractionJoy and sadnessECGBIOPAC System MP150[73]
Table 5. Review of scientific researches focused on emotions recognition and evaluation using GSR.
Table 5. Review of scientific researches focused on emotions recognition and evaluation using GSR.
AimEmotionsMethodsHardware and SoftwareRef.
Stress level evaluation in computer human interaction.StressGSR, eye activityMindfield eSense sensor, Tobii eye-tracker environment (Tobii Studio)[34]
Creation of textile wearable system, which is able to perform an exosomatic EDA measurement using AC and DC methods.Level of arousalGSRTextile electrodes, from Smartex s.r.l. (Pisa, Italy), installed into special glove[89]
Research of proposed methodologies for emotions recognition from physiological signalsValence and arousal levelsGSR, heart ratePolar-based system, Armband from Bodymedia[90]
Assessment of human emotions using peripheral as well as EEG physiological signals on short-time periodsHigh/neutral/low valence and arousal GSR, EEG, blood pressure Biosemi Active II system (http://www.biosemi.com), plethysmograph to measure blood pressure[91]
Assessment of human emotion from physiological signals by means of pattern recognition and classification techniques High/low valence and arousalGSR, EEG, blood pressure, a respiration, temperatureBiosemi Active II device (http://www.biosemi.com), GSR sensor, plethysmograph, respiration belt and a temperature sensor[92]
Creation of wearable system for measuring emotion-related physiological parameters GSR, heart rate, skin temperatureOriginally designed glove with installed sensors [93]
Validation of new method for the emotional experience evaluation extracting semantic information from the autonomic nervous system High/low valence and arousalGSR, ECG, heart rate, Bodymedia Armband, InnerView Research Software 4.1 from Bodymedia[94]
Development of two state emotion recognition engine for mobile phonePleasant unpleasantGSR, Photoplethysmogram (PPG), Skin temperature [95]
Table 6. Review of scientific researches focused on emotions recognition and evaluation using HRV.
Table 6. Review of scientific researches focused on emotions recognition and evaluation using HRV.
AimEmotionsMethodsHardware and SoftwareRef.
Objective of this study was to recognize emotions using EEG and peripheral signals.High/low valence and arousalHRV, EEG, GSR, blood pressure, respirationBiosemi Active II system (http//www.biosemi.com). GSR sensor, plethysmograph, respiration belt[112]
Creation of new method for the identification of happiness and sadness Happiness and sadnessHRV, skin Temperature (SKT),SKT sensor, PPG sensor[113]
Aim of this project was to design a noninvasive system which will be capable of recognizing human emotions using smart sensorsHappiness (excitement), sadness, relaxed (neutral), and angry HRV, skin temperature SKT, GSRCustom made PPG sensor, DS600 temperature sensor by Maxim—Dallas semiconductor, Custom made GSR sensor[114]
This article describes the development of a wearable sensor platform to monitor a mental stress.Mental stressHRV, GSR, respirationHeart rate monitor (HRM) (Polar WearLink+; Polar Electro Inc.), Respiration sensor (SA9311M; Thought Technology Ltd.), GSR sensor (E243; In Vivo Metric Systems Corp.). EMG module (TDE205; Bio-Medical Instruments, Inc.)[115]
This paper investigated the ability of PPG to recognize emotionHigh/low valence and arousalHRVPPG sensor[116]
The present research proposes a novel emotion recognition framework for the computer prediction of human emotions using wearable biosensorsHappiness/Joy, anger, fear, disgust, sadnessHRV, GSR, SKT, Activity recognitionPPG sensor, GSR sensor, SKT, fingertip temperature; EMG gyroscopes and accelerometer for activity recognition, Android smartphone for data collection[117]
Table 7. Review of scientific researches focused on emotions recognition and evaluation using respiration rate measurements.
Table 7. Review of scientific researches focused on emotions recognition and evaluation using respiration rate measurements.
AimEmotionsMethodsHardware and SoftwareRef.
This paper investigates computational emotion recognition using multimodal physiological signalsPositive, negative and neutral arousalPPG, GSR, respiration rate skin temperaturePulse oximeter (PP-CO12, TEAC Co.) GSR, (PPS-EDA, TEAC Co, AP-U030, TEAC Co.), respiration rate sensor (AP-C021, TEAC Co.), temperature sensor clip (AP-C050, TEAC Co.)[121]
This paper introduces an automated approach in emotion recognition, based on several bio signalsStress, disappointment, euphoriaElectromyograms (EMGs), ECG, respiration rate, and GSR.EMG textile fireproof sensors; the ECG and respiration sensors on the thorax; the GSR textile and fireproof sensor placed special glove[122]
To compare time, frequency, and time-frequency features derived from thermal infrared data discriminates between self-reported affective states of an individual in response to visual stimuli drawn from the international affective pictures systemHigh/neutral/low valence and arousal Facial thermal infrared data, blood volume pulse (BVP), and respiratory effortFLIR Systems ThermaCAM (SC640) long wavelength infrared (LWIR) camera, piezo crystal respiratory effort sensor belt 1370G by Grass Technologies, BVP sensor (PPS) by Grass Technologies Atmospheric temperature sensor HS-2000D[123]
Design experimental stand which is used in monitoring human-system interaction High/low arousalGSR, Electromyography (EMGs,) respiration rate, EEG, blood-volume pulse, temperatureSC-Flex/Pro sensor, MyoScan Pro EMG, Respiration rate sensor, EEG-Z sensor, HRV/BVP Flex/Pro sensor Temperature sensor[124]
This paper aims at assessing human emotion recognition by means of the analysis of HRV with varying spectral bands based on respiratory frequencyHigh/neutral/low arousalECG, respiration rate, blood pressure (BP), skin temperature (ST) GSREEG, blood pressure, skin temperature and GSR sensors[125]
Table 8. Review of scientific researches focused on emotions recognition and evaluation using SKT.
Table 8. Review of scientific researches focused on emotions recognition and evaluation using SKT.
AimEmotionsMethodsHardware and SoftwareRef.
Present App for smartphones CaptureMyEmotion, which can improve learning process of autistic children. High/low arousalSKT, GSR, motion analysisQ sensor from Affectiva (www.affectiva.com).[134]
Proposed a new method for evaluating fear, based on nonintrusive measurements obtained using multiple sensorsFearEEG, SKT, eye blinking rateEEG device (Emotiv EPOC), commercial thermal camera (ICI 7320 Pro) commercial web-camera (C600) and a high-speed camera[135]
Study infant emotion rely on the assessment of expressive behavior and physiological responseJoyful emotionSKTThermal imaging system (TH3104MR, NEC, Sanei)[136]
To demonstrate that the effects of particular emotional stimuli depend not only on physical temperatures but also on homeostasis/thermoregulation.Emotionally warm or emotionally cold state-[137]
Present new methodology which offers a sensitive and robust tool to automatically capture facial physiological changesHigh/low valence and arousalSKT, ECG, GSRECG and GSR National Instruments (NI) devices, infrared camera FLIR A615[138]
Evaluation of possibility of wireless determination of skin temperature using iButtonsiButton (type DS1921H; Maxim/Dallas Semiconductor Corp., USA)[139]
Present a new approach how to analyze the physiological signals associated with emotionsSadness, amusement, fear, anger, surpriseSKT, GSRBodyMedia, SenseWear armband[140]
Present a new StressCam methodology for the non-contact evaluation of stress levelStressSKTInfrared camera[141]
Table 9. Relations between emotions and facial expressions [145,146].
Table 9. Relations between emotions and facial expressions [145,146].
EmotionInvolved MusclesActions
HappinessOrbicularis oculi, Zygomaticus majorClosing eyelids, pulling mouth corners upward and laterally
SurpriseFrontalis, Levator palpebrae superiorisRaising eyebrows, raising upper eyelid
FearFrontalis, Corrugator supercilii, Levator palpebrae superiorisRaising eyebrows, lowering eyebrows, raising upper eyelid
AngerCorrugator supercilii, Levator palpebrae superioris, Orbicularis oculiLowering eyebrows, raising upper eyelid, closing eyelids
SadnessFrontalis, Corrugator supercilii, Depressor anguliorisRaising eyebrows, lowering eyebrows, depressing lip corners
DisgustLevator labii superioris, Levator labii superioris alaeque nasiRaising upper lip, raising upper lip and wrinkling nasal skin
Table 10. Review of scientific researches focused on emotions recognition and evaluation using EMG.
Table 10. Review of scientific researches focused on emotions recognition and evaluation using EMG.
AimEmotionsMethodsHardware and SoftwareRef.
Research of possibility to reliably recognize emotional state by relying on noninvasive low-cost EEG, EMG, and GSR sensorsHigh/low valence and arousal EEG, GSR, EMG, HRVBrainLink headset, Neuroview acquisition software, Shimmer GSR+Unit Shimmer EMG device, a plethysmograph[152]
Present a new approach for monitoring and detecting the emotional state in elderly High/low arousalEDA, HRV, EMG, SKT, activity tracker EDA-custom made sensor, a plethysmograph, SKT resistance temperature detector, 3-axis accelerometer [153]
Present a model that allows to determine emotion in real timeHigh/low valence and arousalEMG, GSRProComp Infiniti Bio-signal Encoder, GSR sensor[154]
Present a methodology and a wearable system for the evaluation of the emotional states of car-racing driversAnger, fear, disgust, sadness, enjoyment and surpriseEMG, GSR, ECG, respiration rateEMG textile fireproof sensors; ECG and respiration sensors on the thorax of the driver; the GSR sensor in the glow[122]
Present fully implemented emotion recognition system including data analysis and classificationJoy, anger, pleasure, sadness EMG, ECG, GSR, respiration rateFour-channel EMG, ECG, GSR, respiration rate bio sensor[155]
Table 11. Review of scientific researches focused on emotions recognition and evaluation using EOG.
Table 11. Review of scientific researches focused on emotions recognition and evaluation using EOG.
AimEmotionsMethodsHardware and SoftwareRef.
Present a novel strategy (ASFM) for emotions recognitions Positive, neutral, negative emotions EMG, EOGOff-line experiment was performed using SEED datasets[164]
Present a novel approach for a sensor-based E-Healthcare system, Positive, neutral, negative emotionsEOG, IROGNeuroscan system (Compumedics Neuroscan, Charlotte, NC, USA), infrared camera with the resolution of 1280 × 720[165]
Proposed a new approach towards to the recognition of emotions using stimulated EOG signalsPositive, neutral, negative emotionsEOGCustomized EOG data acquisition device, Ag/AgCl electrodes[166]
The proposed system introduces an emotion recognition system, based on human eye movementHappy, sad, angry, afraid, pleasantEOGVideo-based eye trackers[167]
Present a novel strategy of eye movement analysis as a new modality for recognizing human activity.Arousal levelEOGCommercial system Mobi from Twente Medical Systems International (TMSI)[168]
Table 12. Relations between emotions and body posture [175,176].
Table 12. Relations between emotions and body posture [175,176].
EmotionsGestures and Postures
HappinessBody extended, shoulders up, arms lifted up or away from the body
InterestLateral hand and arm movement and arm stretched out frontal
SurpriseRight/left hand going to the head, two hands covering the cheeks self-touch two hands covering the mouth head shaking body shift–backing
BoredomRaising the chin (moving the head backward), collapsed body posture, and head bent sideways, covering the face with two hands
DisgustShoulders forward, head downward and upper body collapsed, and arms crossed in front of the chest, hands close to the body
Hot angerLifting the shoulder, opening and closing hand, arms stretched out frontal, pointing, and shoulders squared
Table 13. Review of scientific researches focused on emotions recognition and evaluation using analysis of facial expressions, body posture and gestures.
Table 13. Review of scientific researches focused on emotions recognition and evaluation using analysis of facial expressions, body posture and gestures.
AimEmotionsMethodsHardware and SoftwareRef.
Presentation of ASCERTAIN-a multimodal database for implicit personality and Affect recognition using commercial physiological sensors. High/low valence and arousal GSR, EEG, ECG, HRV, facial expressionsGSR sensor, ECG sensor, EEG sensor, webcam to record facial activity Lucid Scribe software [179]
Creation of personalized tool for a child to learn and discuss her feelings Real time arousal and stress levelFacial expression recognitionSmartphone camera, application CaptureMyEmotion[180]
This paper aims to explore the limitations of the automatic affect recognition applied in the usability context as well as to propose a set of criteria to select input channels for affect recognition.Valence and arousal, interest, slight confusion, joy, sense of controlGSR, facial expressions,Infiniti Physiology Suite software; standard internet camera and video capture software from Logitech, Noldus FaceReader, Morae GSR recorder[181]
This study proposes a new method that involves analysis of multiple data considering the symmetrical characteristics of face and facial feature pointsFearMovement of facial feature points such as eyes, nose, and mouthFLIR Tau2 640 thermal cameras, NIR filter, Logitech C600 web-camera[182]
Present a novel method, for computerized emotion perception based on posture to determine the emotional state of the user.Happiness, interest, boredom, disgust, hot angerBody posturesC++ in Ubuntu 14.04. Kinect for Microsoft Xbox 360 and OpenNI SDK[176]
To propose a novel method to recognize seven basic emotional states utilizing body movementHappiness, sadness, surprise, fear, anger, disgust and neutral stateGestures and body movementsKinect v2 sensor[183]
Table 14. Analysis of previous studies on emotion recognition.
Table 14. Analysis of previous studies on emotion recognition.
Emotions Measurement MethodsData Analysis MethodsAccuracyRef.
Sadness, anger, stress, surpriseECG, SKT, GSRSVMCorrect-classification ratios were 78.4% and 61.8%, for the recognition of three and four categories, respectively[133]
Sadness, anger, fear, surprise, frustration, and amusementGSR, HRV, SKTKNN, DFA, MBPKNN, DFA, and MBP, could categorize emotions with 72.3%, 75.0%, and 84.1% accuracy, respectively[184]
Three levels of driver stressECG, EOG, GSR and respirationFisher projection matrix and a linear discriminantThree levels of driver stress with an accuracy of over 97%[126]
Fear, neutral, joyECG, SKT, GSR, respirationCanonical correlation analysisCorrect-classification ratio is 85.3%. The classification rates for fear, neutral, joy were 76%, 94%, 84% respectively[185]
The emotional classes identified are high stress, low stress, disappointment, and euphoriaFacial EOG, ECG, GSR, respiration, SVM and adaptive neuro-fuzzy inference system (ANFIS)The overall classification rates achieved by using tenfold cross validation are 79.3% and 76.7% for the SVM and the ANFIS, respectively.[122]
Fatigue caused by driving for extended hoursHRV Neural networkThe neural network gave an accuracy of 90%[186]
Boredom, pain, surpriseGSR, ECG, HRV, SKT Machine learning algorithms: linear discriminate analysis (LDA), classification and regression tree (CART), self-organizing map (SOM), and SVMAccuracy rate of LDA was 78.6%, 93.3% in CART, and SOMs provided accuracy of 70.4%. Finally, the result of emotion classification using SVM showed accuracy rate of 100.0%.[187]
The arousal classes were calm, medium aroused, and activated and the valence classes were unpleasant, neutral, and pleasantECG, pupillary response, gaze distanceSupport vector machineThe best classification accuracies of 68.5 percent for three labels of valence and 76.4 percent for three labels of arousal [188]
Sadness, fear, pleasureECG, GSR, blood volume pulse, pulse.Support vector regression Recognition rate up to 89.2%[189]
Frustration, satisfaction, engagement, challengeEEG, GSR, ECG Fuzzy logic84.18% for frustration, 76.83% for satisfaction, 97% for engagement, 97.99% for challenge[190]
Terrible, love, hate, sentimental, lovely, happy, fun, shock, cheerful, depressing, exciting, melancholy, mellowEEG, GSR, blood volume pressure, respiration pattern, SKT, EMG, EOG Support Vector Machine, Multilayer Perceptron (MLP), K-Nearest Neighbor (KNN) and Meta-multiclass (MMC),The average accuracies are 81.45%, 74.37%, 57.74% and 75.94% for SVM, MLP, KNN and MMC classifiers respectively. The best accuracy is for ‘Depressing’ with 85.46% using SVM. Accuracy of 85% with 13 emotions[191]

Share and Cite

MDPI and ACS Style

Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human Emotion Recognition: Review of Sensors and Methods. Sensors 2020, 20, 592. https://doi.org/10.3390/s20030592

AMA Style

Dzedzickis A, Kaklauskas A, Bucinskas V. Human Emotion Recognition: Review of Sensors and Methods. Sensors. 2020; 20(3):592. https://doi.org/10.3390/s20030592

Chicago/Turabian Style

Dzedzickis, Andrius, Artūras Kaklauskas, and Vytautas Bucinskas. 2020. "Human Emotion Recognition: Review of Sensors and Methods" Sensors 20, no. 3: 592. https://doi.org/10.3390/s20030592

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop