Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Measuring facial mimicry: Affdex vs. EMG

  • Jan-Frederik Westermann ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Software, Visualization, Writing – original draft

    westermannjan@yahoo.de

    Affiliation Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany

  • Ralf Schäfer,

    Roles Data curation, Methodology, Supervision, Validation

    Affiliation Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany

  • Marc Nordmann,

    Roles Conceptualization, Data curation

    Affiliation Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany

  • Peter Richter,

    Roles Data curation, Formal analysis

    Affiliation Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany

  • Tobias Müller,

    Roles Methodology, Writing – review & editing

    Affiliation Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany

  • Matthias Franz

    Roles Conceptualization, Methodology, Project administration, Resources, Supervision, Validation, Writing – review & editing

    Affiliation Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany

Abstract

Facial mimicry is the automatic imitation of the facial affect expressions of others. It serves as an important component of interpersonal communication and affective co-experience. Facial mimicry has so far been measured by Electromyography (EMG), which requires a complex measuring apparatus. Recently, software for measuring facial expressions have become available, but it is still unclear how well it is suited for measuring facial mimicry. This study investigates the comparability of the automated facial coding software Affdex with EMG for measuring facial mimicry. For this purpose, facial mimicry was induced in 33 subjects by presenting naturalistic affect-expressive video sequences (anger, joy). The response of the subjects is measured simultaneously by facial EMG (corrugator supercilii muscle, zygomaticus major muscle) and by Affdex (action units lip corner puller and brow lowerer and affects joy and anger). Subsequently, the correlations between the measurement results of EMG and Affdex were calculated. After the presentation of the joy stimulus, there was an increase in zygomaticus muscle activity (EMG) about 400 ms after stimulus onset and an increase in joy and lip corner puller activity (Affdex) about 1200 ms after stimulus onset. The joy and the lip corner puller activity detected by Affdex correlate significantly with the EMG activity. After presentation of the anger stimulus, corrugator muscle activity (EMG) also increased approximately 400 ms after stimulus onset, whereas anger and brow lowerer activity (Affdex) showed no response. During the entire measurement interval, anger activity and brow lowerer activity (Affdex) did not correlate with corrugator muscle activity (EMG). Using Affdex, the facial mimicry response to a joy stimulus can be measured, but it is detected approximately 800 ms later compared to the EMG. Thus, electromyography remains the tool of choice for studying subtle mimic processes like facial mimicry.

1 Introduction

Mimicry describes the imitation of facial expressions, intonation, and body posture between two interaction partners [1]. Facial mimicry is of particular importance because it has a specific emotional meaning as it represents a congruent mimic response to an emotional facial expression [2]. It is detectable in the electromyogram (EMG) after only 200–400 ms [3, 4] and occurs unconsciously and automatically [5]. Facial mimicry has been shown to be triggered even when affective cues are perceived only unconsciously [4, 6].

The role of facial mimicry in the recognition of others’ emotions is controversial. In a widely accepted concept, facial mimicry leads to emotional contagion through a feedback mechanism [7]. This is thought to improve affect perception and thus the ability to empathize. This concept has been further developed as part of an embodiment approach to emotion recognition [8]. According to this, facial mimicry facilitates the decoding of observed emotions [9]. This hypothesis is supported by the fact that it has been shown that emotion recognition can be impaired when the subject’s facial mimicry is impaired. [10, 11]. Similarly, an increase in mimic muscle activity (e.g., due to a task such as biting on a pen or holding a pen with the lips), may result in lower accuracy in facial expression recognition. In contrast, there is also evidence that facial mimicry does not improve emotion recognition [12, 13] and no consistent evidence for the feedback hypothesis could be found in reviews [14]. A broader consensus exists for the assumption that mimicry has a positive influence on social relationships [15]. It has been demonstrated that facial mimicry occurs to varying degrees depending on the situation in which the subject finds himself. Thus, the probability for facial mimicry to occur is higher when there is a desire to cooperate and lower in a competitive situation [16, 17]. According to this assumption, facial mimicry can be understood as an affiliative behavior and supports the establishment and maintenance of interpersonal relationships.[1820]. Interaction partners are perceived as more likable when they subtly imitate the others behavior [21]. This makes it easier for an individual to receive acceptance in a group. The individual thus satisfies his or her need to belong to the group and, at the same time, the collective achievement of relevant goals is facilitated [22]. Facial mimicry occurs with varying likelihood depending on the valence of the affect. Consistent with an attachment-reinforcing function, smiling is more frequently imitated as an expression of a happy facial expression [20]. In encounters between strangers, a frown (7%) is imitated significantly less often than a smile (53%) [23].

The gold standard for measuring facial mimicry is EMG measurement of affect-relevant mimic target muscles. Measurement of EMG activities of the zygomaticus and corrugator muscles is commonly used to distinguish between hedonic and anhedonic affects [2426]. Activation of the corrugator muscle results in a frown and the contraction of the zygomaticus muscle results in a smile. Dimberg [24] showed that the presentation of happy faces led to an increase in EMG activity in the zygomaticus muscle and the presentation of angry faces led to an increase in EMG activity in the corrugator muscle. While zygomaticus and corrugator muscles are good indicators of the valence of mimicry, it has been shown that many emotions are subject to a specific pattern of muscle activity and that this pattern is also reflected in the mimicry response [27].

It has also been shown that the mimic response to angry and happy faces results in visible congruent changes in facial expression [28]. This allows to measure facial mimicry using the Facial Action Coding System (FACS) [29]. The FACS is currently the most comprehensive method for coding facial expressions. Using videotaped faces, specially trained human coders can encode so-called action units. According to FACS, there are a total of 44 action units, with each action unit describing a specific facial mimic activity. Ekman assumes that certain combinations of action units can be used to infer the basic affects of fear, disgust, joy, sadness, surprise, and anger. From this, Ekman derived his own coding system, the Emotional Facial Action Coding System (EMFACS) [30]. Although EMFACS has never been published in a peer review process, it is widely used.

Besides the aspects mentioned above, the measurement of facial mimicry is also relevant from a clinical perspective. Some mental disorders that lead to impaired interpersonal communication and thus to distress and further mental comorbidities are associated with altered facial mimicry. For example, slowed [31] or decreased [32] facial mimicry has been demonstrated in patients with autism on EMG. Individuals with alexithymia [33] and Parkinson´s disease [34] also exhibit a reduced facial mimicry response. Other mental disorders with impaired facial mimicry include schizophrenia [35, 36], borderline personality disorder [37], and depression [38]. The extent to which impaired facial mimicry moderates the severity and distress of these disorders is debated.

As already described, the measurement of facial mimicry is technically demanding. Measurement by EMG requires a complex measuring apparatus and experience in the application and interpretation of EMG signals. The analysis of video footage by FACS requires specially trained FACS raters and is very time-consuming. Recent methods for machine-learning assisted videographic measurement of mimic activity promise time-efficient and easy-to-interpret analysis. This could open up a large area of application in affect research.

The software Affdex (developed by Affectiva) investigated in this study is used on the iMotions platform. Affdex is one of the most widely used automated facial coding software. It promises ease of use and synchronization with other psychophysiological measures. It can be used to synchronously measure and evaluate various psychophysiological signals. Should a validation for the measurement of facial mimicry be successful, complex experimental paradigms could thus be performed in a relatively user-friendly manner. Affdex is based on a machine learning principle. A database of approximately 27,000 human FACS-encoded videos of affect-expressive faces is available. To evaluate the likelihood of activity of action units of new videos, Affdex compares them with the database [39]. In a further step, the combined activity of specific action units is used to derive the probability of the presence of a basic affect based on EMFACS [39]. However, the exact operation of the underlying algorithm is not disclosed, making it difficult for researchers to examine the software in detail.

There are already some studies available that have investigated different Automated Facial Coding (AFC) software concerning to certain features. Stöckli et al. [40] compare Affdex with the AFC software Facet and conclude that AFC has difficulties in detecting subtle affects. In another study, subjects were asked to mime happy and angry faces while both Affdex and EMG activity of zygomaticus and corrugator muscle were measured. A high positive correlation was found between the probability of joy and zygomaticus muscle activity and between anger and corrugator muscle activity [41]. While strong prototypical affect expressions were measured here, Höfling et al. [42] compare the ability of the AFC software Facereader (Noldus) with EMG to measure subtle affect expressions. Here, subjects were not asked to imitate the affect stimuli, but to behave passively. There was a congruent EMG activity for anger and joy, indicating a facial mimicry response, whereas the Facereader software had difficulty measuring the negative valence for the anger stimuli.

The ability of Affdex to measure facial mimicry has not yet been investigated. Furthermore, there have been no studies to date on the extent to which the AFC measurement for the lip corner puller and brow lowerer action units correlate with the EMG activity of the zygomaticus and corrugator muscles. This question is of interest because these muscles represent the underlying anatomical structures for the action units. The present study attempts to answer these open questions.

The aim of this study is to compare electromyography with the Affdex AFC software for measuring facial mimicry response to angry and happy faces.

For this purpose, a healthy cohort was shown videos of faces dynamically accumulating affect over time. The stimulus material consisted of video sequences of adult faces showing the basic affects of anger and joy. EMG measurements of the zygomaticus and the corrugator muscles and Affdex measurements were performed simultaneously. Subsequently, EMG activity of the zygomaticus and corrugator muscles was directly compared to the FACS-oriented Affdex action units lip corner puller and brow lowerer [43]. This allows for a direct comparison of measurement sensitivity, as these action units represent the visible correlates to the underlying muscles [29]. Affdex uses additional information from the face for the measurements besides the lip corner puller and brow lowerer action units. Therefore, EMG activity was additionally compared with affect probabilities for joy and anger measured by Affdex.

We expected a positive correlation between the EMG-activity of zygomaticus muscle and action unit lip corner puller for affect joy. We also expected a positive correlation between the EMG-activity of corrugator muscle and action unit brow lowerer for affect anger [41]. However, there is also evidence that measuring subtle affect expressions may be more difficult for the Affdex software [40, 42].

2 Materials and methods

This study is part of a study project on differences in facial responsiveness of alexithymic and non-alexithymic subjects using EMG and Affdex [33]. In the study project, the mimic responses to video sequences with dynamically animated affective facial expressions of adults and children (anger, joy, disgust, surprise, sadness) were investigated. In the present study, only the response of the healthy control group to the adult stimuli of joy and anger was examined. The extent to which the EMG activity of the zygomaticus and corrugator muscles correlated with the Affdex measurements of lip corner puller, joy, brow lowerer, and anger was investigated.

2.1 Psychometric instruments

To ensure a psychologically healthy subject sample, exclusion criteria were screened by two structured interviews (Structured Clinical Interview for DSM-IV (SCID), Toronto Structured Interview for Alexithymia (TSIA)), questionnaires (Short version of the Autism Spectrum Quotient (AQ-short), Beck Depression Inventory II (BDI-II), Patient Health Questionnaire (PHQ-9), 20-item Prosopagnosia Index (PI-20), Toronto Alexithymia Scale (TAS-20)), and functional tests in the laboratory immediately before the start of the experimental part of the study.

The Structured Clinical Interview for DSM-IV (SCID) [44] used to identify psychiatric diagnoses according to the Diagnostic and Statistical Manual Fourth Edition (DSM-IV). The SCID is divided into two parts. The first part captures DSM-IV Axis I disorders (SCID-I) and the second part captures DSM-IV Axis II personality disorders (SCID II). In this study, only schizoid personality traits were recorded for the SCID-II, thus excluding schizoid traits in the subjects.

The Toronto Structured Interview for Alexithymia (TSIA) [45] is an instrument used for clinical and scientific purposes to identify alexithymic disorders, the regulation and processing of affects. In each case, the respondent is asked to name a corresponding situation from his or her life for different cases. A detailed coding catalog allows a three-level assessment of alexithymia development per item. Only non-representative norm data are available to serve as a guide. Reliability estimates of intraclass correlation correspond to 0.90 (p < 0.01) and reliability estimates to 0.88 (p < 0.01). A combined use of TAS and TSIA is suggested for effective assessment of alexithymia [46].

The short version of the Autism Spectrum Quotient (AQ-short) [47] consists of the three factors: interaction and spontaneity, imagination and creativity, and communication and reciprocity. The internal consistency of the factors ranged from 0.65 to 0.87, and the sensitivity analysis resulted in a cut-off value of 18.

The Beck Depression Inventory II (BDI-II) [48] is a self-report questionnaire with 21 multiple-choice questions. Cut-offs for BDI-II are as follows: 0–13 points no or minimal depressive symptoms, 14–19 points mild, 20–28 points moderate, 29–63 points severe depressive symptoms. Retest reliability during one week is r = 0.93 with internal consistency in clinical and non-clinical samples of 0.84 ≤ α ≤ 0.94.

The Patient Health Questionnaire (PHQ-9) [49] is a nine-item component of the PHQ. Each item can be scored as 0 (not at all), 1 (on a single day), 2 (more than half of the days), or 3 (almost every day). Overall, the PHQ-9 score ranges from 0 to 27. Major depression can be diagnosed if any of the items indicate depressed mood and 5 or more items have a score of 2 or higher. Internal reliability was Cronbach’s α = 0.89 in a representative primary care study.

The 20-item Prosopagnosia Index (PI-20) [50] is used to identify prosopagnosia traits. The index is a self-report instrument used to assess experience with face recognition. It is scored using a five-point scale (strongly agree to strongly disagree). The Cronbach’s α of 0.96 shows a high internal consistency of the 20 items. Cut-off scores are 65–74 for mild, 75–84 for moderate and 85–100 for severe developmental prosopagnosia.

The Toronto Alexithymia Scale (TAS-20) [51] is a questionnaire that refers to people who tend to minimize emotional experience and focus attention externally and who have trouble describing and identifying emotions. The TAS-20 uses cut-off scoring ≤ 51 = nonalexithymia, ≥ 61 = alexithymia. Scores of 52 to 60 = possible alexithymia [52]. It is recommended to use the 33rd percentile corresponding to ≤ 45 (threshold for being surely nonalexithymic) and the 66th percentile value corresponding to ≥ 52 (threshold for being alexithymic) for experimental studies. To ensure correct group classification [53]. We were able to determine reliability coefficients Cronbach’s α = .86 for the TAS-20 from the screening sample (N = 2924).

2.2 Participants

Subjects were recruited via posters and advertisements on social networks. The study procedure and data protection regulations were described in detail to the cted parties. Each subject received financial compensation of 25 Euro for expenses and signed an informed consent form. Subsequently, subjects accessed an online questionnaire [54] in which sociodemographic variables (age, gender, siblings, education), the PHQ-9, and the TAS-20 were collected and severe neurological or psychiatric disorders were queried. Exclusion criteria were insufficient knowledge of the German language, left-handedness, age under 18 or over 50 years, serious medical conditions such as endocrine disorders or coronary heart disease, use of psychotropic drugs, vigilance disorders, substance abuse, visual disorders, neurological disorders (including neuropathy and botulinum toxin use), or psychiatric disorders. The non-alexithymic control group studied here was characterized by a TAS-20 sum score <45 and originally included 38 participants. For technical reasons, 5 subjects were excluded from this study. Reasons for this were misplaced Affdex measurement points due to unfavorable lighting conditions or glasses although not every subject wearing glasses had to be excluded. Thus, 33 participants between the ages of 20 and 42 years (mean age = 25.24, SD = 5.73, SE = 0.99, 22 females, 11 males) were included. The clinically defined thresholds of AQ-short (cut-off value = 18), BDI-II (cut-off value = 13), PI20 (cut-off value = 65), PHQ-9 (cut-off value = 9) and TAS-20 (cut-off value = 51) were not exceeded by any of the subjects. The results of the subjects’ psychometric tests are shown in Table 1.

2.3 Stimulus material

The stimulus material consisted of video sequences of adult faces showing the five basic affects (fear, joy, sadness, surprise, anger). Each video began with a neutral face that continuously built up a maximum affect expression (apex) over 2 seconds, which was presented for one second afterwards. Original portraits of adult individuals were taken from the Karolinska Directed Emotional Faces image set [55]. Deindividualized affect-expressive portraits for each gender and affect (five basic affects and neutral) were developed from the most valid portraits for each affect category [56]. This was realized by a digital overlay of the individual faces and resulted in affect prototypical facial patterns of basic affects in a purified way. These averaged affect prototypical portraits served as visual source material for the creation of video sequences of each basic affect and gender. For this reason, a software based morphing algorithm was used, which generated a naturalistic affect enrichment by interpolating video frames from neutral to maximal affect expression within 2000 ms. The final videos show dynamic sequences of naturalistic sliding facial affect amplification (2000 ms), followed by a static presentation of the apex of each basic affect (1000 ms). Both the averaged portraits and the dynamic video sequences were created and edited by using the software package Abrasoft Fantamorph Deluxe 5. The whole process of stimulus development and the proof of validity of the dynamic stimulus material was demonstrated by Müller et al. [57]. Here, specific mimic responses could be detected for each basic affect.

2.4 Procedure

The study was approved by the Ethics Committee of the Medical Faculty of Heinrich Heine University under the registration number 2016116024. Subjects were recruited via posters and advertisements on social networks. The study procedure and data protection regulations were described in detail to the interested parties. Each subject received financial compensation of 25 Euro for expenses and signed an informed consent form. Before starting the experiment, all participants had to pass simple functional tests for checking the reactivity and function of the facial nerve and visual perceptual ability. Subsequently, subjects completed the various psychometric instruments and clinical interviews (TAS-20, BDI-II, SCID, TSIA, PI20, and AQ-short). Only participants whose test scores were below the defined clinical threshold were admitted to the study. At the beginning of the experiment, subjects were shown the investigation cabin and it was explained that affect-expressive faces would be presented as videos and "bodily signals" would be measured simultaneously. To this end, participants were told to watch the videos attentively and empathize with the affects shown without imitating them. The texts and images were presented on a 24-inch TFT screen (AMW) with a resolution of 1920 x 1080 (60 Hz), at a distance of 1 m. Coordination of the experiment and presentation were controlled using PsychoPy v1.82.01 software [58]. The EMG activity was measured bipolar with Ag/AgCl miniature electrodes (Easy Cap E220N-CS-120) according to the guidelines of Fridlund and Cacioppo [59]. The electrodes were filled with electrolyte paste and attached to the left and right zygomatic and corrugator muscle regions. In addition, two reference electrodes were attached to the mastoids, and two additional electrodes were attached to the temporal bone region for measurement of the electrooculogram (for later correction of artifacts). To ensure impedances below 10 k [59], the skin of the subjects was cleaned with alcohol and rubbed with an abrasive electrode paste before attaching the electrodes. After these procedures, the experiment was started. The stimulus material was presented for 3 seconds as described above (videos of affect-expressive faces, 2 seconds of affect enrichment, 1 second of apex). A black fixation cross on a white background was presented for a mean inter-stimulus–interval time of 5 seconds before each video presentation. The videos were presented in randomized order. Each subject watched 40 videos (five affects, two age groups, two genders, two runs). Subjects were filmed throughout the procedure to enable offline Affdex measurement and to monitor their cooperation in following the instructions and their compliance (vigilance, attention, involvement). The filming was performed with a digital camera, which took frontal video recordings of the subjects at a distance of 1 m. The resulting videos were first stored locally and later imported into the iMotions software and analyzed in iMotions using Affdex software.

2.5 Measurement of facial EMG

EMG Data were acquired from both sides of the face (left and right) from each muscle (zygomaticus and corrugator muscle). EMG activity during stimulus presentation was measured digitally with a sampling rate of 2000 Hz (digital polygraph EEG 1100 G; Nihon Kohden). The EMG signal was further processed offline using the Brain Vision Analyzer. A high-pass filter at 10 Hz and a low-pass filter at 1000 Hz were used. A notch filter (50 Hz) was also used to reduce electromagnetic interference. Before the start of the measurements, the subjects were asked to grimace in order to check the function of the measuring chain based on the initial EMG signals. The recorded signals were stored on a hard disk for further offline analyses and parametrization. Two independent reviewers checked the EMG measurements for artifacts (e.g. subject movement, electrode movements, current voltage drifts). Subsequently, the EMG signal was rectified and integrated stepwise for each 200 ms interval over 5000 ms. For better comparability with the interstimulus interval, 1 s before stimulus presentation was included in the analysis. Due to the dynamic affect buildup during the first 2 s of the stimulus presentation and the expected delayed facial response, an additional 1 s after stimulus presentation was evaluated. For subsequent analysis, a total of 25 200 ms intervals were used, i.e., 1 s before stimulus presentation, 3 s during stimulus presentation, and 1 s after stimulus presentation were each included in the measurement. EMG activity was determined baseline-corrected. The baseline was defined as the mean of the last 1000 ms before stimulus presentation. The preprocessed EMG data were imported into the statistical software package R for further analysis.

2.6 Measurement with Affdex

Affdex is a software program for automatic recognition of facial expressions based on the Facial Action Coding System [29]. First, the Viola-Jones algorithm [60] is used to recognize faces and mark the area relevant to the facial expression with a rectangular frame. Within this frame, 34 relevant measurement points on the face are identified and marked, and histogram-of-oriented-gradient features are obtained from the relevant measurement area. Using support-vector-machine classifiers trained with 10000 manually coded facial expressions, percentile ranks are obtained for each facial expression-relevant motion. Subsequently, affect-expressive facial expressions are inferred from the combination of different facial movements using the Emotional Facial Action Coding System and percentile ranks are also determined for the occurrence of one of the basic affects (anger, disgust, fear, joy, sadness, surprise, contempt) [61]. The FACS action units lip corner puller (action unit 12 according to FACS) and brow lowerer (action unit 4 according to FACS) examined in this study were renamed Smile and brow furrow by Affectiva. However, we continue to use the official FACS nomenclature in this paper.

During the EMG measurement, subjects’ faces were filmed with a video camera (C920 HD Pro Webcam), in a resolution of 1920x1080 (30 frames per second), which was located above the presentation screen, providing frontal footage of the subjects throughout the experiment. Because the camera was turned on a few minutes before the experiment began, the videos were initially trimmed to the actual length of the experiment. During initial trial measurements with Affdex, it was noticed that measurement points in the eyebrow area partially jumped into the area of the EMG electrodes that were responsible for the measurements of the corrugator muscle. To avoid erroneous measurements, the electrodes were retouched using video editing software (DaVinci Resolve, Blackmagic design), in consultation with iMotions technical support. For this purpose, the electrodes were covered with skin-colored areas and tracked over the entire course of the video in every single frame (framerate: 30 FPS). As a result the covers reliably covered only the area of the electrodes during facial movements of the subjects, but did not affect any areas relevant for measurement.

The resulting videos were then imported into the iMotions software (iMotions version 7.2). Within iMotions, markers were now added to the time segments in which the stimulus presentations took place, which enabled an assignment to the affect-expressive stimuli. The automatic marker import of iMotions often led to an inaccurate placement of the markers. Therefore, the markers were placed manually. To ensure that the markers were placed at the correct times, there was a small red light behind the subjects that was controlled by PsychoPy and turned off each time a stimulus presentation began. The correct order of the stimulus presentation could be viewed in PsychoPy. "Postprocessing" by the Affdex algorithm and subsequent data export now took place. The resulting data sets were imported into R (Version 4.1.0) for further parameterization and analysis.

2.7 Data reduction and analysis

Rectified individual EMG data were integrated for each 200 ms interval over 5000 ms (1000 ms before stimulus onset and 1000 ms after stimulus termination). Integrals were then averaged for both sides of the face left and right, for female and male stimuli and for first and second measurement, resulting in 25 x 200 ms averaged EMG integrals for each affect and subject.

The output of the Affdex data was in 40 ms intervals. These were first averaged over 200 ms. Subsequently, the data were averaged according to the EMG data for female and male stimulus material and for the first and the second measurement. For the correlation calculations, first and second measurement were not averaged. No distinction was made between the left and right half of the face by Affdex. Measurements of response to children stimuli were excluded.

Spearman correlations were then calculated between EMG activity and Affdex measurements at each measurement time point. For the presentation of the joy stimulus, the correlations between the zygomatic muscle and the lip corner puller action unit and the joy probability were calculated. For the presentation of the anger stimulus, the correlations between the corrugator muscle and the brow lowerer action unit and the anger probability were calculated. The correlation probabilities were tested for significance, and the significance level was set at α = 0.05. Because of repeated measures, Hochberg-Benjamini corrections were applied for p values ≤ 0.05. The Affdex data, the EMG data, and the respective correlations were plotted graphically together in Figs 14. Measurement time points at which Affdex and EMG were significantly correlated (α≤0.05) were marked with an *. In addition, cross correlations were performed between the following time series: Zygomaticus muscle (EMG)- lip corner puller (Affdex); Zygomaticus muscle (EMG)- joy (Affdex); Corrugator muscle (EMG)- brow lowerer (Affdex); Corrugator muscle (EMG)- anger (Affdex).

thumbnail
Fig 1. Stimulus joy, measurement of zygomaticus muscle (EMG) and lip corner puller (Affdex).

Electromyographical activity [μV integrated over 25 x 200 ms interval (μV x 200 ms) +/- standard error] of zygomaticus muscle (blue line) and Affdex measurement (% +/- standard error) for the activity of the lip corner puller action unit (red line) in response to video clips of affect expressing faces of adults for the affect joy, whiskers represent the standard error, black line represents the Spearman correlation between electromyographical activity and Affdex at each measurement point, the symbol * indicates p ≤ 0.05.

https://doi.org/10.1371/journal.pone.0290569.g001

thumbnail
Fig 2. Stimulus anger, measurement of corrugator muscle (EMG) and brow lowerer action unit (Affdex).

Electromyographical activity [μV integrated over 25 x 200 ms interval (μV x 200 ms) +/- standard error] of corrugator muscle (blue line) and Affdex measurement (% +/- standard error) for the brow lowerer actionunit (red line) in response to video clips of affect expressing faces of adults for the affect anger, whiskers represent the standard error, black line represents the Spearman correlation between electromyographical activity and Affdex at each measurement point, the symbol * indicates p ≤ 0.05.

https://doi.org/10.1371/journal.pone.0290569.g002

thumbnail
Fig 3. Stimulus joy, measurement of zygomaticus muscle (EMG) and affect joy (Affdex).

Electromyographical activity [μV integrated over 25 x 200 ms interval (μV x 200 ms) +/- standard error] of zygomaticus muscle (blue line) and Affdex measurement (% +/- standard error) for the affect joy (orange line) in response to video clips of affect expressing faces of adults for the affect joy, whiskers represent the standard error, black line represents the Spearman correlation between electromyographical activity and Affdex at each measurement point, the symbol * indicates p ≤ 0.05.

https://doi.org/10.1371/journal.pone.0290569.g003

thumbnail
Fig 4. Stimulus anger, measurement of corrugator muscle (EMG) and affect anger (Affdex).

Electromyographical activity [μV integrated over 25 x 200 ms interval (μV x 200 ms) +/- standard error] of corrugator muscle (blue line) and Affdex measurement (% +/- standard error) for the affect anger (orange line) in response to video clips of affect expressing faces of adults for the affect anger, whiskers represent the standard error, black line represents the Spearman correlation between electromyographical activity and Affdex at each measurement point, the symbol * indicates p ≤ 0.05.

https://doi.org/10.1371/journal.pone.0290569.g004

3 Results

Fig 1 shows the facial mimicry response of the observer represented by the course of the EMG activity of the zygomaticus muscle and the probability of the lip corner puller action unit calculated by Affdex during the measurement interval when the video was displaying joy. In addition, the course of the Spearman correlation between EMG and Affdex at each measurement time point is shown. Fig 2 shows the course of the EMG activity of the corrugator muscle and the probability of the brow lowerer action unit calculated by Affdex. In addition, the course of the Spearman correlation between the two values is shown. Starting with the neutral face and extending to the apex, EMG activity for both affects is congruent with the affect enhancement of the stimuli. Approximately 400 ms after stimulus onset, EMG activity increases in parallel with the increasing affect expression of the stimulus. It increases to its maximum after 2000 ms, which corresponds to the arising apex of affect expression in the stimulus videos. In addition to the increase in mean values, an increase in variance is also evident. As expected, presentation of the joy stimulus led to an increase in zygomaticus muscle activity and presentation of the anger stimulus led to an increase in corrugator activity. The curve of the zygomaticus muscle reaches its maximum at a value of approx. 1800 μVx200 ms and drops to approx. 1200 μVx200 ms at the end of the measurement interval. The curve of corrugator muscle rises to a maximum value of 1300 μVx200 ms and drops to a value of approx. 500 μVx200 ms by the end of the measurement interval.

Table 2 contains the results of Spearman correlation calculations between zygomaticus muscle and lip corner puller including alpha error corrected p-values. Table 3 contains the results of Spearman correlation calculations between corrugator muscle and brow lowerer including alpha error corrected p-values.

thumbnail
Table 2. Stimulus joy, Spearman correlation between zygomaticus muscle (EMG) and lip corner puller (Affdex).

https://doi.org/10.1371/journal.pone.0290569.t002

thumbnail
Table 3. Stimulus anger, Spearman correlation between corrugator muscle (EMG) and brow lowerer action unit (Affdex).

https://doi.org/10.1371/journal.pone.0290569.t003

The activities determined by Affdex for lip corner puller and brow lowerer, respectively, run differently. The activity of the action unit lip corner puller starts to increase 1200 ms after stimulus onset. The activity increases over a period of 2000 ms to its maximum of 6% (3000 ms after stimulus onset and 1000 ms after stimulus apex). As with the EMG measurement, the variance also increases in addition to the mean values. Thus, both the EMG of the zygomaticus muscle and the Affdex measurement of the lip corner puller action unit show an increase in activity upon presentation of the joy stimulus. However, lip corner puller activity proceeds with a latency of 800 ms compared with the course of the stimulus material and EMG activity.

Brow lowerer activity calculated by Affdex shows an increase from 6% during stimulus onset to its maximum of 8% approximately 2500 ms after stimulus onset. Unlike EMG activity, there is no change in variance, suggesting that this is not a stimulus-associated increase in brow lowerer activity calculated by Affdex but a random fluctuation.

The calculation of the Spearman correlation between the EMG activity for the zygomaticus muscle and the activity of the lip corner puller action unit calculated by Affdex for the affect joy shows an increase in Spearman correlation shortly after the onset of stimulus presentation, but without becoming significant. The Spearman correlation coefficients become significant (p ≤ 0.05) from 2200 ms after the onset of stimulus presentation and increase to a maximum of approximately 0.5 3000 ms after stimulus presentation. Zygomaticus muscle activity and lip corner puller action unit activity calculated by Affdex correlate significantly with each other until the end of the measurement interval 4000 ms after stimulus presentation.

Calculation of Spearman correlation between EMG activities for corrugator muscle and brow lowerer action unit calculated by Affdex for affect anger shows no relevant increase in Spearman correlation. Corrugator muscle activity does not significantly correlate with brow lowerer action unit activity calculated by Affdex at any time point

Fig 3 shows again the course of the EMG activity of the zygomaticus muscle and the probability of joy calculated by Affdex. In addition, the course of the Spearman correlation between the two values is shown. Fig 4 shows the course of the EMG activity of the corrugator muscle and the probability of anger calculated by Affdex. The probability for the affect joy calculated by Affdex shows an increase 1400 ms after stimulus onset from 0% to 2.5%. The curve reaches its maximum 500 ms after the apex of the affect expression of the stimulus (2500 ms after stimulus onset). As with the EMG measurement and the action unit measurement, the variance increases here in addition to the mean values. Thus, both the EMG of the zygomaticus muscle and the Affdex measure of joy probability show an increase upon presentation of the joy stimulus. However, the joy probability calculated by Affdex progresses with a latency of 1000 ms compared with the progress of the stimulus material and EMG activity.

Table 4 contains the results of Spearman correlation calculations between zygomaticus muscle and Joy including alpha error corrected p-values. Table 5 contains the results of Spearman correlation calculations between corrugator muscle and anger including alpha error corrected p-values.

thumbnail
Table 4. Stimulus joy, Spearman correlation between zygomaticus muscle (EMG) and affect joy (Affdex).

https://doi.org/10.1371/journal.pone.0290569.t004

thumbnail
Table 5. Stimulus anger, Spearman correlation between corrugator muscle (EMG) and affect anger (Affdex).

https://doi.org/10.1371/journal.pone.0290569.t005

The probability for affect anger calculated by Affdex shows no increase and remains at 0% throughout the stimulus presentation.

The calculation of Spearman correlation coefficients between EMG activity of the zygomaticus muscle and the probability of detecting joy calculated by Affdex during the presentation of the joy video shows a value of -0.2 until it increases to 0 1000 ms after stimulus onset and increases from 1600 ms to a maximum value of 0.4, 2600 ms after stimulus onset. The Spearman correlation coefficients become significant (p ≤ 0.05) at 3400 ms, 3800 ms and 4000 ms.

Calculation of Spearman correlation between EMG activities for corrugator muscle and anger probability for the anger affect calculated by Affdex shows an increase of Spearman correlation to nearly 0.2 from 3800 ms after stimulus presentation.

Fig 5 shows the results of the cross-correlation calculation between the EMG activity of the zygomaticus muscle and the lip corner puller activity determined by Affdex during the presentation of the joy stimulus. the largest cross-correlation coefficient was found at a lag of 5 and was 0.256, indicating lagged matching in temporal patterns between EMG and Affdex.

thumbnail
Fig 5. Stimulus joy, cross-correlation coefficients between zygomaticus muscle (EMG) and lip corner puller (Affdex).

Cross-correlation coefficients between the EMG activity of the zygomaticus muscle and the lip corner puller activity determined by Affdex during the presentation of the joy stimulus (black vertical lines). The blue dashed line represents the 95% confidence interval.

https://doi.org/10.1371/journal.pone.0290569.g005

Table 6 contains the cross-correlation coefficients of all measurements shown so far. For the calculation of the cross-correlation coefficient between zygomaticus muscle and the affect joy during the presentation of the joy stimulus, the largest value was also shown at a lag of 5. The cross-correlation coefficient here was 0.174. During the presentation of the anger stimulus, the cross-correlation coefficients show no relevant increase.

4 Discussion

This study is the first in which the facial mimicry response of healthy subjects to dynamic affect-enhancing videos (joy and anger) was measured simultaneously using EMG and Affdex to compare the suitability of these two measurement methods.

EMG measurement has been the gold standard for measuring facial mimicry. However, it requires a complex measuring apparatus and experience in the application and interpretation of EMG signals. Affdex promises a time-saving and easy-to-interpret analysis of facial expressions. In addition, measurement electrodes in the face would not be necessary. This could open up a wide range of applications in affect research.

First, the EMG activity of zygomaticus muscle and corrugator muscle was directly compared to each other using the Affdex action units lip corner puller and brow lowerer, which are based on FACS [43]. This allows for a direct comparison of measurement sensitivity, as these action units represent the visible correlates to the underlying target muscles [29]. Second, EMG activity was compared to affect probabilities measured by Affdex, as Affdex uses other data for measurement in addition to the lip corner puller and brow lowerer action units according to EMFACS [39].

We expected comparable results for Affdex and EMG measurements [41, 42]. However, there was also evidence for reduced measurement performance of Affdex for subtle affect expressions, as expected for facial mimicry [40].

In healthy subjects, it has been shown that facial mimicry could be induced by affective stimulus material. The muscle activity of the zygomaticus and corrugator muscles measured by EMG reflected the valence of the presented affects joy and anger [33, 62].

The Affdex measurement for the lip corner puller action unit and the affect joy also showed an increase and a significant correlation with the EMG measurement of the zygomaticus muscle during the presentation of the joy stimulus. However, the rise of the Affdex trace for lip corner puller did not begin until 1200 ms after stimulus onset and approximately 800 ms after the rise of the EMG trace. Sato et al. [63] demonstrated that human FACS coders detect the facial mimicry response to a dynamic happy stimulus after 817 (±200) ms after stimulus onset. The Affdex trace for joy rose 200 ms later than the trace for lip corner puller. The Affdex trace for lip corner puller reached its maximum at an average of 6.02% and the Affdex trace for joy at an average of 2.51%. These values correspond to relatively low expressions. Kulke et al. [41] studied a healthy cohort who imitated faces with maximum affect expression. Here, Affdex measured a maximum mean of 69.56% for lip corner puller and a maximum mean of 67.53% for joy when imitating joy. Thus, it was shown that Affdex is generally capable of measuring the facial mimicry response to the joy stimulus, however it´s reactivity starts much later. The Affdex measurement for the action unit brow lowerer showed no stimulus-associated change and no significant correlation with the EMG measurement for the corrugator muscle at any time during the measurement. The Affdex trace for affect anger showed no deflection during stimulus presentation. Higher levels are found in a healthy cohort that imitated anger stimuli [41]. Here, Affdex measured a maximum mean of 36.72% for brow lowerer and a maximum mean of 8.88% for anger when imitating anger. Affdex thus performs relatively poor in our trials measuring the facial mimicry response to the anger stimulus.

Since the measurements are time series, we also calculated cross-correlation correlations. These additional calculations provided statistical evidence about a lagged matching in temporal patterns between EMG and Affdex.

For the measurements of corrugator muscle, brow lowerer and anger, the cross-correlation coefficients show no relevant increase.

Similar studies already indicated low sensitivity of automated affect detection for subtle affect expressions [40, 42, 64]. However, hedonic affect could be measured better than anhedonic affect [65], which is consistent with the results of present study.

It remains unclear why Affdex detects the mimicry response for joy but not for anger, although the EMG measures muscle activity in both cases. Other studies also showed weaker recognition performance of Automated Facial Coding for anger compared to joy [40, 66]. One reason for this could be that the joy stimulus leads to higher and longer-lasting muscle activity than the anger stimulus. While the zygomaticus muscle activity increases to about 1800 μV x 200 ms and remains at over 1200 μV x 200 ms until the end of the measurement interval, the corrugator muscle activity only increases to a maximum of about 1300 μV x 200 ms and decreases to about 500 μV x 200 ms until the end of the measurement interval. The course of EMG activity could provide clues to the activity of the action units measured by Affdex. It is possible that the muscle activity of the corrugator muscle is not high enough to activate the brow lowerer action unit to a sufficient extent to detect it measurable by Affdex. Other studies also showed a stronger EMG response to hedonic stimuli than to anhedonic stimuli [42]. Even if EMG activity can be reliably measured, quantitatively these are extremely small increases in activation in the EMG. It is conceivable that EMG can measure muscle activity that does not result in any visible change in the face.

Another explanation could be the simultaneous application of EMG by skin electrodes and Affdex measurement. Kulke et al. [41] found that when measuring imitated affect with Affdex and simultaneous EMG measurement, the measurement result was only slightly worsened by the EMG electrodes used on the face. We observed that the Affdex measurement points, which are regularly located at the eyebrows, jumped over longer time intervals to the EMG electrodes located at the forehead above the eyebrows. This occurred even though the electrodes did not cover the areas relevant to Affdex. Therefore, in this study, to achieve consistently functioning Affdex measurements, the EMG electrodes in the video footage of the test subjects observing the video sequences were retouched after consultation with the iMotions support team. After retouching, the measurement points were consistently located on the eyebrows, so that the Affdex measurement may no longer have been affected. Of course, it is conceivable that the retouching obscured needed cues, and thus impaired Affdex’ anger and brow lowerer detection in particular. In this case, this study is not suitable to assess Affdex’s performance for measuring brow lowerer activity. The measurement points responsible for measuring Affdex’ lip corner puller action unit and joy were in the correct positions throughout the measurement interval.

In conducting this study, in addition to the above mentioned advantages of Affdex compared to EMG disadvantages of Affdex were also noticed. A total of 5 subjects were excluded due to incorrect measurements by Affdex caused by unfavorable lighting conditions or glasses. To prevent further measurement errors, EMG electrodes had to be retouched as described above, which was technically challenging and very time-consuming. We performed the Affdex measurements on post-processed videos. The import of stimulus markers provided by iMotions for this procedure sometimes resulted in temporally offset markers. As a result, the markers had to be inserted manually to accurately mark the times at which stimuli were presented. This procedure was also very time-consuming.

Technical improvements could resolve these problems and significantly improve the application.

As mentioned earlier, retouching the electrodes was time consuming. Future studies should either not use electrodes when measuring simultaneously with Affdex or place them on the face and cover them up so that they do not interfere with Affdex measurement.

Future studies could additionally check the subject videos with human FACS raters. This type of validity check would be very time-consuming but would clarify whether Affdex does not detect changes in mimic musculature that are visible to humans.

The present study focused on measuring the facial mimicry of the most commonly studied affects, joy and anger. Future studies could investigate other affects such as fear, disgust, sadness, and surprise.

4.1 Conclusion

The present study demonstrates that Affdex can measure a facial mimicry response for the affect joy. Despite the delayed measurement compared to the established EMG measurement, Affdex shows a valid performance. Nevertheless, it still does not match the highly sensitive EMG and therefore needs further improvement for measuring subtle affect expressions. It remains unclear how well Affdex detects the facial mimicry response to an angry stimulus, because in this study the electrodes measuring the corrugator muscle probably confounded Affdex. Should the measurement performance of Affdex improve significantly in the future and enable the measurement of subtle affect expressions, it could develop into a promising measurement instrument with a broad range of applications. Especially naturalistic experimental settings that require non-contact measurement of affective responses could benefit from Affdex. However, EMG has been superior in capturing the temporal and dynamic course characteristics of affect-expressive mimicry, at least for the basic affects studied here. EMG thus remains the gold standard for measuring facial mimicry.

Acknowledgments

We would like to thank Lotte Wagner-Douglas, Claudius Rehagel and Alexandra Schwatlo for help with data collection and data analysis.

References

  1. 1. Hess U, Philippot P, Blairy S. Mimicry- Facts and fiction. The social context of nonverbal behavior. 1999:213–41.
  2. 2. Seibt B, Mühlberger A, Likowski KU, Weyers P. Facial mimicry in its social setting. Frontiers in psychology. 2015; 6:1122. Epub 2015/08/11. pmid:26321970.
  3. 3. Dimberg U, Thunberg M. Rapid facial reactions to emotional facial expressions. Scand J Psychol. 1998; 39:39–45. pmid:9619131.
  4. 4. Dimberg U, Thunberg M, Elmehed K. Unconscious facial reactions to emotional facial expressions. Psychological Science. 2000; 11:86–9. pmid:11228851.
  5. 5. Bargh JA, Chartrand TL. The Unbearable Automaticity of Being. American Psychologist. 1999; 54(7):228–49.
  6. 6. Neumann R, Schulz SM, Lozo L, Alpers GW. Automatic facial responses to near-threshold presented facial displays of emotion: imitation or evaluation. Biol Psychol. 2014; 96:144–9. Epub 2013/12/24. pmid:24370542.
  7. 7. Lundqvist LO. Facial EMG reactions to facial expressions: a case of facial emotional contagion. Scand J Psychol. 1995; 36:130–41. pmid:7644897.
  8. 8. Niedenthal PM, Barsalou LW, Winkielman P, Krauth-Gruber S, Ric F. Embodiment in attitudes, social perception, and emotion. Pers Soc Psychol Rev. 2005; 9:184–211. pmid:16083360.
  9. 9. Niedenthal PM, Wood A, Rychlowska M, Korb S, editors. Embodied Simulation in Decoding Facial Expression.In Fernández-Dols J.-M& Russel J. A.(Eds.), The Science of facial expression (pp. 397–414). Oxford University Press; 2017.
  10. 10. Oberman LM, Winkielman P, Ramachandran VS. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions. Soc Neurosci. 2007; 2:167–78. pmid:18633815.
  11. 11. Avenanti Alessio. Blocking facial mimicry affects recognition of facial and body expressions. PLOS ONE. 2020. pmid:32078668
  12. 12. Blairy S, Herrera P, Hess U. Mimicry and the Judgement of Emotional Facial Expressions. Journal of Nonverbal Behavior. 1999; 23:5–41.
  13. 13. Gump BB, Kulik JA. Stress, Affiliation, and Emotional Contagion. Journal of Personality and Social Psychology. 1997; 72:305–19. pmid:9107002
  14. 14. Hess U, Blairy S, Philippot P. Mimicry: Facts and Fiction. In: Hess U, Philippot P, Blairy S, Feldmann R, Coats E, editors. The social context of nonverbal behavior. Cambridge University Press; 1999. pp. 213–41.
  15. 15. Cappella JN. Mutual influence in expressive behavior: Adult–adult and infant–adult dyadic interaction. Psychological Bulletin. 1981; 89:101–32. pmid:7232607
  16. 16. Lanzetta JT, Englis BG. Expectations of cooperation and competition and their effects on observers’ vicarious emotional responses. Journal of Personality and Social Psychology. 1989; 56:543–54.
  17. 17. Seibt B, Weyers P, Likowski KU, Pauli P, Mühlberger A, Hess U. Subliminal Interdependence Priming Modulates Congruent and Incongruent Facial Reactions to Emotional Displays. Social Cognition. 2013; 31:613–31.
  18. 18. Hatfield E, Cacioppo JT, Rapson RL. Emotional contagion. 1st ed. Cambridge: Cambridge University Press; 1994.
  19. 19. Drimalla H, Landwehr N, Hess U, Dziobek I. From face to face: the contribution of facial mimicry to cognitive and emotional empathy. Cognition & Emotion. 2019; 33:1672–86. Epub 2019/03/21. pmid:30898024.
  20. 20. Hess U, Fischer A. Emotional Mimicry: Why and When We Mimic Emotions. Social and Personality Psychology Compass. 2014; 8:45–57.
  21. 21. Chartrand TL, Bargh JA. The Chameleon Effect: The Perception-Behavior Link and Social Interaction. Journal of Personality and Social Psychology. 1999:893–910. pmid:10402679
  22. 22. Baumeister RF, Leary MR. The Need to Belong: Desire for Interpersonal Attachments as a Fundamental Human Motivation. Psychological Bulletin. 1995:497–529. pmid:7777651
  23. 23. Hinsz VB, Tomhave JA. Smile and (Half) the World Smiles with You, Frown and You Frown Alone. Personality and Social Psychology Bulletin. 1991:586–92.
  24. 24. Dimberg U. Facial reactions to facial expressions. Psychophysiology. 1982; 19:643–7. pmid:7178381.
  25. 25. Larsen JT, Norris CJ, Cacioppo JT. Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology. 2003; 40:776–85. pmid:14696731.
  26. 26. Vrana SR. The psychophysiology of disgust: differentiating negative emotional contexts with facial EMG. Psychophysiology. 1993; 30:279–86. pmid:8497557.
  27. 27. Wingenbach TSH, Brosnan M, Pfaltz MC, Peyk P, Ashwin C. Perception of Discrete Emotions in Others: Evidence for Distinct Facial Mimicry Patterns. Sci Rep. 2020; 10:4692. Epub 2020/03/13. pmid:32170180.
  28. 28. Yoshimura S, Sato W, Uono S, Toichi M. Impaired overt facial mimicry in response to dynamic facial expressions in high-functioning autism spectrum disorders. J Autism Dev Disord. 2015; 45:1318–28. pmid:25374131.
  29. 29. Ekman P, Friesen WV. Facial Action Coding System (FACS). 1978 [updated 2 Dec 2021; cited 15 Dec 2021]. Available from: https://psycnet.apa.org/doiLanding?doi=10.1037%2Ft27734-000.
  30. 30. Friesen WV, Ekman P. EMFACS-7: Emotional facial action coding system, Version 7. Unpublished manuscript. Unpublished manuscript. 1984.
  31. 31. Oberman LM, Winkielman P, Ramachandran VS. Slow echo: facial EMG evidence for the delay of spontaneous, but not voluntary, emotional mimicry in children with autism spectrum disorders. Dev Sci. 2009; 12:510–20. pmid:19635079.
  32. 32. McIntosh DN, Reichmann-Decker A, Winkielman P, Wilbarger JL. When the social mirror breaks: deficits in automatic, but not voluntary, mimicry of emotional facial expressions in autism. Dev Sci. 2006; 9:295–302. pmid:16669800.
  33. 33. Franz M, Nordmann MA, Rehagel C, Schäfer R, Müller T, Lundqvist D. It is in your face-Alexithymia impairs facial mimicry. Emotion. 2021; 21:1537–49. Epub 2021/11/18. pmid:34793185.
  34. 34. Argaud S, Delplanque S, Houvenaghel J-F, Auffret M, Duprez J, Vérin M, et al. Does Facial Amimia Impact the Recognition of Facial Emotions? An EMG Study in Parkinson’s Disease. PLOS ONE. 2016; 11:e0160329. Epub 2016/07/28. pmid:27467393.
  35. 35. Varcin KJ, Bailey PE, Henry JD. Empathic deficits in schizophrenia: the potential role of rapid facial mimicry. J Int Neuropsychol Soc. 2010; 16:621–9. Epub 2010/04/07. pmid:20374674.
  36. 36. Sestito M, Umiltà MA, de Paola G, Fortunati R, Raballo ALeuci E, et al. Facial reactions in response to dynamic emotional stimuli in different modalities in patients suffering from schizophrenia: a behavioral and EMG study. Front Hum Neurosci. 2013; 7:368. Epub 2013/07/23. pmid:23888132.
  37. 37. Matzke B, Herpertz SC, Berger C, Fleischer M, Domes G. Facial reactions during emotion recognition in borderline personality disorder: a facial electromyography study. Psychopathology. 2014; 47:101–10. Epub 2013/09/07. pmid:24021701.
  38. 38. Wexler BE, Levenson L, Warrenburg S, Price LH. Decreased Perceptual Sensitivity to Emotion-Evoking Stimuli in Depression. Psychiatry Research. 1993:127–38. pmid:8022947
  39. 39. McDuff D, Mahmoud A, Mavadati M, Amr M, Turcot J, el Kaliouby R. AFFDEX SDK: A Cross-Platform RealTime Multi-Face Expression Recognition Toolkit. In: Kaye J, Druin A, Lampe C, Morris D, Hourcade JP, editors. Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. New York, NY, USA: ACM; 2016. pp. 3723–6.
  40. 40. Stöckli S, Schulte-Mecklenbeck M, Borer S, Samson AC. Facial expression analysis with AFFDEX and FACET: A validation study. Behav Res Methods. 2018; 50:1446–60. pmid:29218587.
  41. 41. Kulke L, Feyerabend D, Schacht A. A Comparison of the Affectiva iMotions Facial Expression Analysis Software With EMG for Identifying Facial Expressions of Emotion. Frontiers in psychology. 2020; 11:329. Epub 2020/02/28. pmid:32184749.
  42. 42. Höfling TTA, Alpers GW, Gerdes ABM, Föhl U. Automatic facial coding versus electromyography of mimicked, passive, and inhibited facial response to emotional faces. Cognition & Emotion. 2021; 35:874–89. Epub 2021/03/25. pmid:33761825.
  43. 43. Senechal T, McDuff D, el Kaliouby R. Facial Action Unit Detection Using Active Learning and an Efficient Non-linear Kernel Approximation. 2015 IEEE International Conference on Computer Vision Workshop (ICCVW). IEEE; 2015. pp. 10–8.
  44. 44. Wittchen HU, Wunderlich U, Gruschwitz S, Zaudig M. SCID: Structured Clinical Interview for DSM-IV Axis I Disorders.
  45. 45. Grabe HJ, Löbel S, Dittrich D, Bagby RM, Taylor GJ, Quilty LC, et al. The German version of the Toronto Structured Interview for Alexithymia: factor structure, reliability, and concurrent validity in a psychiatric patient sample. Compr Psychiatry. 2009; 50:424–30. Epub 2009/01/16. pmid:19683612.
  46. 46. Montebarocci O, Surcinelli P. Correlations between TSIA and TAS-20 and their relation to self-reported negative affect: A study using a multi-method approach in the assessment of alexithymia in a nonclinical sample from Italy. Psychiatry Research. 2018; 270:187–93. pmid:30261408
  47. 47. Freitag CM, Retz-Junginger P, Retz W., Seitz C., Palmason H, and Meyer J. Evaluation der deutschen version des Autismus-Spektrum-Quotienten (AQ)-die Kurzversion AQ-k. Klinische Psychologie und Psychotherapie. 2007; 36.
  48. 48. Hautzinger M, Keller F, Kühner C. Beck depressions-inventar (BDI-II). 2006.
  49. 49. Kroenke K, Spitzer RL, and Williams JB. The PHQ-9: validity of a brief depression severity measure. Journal of General Internal Medicin. 2001; 16:606–13. pmid:11556941
  50. 50. Shah P, Gaule A, Sowden S, Bird G, Cook R. The 20-item prosopagnosia index (PI20): a self-report instrument for identifying developmental prosopagnosia. R Soc Open Sci. 2015; 2:140343. Epub 2015/06/24. pmid:26543567.
  51. 51. Bagby R, Parker JD, Taylor GJ. The twenty-item Toronto Alexithymia scale—I. Item selection and cross-validation of the factor structure. Journal of Psychosomatic Research. 1994; 38:23–32. pmid:8126686
  52. 52. Taylor GJ, Bagby R, Parker JD. Disorders of affect regulation: Alexithymia in medical and psychiatric illness. 1999th ed. Cambridge University Press; 1997.
  53. 53. Franz M, Popp K, Schaefer R, Sitte W, Schneider C, Hardt J, et al. Alexithymia in the German general population. Soc Psychiatry Psychiatr Epidemiol. 2008; 43:54–62. Epub 2007/10/12. pmid:17934682.
  54. 54. Leiner DJ. SoSci survey. Available Online at: https://www.soscisurvey. de.(accessed Decembre 13, 2021). 2014.
  55. 55. Lundqvist D, Litton JE. The averaged Karolinska directed emotional-KDEF (CD ROM). Stockholm: Karolinska Institute, Department of; 1998.
  56. 56. Goeleven E, Raedt R de, Leyman L, Verschuere B. The Karolinska Directed Emotional Faces: A validation study. Cognition and Emotion. 2008; 22:1094–118.
  57. 57. Müller T, Schäfer R, Hahn S, Franz M. Adults’ facial reaction to affective facial expressions of children and adults. Int J Psychophysiol. 2019; 139:33–9. Epub 2019/01/26. pmid:30695699.
  58. 58. Peirce JW. PsychoPy—Psychophysics software in Python. J Neurosci Methods. 2007; 162:8–13. Epub 2007/01/23. pmid:17254636.
  59. 59. Fridlund A. J., Cacioppo J.T. Guidelines for Human Electromyographic Research. Psychophysiology. 1986; 23:567–89. pmid:3809364
  60. 60. Viola P, Jones M. Rapid object detection using a boosted cascade of simple features. Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001. IEEE Comput. Soc; 2001. I-511-I–518.
  61. 61. McDuff D, Mahmoud A, Mavadati M, Amr M, Turcot J, el Kaliouby R. AFFDEX SDK. In: Kaye J, Druin A, Lampe C, Morris D, Hourcade JP, editors. Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. New York, NY, USA: ACM; 2016. pp. 3723–6.
  62. 62. Dimberg U, Thunberg M, Grunedal S. Facial reactions to emotional stimuli: Automatically controlled emotional responses. Cognition and Emotion. 2002; 16:449–71.
  63. 63. Sato W, Yoshikawa S. Spontaneous facial mimicry in response to dynamic facial expressions. Cognition. 2007; 104:1–18. Epub 2006/06/14. pmid:16780824.
  64. 64. Sato W, Hyniewska S, Minemoto K, Yoshikawa S. Facial Expressions of Basic Emotions in Japanese Laypeople. Frontiers in psychology. 2019; 10:259. Epub 2019/02/12. pmid:30809180.
  65. 65. Höfling TTA, Gerdes ABM, Föhl U, Alpers GW. Read My Face: Automatic Facial Coding Versus Psychophysiological Indicators of Emotional Valence and Arousal. Frontiers in psychology. 2020; 11:1388. Epub 2020/06/19. pmid:32636788.
  66. 66. Lewinski P, Uyl TM den, Butler C.Automated facial coding: Validation of basic emotions and FACS AUs in FaceReader. Journal of Neuroscience, Psychology, and Economics. 2014; 7:227–36.