Next Article in Journal
Chipless RFID Label with Identification and Touch-Sensing Capabilities
Next Article in Special Issue
Individual’s Social Perception of Virtual Avatars Embodied with Their Habitual Facial Expressions and Facial Appearance
Previous Article in Journal
Machine Learning-Based 5G-and-Beyond Channel Estimation for MIMO-OFDM Communication Systems
Previous Article in Special Issue
Identification of Video Game Addiction Using Heart-Rate Variability Parameters
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Changes in Computer-Analyzed Facial Expressions with Age

1
Interdisciplinary Program in Cognitive Science, Seoul National University, Seoul 08826, Korea
2
Department of Psychiatry, SMG-SNU Boramae Medical Center, Seoul National University College of Medicine, Seoul 03080, Korea
3
Dental Research Institute, School of Dentistry, Seoul National University, Seoul 08826, Korea
4
Behavioral Neuroscience Program, School of Medicine, Boston University, Boston, MA 02101, USA
5
Division of Teacher Education, College of Liberal Arts and Interdisciplinary Studies, Kyonggi University, Suwon 16200, Korea
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2021, 21(14), 4858; https://doi.org/10.3390/s21144858
Submission received: 17 April 2021 / Revised: 13 July 2021 / Accepted: 15 July 2021 / Published: 16 July 2021
(This article belongs to the Special Issue Emotion Intelligence Based on Smart Sensing)

Abstract

:
Facial expressions are well known to change with age, but the quantitative properties of facial aging remain unclear. In the present study, we investigated the differences in the intensity of facial expressions between older (n = 56) and younger adults (n = 113). In laboratory experiments, the posed facial expressions of the participants were obtained based on six basic emotions and neutral facial expression stimuli, and the intensities of their faces were analyzed using a computer vision tool, OpenFace software. Our results showed that the older adults expressed strong expressions for some negative emotions and neutral faces. Furthermore, when making facial expressions, older adults used more face muscles than younger adults across the emotions. These results may help to understand the characteristics of facial expressions in aging and can provide empirical evidence for other fields regarding facial recognition.

1. Introduction

Expression and recognition of emotions through facial expressions are fundamental functions of basic communication. Facial expressions are critical for communicating with one’s surroundings in terms of their role to convey the primary meaning of social information [1,2]. People can communicate and convey their emotions in diverse manners; however, facial expressions can be used in the most flexible way [3]. Investigating how facial movements are controlled and how people recognize others’ facial expressions, therefore, is an essential way to understand the nature of human beings as social beings and can also facilitate emotional functioning.
It has been well established that emotional expression and recognition skills through facial expressions change with age [4,5]. A previous study showed older and young people a variety of facial expressions and confirmed how they recognized them [6]. Young and old people were both aware of expressions of positive emotion, while older people were less aware of negative facial expressions. In addition, the performance of the older group declined in sadness facial expression recognition but improvement in disgust facial expression recognition [7,8,9]. The older people were also more inclined to think that they felt happy when they were shown smiles [10]. A recent meta-analysis demonstrated that older adults showed lower performance on emotional face identification than a younger group of adults [11].
Owing to physical aging, sarcopenia, such as atrophy of facial skeleton, malposition of fatty muscles, and loss of soft tissue happen most commonly in the areas of the maxilla, mandible, and anterior nasal spine [12]. A previous study showed that human facial aging demonstrated a common pattern of morphological, chronological, and dermatological changes in various biomedical studies [13]. In an aspect of neuromuscular mechanism, voluntary facial expressions (i.e., posed facial expressions) using the lower part of the face are prominently controlled by the left hemisphere and vice versa [14,15,16]. Specifically, aging of the orofacial motor cortex, which involves involuntary facial expressions, can cause a decline in cognitive control for the lower part of the face [17,18]. While facial aging is natural and inevitable for most people, multiple studies have suggested there are several markers of facial expression and recognition in neuropathological changes including epilepsy [19], Parkinson’s disease [20], Alzheimer’s disease [21], and other neurocognitive disorders [22]. Despite this, identifying the quantitative characteristics of facial aging is still limited.
The posed facial expression, which is commonly exhibited on portrayal of other’s facial expression, has distinct characteristics compared to spontaneous facial expression in aspects of neuromotor system and display rules. Whereas posed facial expression is generated cognitively within the pyramidal system, spontaneous facial expression exhibits independent motor control and is driven by extrapyramidal system [15,23]. The movements inherent to posed facial expression tend to display intended emotions in the context of social interactions (i.e., display rules), while spontaneous facial expression correspond to a primary emotional system [15,24]. Although, several studies have pointed out the limitations of the characteristic of the posed facial expression for its artificiality by actor’s and variability by experimental conditions [25,26,27], research leveraging posed facial expression has clear advantages. For interpretability, posed facial expression is less ambiguous than spontaneous facial expression [28] and is also universal across the basic emotion [29]. Such universality has also been identified in recent study for East Asian population [27]. Since cumulative literatures have studied the pose facial expression [30], posed facial expression is may expected to be a valid indicator for investigating aging.
Quantitative measurements of facial expressions and their analyses has been an active research topic in behavioral science. Among several studies, a facial action coding system (FACS) [31,32] is the most widely used in this area. A series of facial muscle movements that represent facial expressions, termed as action units (AUs), can help a facial recognition-based analysis to be more standardized [33]. Since AUs were originally developed from basic emotion theory and manually rated by highly trained coders, the FACS-based AUs have had limited accessibility for standardization. Recently, automated computer vision and multidiscipline study for facial expression analysis have emerged [34]. These studies enable scaling facial expressions more feasible; facial aging study remains in three-dimensional (3D) morphometric [13,35] or electromyography (EMG) studies [36,37]. In that regard, little is known about quantitative facial aging.
Given that facial expressions are crucial indicators of human health status [38,39], applying machine learning algorithm techniques to facial expressions, such as computer-aided diagnosis (CAD) in the biomedical signal [40], and the medical imaging field [41], can contribute to digital health. This technique is often used in facial paralysis [42,43], face transplant [44], pain detection through facial expression [45], and neurologic studies such as those involving autism [46], Turner syndrome [47], and Parkinson’s disease [48]. Since language production and discourse decrease with aging [49], identifying the characteristics of facial expressions in the older adults is a promising and challenging research area in gerontology, which can diagnose disease regardless of patient communication skills. Moreover, the uniqueness of facial expressions has led to consistent studies in the area of personal identification for health records [50], to improve performances on CAD and identification using facial expressions, to develop the algorithm, and to provide interpretable results for facial expressions with aging. Although there has been much work on automatic facial expression recognition in computer vision research, the algorithms have been experimentally validated primarily on younger faces. For facial expressions to be better used as digital markers related to aging, finding quantitative differences in facial changes with aging should be studied.
The aim of this study was to identify the characteristics of facial expressions based on the basic emotion theory and to compare the differences in facial expressions between younger and older adults for each basic emotion and AU, respectively. Additionally, a feature-selection approach was used to identify multivariate patterns of the changes in facial expressions related to aging. Finally, the predictive ability for selected AUs was evaluated.

2. Materials and Methods

2.1. Ethics Statement

This study was approved by the Institutional Review Board of the SMG-SNU Boramae Medical Center (IRB No. 30-2017-63), and all participants submitted written consent for participating in the study.

2.2. Participants

A total of 61 older adults and 115 younger adults were recruited for this study. The older adults were between 62 and 84 years old and recruited from the Alzheimer’s disease research center of the SMG-SNU Boramae hospital. Healthy young participants were recruited from the university student participant pool and aged between 18 to 39. None of them had a history of psychiatric disorder. Major medical diseases, severe head injury, and visual impairment were excluded in all groups. Especially, all the older adults were free from the diagnosis criteria of Alzheimer’s disease and depressive spectrum disorder with DSM-IV [51]. All medical judgements were determined by a board-certified psychiatrist (J.-Y.L.).
To screen the potential emotion related problems such as depression, anxiety, and alexithymia, participants were asked to answer self-reported measures: Beck Depression Inventory (BDI), Beck Anxiety Inventory (BAI), and Toronto Alexithymia Scale (TAS). The Korean version of BDI involves 21 questions to evaluate the severity of depression, with scores ranging from 0 to 63 [52,53]. A higher score indicates severe depressive symptoms, and the cutoff score is 18 in the Korean version [54]. The Korean version of BAI utilizes 21 questions to measure the severity of anxiety, with scores ranging from 0 to 63 [55]. A higher BAI score indicates severe anxiety symptoms with a cutoff score of 19 [56]. A twenty-item TAS was developed and validated to measure the severity of alexithymia. A score ranging from 20 to 100 [57,58], with a cutoff score at 61 was used for the Korean version [59]. The TAS is made up of three subscales: Difficulty identifying feeling, difficulty describing feeling, and externally oriented thinking. Neither group had an abnormal level of emotional problems (Table 1).
Since data for five older adults and two younger adults failed to pass the quality check, 169 of 176 participants were included in the analysis. Table 1 summarizes the demographic and clinical characteristics of the participants. Significant differences were found in age, education, left-handed, BDI score, and TAS score. Except for age, these variables were adjusted in further analyses.

2.3. Procedures

A series of photos containing six basic emotions and a neutral facial expression were presented to participants, which consisted of seven stimuli and had been selected by researchers from a photography dataset used in a previous study [50]. Instructions were given in both verbal and visual form, and the participants were asked to answer verbally for stimuli. Then, participants performed posed facial expressions for the given list of six basic emotions and the neutral emotion. For example, for happy facial expression, a photograph of a person with a happy face was presented; participants were asked to identify the emotion conveyed; and “make a happy face for 15 s towards the camera” to be video recorded. The facial stimuli were given once participants were fully aware of the instruction of the study. Examples of stimuli are shown in Figure 1. Each facial stimulus was presented for a maximum of 7 s; the researcher moved on to the next stimulus when the participant made a verbal response. Facial expressions were acquired for a total of 105 s for each emotion.

2.4. Data Acquisition

The participants’ video recordings of posed facial expressions were administered with a Canon EOS 70D DSLR Camera with a 50 mm prime lens, 720 p resolution, and 60 fps frame rate. The camera was positioned on a fixed stand approximately 120–140 cm above the floor to correctly capture the entire face of the participants. The posed facial expressions were recorded for 15 s after a clear instruction to imitate a previously recognized emotional face.
For each frame of the recorded videos, the presence and intensity were estimated using OpenFace 2.0, an open-source toolkit for facial behavior analysis, which consists of four pipelines: (1) facial landmark detection and tracking, (2) head pose estimation, (3) eye gaze estimation, and (4) facial expression recognition [34]. For analyzing facial expressions, OpenFace 2.0 recognizes facial expressions by detecting AU intensity and presence according to FACS [31]. Without using all the AUs listed in FACS, OpenFace 2.0 offers a subset of 18 AUs by cross-dataset learning, specifically, 01, 02, 04, 05, 06, 07, 09, 10, 12, 14, 15, 17, 20, 23, 25, 26, 28, and 45. The occurrences and intensities in AUs are estimated by using machine learning algorithms. The methods for AU estimation and analysis are described in more detail elsewhere [61]. In the present study, AU intensities were used to derive measures of individual emotional facial expression and six basic emotions were created according to emotional FACS (EMFACS) [62]. The EMFACS were based on the FACS that have been proven to have significant reliability for the assessment of human facial movements [63,64]. The highest intensity for each AU was calculated as the maximum score across all the video frames, which is validated in prior work [65]. Examples of each AU and emotion are shown in Table 2.

2.5. Statistical Analysis

Descriptive statistics for demographic variables were calculated as mean scores and standard deviations. The difference in AU was compared, applying for multiple comparisons (followed by Bonferroni correction). Chi-squared tests were used to compare categorical outcomes such as sex and usage of botulinum toxin (botox). The correlation between age and the AU intensity was investigated. To explain multivariate profiles with respect to input features that were accurately distinguished from the older group, the adaptive least absolute shrinkage and selection operator (LASSO) ML algorithm were applied to the dataset [66]. The adaptive LASSO, which is a regularized regression method with L1-norm penalty [67] is a popular technique for simultaneous estimation and consistent variable selection [66]. It is a powerful model that performs regularization and feature selection, and it can provide model interpretability by excluding irrelevant features that are not related to the class from the model. L1 regularization, which penalizes elements of redundant complexity, focuses on the most significant features, and thus prevents overfitting of the data and is supported by well-grounded theoretical analysis [68]. The regression coefficients of unimportant variables shrank to 0 upon implementing the adaptive LASSO. In that regard, the adaptive LASSO algorithm provided interpretable results related to the older adults. Due to its high accessibility and low computational complexity as compared with other feature selection models, recently, this approach has been highly recommended in behavioral science [69].
In order to avoid the overfitting issue and to evaluate the generalizability of the results from the ML algorithms, 10-fold cross-validation was applied during the variable selection process [70]. First, the data were randomly split into a training set (66.7% of the data) and a test set (33.4% of the data). All the ML models were fitted using the training set, and classifications were separately made on the test and training datasets. The optimal parameter, lambda, was determined across 1000 iterations of 10-fold cross-validation to minimize the deviance of the model. Then, predictions were made on the test set based on the ML models trained in the training set. All reported p values have been adjusted for multiple comparison analyses.

3. Results

3.1. The Differences in Facial Expression between the Older Adults and Younger Adtuls

Figure 2 and Figure 3 demonstrate the AU values of the older and younger adults for the neutral and emotional face. The results applied for multiple comparisons are presented in Table 3. In AU 06, 07, 12, and 14, older adults showed higher intensity compared to younger adults. For AU 45, older adults showed lower intensity than younger adults.
To explore the relationship between age and each AU, a correlation analysis was conducted. The patterns of the results were similar to differences in group comparisons (Figure 4). For AU 06, 07, 12, 10, and 14, positive correlations between AU and age were found, while negative correlation were found in AU 45 across the emotions.

3.2. Feature Selection for Predicting Age

The adaptive LASSO model was implemented to identify significant features for distinguishing the older group among the input variables. Demographics (education, sex, left-handed, and botox), self-reported measure (TAS and BDI), and all AUs were assessed for their ability to classify the older adults. Figure 5 shows the multivariate profiles for distinguishing the older adults from the participants in the current study. Demographics and self-reported measure were not significant in the adaptive LASSO result. Among the total 119 AUs, only 11 AUs remain significant: AU 10 in angry; AU 02, 10, 14, and 45 in sad; AU 05 and 14 in surprise; AU 06, 10, 20, and 45 in neutral, respectively. The receiver operating characteristic (ROC) demonstrated an AUC of 0.924 for the adaptive LASSO model.

4. Discussion and Conclusions

The purpose of the present study was to investigate the differences in facial expressions of older and younger adults and to examine how facial muscles contributed to aging through AUs for six basic emotion and neutral facial expression. Throughout the emotions and AUs, the older adults appeared to exhibit greater intensity in facial expression than the younger adults. In some area, the older adults showed lower facial intensity than the younger adults.

4.1. Degenerative Changes in Facial Expression Differences with Age

The main findings show that the older adults have higher AU values than young people for neutral and negative emotion (i.e., angry and sad). An increasing amount of the literatures has demonstrated that aging is associated with dramatic reductions in muscle strength (i.e., dynapenia) and motor control [71,72,73]. With advancing age, decreased neuromuscular changes may result in deficits in voluntary activation for facial activities [73,74]. In that regard, the facial expressions of older adults can naturally differ from those of younger adults [75].
Given that the cortex, spinal cord, and neuromuscular junction are functionally correlated, and they influence voluntary activation of muscle fibers [76], voluntary facial expressions can be addressed by neurological evidence [77]. For older adults to make facial expressions as intended, therefore, it is necessary to utilize their brain in the top-down processing format to ensure that the commands from the brain are correctly delivered to the facial muscles. In addition to facial aging due to sarcopenia, this suggests that changes in the motor cortex with aging can cause changes in facial expressions in the older adults [78,79].
Regarding the expression of strong negative emotions in the older adults representing our results, age differences are reported between the older and the young adults when they discriminate negative emotion. A previous study demonstrated that older adults had more difficulty distinguishing low intensity negative emotions [80]. They may tend to make facial expressions excessively because the older adults themselves may not be able to identify low intensity negative emotions.
Previous studies well support the differences in AUs intensity between the two groups. On upper facial expression, namely AU 06 and 07, the older adults can show greater intensity than the younger adults. Increased activity in orbicularis oculi muscle [81], deeply set of eye [82], and changes in eyelid due to poor visual acuity [83] may have affected the changes in upper facial expression. For lower facial expression, AU 10, 12, 14, the strength of the face may have been further tapped due to the highlighted facial contour caused by loss of subcutaneous fill around the nose and mouth in the older adults [84]. In AU 45, the older adults rather showed reduced AUs than the younger people. Elevated duration of eye blink may explain this reason. Duration of the eye-blinking decreases with aging, apparently reflecting decreased intensities in AU 45 [85], since the deterioration of the orbicularis oculi muscle can affect the complete eye closure rate [86].
As for the adaptive LASSO, the result was shown to be similar to the comparisons between two groups, expect for the AU 02, 05, and 20. The increase in AU 02 in sad condition, as previously mentioned, may have resulted in increased activity in the eyebrow and strong representation of negative representations [80,81]. For the AU 05 in surprise condition, the reduction of muscles may also involve in eye activity have affected the weaker construction of surprise facial expressions [85,86]. For the AU 20, aging may lead to the relaxation of the lip stretcher owing to decreased muscle around the mouth [17,87].

4.2. Limitations and Future Direction

There are several limitations in the current study. First, we employed only posed emotions. Given that the mechanisms of the posed emotions and the spontaneous facial expressions differ [88], further studies are needed to compare the difference between two distinct facial expressions. Secondly, we did not employ physiological assessment. The OpenFace software, unlike EMG, could not measure sensitive intensities in facial muscles at a physiological level. However, since the OpenFace library is based on FACS and provides reliable results along with recent technological advances, measurement errors are not likely to be a problem. In addition, recent study on the difference between computer vision and EMG has demonstrated only a few differences among the two techniques with respect to accessing overt facial expressions, and that computer vision showed better performance as compared with human [89]. Thirdly, age group is less continuous. Thus, future studies should be designed for providing normative data for facial aging with respect to demographics, such as age and sex. Lastly, the presence of the imbalanced class between the younger group and older group can be a potential limitation of the current study. This issue may not be critical, if the ratios between two classes are not too different. An experimental study showed that low class imbalance ratios do not cause significant performance loss [90], where the class ratio of 40:60, which is similar to our study (Table 2), seemed to converge to nearly zero with respect to performance loss. Another study used metabolomics data and showed that a false positive ratio even decreases as the class-imbalanced ratio rises, due to the prevention of over selection in identifying biomarker features with the LASSO algorithm [91]. Despite these studies, our findings should be interpreted with caution.
With the above limitations, our study has the following strengths. Our findings regarding posed emotions, which require conscious effort of facial muscles, can be used as an evidence to censor individuals who deliberately deceive others, especially for lie detection [92]. In situations where biophysiological assessment is limited, computer vision-based face recognition tools would be beneficial. In a clinical setting, our findings can be used for detecting frailty and other senile changes in muscle. For computer vision-based facial recognition, our findings may also provide researchers with empirical evidence for the characteristics of a human aging face, which would help develop the service and/or product for recognizing the faces of older adults. Notably, so far, there has been little attempt for facial expression recognizing study that compares the characteristics between the younger and the older. Our findings can provide interpretable evidence and explainable features for aging faces. This could provide an important basis for CAD studies for older people in the future.

4.3. Conclusions

Taken together, the present study is the first to investigate the differences in posed facial expressions between older adults and younger adults using a computer analysis method. Our findings provide evidence for implications in facial expression intensity based on FACS-AU-derived emotional faces. The older adults expressed more intense expressions in neutral and negative emotions than younger adults and tended to use more muscles when they were making facial expressions. In some part of the facial expression, the older adults showed weaker intensity than the younger adults. Our findings may suggest that changes in the muscles around the eyes and mouth due to aging can be indicators of the characteristics for identifying the aging face. The results of this study were obtained quantitatively from a normal population, which has several strengths as compared with previous studies of facial expression based on EMG, 3D morphometry, or subjective rating. They can be used as a basic methodology for analyzing and for identification of the characteristics of facial aging. We hope that the various features of the posed emotions of the older adults in this study can be a significant contribution to other scientific fields with respect to facial expressions, such as criminological research using lie detection, behavioral medicine, and computer vision research based on facial recognition. Future studies are needed for investigating other attributes in facial expressions regarding dynamic emotions, natural environments, and diverse groups.

Author Contributions

J.-Y.L. and S.P. (Soowon Park) designed the study; S.P. (Soowon Park) and J.-Y.L. recruited participants and collected facial and clinical data; M.B., M.-G.S., G.N. and J.I. wrote the protocol and performed interpretation of data; H.K. and S.P. (Seho Park) contributed to facial behavioral data analyses and wrote the methodology; K.K. and H.K. undertook statistical data analyses; K.K. and H.K. wrote the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Education through the National Research Foundation of Korea (NRF), grant number (NRF-2017R1D1A1A02018479).

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki and the protocol was approved by the Institutional Review Board of SMG-SNU Boramae Medical Center (IRB No. 30-2017-63).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

We would like to thank the anonymous reviewers for their time and constructive comments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Buck, R. Nonverbal communication of affect in children. J. Personal. Soc. Psychol. 1975, 31, 644–653. [Google Scholar] [CrossRef]
  2. Buck, R.W.; Savin, V.J.; Miller, R.E.; Caul, W.F. Communication of affect through facial expressions in humans. J. Personal. Soc. Psychol. 1972, 23, 362–371. [Google Scholar] [CrossRef] [PubMed]
  3. Crivelli, C.; Fridlund, A.J. Facial displays are tools for social influence. Trends Cogn. Sci. 2018, 22, 388–399. [Google Scholar] [CrossRef] [Green Version]
  4. Malatesta, C.Z.; Izard, C.E.; Culver, C.; Nicolich, M. Emotion communication skills in young, middle-aged, and older women. Psychol. Aging 1987, 2, 193–203. [Google Scholar] [CrossRef]
  5. Sullivan, S.; Ruffman, T. Emotion recognition deficits in the elderly. Int. J. Neurosci. 2004, 114, 403–432. [Google Scholar] [CrossRef] [PubMed]
  6. Ebner, N.C.; Johnson, M.K. Young and older emotional faces: Are there age group differences in expression identification and memory? Emotion 2009, 9, 329–339. [Google Scholar] [CrossRef] [Green Version]
  7. Calder, A.J.; Keane, J.; Manly, T.; Sprengelmeyer, R.; Scott, S.; Nimmo-Smith, I.; Young, A.W. Facial expression recognition across the adult life span. Neuropsychologia 2003, 41, 195–202. [Google Scholar] [CrossRef]
  8. MacPherson, S.E.; Phillips, L.H.; Della Sala, S. Age, executive function and social decision making: A dorsolateral prefrontal theory of cognitive aging. Psychol. Aging 2002, 17, 598–609. [Google Scholar] [CrossRef]
  9. Suzuki, A.; Hoshino, T.; Shigemasu, K.; Kawamura, M. Decline or improvement? Age-related differences in facial expression recognition. Biol. Psychol. 2007, 74, 75–84. [Google Scholar] [CrossRef]
  10. Slessor, G.; Miles, L.K.; Bull, R.; Phillips, L.H. Age-related changes in detecting happiness: Discriminating between enjoyment and nonenjoyment smiles. Psychol. Aging 2010, 25, 246–250. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Gonçalves, A.R.; Fernandes, C.; Pasion, R.; Ferreira-Santos, F.; Barbosa, F.; Marques-Teixeira, J. Effects of age on the identification of emotions in facial expressions: A meta-analysis. PeerJ 2018, 6, e5278. [Google Scholar] [CrossRef]
  12. Fedok, F.G. The aging face. Facial Plast. Surg. 1996, 12, 107–115. [Google Scholar] [CrossRef]
  13. Windhager, S.; Mitteroecker, P.; Rupić, I.; Lauc, T.; Polašek, O.; Schaefer, K. Facial aging trajectories: A common shape pattern in male and female faces is disrupted after menopause. Am. J. Phys. Anthropol. 2019, 169, 678–688. [Google Scholar] [CrossRef] [Green Version]
  14. Müri, R.M. Cortical control of facial expression. J. Comp. Neurol. 2016, 524, 1578–1585. [Google Scholar] [CrossRef] [Green Version]
  15. Ross, E.D.; Prodan, C.I.; Monnot, M. Human facial expressions are organized functionally across the upper-lower facial axis. Neuroscience 2007, 13, 433–446. [Google Scholar] [CrossRef]
  16. Ross, E.D.; Pulusu, V.K. Posed versus spontaneous facial expressions are modulated by opposite cerebral hemispheres. Cortex 2013, 49, 1280–1291. [Google Scholar] [CrossRef] [PubMed]
  17. Bilodeau-Mercure, M.; Kirouac, V.; Langlois, N.; Ouellet, C.; Gasse, I.; Tremblay, P. Movement sequencing in normal aging: Speech, oro-facial, and finger movements. Age 2015, 37, 1–13. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Avivi-Arber, L.; Sessle, B.J. Jaw sensorimotor control in healthy adults and effects of ageing. J. Oral Rehabil. 2018, 45, 50–80. [Google Scholar] [CrossRef] [PubMed]
  19. Balestrini, S.; Lopez, S.M.; Chinthapalli, K.; Sargsyan, N.; Demurtas, R.; Vos, S.; Altmann, A.; Suttie, M.; Hammond, P.; Sisodiya, S.M. Increased facial asymmetry in focal epilepsies associated with unilateral lesions. Brain Commun. 2021, 3, fcab068. [Google Scholar] [CrossRef]
  20. Sonawane, B.; Sharma, P. Review of automated emotion-based quantification of facial expression in Parkinson’s patients. Vis. Comput. 2021, 37, 1151–1167. [Google Scholar] [CrossRef]
  21. Burton, K.W.; Kaszniak, A.W. Emotional experience and facial expression in Alzheimer’s disease. Aging Neuropsychol. Cogn. 2006, 13, 636–651. [Google Scholar] [CrossRef] [PubMed]
  22. Zeghari, R.; König, A.; Guerchouche, R.; Sharma, G.; Joshi, J.; Fabre, R.; Robert, P.; Manera, V. Correlations between facial expressivity and apathy in elderly people with neurocognitive disorders: Exploratory study. JMIR Form. Res. 2021, 5, e24727. [Google Scholar] [CrossRef]
  23. Borod, J.C.; Haywood, C.S.; Koff, E. Neuropsychological aspects of facial asymmetry during emotional expression: A review of the normal adult literature. Neuropsychol. Rev. 1997, 7, 41–60. [Google Scholar] [CrossRef] [PubMed]
  24. Namba, S.; Makihara, S.; Kabir, R.S.; Miyatani, M.; Nakao, T. Spontaneous facial expressions are different from posed facial expressions: Morphological properties and dynamic sequences. Curr. Psychol. 2017, 36, 593–605. [Google Scholar] [CrossRef]
  25. Galati, D.; Scherer, K.R.; Ricci-Bitti, P.E. Voluntary facial expression of emotion: Comparing congenitally blind with normally sighted encoders. J. Personal. Soc. Psychol. 1997, 73, 1363. [Google Scholar] [CrossRef]
  26. Gosselin, P.; Kirouac, G.; Doré, F.Y. Components and recognition of facial expression in the communication of emotion by actors. J. Personal. Soc. Psychol. 1995, 68, 83. [Google Scholar] [CrossRef]
  27. Sato, W.; Hyniewska, S.; Minemoto, K.; Yoshikawa, S. Facial expressions of basic emotions in Japanese laypeople. Front. Psychol. 2019, 10, 259. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Van Der Zant, T.; Nelson, N. Motion increases recognition of naturalistic postures but not facial expressions. J. Nonverbal Behav. 2021, 1–14. [Google Scholar]
  29. Elfenbein, H.A.; Ambady, N. On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychol. Bull. 2002, 128, 203. [Google Scholar] [CrossRef] [Green Version]
  30. Aviezer, H.; Ensenberg, N.; Hassin, R.R. The inherently contextualized nature of facial emotion perception. Curr. Opin. Psychol. 2017, 17, 47–54. [Google Scholar] [CrossRef]
  31. Ekman, P.; Friesen, W. Facial Action Coding System (FACS): Manual; Consulting Psychologists Press: Palo Alto, CA, USA, 1978. [Google Scholar]
  32. Hamm, J.; Kohler, C.G.; Gur, R.C.; Verma, R. Automated facial action coding system for dynamic analysis of facial expressions in neuropsychiatric disorders. J. Neurosci. Methods 2011, 200, 237–256. [Google Scholar] [CrossRef] [Green Version]
  33. Kar, N.B.; Babu, K.S.; Sangaiah, A.K.; Bakshi, S. Face expression recognition system based on ripplet transform type II and least square SVM. Multimed. Tools Appl. 2019, 78, 4789–4812. [Google Scholar] [CrossRef]
  34. Baltrusaitis, T.; Zadeh, A.; Lim, Y.C.; Morency, L.P. Openface 2.0: Facial behavior analysis toolkit. In Proceedings of the 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi’an, China, 15–19 May 2018; pp. 59–66. [Google Scholar]
  35. Cotofana, S.; Assemi-Kabir, S.; Mardini, S.; Giunta, R.E.; Gotkin, R.H.; Moellhoff, N.; Avelar, L.E.T.; Mercado-Perez, A.; Lorenc, P.Z.; Frank, K. Understanding facial muscle aging: A surface electromyography study. Aesthetic Surg. J. 2021, sjab202. [Google Scholar] [CrossRef]
  36. Bailey, P.E.; Henry, J.D. Subconscious facial expression mimicry is preserved in older adulthood. Psychol. Aging 2009, 24, 995–1000. [Google Scholar] [CrossRef]
  37. Labuschagne, I.; Pedder, D.J.; Henry, J.D.; Terrett, G.; Rendell, P.G. Age differences in emotion regulation and facial muscle reactivity to emotional films. Gerontology 2020, 66, 74–84. [Google Scholar] [CrossRef]
  38. Wang, F.; Chen, H.; Kong, L.; Sheng, W. Real-time facial expression recognition on robot for healthcare. In Proceedings of the 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR), Shenyang, China, 24–27 August 2018; pp. 402–406. [Google Scholar]
  39. Stephen, I.D.; Hiew, V.; Coetzee, V.; Tiddeman, B.P.; Perrett, D.I. Facial shape analysis identifies valid cues to aspects of physiological health in Caucasian, Asian, and African populations. Front. Psychol. 2017, 8, 1883. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Khan, M.A.; Kim, Y. Cardiac arrhythmia disease classification using LSTM deep learning approach. CMC Comput. Mater. Contin. 2021, 67, 427–443. [Google Scholar]
  41. Giger, M.L.; Suzuki, K. Computer-aided diagnosis. In Biomedical Information Technology; Academic Press: Cambridge, MA, USA, 2008; pp. 359–374. [Google Scholar]
  42. Parra-Dominguez, G.S.; Sanchez-Yanez, R.E.; Garcia-Capulin, C.H. Facial paralysis detection on images using key point analysis. Appl. Sci. 2021, 11, 2435. [Google Scholar] [CrossRef]
  43. Guarin, D.L.; Yunusova, Y.; Taati, B.; Dusseldorp, J.R.; Mohan, S.; Tavares, J.; Jowett, N. Toward an automatic system for computer-aided assessment in facial palsy. Facial Plast. Surg. Aesthetic Med. 2020, 22, 42–49. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Dorante, M.I.; Kollar, B.; Obed, D.; Haug, V.; Fischer, S.; Pomahac, B. Recognizing emotional expression as an outcome measure after face transplant. JAMA Netw. Open 2020, 3, e1919247. [Google Scholar] [CrossRef]
  45. Roy, S.D.; Bhowmik, M.K.; Saha, P.; Ghosh, A.K. An approach for automatic pain detection through facial expression. Procedia Comput. Sci. 2016, 84, 99–106. [Google Scholar] [CrossRef] [Green Version]
  46. De Belen, R.A.J.; Bednarz, T.; Sowmya, A.; Del Favero, D. Computer vision in autism spectrum disorder research: A systematic review of published studies from 2009 to 2019. Transl. Psychiatry 2020, 10, 1–20. [Google Scholar] [CrossRef]
  47. Chen, S.; Pan, Z.X.; Zhu, H.J.; Wang, Q.; Yang, J.J.; Lei, Y.; Li, J.Q.; Pan, H. Development of a computer-aided tool for the pattern recognition of facial features in diagnosing Turner syndrome: Comparison of diagnostic accuracy with clinical workers. Sci. Rep. 2018, 8, 9317. [Google Scholar] [CrossRef] [Green Version]
  48. Jin, B.; Qu, Y.; Zhang, L.; Gao, Z. Diagnosing Parkinson disease through facial expression recognition: Video analysis. J. Med. Internet Res. 2020, 22, e18697. [Google Scholar] [CrossRef]
  49. Ardila, A.; Rosselli, M. Spontaneous language production and aging: Sex and educational effects. Int. J. Neurosci. 1996, 87, 71–78. [Google Scholar] [CrossRef]
  50. Jayanthy, S.; Anishkka, J.B.; Deepthi, A.; Janani, E. Facial Recognition and Verification System for Accessing Patient Health Records. In Proceedings of the 2019 International Conference on Intelligent Computing and Control Systems (ICCS), Madurai, India, 15–17 May 2019; pp. 1266–1271. [Google Scholar]
  51. Association, A.P. Diagnostic and Statistical Manual of Mental Disorder: DSM-IV-TR; American Psychiatric Association: Washington, DC, USA, 2000. [Google Scholar]
  52. Beck, A.T.; Ward, C.H.; Mendelson, M.; Mock, J.; Erbaugh, J. An inventory for measuring depression. Arch. Gen. Psychiatry 1961, 4, 561–571. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  53. Sung, H.; Kim, J.; Park, Y.; Bai, D.; Lee, S.; Ahn, H. A study on the reliability and the validity of Korean version of the Beck Depression Inventory (BDI). J. Korean Soc. Biol. Ther. Psychiatry 2008, 14, 201–212. [Google Scholar]
  54. Lim, S.Y.; Lee, E.J.; Jeong, S.W.; Kim, H.C. The validation study of Beck Depression Scale 2 in Korean version. Anxiety Mood 2011, 7, 48–53. [Google Scholar]
  55. Beck, A.T.; Epstein, N.; Brown, G.; Steer, R.A. An inventory for measuring clinical anxiety: Psychometric properties. J. Couns. Clin. Psychol. 1988, 56, 893–897. [Google Scholar] [CrossRef]
  56. Julian, L.J. Measures of anxiety: State-Trait Anxiety Inventory (STAI), Beck Anxiety Inventory (BAI), and Hospital Anxiety and Depression Scale-Anxiety (HADS-A). Arthritis Care Res. 2011, 63, S467–S472. [Google Scholar] [CrossRef] [Green Version]
  57. Bagby, R.M.; Parker, J.D.A.; Taylor, G.J. The twenty-item Toronto Alexithymia Scale-I. Item selection and cross-validation of the factor structure. J. Psychosom. Res. 1994, 38, 23–32. [Google Scholar] [CrossRef]
  58. Lee, J.Y.; Rim, Y.H.; Lee, H.D. Development and validation of a Korean version of the 20-item Toronto Alexithymia Scale (TAS-20K). J. Korean Neuropsychiatr. Assoc. 1996, 35, 888–899. [Google Scholar]
  59. Seo, S.S.; Chung, U.S.; Rim, H.D.; Jeong, S.H. Reliability and validity of the 20-item Toronto Alexithymia Scale in Korean adolescents. Psychiatry Investig. 2009, 6, 173. [Google Scholar] [CrossRef] [PubMed]
  60. Park, S.; Kim, T.; Shin, S.A.; Kim, Y.K.; Sohn, B.K.; Park, H.J.; Youn, J.H.; Lee, J.Y. Behavioral and neuroimaging evidence for facial emotion recognition in elderly korean adults with mild cognitive impairment, Alzheimer’s disease, and frontotemporal dementia. Front. Aging Neurosci. 2017, 9, 389. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  61. Baltrušaitis, T.; Mahmoud, M.; Robinson, P. Cross-dataset learning and person-specific normalisation for automatic action unit detection. In Proceedings of the 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), Ljubljana, Slovenia, 4–8 May 2015; pp. 1–6. [Google Scholar]
  62. Friesen, W.; Ekman, P. EMFACS-7: Emotional Facial Action Coding System; University of California at San Francisco: San Francisco, CA, USA, 1983; Unpublished manuscript. [Google Scholar]
  63. Sayette, M.A.; Cohn, J.F.; Wertz, J.M.; Perrott, M.A.; Parrott, D.J. A psychometric evaluation of the facial action coding system for assessing spontaneous expression. J. Nonverbal Behav. 2001, 25, 167–185. [Google Scholar] [CrossRef]
  64. Scherer, K.R. Handbook of Methods in Nonverbal Behavior Research; Cambridge University Press: Cambridge, UK, 1985. [Google Scholar]
  65. Olderbak, S.; Hildebrandt, A.; Pinkpank, T.; Sommer, W.; Wilhelm, O. Psychometric challenges and proposed solutions when scoring facial emotion expression codes. Behav. Res. Methods 2014, 46, 992–1006. [Google Scholar] [CrossRef] [Green Version]
  66. Zou, H. The adaptive lasso and its oracle properties. J. Am. Stat. Assoc. 2006, 101, 1418–1429. [Google Scholar] [CrossRef] [Green Version]
  67. Tikhonov, A.N. On the stability of inverse problems. Dokl. Akad. Nauk SSSR 1943, 39, 195–198. [Google Scholar]
  68. Vidaurre, D.; Bielza, C.; Larranaga, P. A survey of L1 regression. Int. Stat. Rev. 2013, 81, 361–387. [Google Scholar] [CrossRef]
  69. McNeish, D.M. Using lasso for predictor selection and to assuage overfitting: A method long overlooked in behavioral sciences. Multivar. Behav. Res. 2015, 50, 471–484. [Google Scholar] [CrossRef]
  70. Lever, J.; Krzywinski, M.; Altman, N. Points of significance: Model selection and overfitting. Nat. Methods 2016, 13, 703–704. [Google Scholar] [CrossRef]
  71. Clark, B.C.; Manini, T.M. Sarcopenia ≠ dynapenia. J. Gerontol. Ser. A 2008, 63, 829–834. [Google Scholar] [CrossRef] [PubMed]
  72. Enoka, R.M.; Christou, E.A.; Hunter, S.K.; Kornatz, K.W.; Semmler, J.G.; Taylor, A.M.; Tracy, B.L. Mechanisms that contribute to differences in motor performance between young and old adults. J. Electromyogr. Kinesiol. 2003, 13, 1–12. [Google Scholar] [CrossRef]
  73. Clark, B.C. Neuromuscular changes with aging and sarcopenia. J. Frailty Aging 2019, 8, 7–9. [Google Scholar] [PubMed]
  74. Klass, M.; Baudry, S.; Duchateau, J. Voluntary activation during maximal contraction with advancing age: A brief review. Eur. J. Appl. Physiol. 2007, 100, 543–551. [Google Scholar] [CrossRef]
  75. Oliviero, A.; Profice, P.; Tonali, P.A.; Pilato, F.; Saturno, E.; Dileone, M.; Ranieri, F.; Di Lazzaro, V. Effects of aging on motor cortex excitability. Neurosci. Res. 2006, 55, 74–77. [Google Scholar] [CrossRef]
  76. Gandevia, S.C. Spinal and supraspinal factors in human muscle fatigue. Physiol. Rev. 2001, 81, 1725–1789. [Google Scholar] [CrossRef]
  77. Manini, T.M.; Clark, B.C. Dynapenia and aging: An update. J. Gerontol. Ser. A 2012, 67, 28–40. [Google Scholar] [CrossRef] [Green Version]
  78. Morecraft, R.J.; Stilwell–Morecraft, K.S.; Rossing, W.R. The motor cortex and facial expression: New insights from neuroscience. Neurol. 2004, 10, 235–249. [Google Scholar] [CrossRef]
  79. Salat, D.H.; Buckner, R.L.; Snyder, A.Z.; Greve, D.N.; Desikan, R.S.; Busa, E.; Morris, J.C.; Dale, A.M.; Fischl, B. Thinning of the cerebral cortex in aging. Cereb. Cortex 2004, 14, 721–730. [Google Scholar] [CrossRef] [Green Version]
  80. Mienaltowski, A.; Johnson, E.R.; Wittman, R.; Wilson, A.T.; Sturycz, C.; Norman, J.F. The visual discrimination of negative facial expressions by younger and older adults. Vis. Res. 2013, 81, 12–17. [Google Scholar] [CrossRef] [Green Version]
  81. Yun, S.; Son, D.; Yeo, H.; Kim, S.; Kim, J.; Han, K.; Lee, S.; Lee, J. Changes of eyebrow muscle activity with aging: Functional analysis revealed by electromyography. Plast. Reconstr. Surg. 2014, 133, 455e–463e. [Google Scholar] [CrossRef]
  82. Hennekam, R.C. The external phenotype of aging. Eur. J. Med. Genet. 2020, 63, 103995. [Google Scholar] [CrossRef]
  83. Moon, J.H.; Oh, Y.H.; Kong, M.H.; Kim, H.J. Relationship between visual acuity and muscle mass in the Korean older population: A cross-sectional study using Korean National Health and Nutrition Examination Survey. BMJ Open 2019, 9, e033846. [Google Scholar] [CrossRef] [Green Version]
  84. Coleman, S.R.; Grover, R. The anatomy of the aging face: Volume loss and changes in 3-dimensional topography. Aesthetic Surg. J. 2006, 26, S4–S9. [Google Scholar] [CrossRef] [Green Version]
  85. Sun, W.S.; Baker, R.S.; Chuke, J.C.; Rouholiman, B.R.; Hasan, S.A.; Gaza, W.; Stava, M.W.; Porter, J.D. Age-related changes in human blinks. Passive and active changes in eyelid kinematics. Investig. Ophthalmol. Vis. Sci. 1997, 38, 92–99. [Google Scholar]
  86. Sforza, C.; Rango, M.; Galante, D.; Bresolin, N.; Ferrario, V.F. Spontaneous blinking in healthy persons: An optoelectronic study of eyelid motion. Ophthalmic Physiol. Opt. 2008, 28, 345–353. [Google Scholar] [CrossRef] [PubMed]
  87. Cecílio, F.; Regalo, S.; Palinkas, M.; Issa, J.; Siéssere, S.; Hallak, J.; Machado-de-Sousa, J.; Semprini, M. Ageing and surface EMG activity patterns of masticatory muscles. J. Oral Rehabil. 2010, 37, 248–255. [Google Scholar] [CrossRef] [PubMed]
  88. Motley, M.T.; Camden, C.T. Facial expression of emotion: A comparison of posed expressions versus spontaneous expressions in an interpersonal communication setting. West. J. Commun. (Incl. Commun. Rep.) 1988, 52, 1–22. [Google Scholar] [CrossRef]
  89. Perusquia-Hernández, M.; Ayabe-Kanamura, S.; Suzuki, K.; Kumano, S. The invisible potential of facial electromyography: A comparison of EMG and computer vision when distinguishing posed from spontaneous smiles. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, 4–9 May 2019. [Google Scholar]
  90. Prati, R.C.; Batista, G.E.; Silva, D.F. Class imbalance revisited: A new experimental setup to assess the performance of treatment methods. Knowl. Inf. Syst. 2015, 45, 247–270. [Google Scholar] [CrossRef]
  91. Fu, G.H.; Yi, L.Z.; Pan, J. LASSO-based false-positive selection for class-imbalanced data in metabolomics. J. Chemom. 2019, 33, e3177. [Google Scholar] [CrossRef]
  92. Avola, D.; Cinque, L.; Foresti, G.L.; Pannone, D. Automatic deception detection in RGB videos using Facial Action Units. In Proceedings of the 13th International Conference on Distributed Smart Cameras, New York, NY, USA, 9–11 September 2019; pp. 1–6. [Google Scholar]
Figure 1. The facial stimuli representing the six basic emotions and the neutral emotion, adapted from [60].
Figure 1. The facial stimuli representing the six basic emotions and the neutral emotion, adapted from [60].
Sensors 21 04858 g001
Figure 2. Prevalence of AU values by groups for neutral face. AU, action unit.
Figure 2. Prevalence of AU values by groups for neutral face. AU, action unit.
Sensors 21 04858 g002
Figure 3. Prevalence of emotional AU values by groups for emotional face. AU, action unit.
Figure 3. Prevalence of emotional AU values by groups for emotional face. AU, action unit.
Sensors 21 04858 g003
Figure 4. Correlation plot for age and AUs. AU, action unit; ang, angry; dis, disgust; fea, fear; hap, happy; neu, neutral; sur, surprise. p-values were adjusted for multiple comparisons.
Figure 4. Correlation plot for age and AUs. AU, action unit; ang, angry; dis, disgust; fea, fear; hap, happy; neu, neutral; sur, surprise. p-values were adjusted for multiple comparisons.
Sensors 21 04858 g004
Figure 5. The adaptive LASSO results. AU, action unit; ang, angry; neu, neutral; sur, surprise.
Figure 5. The adaptive LASSO results. AU, action unit; ang, angry; neu, neutral; sur, surprise.
Sensors 21 04858 g005
Table 1. Demographic characteristics across the groups.
Table 1. Demographic characteristics across the groups.
Younger Adults (n = 113)Older Adults (n = 56)
Mean ± SDMean ± SD
Age21.9± 2.9172.2± 4.72
Education14.5± 1.109.8± 4.47
Sex, n (%)
Male57 (50.4)27 (48.2)
Female56 (49.6)29 (51.8)
Usage of botox, n (%)2 (1.8)1 (1.7)
Left-handed, n (%)8 (7.1)1 (1.7)
BDI 10.7±6.8814.4± 11.04
BAI 25.1 4.2825.3 ± 6.22
TAS 45.3± 10.5250.3± 8.95
Note. Botox, botulinum toxin; BDI, Beck Depression Inventory; BAI, Beck Anxiety Inventory; TAS, Toronto Alexithymia Scale; SD, standard deviation; BOLD indicates statistically significant differences.
Table 2. Action unit descriptions and combination of each emotion.
Table 2. Action unit descriptions and combination of each emotion.
No.FACS NameFacial Muscle (Location)
1Inner brow raiserFrontalis, pars medialis (U)
2Outer brow raiserFrontalis, pars lateralis (U)
4Brow loweringDepressor glabellae, depressor supercilli, currugator (U)
5Upper lid raiserLevator palpebrae superioris (U)
6Cheek raiserOrbicularis oculi, pars orbitalis (U)
7Lid tightenerOrbicularis oculi, pars palpebralis (U)
9Nose wrinkleLevator labii superioris alaquae nasi (L)
10Upper lip raiserLevator labii superioris, caput infraorbitalis (L)
11Nasolabial deepenerZygomatic minor (L)
12Lip corner pullerZygomatic major (L)
14DimplerBuccinator (L)
15Lip corner depressorDepressor anguli oris (triangularis) (L)
17Chin raiserMentalis (L)
20Lip stretcherRisorius (L)
23Lip tightenerOrbicularis oris (L)
25Lip partingDepressor labii, relaxation of mentalis, orbicularis oris (L)
26Jaw dropMasetter, temporal and internal pterygoid relaxed (L)
45Blink Levator palpebrae superioris, orbicularis oculi (U)
EmotionAU combination
Angry04 + 05 + 07 + 23
Disgust09 + 15
Fear01 + 02 + 04 + 05 + 20 + 26
Happy06 + 12
Sad01 + 04 + 15
Surprise01 + 02 + 05 + 26
Note. AU, action unit; FACS, facial action coding system; L, lower face; U, upper face.
Table 3. AU comparisons by groups for six basic emotions.
Table 3. AU comparisons by groups for six basic emotions.
VariablesYounger AdultsOlder AdultsDirectionLocationp-Value
Mean ± SDMean ± SD
AU06 (ang)0.42 ± 0.581.36 ± 0.85Y < OU<0.001
AU06 (dis)0.58 ± 0.541.22 ± 0.73Y < OU0.0276
AU06 (neu)0.05 ± 0.140.62 ± 0.45Y < OU<0.001
AU06 (sad)0.29 ± 0.451.11 ± 0.59Y < OU<0.001
AU06 (sur)0.25 ± 0.500.88 ± 0.59Y < OU<0.001
AU07 (neu)0.94 ± 0.701.80 ± 0.83Y < OU<0.001
AU07 (sad)1.50 ± 0.942.37 ± 1.03Y < OU<0.001
AU07 (sur)1.29 ± 0.942.27 ± 0.97Y < OU0.0105
AU10 (ang)0.43 ± 0.581.25 ± 0.61Y < OL<0.001
AU10 (dis)0.49 ± 0.501.11 ± 0.62Y < OL<0.001
AU10 (fea)0.38 ± 0.490.95 ± 0.58Y < OL<0.001
AU10 (neu)0.03 ± 0.130.57 ± 0.47Y < OL<0.001
AU10 (sad)0.20 ± 0.340.95 ± 0.55Y < OL<0.001
AU10 (sur)0.26 ± 0.460.96 ± 0.61Y < OL<0.001
AU12 (ang)0.38 ± 0.561.23 ± 0.83Y < OL<0.001
AU12 (neu)0.06 ± 0.150.43 ± 0.40Y < OL<0.001
AU12 (sad)0.29 ± 0.430.79 ± 0.65Y < OL<0.001
AU14 (ang)0.41 ± 0.631.12 ± 0.81Y < OL0.0255
AU14 (neu)0.04 ± 0.150.31 ± 0.38Y < OL<0.001
AU14 (sad)0.20 ± 0.410.64 ± 0.60Y < OL0.0036
AU45 (hap)2.09 ± 0.701.23 ± 0.63Y > OU0.0029
AU45 (neu)2.41 ± 0.691.22 ± 0.55Y > OU<0.001
AU45 (sad)1.95 ± 0.751.18 ± 0.61Y > OU0.0495
AU45 (sur)2.34 ± 0.771.46 ± 0.73Y > OU0.0022
Note: AU, action unit; BOLD, indicates significant p-values; ang, angry; dis, disgust; fea, fear; hap, happy; neu, neutral; sur, surprise; L, lower face; U, upper face. Comparisons were adjusted for covariates. p-values were adjusted for multiple comparisons.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ko, H.; Kim, K.; Bae, M.; Seo, M.-G.; Nam, G.; Park, S.; Park, S.; Ihm, J.; Lee, J.-Y. Changes in Computer-Analyzed Facial Expressions with Age. Sensors 2021, 21, 4858. https://doi.org/10.3390/s21144858

AMA Style

Ko H, Kim K, Bae M, Seo M-G, Nam G, Park S, Park S, Ihm J, Lee J-Y. Changes in Computer-Analyzed Facial Expressions with Age. Sensors. 2021; 21(14):4858. https://doi.org/10.3390/s21144858

Chicago/Turabian Style

Ko, Hyunwoong, Kisun Kim, Minju Bae, Myo-Geong Seo, Gieun Nam, Seho Park, Soowon Park, Jungjoon Ihm, and Jun-Young Lee. 2021. "Changes in Computer-Analyzed Facial Expressions with Age" Sensors 21, no. 14: 4858. https://doi.org/10.3390/s21144858

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop