Facial expression research in the past 30 years has been focused mainly on human subjects, while that of other animals, specifically non-primate mammals, has been neglected despite its importance in terms of phylogenetical perspectives. Recently, however, several studies have shown the presence of facial expression in several mammalians such as canids, ungulates, and rodents in a systematic manner. Furthermore, evidence indicate that these species possibly receive facial expressions as communicative signals from conspecifics. This paper reviews research on facial expression in non-primate animals and discusses the significance of understanding the mechanisms and functions of human facial expressions.
Konrad Lorenz proposed that “baby schema” —a set of infantile physical features such as bigger eyes, smaller nose and mouth— is perceived as cute and induces caretaking behavior (Lorenz, 1943). Although a large number of human empirical studies have supported this idea, baby schema has been seldom studied in the context of evolution. If baby schema has its adaptive value as originally suggested, it may affect perceivers cognition in various species in similar ways to human. This article aimed to reconsider the evolutionary origins of baby schema recognition from a comparative cognitive perspective by reviewing human studies and recent non-human primate studies.
People make trait inferences from facial appearance in a rapid and spontaneous manner, and these judgments can have social consequences in a variety of important domains. This article reviews recent research on the perception of facial impressions, focusing on perceptual determinants, social consequences, and the accuracy of a variety of facial impressions such as attractiveness, trustworthiness, dominance, and competence. This review revealed that there is a consensus in the judgments of facial impressions, although inferences about internal traits and behavioral tendencies based on facial appearance are not always accurate, and that data-driven computational modeling allows for identifying the perceptual determinants of facial impressions.
In this paper, the author reviewed recent studies about social impressions from faces and the effect of positive expression (smile) on facial attractiveness. It is appeared that smile generally contributes to positive evaluation of facial attractiveness (in 31 studies, the majorities of 23 studies reported positive effect of smile on facial attractiveness, 8 studies reported smiles don’t have any effect on facial attractiveness). However, it is suggested that smile is not the requirement (basic/fundamental component) of facial attractiveness. Second, the author introduced recent theories about interpersonal attractiveness and discussed the similarities between recent model of facial impressions (Oosterhof & Todorov, 2008; Sutherland et al., 2013) and the fundamental model of interpersonal attractiveness, that is, the importance of the ideas of instrumentality toward perceiver’s goals/motivational states (Finkel & Eastwick, 2015). Based on this theoretical discussion, the author proposed that it is necessary for researchers of facial attractiveness to take into consideration about the interaction of these three components: perceiver’s motivational state, facial features and the information of social intentions (emotion expressions and gaze directions).
Viewing others’ facial expressions often evokes congruent facial responses in the observer. This phenomenon, referred to as “spontaneous facial reactions,” is considered to act as “social glue” by fostering affiliation. However, the underlying mechanisms are still under debate. The present article reviews the literature on spontaneous facial reactions and introduces both classical and contemporary theoretical frameworks. Finally, the article examines the subject from a developmental perspective and proposes possible neural mechanisms formed through social interactions during infancy that underlie spontaneous facial reactions.
We present an overview of research into the processing of facial expressions, focusing on older adults’ ability to recognize, perceive, and mimic facial expressions of others. Although some researchers argue that older adults exhibit a so-called positivity effect in facial expression recognition (a behavioral tendency for older adults to adept at processing positive compared to negative or neutral stimuli), recent findings suggest that the effect is not so robust as previously thought. Instead, it appears that older adults consistently show a compromised ability to recognize negative facial expressions including anger, fear or sadness, compared to young adults. Different theories have been proposed to explain this pattern, but none give a comprehensive account of the current findings. We argue that during the early stages of facial expression detection, older adults are adept at detecting negative facial expressions and do not show a positivity effect. Compared to facial recognition, facial mimicry (which appears to indicate automatic processing of facial expressions) is more likely to be preserved throughout adulthood. We discuss the dissociation patterns between automatic and conscious processing of facial expressions. We conclude the review with an emphasis on the need to investigate further the stages of facial expression processing in older adults.
The effect of the posture on impressions regarding the facial expressions of emotion was investigated. The 30 participants were shown pictures of men and women having one of four facial expressions (smile, surprise, anger, or neutral), and one of eight postures. They rated the pictures using a semantic differential technique. A factor analysis indicated two factors: “Evaluation” and “Activity.” The placements of the pictures in the factor space indicated that the impression regarding facial expressions was affected by a person’s posture. Moreover, multiple regression analyses indicated that both factors were positively correlated with familiarity with the person.
Smiles are one of the most ubiquitous facial expressions. They are often interpreted as a signalling cue of positive emotion. However, as any other facial expression, smiles can also be voluntarily fabricated, masked or inhibited with different communication goals. This review discusses automatic identification of smile genuineness. First, emotions and their bodily manifestation are introduced. Second, an overview of the literature on different types of smiles is provided. Afterwards, different techniques used to investigate smile production are described. These techniques range from human video-coding, bio-signal inspection, and novel sensors that, together with automated techniques using machine learning, aim to investigate facial expression characteristic’s beyond human perception. Next, a general summary on the spatio-temporal shape of a smile is provided. Finally, remaining challenges regarding individual and cultural differences are discussed.
Facial expression images or movies are used as stimuli in various research fields, especially in psychological research. Using facial expression databases is helpful for the purpose because they reduce burden of collecting facial images. A lot of facial expression databases are available nowadays, though the number of databases which apt to psychological experiments is restricted. In this report, I introduce facial expression databases including validation process as those which are eligible for using in psychological studies. It can be thought that validation ensures the quality of database and gives us data which can be used as standards for the experimental results. Recent development of facial expression databases shows that researchers have been trying to include various emotions other than six basic facial expressions, and/or offer them with additional information such as gestures or voices. These trends diversify the methods of validation.
A conventional view of the typical development of a facial expression comprises the phases of onset, apex and offset, and the facial configurations at the peak moment of the expression directly correspond with an emotional state. However, such a static view may be too simple; affective expressions are dynamic and complex. The present article focuses on empirical evidence demonstrating that facial temporal dynamics is a carrier of information about emotional states. Our study suggests that facial expressions are dynamic changes in which multiple components with different spatio-temporal characteristics appear simultaneously, unlike the conventional view on facial expressions. The implications for future research on dynamic facial expressions are discussed.
This paper surveys studies aimed at building computational models of human cognitive processes of facial expression, done mainly in the machine learning domain. By aggregating a set of labels obtained through online crowdsourcing, the ground truth of item/instance is estimated while maintaining a psychological perspective. These models are closely connected with theories established in psychology such as signal detection theory and item response theory. In that respect, the boundary between the machine learning and psychology fields seem to be considerably narrowed. The purpose of this paper is to clarify the relationship between these two research areas to promote the development of both.