Probabilistic learning of emotion categories

J Exp Psychol Gen. 2019 Oct;148(10):1814-1827. doi: 10.1037/xge0000529. Epub 2018 Dec 20.

Abstract

Although the configurations of facial muscles that humans perceive vary continuously, we often represent emotions as categories. This suggests that, as in other domains of categorical perception such as speech and color perception, humans become attuned to features of emotion cues that map onto meaningful thresholds for these signals given their environments. However, little is known about the learning processes underlying the representation of these salient social signals. In Experiment 1 we test the role of statistical distributions of facial cues in the maintenance of an emotion category in both children (6-8 years old) and adults (18-22 years old). Children and adults learned the boundary between neutral and angry when provided with explicit feedback (supervised learning). However, after we exposed participants to different statistical distributions of facial cues, they rapidly shifted their category boundaries for each emotion during a testing phase. In Experiments 2 and 3, we replicated this finding and also tested the extent to which learners are able to track statistical distributions for multiple actors. Not only did participants form actor-specific categories, but the distributions of facial cues also influenced participants' trait judgments about the actors. Taken together, these data are consistent with the view that the way humans construe emotion (in this case, anger) is not only flexible, but reflects complex learning about the distributions of the myriad cues individuals experience in their social environments. (PsycINFO Database Record (c) 2019 APA, all rights reserved).

MeSH terms

  • Adolescent
  • Child
  • Cues
  • Emotions / physiology*
  • Facial Expression*
  • Facial Muscles
  • Female
  • Humans
  • Judgment
  • Learning / physiology*
  • Male
  • Social Environment
  • Social Perception*
  • Speech
  • Young Adult