The present work investigates pupillary reactions induced by exposure to faces with different levels of trustworthiness. Participants' (N = 69) pupillary changes were recorded while they viewed... Show moreThe present work investigates pupillary reactions induced by exposure to faces with different levels of trustworthiness. Participants' (N = 69) pupillary changes were recorded while they viewed white male faces with a neutral expression varying on facial trustworthiness. Results suggest that reward processing and pupil mimicry are relevant mechanisms driving participants' pupil reactions. However, when including both factors in one statistical model, pupil mimicry seems to be a stronger predictor than reward processing of participants' pupil dilation. Results are discussed in light of pupillometry evidence. Show less
Observing others' emotions triggers physiological arousal in infants as well as in adults, reflected in dilated pupil sizes. This study is the first to examine parents' and infants' pupil responses... Show moreObserving others' emotions triggers physiological arousal in infants as well as in adults, reflected in dilated pupil sizes. This study is the first to examine parents' and infants' pupil responses to dynamic negative emotional facial expressions. Moreover, the links between pupil responses and negative emotional dispositions were explored among infants and parents. Infants' and one of their parent's pupil responses to negative versus neutral faces were measured via eye tracking in 222 infants (5- to 7-month-olds, n = 77, 11- to 13-month-olds, n = 78, and 17- to 19-month-olds, n = 67) and 229 parents. One parent contributed to the pupil data, whereas both parents were invited to fill in questionnaires on their own and their infant's negative emotional dispositions. Infants did not differentially respond to negative expressions, while parents showed stronger pupil responses to negative versus neutral expressions. There was a positive association between infants' and their parent's mean pupil responses and significant links between mothers' and fathers' stress levels and their infants' pupil responses. We conclude that a direct association between pupil responses in parents and offspring is observable already in infancy in typical development. Stress in parents is related to their infants' pupillary arousal to negative emotions. Show less
Parent-to-child transmission of information processing biases to threat is a potential causal mechanism in the family aggregation of anxiety symptoms and traits. This study is the first to... Show moreParent-to-child transmission of information processing biases to threat is a potential causal mechanism in the family aggregation of anxiety symptoms and traits. This study is the first to investigate the link between infants' and parents' attention bias to dynamic threat-relevant (versus happy) emotional expressions. Moreover, the associations between infant attention and anxiety dispositions in infants and parents were explored. Using a cross-sectional design, we tested 211 infants in three age groups: 5-to-7-month-olds (n = 71), 11-to-13-month-olds (n = 73), and 17-to-19-month-olds (n = 67), and 216 parents (153 mothers). Infant and parental dwell times to angry and fearful versus happy facial expressions were measured via eye-tracking. The parents also reported on their anxiety and stress. Ratings of infant temperamental fear and distress were averaged across both parents. Parents and infants tended to show an attention bias for fearful faces with marginally longer dwell times to fearful versus happy faces. Parents dwelled longer on angry versus happy faces, whereas infants showed an avoidant pattern with longer dwell times to happy versus angry expressions. There was a significant positive association between infant and parent attention to emotional expressions. Parental anxiety dispositions were not related to their own or their infant's attention bias. No significant link emerged between infants' temperament and attention bias. We conclude that an association between parental and infant attention may already be evident in the early years of life, whereas a link between anxiety dispositions and attention biases may not hold in community samples. Show less
Objectives: For children to understand the emotional behavior of others, the first two steps involve emotion encoding and emotion interpreting, according to the Social Information Processing model.... Show moreObjectives: For children to understand the emotional behavior of others, the first two steps involve emotion encoding and emotion interpreting, according to the Social Information Processing model. Access to daily social interactions is prerequisite to a child acquiring these skills, and barriers to communication such as hearing loss impede this access. Therefore, it could be challenging for children with hearing loss to develop these two skills. The present study aimed to understand the effect of prelingual hearing loss on children's emotion understanding, by examining how they encode and interpret nonverbal emotional cues in dynamic social situations. Design: Sixty deaf or hard-of-hearing (DHH) children and 71 typically hearing (TH) children (3-10 years old, mean age 6.2 years, 54% girls) watched videos of prototypical social interactions between a target person and an interaction partner. At the end of each video, the target person did not face the camera, rendering their facial expressions out of view to participants. Afterward, participants were asked to interpret the emotion they thought the target person felt at the end of the video. As participants watched the videos, their encoding patterns were examined by an eye tracker, which measured the amount of time participants spent looking at the target person's head and body and at the interaction partner's head and body. These regions were preselected for analyses because they had been found to provide cues for interpreting people's emotions and intentions. Results: When encoding emotional cues, both the DHH and TH children spent more time looking at the head of the target person and at the head of the interaction partner than they spent looking at the body or actions of either person. Yet, compared with the TH children, the DHH children looked at the target person's head for a shorter time (b = -0.03, p = 0.030), and at the target person's body (b = 0.04, p = 0.006) and at the interaction partner's head (b = 0.03, p = 0.048) for a longer time. The DHH children were also less accurate when interpreting emotions than their TH peers (b = -0.13, p = 0.005), and their lower scores were associated with their distinctive encoding pattern. Conclusions: The findings suggest that children with limited auditory access to the social environment tend to collect visually observable information to compensate for ambiguous emotional cues in social situations. These children may have developed this strategy to support their daily communication. Yet, to fully benefit from such a strategy, these children may need extra support for gaining better social-emotional knowledge. Show less
Objectives: For children to understand the emotional behavior of others, the first two steps involve emotion encoding and emotion interpreting, according to the Social Information Processing model.... Show moreObjectives: For children to understand the emotional behavior of others, the first two steps involve emotion encoding and emotion interpreting, according to the Social Information Processing model. Access to daily social interactions is prerequisite to a child acquiring these skills, and barriers to communication such as hearing loss impede this access. Therefore, it could be challenging for children with hearing loss to develop these two skills. The present study aimed to understand the effect of prelingual hearing loss on children's emotion understanding, by examining how they encode and interpret nonverbal emotional cues in dynamic social situations. Design: Sixty deaf or hard-of-hearing (DHH) children and 71 typically hearing (TH) children (3-10 years old, mean age 6.2 years, 54% girls) watched videos of prototypical social interactions between a target person and an interaction partner. At the end of each video, the target person did not face the camera, rendering their facial expressions out of view to participants. Afterward, participants were asked to interpret the emotion they thought the target person felt at the end of the video. As participants watched the videos, their encoding patterns were examined by an eye tracker, which measured the amount of time participants spent looking at the target person's head and body and at the interaction partner's head and body. These regions were preselected for analyses because they had been found to provide cues for interpreting people's emotions and intentions. Results: When encoding emotional cues, both the DHH and TH children spent more time looking at the head of the target person and at the head of the interaction partner than they spent looking at the body or actions of either person. Yet, compared with the TH children, the DHH children looked at the target person's head for a shorter time (b = -0.03, p = 0.030), and at the target person's body (b = 0.04, p = 0.006) and at the interaction partner's head (b = 0.03, p = 0.048) for a longer time. The DHH children were also less accurate when interpreting emotions than their TH peers (b = -0.13, p = 0.005), and their lower scores were associated with their distinctive encoding pattern. Conclusions: The findings suggest that children with limited auditory access to the social environment tend to collect visually observable information to compensate for ambiguous emotional cues in social situations. These children may have developed this strategy to support their daily communication. Yet, to fully benefit from such a strategy, these children may need extra support for gaining better social-emotional knowledge. Show less