The present study examined emotional facial perception (happy and angry) in 7, 9 and 11-year-old children from Caucasian and multicultural environments with an offset task for two ethnic groups of faces (Asian and Caucasian). In this task, participants were required to respond to a dynamic facial expression video when they believed that the first emotion presented had disappeared. Moreover, using an eye-tracker, we evaluated the ocular behavior pattern used to process these different faces. The analyses of reaction times do not show an emotional other-race effect (i.e., a facility in discriminating own-race faces over to other-race ones) in Caucasian children for Caucasian vs. Asian faces through offset times, but an effect of emotional face appeared in the oldest children. Furthermore, an eye-tracked ocular emotion and race-effect relative to processing strategies is observed and evolves between age 7 and 11. This study strengthens the interest in advancing an eye-tracking study in developmental and emotional processing studies, showing that even a “silent” effect should be detected and shrewdly analyzed through an objective means.
Emotional facial perception is a complex process that develops in childhood from the earliest days of life. The ability to recognize facial expressions emerges early, at around the age of 7 months [1–3]. Despite this early emotional skills enhancement, the improvement of facial expression perception persists during childhood until about age 14  parallel to frontal cortical maturation. Kolb, Wilson and Taylor  showed that happy faces were perceived equally well by 6-year-old children as by adults whereas other emotional faces were perceived poorly until adolescence, showing the importance of considering the development of emotional facial expressions individually. Thus, these recognition abilities improve considerably with age throughout childhood and (pre)adolescence [5, 6]. Based on several different tasks, studies provide a consensus regarding the developmental trajectory of facial expression recognition: overall, 4- to 5-year-old children performed as well as 6– to 9-year-olds for happiness, but not for sadness, anger, or fear, respectively to chronology [7–9]. Surprise and disgust are recognized later, between ages 6 and 10 . Nevertheless, this consensus is debatable due to the diversity of tasks [6, 11–13]. Therefore, with age, children should increase their expertise in facial expression processing through automation in emotional interpretation .
In addition to emotional information which may enrich face processing, the ethnicity of the face should interfere with discrimination. Indeed, it is well established in literature that other-race faces are more difficult to recognize than own-race ones through the so-called Other-Race Effect (ORE) phenomenon . Different ethnic groups have shown this robust effect with faster and more accurate recognition of own-race faces [16–24]. This narrowing for own-race face processing should emerge early, during the first year, even as of the third month of life [25–33]. Nevertheless, the specialization of face processing depends on early visual experience with ORE actually more related to the ethnicity of the child’s main caregivers than to the child’s own . Accordingly, children with parents of different ethnicities or those who are heavily exposed to more than one ethnic group develop expertise for more than one group [28, 35, 36]. Based on cerebral plasticity, familiarization or training to faces from a new ethnic group allows children to maintain abilities for other races [19, 37, 38] and this flexibility has been observed in 3- to 14-year-old adopted children [20, 39]. Valentine  presented the face expertise development as a framework in terms of vectors in a multidimensional perceptual space in which an average of all our experienced faces would compose a ‘prototypical face’. Thus, faces are encoded as vectors according to their deviation from this prototypical face. Predominant exposure to faces of a specific species, gender, or race early in life will cause the dimensions of one’s prototype to become “tuned” towards such faces. Faces close to this ‘prototype’ are thus easily categorized . Caroo [41, 42] had already observed this environmental facilitation through a significant own-race advantage in recognition, and a positive effect of interracial experience for recognition of other-race faces.
As for their discrimination, ethnic familiarity increases processing of the emotional expressions of faces [43–47]. In their meta-analysis, Elfenbein and Ambady  showed that emotions were universally recognized above chance level but that there was a pervasive in-group facilitation. This experience-dependent advantage would decrease when participants lived in more ethnically diverse regions or reported out-group experiences . Moreover, ethnic group would impact emotional processing itself, and children across cultures may display distinct patterns of socio-emotional functioning in early childhood. For example, Chinese children displayed more self-regulation in compliance and delay tasks than North American children did [50–52]. Cross-cultural differences have been demonstrated in social interaction and may affect children’s emotional sensitivity and course of development. In adults, this cultural particularity has been observed by Ishii and collaborators  using an offset task . Participants were required to respond to a dynamic facial expression video when they believed that the first emotion presented had disappeared. Compared to US subjects, Japanese adults perceived the offset of happiness faster than Americans did. Authors interpreted this sensitivity to the disappearance of positive emotional expression as related to an Asian cultural anxiety and sensitivity to others’ expectations [50–52, 55, 56].
Less is known about how faces are processed depending on different developmental, emotional or ethnic factors. However, eye-trackers could constitute a major source of information in facial and emotional processing. Few studies have been interested in ocular behavior to faces, and even less with cultural or emotional factors. Eye tracker studies during childhood have shown patterns of preferential fixations according to emotional expression, distributed over internal features such as eyes and mouth . For example, Dollion and collaborators  showed that infants looked preferentially at the mouth for happy face processing, whereas they oriented toward the eyes and eyebrow areas for angry and sad faces. Thus, it was suggested that our predisposition for face processing also differs across cultures for the strategy employed to extract visual information from faces . Among the rare cultural eye-tracking studies, a difference in fixation pattern between Caucasian and East Asian people emerged. For example, Liu and collaborators  demonstrated that Asian infants (4 to 9 months old) predominantly look around the nose, avoiding a direct gaze toward the eyes. By contrast, Caucasian infants (6 to 10 months old) preferentially look at the eye area to process faces . In adults, Blais and collaborators  have shown that, to categorize and recognize faces, Caucasians looked at the eye and mouth areas whereas Asians used mostly the central area, around the nose. These data were consistent with a theory supported previously by Kitayama and collaborators , suggesting a principally holistic processing for faces in Asian adults compared to an analytic one in Caucasian adults. Surprisingly, an eye-tracked study [62, 63] revealed that, even though the environment modulates expertise for the types of faces experienced, this social experience does not abolish cultural diversity in eye movement strategies.
In this study, we aim to analyze reaction times and/or ocular behavior to evaluate whether the development of face processing depending on emotional and ethnic faces can be observed in children between 7 and 11 years old, an age sensitive to emotional understanding enhancement. For this aim, we choose to use an emotional offset task as developed by Niedenthal et al. , which has already been shown to be adaptable to ethnic effects studies . In a previous Offset study with Asian and Caucasian faces, the analyses of reaction times revealed a clear emotional ORE in Vietnamese children but not in Swiss ‘Multicultural’ children, demonstrating the importance of an integrated environment in face processing . In the present study, we examined if a larger Swiss Caucasian child population would be sensitive to ethnicity of faces through an emotional ORE and to emotion through a developmental course performance between happy and angry expressions. Based on previous study results, we do not expect facilitation in reaction times in this integrated population, but suggest that a developmental emotional effect could be demonstrated in angry face processing, due to the developmental course for this emotion.
Using an eye-tracker, we evaluate the ocular behavior pattern used between ages 7 and 11 to process these different faces. The ocular movement analysis could provide more finer data depending on the face ethnicity and emotion processed. Happy faces would be processed more easily in younger children parallel to emotional understanding development in the age range. If analytic fixation patterns are already developed, we could observe interaction between face and area of interest, i.e. the mouth and zygomaticus for happy faces and the eyes and corrugators for angry faces.
A total of 88 children (48 girls 40 boys) aged 7, 9 and 11 years were recruited in 3 Swiss multicultural public schools in the canton of Geneva and participated in the present experiment (Table 1). Parents had previously signed an informed consent and completed a Socio-Economic Status questionnaire and questions about cultural origin and familial environment. Children’s birth date and term, country of birth, residence or education since birth, mother tongue and ethnic types present in the child’s immediate environment (Asian, Caucasian and/or other) were controlled. Children who were themselves, or who lived or had lived in an environment that was neither Asian nor Caucasian were included in a so-called "multicultural" group. The questionnaire indicated 39 Caucasian, and 49 ‘Multicultural’ environments, i.e. neither Caucasian nor Asian children. Seven subjects were excluded from the looking times analysis due to the loss of eye detection by the eye-tracker or recording failures. Thus, 81 children were analyzed for eye-tracking looking times (46 girls of whom 23 are Caucasian, 35 boys—14 Caucasian).
|Reaction times||Eye-tracking looking times|
|Age Group (Mean±SD)||Environment||N||Total (N girls)||Age Group (Mean±SD)||Environment||N||Total (N girls)|
|7 y.o (6.68±0.48)||Caucasian||13||25 (14)||7 y.o (6.65±0.49)||Caucasian||12||20 (12)|
|9 y.o (8.58±0.50)||Caucasian||11||25 (14)||9 y.o. (8.61±0.50)||Caucasian||10||23 (14)|
|11 y.o. (10.50±0.62)||Caucasian||15||38 (20)||11 y.o. (10.50±0.62)||Caucasian||15||38 (20)|
This study was carried out in accordance with the latest revision of the World Medical Association’s Code of Ethics (Declaration of Helsinki), and was approved by the Institutional Review Board at the University of Geneva (Commission d’Ethique FAPSE).
The stimuli presented consisted of 5-second video clips showing gradual changes in emotional expressions displayed by adult faces in frontal pose . Caucasian face stimuli were constructed from 20 adult face pictures (50% female) selected and standardized black and white pictures of 5°×6.8° of visual angle. The same photos have been successfully used to create similar stimuli in previous studies [54, 65–67]. Asian face stimuli were constructed in the same way with 20 adult faces (50% female) selected from the Asian Emotion Database [68, 69]. Asian database and pictures were selected in order to correspond as possible to Caucasian faces in terms of size, resolution, contrast and luminance, thus converted in black and white. To create morphed images depicting the continuum between two faces (pictures of a person with happy and angry expressions), the positions of the features in one photograph are moved toward their positions in the other photograph, as if the image lay on a rubber sheet. Each angry- and happy-face picture was paired and morphed progressively from one emotion to the other with a software (Morpheus Photo Morpher, version 3.17) producing 500 frames by paired stimuli, converted afterwards into movies at 100 frames per second with ffmpeg (http://ffmpeg.org/) [66, 67]. The 5-second movie clips always showed a full-blown expression of happiness or anger that gradually morphed into the other expression. Four blocks (2 Asian and 2 Caucasian faces counterbalanced) of 20 trials were performed and separated by a short break.
The experiment took place in a quiet room at the children’s school where they were tested individually with a computer task. Eye movements were recorded with an eye-tracker SMI RED 250 (SensoMotoric Instruments GmbH, Teltow, Germany). The experiment started with a 9-point calibration phase at different locations covering the whole surface of the screen. This phase was repeated until a satisfactory calibration (less than 2° of deviation in the x and y axes) was achieved. Video clips of adult faces showing one emotion (e.g. Happiness) moving towards another (e.g. Anger) were presented to each child. The children were instructed to report when they no longer perceived the initial first emotional expression by pressing the “Space” button of the computer with their writing hand’s index finger. Each child completed 80 trials divided into 4 blocks for a total of 12 to 15 minutes.
Analyses were interested in 1) behavioral offset reaction times to press the button for the emotional offset evaluation; 2) eye-tracking recorded mean looking times toward specific areas of the faces (Area Of Interest, AOI). AOIs have been delineated on the basis Ekman and Friesen’s Facial Acting Coding System  in which the contractions or decontractions of the face are broken down into action units. We used action units to the emotional expression of happiness and anger, and defined ‘universal’ AOIs that allow us to cover the entire eye or mouth areas for each face regardless of moves associated to emotional expression in movies as in Berdasco-Muñoz, Nazzi and Yeung . In this view, the area called ‘eyes AOI’ contained the orbicular muscles of the eye for the cheek lift specific to joy (UA6) as well as the tension of the eyelid and the opening between the upper eyelid and the eyebrows specific to anger (UA5 and UA7). The ‘Mouth AOI’ included the orbicular muscle of the mouth activated in the closing tension of the lips proper to the expression of anger (respectively UA4 and UA23). The area of the mouth also included the muscle of the great zygomatic used to stretch the corner of the lips to make them smile (UA12). All AOIs are identical to those shown in Fig 1 and have a size of 1.2 x 3.5°. These ‘Eyes’ and ‘Mouth’ extended areas allowed us to analyze the videos with a constant in the AOI as demonstrated in Berdasco-Muñoz, Nazzi and Yeung . Looking times were extracted from the SMI program through Net Dwell Times for the first half of each video morph timeline (2500 ms) in order to evaluate ocular movement for the first emotion offset evaluation corresponding to response decision making.
Afterwards, reaction and looking times greater than two standard deviations from the mean were not considered for analysis (<2%). Statistical analyses were conducted using Statistica 13. The significance threshold was .05 and planned contrasts were performed to contrast significant interactions. Effect sizes are given in partial eta-squared ηp2 for ANOVAs.
A Stimulus Face ethnicity (Asian/Caucasian) x Emotion (Angry to Happy / Happy to Angry) repeated measure ANOVA was performed on mean RTs for the 88 children with Age Group (7, 9, 11) and Participant Environment (Caucasian, Multicultural) as between-subject factors.
There is no main effect of Stimulus Face ethnicity (F(1,82) = 1.44, p = .234, ηp2 = .017), interactions with Age Group (F(1,82) = 0.14, p = .867, ηp2 = .003) (Fig 2A) or Participant Environment (F(1,82) = 0.96, p = .33, ηp2 = 0.012).
There is a trend for main effect of Emotion (F(1,82) = 3.54, p = .063, ηp2 = .041) but a significant interaction between Emotion and Age Group is observed (Fig 2B). Indeed, Angry offset reaction times decrease with age more than Happy expression (F(2,82) = 5.87, p = .0042, ηp2 = .125).
Contrasts from planned comparison show that offset perception is significantly shorter for Angry faces than for Happy ones in the 11 year-old group only (2905 vs. 3141 ms, F(1,82) = 17.25, p<.0001). Moreover, oldest children have decreased Reaction Times than youngest ones for the Angry to Happy expression (2905 vs. 3257 ms, F(1,82) = 6.14, p = .015).
An interaction between all factors, Face Ethnicity, Emotion, Age Group and Environment is also observed (F(2,82) = 3.2, p = .046, ηp2 = .072). Contrast analysis shows the significant effect of emotion for 10–12 year-old children, with decreased RT for Anger offset compared to Happiness offset particularly for Caucasian Environment children processing Caucasian faces (2782 vs. 3225 ms, F(1,82) = 15.03, p<.001), but not for Multicultural children (3056 vs. 3171, F(1,82) = 1.55, p = .217) or in Caucasian environment children for Asian Faces (2840 vs. 3017, F(1,82) = 2.44, p = .122).
Looking times were analyzed on the first half of the videos in order to observe ocular behavior 1) depending on the first emotional expression of the morph; and 2) before motor response. The significance threshold was .05; effect sizes are given in partial eta-squared ηp2 for ANOVAs main effects. Planned comparisons were performed for significant interaction contrasts.
Stimulus Face ethnicity (Asian/Caucasian)] x Emotion (Angry to Happy / Happy to Angry) x AOI (Mouth/Eyes) repeated measures ANOVA with Age Group (7, 9, 11) and Participant Environment (Caucasian, Multicultural) as between-subject factors were performed on mean Looking Times on the first half of morphs for the 81 children.
We observe a main effect of AOI; mouth areas are more watched than eyes (1047 vs. 812 ms, F(1,75) = 5.37, p = .023, ηp2 = .067) and different interactions are demonstrated.
First, Face ethnicity interacts with children’s Age group (Fig 2C), with an inversion of looking time between Caucasian and Asian faces, appearing in the older group, with faster Caucasian face processing in 10–12 year-old children (F(2,75) = 3.17, p = .048, ηp2 = .078).
The second interaction concerns Face ethnicity and AOI (F(2,75) = 16.32, p = .0001, ηp2 = 0.178), in which we observed a significant difference between mouth and eye looking times for Caucasian face processing (1123 vs. 751 ms, F(1,75) = 12.39, p = .0007), not observed for Asian faces (972 vs. 872 ms, F(1,75) = 0.84, p = .3618). There is no main effect of children’s environment (F(1,75) = 2.58, p = .362, ηp2 = .033).
There is no main effect of emotion in mean looking times (F(1,75) = 1.33, p = .253, ηp2 = .017) nor interaction between Emotion and Age group (Fig 2D) (F(1,75) = 0.95, p = .391, ηp2 = .024).
The last interaction involves Face ethnicity, Emotion and AOI (Fig 3) (F(1,75) = 5.68, p = .019, ηp2 = .070). This interaction shows that the Face ethnicity effect is observed particularly for Anger offset evaluation, with a significant difference in AOI looking times (Fig 3). Children looked more at the Mouth for Caucasian faces than for Asian ones (1132 vs. 931 ms, p<.0001) and at the Eyes for Asian faces compared to Caucasian ones (890 vs. 763 ms, p<.0001).
Moreover, the processing of Caucasian faces shows a significant difference between Mouth and Eyes AOIs for both emotions (Angry: 1132 vs. 736 ms, F(1,75) = 17.03, p<.001; Happy: 1114 vs. 766 ms, F(1,75) = 1, p = .006).
However, Asian emotional expressions do not differ between AOI looking times, but the Happy expression showed longer looking times toward the mouth than the Angry one (1012 vs. 931 ms, F(1,75) = 7.63, p = .007).
A second rmANOVA analysis was completed on AOI looking times percentages in order to see the proportion of looking time between each AOI. We observe an effect of Face (F(1,75) = 15, p<.001, ηp2 = .166). Asian faces were treated equally between mouth and eyes areas, whereas Caucasian were more scanned from mouth (Fig 4A).
The average percentage of looking times to each AOI was calculated across conditions and compared with the percentage expected by chance (.50) using two-tailed t-tests. Results were presented in Fig 4B as percentage of preferential AOI looking times (% Eyes AOI–expected by chance .50). A positive percentage therefore corresponds to a preference for Eyes AOI, while a negative percentage corresponds to a preferential orientation towards the mouth AOI. Results showed that 7-year-old children preferentially looked at mouth AOI to process Caucasian Faces Angry (-15,2%, p = .005) and Happy (14,1%, p = .042) expressions whereas no preferential orientation was found for Asian Faces.
This study aimed at evaluating the developmental impact in emotional expression discrimination depending on the ethnicity of faces. For this aim, we included 7, 9 and 11 year-old children in a multicultural public school who had to perform an offset task, in order to detect their sensitivity to facial emotional cues through reaction times and/or ocular behavior.
There was not a significant difference in reaction time performance between Caucasian and Asian emotional expression processing. This result is consistent with a previous study using this task that showed an emotional other-race effect in Vietnamese children but not in their Swiss counterparts . This observation is made in the integrated Swiss population, in a city composed of 48% foreign residents [71, 72]. The ability to discriminate emotions would be more generalized in people living in an environment with heterogeneous ethnicities, as previously suggested in adults by a meta-analysis published by Elfenbein and Ambady . Nevertheless, reaction times show that, even if the Happy expression evaluation does not evolve between Age groups, Anger offset detection becomes shorter in the older group. This observation is consistent with emotional development, with a stabilized processing of Happiness early in development but a later development of the understanding in more complex and less experienced emotions such as Anger, which is known to be efficiently processed between ages 8 and 12 .
Eye-tracker data were analyzed to deal with these observations in depth. We found here that gaze behaviors towards emotional faces showed several differences depending on Age, Emotion or Face ethnicity. A main impact should be the effect of Face ethnicity processing through the age groups. Indeed, we observe a decrease in Caucasian face processing times with age, suggesting the emergence of an eye-tracked emotional Other-Race Effect in children despite its absence in offset reaction times. The Caucasian faces should also be more processed by AOI, with longer looking times around the mouth area than the eyes area, which was not observed for Asian faces. Asian faces processing was more oriented toward the mouth area for the Happy evaluation than for the Angry evaluation. This observation made on net looking times was supported by the results of the analysis of the percentage of AOI preferential looking times. This finding could be consistent with Kitayama and collaborators’ study , which showed a preferential focal feature strategy used by Caucasians to process faces, whereas Asian people used more holistic and global strategies, as supported by the findings of Blais and collaborators . Here, it was the Caucasian faces that were more analyzed by preferential features compared to Asian faces treated more centrally, even by the same population. However, we also see that with age, Caucasian faces tend to be processed more equitably among AOIs. Interpretation in holistic treatment must therefore be approached with caution as it may involve the development of better processing skills for the ocular region. Indeed, youngest children show a better early use of information from the mouth, associated to earlier processing skills of happy expressions.
Altogether, our results show a developmental course for emotional facial expression, with the later improvement of Anger processing compared to the already well-established happy face processing, consistent with the early development of happy emotion preference and understanding [4–6]. It has been previously shown that happiness is correctly conceptualized as of age 6, with efficient component processing. Indeed, happiness is not sensitive to the inversion effect avoiding holistic treatment , thus, the mouth as an analytic component for smile processing could be enough . Eye-tracking data corroborate the preferential orientation towards the mouth area to process happiness. On the other hand, anger did not orient children preferentially toward the eye area and specific activations of the corrugator, but we saw through reaction times that anger processing is under development at this period. Angry faces could require a holistic processing which could develop gradually during childhood [57, 58, 61, 75, 76], or a better Eyes AOI orienting to process corrugator areas. Nevertheless, our results show that, more than an Asian vs. Caucasian people interaction strategy, Kitayama’s results with Asian holistic vs . Caucasian analytic processing could reflect how to optimize Asian or Caucasian face processing . Thus, even Caucasian children would do better to look centrally for Asian face processing and in specific areas in Caucasian faces. Also, the multicultural environment of the Swiss population could have made it possible to develop the emergence of these optimal strategies, while keeping an Other-Race Effect, which would be behaviorally invisible but psychophysically observable through the eye-tracker. Our study reveals that in an integrated multicultural environment, the behavioral emotional Other-Race Effect is not systematically found, suggesting that own-race bias can be hidden by various inter-ethnical experiences. Nevertheless, as suggested by Kelly and collaborators , even if the environment modulates expertise for type of faces experienced, this social experience seems not to abolish cultural diversity in eye movement strategies. Indeed, even though behavioral emotional ORE was not found in our sample, the eye-tracker revealed a silent ORE through ocular movements. This tool allows us to observe a developmental course in emotional processing, confirming an earlier understanding of happiness compared to anger, and shows longer looking time towards mouth area. Previous studies have suggested a Caucasian analytic strategy in face processing contrasted with an Asian ‘central’ holistic strategy . From 7 year-old, children would already displayed patterns of fixations in internal features consistent with adults of their cultural groups . Thus, our results could present above all a developmental period in which the gaze is oriented principally around the mouth due to abilities developed for happiness processing, during which children learn to balance their gaze between the eyes and mouth according to the emotion. Nevertheless, we highlight that even Caucasian children seem to process Asian faces, particularly the more complex anger emotion, with a more central or balanced strategy in order to find their cultural emotional cues. These results seem to inform about a flexible and crucial emotional developmental period, during which component and holistic strategies are evolving depending on expertise and familiarity to faces type to optimize performance. Nevertheless, the understanding of holistic vs. component strategies used depending on age or ethnicity requires major further investigation and specific studies, such as manipulating the inversion effect with eye-tracker recording, and fine gaze analysis with adapted stimuli. In conclusion, this study strengthens the interest to develop research on eye-tracking in developmental and emotional processing studies, showing that even a “silent” effect should be detected and shrewdly analyzed through an objective means.
We would like to thank the participants as well as the teachers and principals who made this study possible. Thanks to Anna Crelier and Christine Conti for conducting experiments as part of their master’s degree.