An EEG experimental protocol is designed to clarify the interplay between conscious and non-conscious representations of emotional faces in patients with Asperger's syndrome. The technique suggests that patients with Asperger's syndrome have deficits in non-conscious representation of emotional faces, but have comparable performance in conscious representation with healthy controls.
Several neuroimaging studies have suggested that the low spatial frequency content in an emotional face mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces1-3. These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly2. In contrast, the conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in faces3,4. Asperger's syndrome (AS)5,6 represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain social communication failure in patients with AS7-9. In order to clarify the interplay between conscious and non-conscious representations of emotional faces in AS, an EEG experimental protocol is designed with two tasks involving emotionality evaluation of either photograph or line-drawing faces. A pilot study is introduced for selecting face stimuli that minimize the differences in reaction times and scores assigned to facial emotions between the pretested patients with AS and IQ/gender-matched healthy controls. Information from the pretested patients was used to develop the scoring system used for the emotionality evaluation. Research into facial emotions and visual stimuli with different spatial frequency contents has reached discrepant findings depending on the demographic characteristics of participants and task demands2. The experimental protocol is intended to clarify deficits in patients with AS in processing emotional faces when compared with healthy controls by controlling for factors unrelated to recognition of facial emotions, such as task difficulty, IQ and gender.
Facial emotion recognition is one of the most important brain processes engaged in social communications. A variety of mental disorders are related to problems with explicit detection of facial emotions4-6. A photograph of a face contains a spectrum of spatial information that can be filtered for either the high spatial frequency (HSF) or low spatial frequency (LSF) content. HSF is related to highly detailed parts of an image, such as the edges of a face, while LSF is related to coarser or less well-defined parts such as a holistic face with LSF contents7. Any face recognition task simultaneously induces conscious and non-conscious processes8-12, and the participation of the non-conscious process occurs in the 150-250 msec post onset interval or even earlier13. In healthy controls, the non-conscious process is generally faster than the conscious process14,15. Several neuroimaging studies have suggested that the LSF in a facial stimulus (or motivationally significant stimulus) mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces3,16. These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly1. In contrast, conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in the face9,17,18.
Asperger's syndrome (AS)19,20 represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain the social communication failure in AS21-25. Behavioral disorders observed in children with AS can be diagnosed in the first three years of life26, a period during which their voluntary (or conscious) control over behaviors is not fully developed27. In adults with AS, the behavioral disorders can be compensated for through attention regulation28. Difficulty in processing details within a certain spatial frequency range may indicate a disruption in different information processing stages. So far, no study has directly addressed evoked potentials and oscillatory activity in patients with AS during facial emotion recognition involving face stimuli in specific spatial frequency ranges. It is important to examine the functional trajectory in patients with AS when compared with healthy controls during processing facial stimuli with different spatial frequency contents by controlling for task demands and demographic effects such as gender and IQ.
In order to clarify the interplay between conscious and non-conscious representations of emotional faces, an EEG experimental protocol is designed for comparing brain evoked potentials and oscillatory activity between patients with AS and IQ/gender-matched healthy controls. A cohort of pilot participants was recruited prior to the EEG experiment for assistance with selection of the experimental stimuli and development of a scoring system in order to facilitate an evaluation of performance in patients with AS. The protocol consists of two tasks involving emotionality evaluation of either photograph or line-drawing faces. The differences between the two groups can be assessed by computing ERPs and event-related spectral perturbations (ERSPs). In the next section, the details of the experimental protocol are elaborated on, including the pilot study and EEG data processing/analysis methods, followed by the main analysis results. Finally, the critical steps in the protocol and its significance with respect to existing methods are discussed. The limitation and possible extension of the protocol to use in patients with other emotional disorders are also pointed out.
Ethics Statement: Procedures involving human participants have been approved by the human participant research ethics committee/Institutional Review Board at the Academia Sinica, Taiwan.
1. Stimuli and Experimental Program Preparation
Figure 1. Examples of emotional face stimuli. (A) photograph faces where the hair and ears have been masked out in the black background color, and (B) line-drawing faces that are edited from (A) by graphics software. The faces show neutral, happy, and angry emotions respectively from the top to bottom rows. Please click here to view a larger version of this figure.
Figure 2. A screenshot of a face stimulus in the program. The size of the face is configured to fit the height of the screen. The empty area is filled in with the black color. Please click here to view a larger version of this figure.
Figure 3. A screenshot of the scoring system for emotionality evaluation. The scoring bar is designed to have no tick mark. The participant needs to drag the mouse to select the score assigned to a face and press the GO button to finish the task. Please click here to view a larger version of this figure.
2. EEG Recording Procedure
3. Processing EEG Data
Note: The software commands provided in this section are specific for EEGLAB.
4. Statistical Analysis
Figure 4. The channel partition. The channels are divided into eleven regions. LF: left-frontal (10 channels), MF: midline-frontal (14), RF: right-frontal (10), LT: left-temporal (13), RT: right-temporal (13), LC: left-central (9), MC: midline-central (14), RC: right-central (9), LP: left-occipital parietal (9), MP: midline-occipital parietal (12), RP: right-occipital parietal (9). Please click here to view a larger version of this figure.
The average verbal and performance IQ scores are listed in Table 1 for the control and AS groups along with the average reaction times and average scores assigned to emotionality of faces of the two groups. In the table, none of the group differences achieves statistical significance except for the neutral faces in the line-drawing task, where the AS group has an average score near zero (p <0.001)13. Interestingly, the AS group still has slightly longer reaction times than the control group in responding to angry and happy faces, and shorter reaction times in responding to neutral faces even under the experimental control of gender, IQ and face stimuli. Asperger's syndrome is found with impairments in the amygdala and its associated limbic structures37-39, which are known to be involved in memory of emotions except for the neutral emotion40,41. These limbic structures associated with the non-conscious process may play an important role in interpretation of behavioral responses in patients with AS.
Table 1. Behavioral data of the scores on the Wechsler adult intelligence Scale-III, reaction times, and average emotionality scores assigned to face stimuli in the photograph and line-drawing tasks. This table is a modified version of Table 1 in Tseng et al.13
As shown in Figure 5, the N400 component in the control group is pronounced in the frontal, temporal and occipital-parietal regions in both photograph and line-drawing tasks, but the amplitude of this component is smaller in the line-drawing task. In the AS group, the N400 is visible in the midline frontal region, but invisible in other regions in the photograph task, and becomes visible in all frontal regions in the line-drawing task. The MANOVA task-by-group interaction effect is significant in the 350-450 msec post onset interval (p = 0.019). The two groups also show significant differences in the early perception in the photograph task42, and have comparable ERP patterns in the line-drawing task; that is, the task-by-group interaction effect is also significant in the 50-150 msec post onset interval (p = 0.035). Photograph and line-drawing faces reach the largest ERP difference in the temporal and occipital-parietal regions in the 250-550 msec interval.
Figure 5. ERP plots. ERP plots in the right frontal, right temporal and right occipital-parietal regions in the control (blue) and AS (red) groups in the (A) photograph and (B) line-drawing tasks. Locations of EEG channels are shown in the upper left-hand side of each plot. The vertical axis shows the ERP voltage (µV) and the horizontal axis shows the time in msec. This figure is a modified version of Figure 2 in Tseng et al.13 Please click here to view a larger version of this figure.
As shown in Figures 6 and 7, delta/theta synchronization in the control group is pronounced in the 50-800 msec post onset interval in both tasks. The occipital-parietal regions display strongest synchronization, followed by the central and temporal regions and then by the frontal regions in the early 50-350 msec interval, and the regional differences disappear after 350 msec. The occipital-parietal regions also demonstrate the strongest alpha/beta desynchronization in the 200-800 msec interval. In general, the photographs have an additive effect over line-drawings in delta/theta synchronization, but the line-drawings induce stronger alpha/beta desynchronization. The AS group has more comparable delta/theta synchronization as the control group in the line-drawing task, and no apparent additive effect associated with the photograph faces. The MANOVA task-by-group interaction effect is significant in the 50-150, 250-350, and 350-450 msec post onset intervals (p = 0.043, 0.003 and 0.015, respectively). The group effect is also significant in the 150-250, 250-350, and 350-450 msec intervals (p = 0.033, .011 and 0.022, respectively). The AS group displays stronger delta/theta synchronization in the occipital-parietal regions in the 150-250 msec interval as well as the midline regions in the 350-450 msec interval when compared against other scalp regions. The alpha/beta desynchronization in the AS group is similar to that of the control group (and slightly stronger) in both tasks, but the differences between the two tasks tend to be smaller in the AS group. The MANOVA group and task-by-group effects are statistically insignificant in high-frequency oscillations.
Figure 6. ERSP plots in the photograph task. ERSP plots for the (A) control and (B) AS groups in the photograph task. The red color denotes power increase (synchronization), and the blue color denotes power decrease (desynchronization) compared with the baseline. This figure is a modified version of Figure 3 in Tseng et al.13 Please click here to view a larger version of this figure.
Figure 7. ERSP plots in the line-drawing task. ERSP plots for the (A) control and (B) AS groups in the line-drawing task. This figure is a modified version of Figure 3 in Tseng et al.13 Please click here to view a larger version of this figure.
The ERP results suggest a group difference in the early perception (50-150 msec) and later semantic recognition (350-450 msec) of emotional faces in the photograph task. The AS group has a smaller P1 amplitude in the photograph task and a slightly larger P1 amplitude in the line-drawing task when compared with the control group. The amplitude differences in the P1 between the two tasks may reflect the uniqueness of patients with AS in the perception of photographs and line-drawings43. The N400 is shown to be strongly affected by the emotional content, familiarity and global/local features in faces44. In our study, the N400 (350-450 msec) in the frontal and temporal regions is highly visible in the control group but almost invisible in the AS group in the photograph task. In facial emotion recognition, the N400 may be interpreted as a process of searching for a link between a face and its semantic interpretation (angry, neutral and happy). In the control group, the ERP difference between the two tasks in the 350-450 msec interval is consistent with findings by others. The amygdala is more active to intact fearful faces or fearful faces containing only LSF contents3,45. As the most LSF contents are removed from the line-drawings, these findings from the control group indicate that the N400 is much smaller in the occipital-parietal region and almost invisible in the temporal regions compared with that in the photograph task.
Because information processing of line-drawings depends less on the non-conscious function in the amygdala, patients with AS show more comparable ERP patterns as the healthy controls in the later (350-450 msec) stages during emotional face recognition. Interestingly, the AS group can accomplish the emotionality evaluation tasks correctly without the visible N400 in the photograph task. It is reasonable to hypothesize that information processing through the amygdala and its associated limbic structures play a crucial role in triggering the amplitude of the N400, which may impact the efficiency of information processing in patients with AS but has no effect on their response accuracy.
It has been shown that emotional face recognition engages early and later changes in delta/theta oscillations8, which are considered the brain activity associated with cortical-limbic projections during stimulus estimation46-48. Delta/theta synchronization is more associated with non-conscious than with conscious face recognition46. The findings on ERSPs further indicate that the AS group has much weaker synchronization in delta/theta rhythms in the early and later stages of emotional face recognition. It is reasonable to hypothesize that weaker delta/theta synchronization reflects a disturbance in non-conscious processing of emotional expressions and a failure in the limbic-cortical projection in patients with AS. Delta/theta synchronization is slightly more pronounced in the midline frontal, midline central and midline occipital-parietal regions relative to other scalp regions in the AS group in the 350-450 msec post onset interval in both tasks. These midline regions are closely related to the cortical structure of conscious representation of emotional significance18.
Because the cognitive or conscious pathway is still mediated by the limbic structure such as the thalamus, we may hypothesize that the AS group relies on the conscious pathway more than the non-conscious pathway in responding to the photographs and line-drawings. In the control group, the delta/theta power reaches the strongest in the parietal-occipital regions time-locked to stimulus onset and increases in the frontal regions at a later stage in the photograph task. The spatial distribution of the delta/theta power in the line-drawing task becomes closer to that of the AS group. We hypothesize that the control group engages the conscious and non-conscious pathways in the photograph task, and relies on the conscious pathway in the line-drawing task.
When comparing ERSPs between the two tasks, the control group additionally suggests an additive effect of LSF contents on delta/theta synchronization in the 250-450 msec post onset interval, independent of brain regions and of mechanisms elicited by facial emotions. The LSF content in a face seems to place a constant load on the information flow, which may be easily bypassed through voluntary attention to details in a face, as is suggested by patients with AS who can evaluate facial emotions successfully in the photograph task. Strong alpha and beta oscillations have been referred to as indicators of functional processes in the neocortex associated with attention, semantic long-term memory, and cognitive estimation of stimuli49,50. In a face recognition task, alpha/beta desynchronization reflects the level of voluntary attention to visual stimuli and is associated with cognitive appraisal of facial emotions15,18,51. In this study, there is no evidence supporting a task or group effect in higher frequency oscillations (alpha and beta) except for regional differences, when comparing the difference between the parietal-occipital region and other regions. Alpha desynchronization reflects attention and a release from inhibited processes in complicated tasks52, whereas beta oscillation is seldom observed in emotion-related tasks53,54. Beta desynchronization in the AS group is generally stronger than that in the control group in both tasks, but the group difference is insignificant. The ERSPs suggest that the AS group has much weaker delta/theta power, but slightly stronger alpha/beta power when compared with the control group. We hypothesize that patients with AS may direct their attention to some important details in faces by use of cognitive appraisal of visual stimuli to compensate for sensory and affective deficits.
In summary, the recognition of facial emotions in healthy controls induces both conscious and non-conscious processes9,18,51. The reaction-time differences between the two tasks tend to be larger in the control group than those in the AS group. We hypothesize that the healthy controls engage the conscious process more than the non-conscious one in responding to the line-drawings and exert both processes in responding to the photographs, whereas patients with AS rely only on the conscious process in responding to both types of faces.
Supplemental Code File: Example Program. Please click here to download this file.
The literature features studies on recognition of facial emotions in patients with autism by analysis of EEG reactions44, and on recognition of high- and low-spatial frequency contents using visual stimuli43. To the best of our knowledge, however, there is a lack of existing work on the brain oscillatory activity that combines emotion recognition with distinct spatial frequency contents. Our protocol is a first step towards estimating the influence of emotionality (positive, neutral and negative faces) and spatial frequency information (photographs and line-drawings) on recognition of emotions in patients with AS compared with healthy controls. Our analysis of EEG reactions in the spatial, temporal and frequency domains allows for separating the affective and cognitive functions to a degree for the scientific understanding of the AS disorder. In this study, the experimental protocol provides an approach to minimizing factors unrelated to the recognition of emotions; that is, reaction times and scores assigned to emotionality of faces are kept as similar as possible between the two groups by a carefully designed pilot study. Participants are also matched on IQ and gender in both the pilot study and EEG experiment. While previous EEG studies on AS have focused on the P1 and N17055, the protocol in this study makes a contribution in demonstrating a significant difference in the N400 component between the AS and control groups.
Ekman's emotional faces elicit stronger lower frequency oscillations in the healthy controls compared with faces in other databases (e.g., some well-validated Taiwanese's emotional faces). It is highly recommended to conduct a pilot EEG study to validate emotional face stimuli used in patients and healthy controls before the EEG experiment. Patients with AS had difficulty using HSF information in the eye regions56. In this reason, the selected Ekman's face stimuli contain emotional expressions identifiable by exposed/unexposed teeth or furrowed/smoothed eyebrows. Studies on other types of patients might consider other face features while replacing stimuli used in the protocol. The scoring system must be designed to facilitate patients performing the emotionality evaluation task, which can be resolved by interviewing patients recruited in the pilot study; that is, the ordered continuum without any tick marks except for the central and end points is designed according to feedback from the pilot patients. The labels at the end points of the scoring system can be modified, for example, friendly versus inimical, which must be chosen to maximize the emotional responses especially in the controls.
In the literature, AS is found with impairments in the amygdala and its related limbic structures37-39, which are involved in memory and retrieval of information relevant to emotions, except for the neutral emotion40,41. Further, the amygdala is sensitive to the LSF content in a photographed face3. The two tasks in the protocol are designed according to the existing findings on deficits in adults with AS, and the stimuli and scoring system were additionally designed for use with this population of patients. Clinical applications of the protocol to other adult patients with a similar type of impairment, such as autism spectrum disorders57, can be conducted with a minor modification in the face stimuli and the scoring system.
It should be noted that the protocol is not intended for clinical diagnosis of children younger than 7 years old, whose conscious (or voluntary) control over behaviors may not be fully developed26. Further, the technique does not yield clear diagnostic results in patients with psychiatric comorbidity following brain injuries, tumors or other violations of cerebral hemodynamics. Several studies have found a relationship between aggression and hormonal changes in women during the menstrual-cycle58,59. It is also well-known that the administration of ethanol or narcotic drugs changes the emotional reactions60. These types of changes may cause fluctuations in EEG reactions to emotional stimuli in both healthy controls and patients with AS. Therefore, it is not recommended to apply the protocol to women during monthly periods or when suffering premenstrual syndromes, or to patients under alcohol or drug intoxication. Neuroimaging studies on conscious and non-conscious pathways of emotions may apply the protocol to demographically matched healthy controls and patients with AS by varying the degrees of coarseness and neutrality in emotional face stimuli.
Patients with AS belong to a relatively high trait-anxiety group13,36 and their eyes blinking and motion artifacts can be serious. It is desirable to have experienced data processors and efficient algorithms for removing EEG artifacts before addressing any scientific or clinical issues. The experimental protocol represents an effort toward research into the conscious and non-conscious representations of emotions in the brain. The protocol has been validated by recruiting IQ/gender-matched controls and patients with AS in the EEG experiment. The reaction time and response accuracy are additional supplements to the psychological and behavioral diagnoses. The technique is independent of the subjective mood of the participant during the experiment, and therefore, allows for tracking dynamics of a patient's state during and after psychological or pharmacological therapy. The technique can be applied to patients suffering from other kinds of affective pathology, such as the anxiety disorder, depression, burnout syndrome, and emotional violation in post-traumatic stress. Further modifications on the protocol are encouraged for use in other social and emotional disorder groups. A well-designed pilot study with interview of controls and patients would help with validation of a modified version of the protocol.
The authors have nothing to disclose.
This research was supported by grants MOST102-2410-H-001-044 and MOST103-2410-H-001-058-MY2 to M. Liou, and RSF-14-15-00202 to A.N. Savostyanov. The support of Russian Science Foundation (RSF) was used for elaboration of experimental paradigm of face recognition.
Synamps 2/RT 128-channel EEG/EP/ERP | Neuroscan | ||
Quik-CapEEG 128 electrodes | Neuroscan | ||
Gel | Quik-Gel | ||
FASTRAK 3D digitizer | Polhemus |