Summary

Conscious and Non-conscious Representations of Emotional Faces in Asperger’s Syndrome

Published: July 31, 2016
doi:

Summary

An EEG experimental protocol is designed to clarify the interplay between conscious and non-conscious representations of emotional faces in patients with Asperger's syndrome. The technique suggests that patients with Asperger's syndrome have deficits in non-conscious representation of emotional faces, but have comparable performance in conscious representation with healthy controls.

Abstract

Several neuroimaging studies have suggested that the low spatial frequency content in an emotional face mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces1-3. These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly2. In contrast, the conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in faces3,4. Asperger's syndrome (AS)5,6 represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain social communication failure in patients with AS7-9. In order to clarify the interplay between conscious and non-conscious representations of emotional faces in AS, an EEG experimental protocol is designed with two tasks involving emotionality evaluation of either photograph or line-drawing faces. A pilot study is introduced for selecting face stimuli that minimize the differences in reaction times and scores assigned to facial emotions between the pretested patients with AS and IQ/gender-matched healthy controls. Information from the pretested patients was used to develop the scoring system used for the emotionality evaluation. Research into facial emotions and visual stimuli with different spatial frequency contents has reached discrepant findings depending on the demographic characteristics of participants and task demands2. The experimental protocol is intended to clarify deficits in patients with AS in processing emotional faces when compared with healthy controls by controlling for factors unrelated to recognition of facial emotions, such as task difficulty, IQ and gender.

Introduction

Facial emotion recognition is one of the most important brain processes engaged in social communications. A variety of mental disorders are related to problems with explicit detection of facial emotions4-6. A photograph of a face contains a spectrum of spatial information that can be filtered for either the high spatial frequency (HSF) or low spatial frequency (LSF) content. HSF is related to highly detailed parts of an image, such as the edges of a face, while LSF is related to coarser or less well-defined parts such as a holistic face with LSF contents7. Any face recognition task simultaneously induces conscious and non-conscious processes8-12, and the participation of the non-conscious process occurs in the 150-250 msec post onset interval or even earlier13. In healthy controls, the non-conscious process is generally faster than the conscious process14,15. Several neuroimaging studies have suggested that the LSF in a facial stimulus (or motivationally significant stimulus) mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces3,16. These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly1. In contrast, conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in the face9,17,18.

Asperger's syndrome (AS)19,20 represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain the social communication failure in AS21-25. Behavioral disorders observed in children with AS can be diagnosed in the first three years of life26, a period during which their voluntary (or conscious) control over behaviors is not fully developed27. In adults with AS, the behavioral disorders can be compensated for through attention regulation28. Difficulty in processing details within a certain spatial frequency range may indicate a disruption in different information processing stages. So far, no study has directly addressed evoked potentials and oscillatory activity in patients with AS during facial emotion recognition involving face stimuli in specific spatial frequency ranges. It is important to examine the functional trajectory in patients with AS when compared with healthy controls during processing facial stimuli with different spatial frequency contents by controlling for task demands and demographic effects such as gender and IQ.

In order to clarify the interplay between conscious and non-conscious representations of emotional faces, an EEG experimental protocol is designed for comparing brain evoked potentials and oscillatory activity between patients with AS and IQ/gender-matched healthy controls. A cohort of pilot participants was recruited prior to the EEG experiment for assistance with selection of the experimental stimuli and development of a scoring system in order to facilitate an evaluation of performance in patients with AS. The protocol consists of two tasks involving emotionality evaluation of either photograph or line-drawing faces. The differences between the two groups can be assessed by computing ERPs and event-related spectral perturbations (ERSPs). In the next section, the details of the experimental protocol are elaborated on, including the pilot study and EEG data processing/analysis methods, followed by the main analysis results. Finally, the critical steps in the protocol and its significance with respect to existing methods are discussed. The limitation and possible extension of the protocol to use in patients with other emotional disorders are also pointed out.

Protocol

Ethics Statement: Procedures involving human participants have been approved by the human participant research ethics committee/Institutional Review Board at the Academia Sinica, Taiwan.

1. Stimuli and Experimental Program Preparation

  1. Prepare a pool of more than 60 emotional face photographs29 categorized into three facial expressions (angry, happy, and neutral). Use graphics software to mask out hair and ear parts in the photographs with black background as shown in Figure 1A so that participants can concentrate on the facial features in the photographs.
    1. Open a photograph in the graphics software. Use the selection toolbox to draw an elliptical region and adjust the region size so that the ears and most hair do not fall in the ellipse.
    2. Invert the selected region. Click "delete" to remove the unwanted region of the photograph and replace it with the black background color.

Figure 1
Figure 1. Examples of emotional face stimuli. (A) photograph faces where the hair and ears have been masked out in the black background color, and (B) line-drawing faces that are edited from (A) by graphics software. The faces show neutral, happy, and angry emotions respectively from the top to bottom rows. Please click here to view a larger version of this figure.

  1. Create a pilot study. Recruit pilot participants for selecting suitable stimuli from the photograph pool.
    Note: The pilot participants should not participate in the EEG experiment.
    1. Configure the stimulus presentation program beginning with the first computer screen presenting the task instruction, followed by 5 familiarization trials. Begin each trial with a fixation cross, followed by a face stimulus, and by an emotionality evaluation task. See Supplemental Code File for an example program.
      Note: The real pilot trials immediately follow the familiarization trials by selecting face photographs in a random order from the pool.
      1. Create an experimental program, including the instruction screens and a central eye-fixation screen. Create the face stimulus screen as illustrated in Figure 2 by configuring the photograph size to be 18.3 x 24.4 cm2 (width x height) with black background color, given a computer screen size 41 x 25.6 cm2 with resolution 1,680 x 1,050. See Supplemental Code File for an example program.
      2. Create a scoring system for emotionality evaluation in the program as illustrated in Figure 3. Place a horizontal line ranging from -100 to +100 in a continuous scale in the center of the screen without any tick-marks, except for the central and endpoints. Prepare the program such that participants can freely evaluate the emotionality of a photograph face by dragging the scoring cursor to the left for very angry (-100) and to the right for very happy (+100), and press the GO button.
        Note: The scoring line is designed without any tick-marks because patients with AS can easily get stuck in placing the cursor between ticks during emotionality evaluation. Therefore, a continuous scale is preferred for patients.
      3. Make sure the program records a participant's behavioral results (e.g. reaction time and emotionality scores), which are used as criteria for choosing photographs from the pool (see step 1.3.1).
    2. Recruit pilot participants (5 control and 5 AS pilot participants). Diagnose clinical patients according to Gillberg30 and DSM-IV criteria26 and administer the clinical derived short-form of Wechsler Adult Intelligence Scale (WAIS-III)31. Match the controls to their AS counterparts as closely as possible on gender, and on verbal/performance IQ scores.
    3. Run the experimental procedure in the pilot study for each individual participant. After completing the emotional face recognition task, interview each pilot AS participant on the reasonable duration of the central-eye fixation and stimulus presentation periods, difficulty of the task, ease of using the scoring system and the maximum number of trials for keeping his/her concentration, based on which the program can be reconfigured for the EEG experiment (see step 1.3.2)

Figure 2
Figure 2. A screenshot of a face stimulus in the program. The size of the face is configured to fit the height of the screen. The empty area is filled in with the black color. Please click here to view a larger version of this figure.

Figure 3
Figure 3. A screenshot of the scoring system for emotionality evaluation. The scoring bar is designed to have no tick mark. The participant needs to drag the mouse to select the score assigned to a face and press the GO button to finish the task. Please click here to view a larger version of this figure.

  1. Program for Task 1: Photograph Session.
    1. Select from the pool 30 photographs, comprising 10 each for happy, angry, and neutral facial expressions (5 male and 5 female faces for each type of expressions), that give the most comparable mean reaction times and mean emotionality scores between the 5 AS and 5 control pilot participants.
    2. Update the experimental program configurations by incorporating feedback from the pilot patients, such as the optimal central eye-fixation period (i.e., 1,000 msec), duration of stimulus presentation (i.e., 1,000 msec), inter-stimulus interval (i.e., randomly assigned in-between 4 and 7 sec), and scale of the scoring system (i.e., -100 to 100). Add five familiarization trials prior to the 30 experimental trials in the program.
      1. Change the number of stimuli and time intervals in an external configuration text file associated with the experimental program.
        Note: The text file can be modified to fit different experimental conditions without intervention of software engineers.
      2. Do not count the five photographs for familiarization trials to the 30 selected photographs. Do not use the EEGs and the behavioral data recorded in familiarization trials in data analysis.
  2. Program for Task 2: Line-drawing Session.
    1. Create line-drawing pictures of the 35 photographs (5 for familiarization trials, 30 for experimental trials) used in Task 1 by tracing the edges of each face. Use graphics software to modify the grey scale photographs into black-and-white line-drawings as shown in Figure 1B.
      Note: Steps below for photograph editing is one of the possible solutions for making line-drawings.
      1. In the graphics software, adjust the brightness/contrast of the photograph so that the original grey scale intensity in the majority pixels falls in either black or white.
      2. Apply "sketch effect" in the "effect" or "filter" menu of the software to a grey scale photograph so that only contour of the high spatial frequency part is preserved, and apply "distress effect" to increase the dilation of the contour lines.
      3. Use any brush tool to enhance the contours and use an eraser tool to clean up unwanted parts. Make sure to keep important facial features by checking back and forth between the original photograph and its line-drawing counterpart.
    2. Make a copy of the program of Task 1 in step 1.3 to create a program for Task 2 and replace the 35 photographs in Task 1 with the corresponding line-drawings.

2. EEG Recording Procedure

  1. Preparations
    1. Recruit 10 healthy controls and 10 patients with AS for EEG experiments based on the guidelines of the local human participant research ethics committee/Institutional Review Board.
    2. Administer the short-form of WAIS-III31 to the patients with AS individually prior to the experiments, and find the controls who match the patients as closely as possible on gender and on the verbal/performance IQ scores.
  2. EEG Recording
    1. Seat the participant in a comfortable chair in a sound insulated (dimly lit) chamber and adjust the chair position so that the computer screen is 60 cm in front of the participant. After a tutorial on the experimental procedure, have the participant fill out the consent forms along with a few questions on his/her handedness.
    2. Use an EEG cap with 132 Ag/AgCl electrodes (including 122 10-10 system EEG, and the bipolar VEOG, HEOG, EKG, EMG electrodes, along with six facial-muscle channels) to record EEGs. Connect the cap to two 64-channel amplifiers with 0.1-100 Hz analog band-pass filter to digitize raw EEGs at 1,000 Hz sampling rate.
    3. Fit the standard 128-channel EEG cap to each participant's head. Adjust the cap so that the electrode labeled "reference" is placed at the "Cz" position, which is located relative to the anterior/posterior midline landmarks (i.e., middle of the nasion to inion distance), and to the left/right landmarks (i.e., middle of the left/right tragis), according to the EEG international 10/10 system.
    4. Gently use a blunt needle to inject conductive gel into all the electrodes. Stir with the needle slowly inside the electrode to ensure good gel contact between the scalp and the electrode (i.e., to keep the impedance below 5 kΩ). Constantly check the condition of gel contact at the electrodes labeled "reference" and "ground" on the EEG cap to make sure the impedance measurement is correct.
      1. Observe the electrode impedance by viewing the electrode impedance screen supported by the EEG recording software (e.g. SCAN 4.5 in this study) that usually goes with the EEG system. On the screen, the electrodes are shown in colors, and different colors indicate the levels of impedance.
    5. Place one HEOG electrode at the canthus of one eye (positive site), and the second electrode at the canthus of the other eye (negative site), one VEOG electrode above and the other one below the left eye, bipolar EKG electrodes on the back of the left and right hands, and bipolar EMG electrodes in the area between the thumb and index finger of the right hand, and the six facial electrodes around the eyebrow and cheek.
    6. Record in a notebook those bad channels in which the impedance is higher than 5 kΩ, or directly save the screen showing impedance at all electrodes. Use this as future reference for discarding bad channels at the stage of EEG data processing.
    7. Record resting-state EEGs after instructing the participant to close eyes for 12 min. During this time, doubly check the quality of the instant EEG stream shown on the screen supported by the EEG recording software.
      Note: There should be clear alpha waves distributed in the occipital channels during the eyes-closed condition compared with the eyes-open condition. If the alpha waves are too noisy (ignoring the bad channels) or distorted, return to step 2.2.4 and adjust the gel contact.
    8. Start the two experimental tasks in a counter-balanced order across participants. Record EEGs by clicking the record icon on the screen supported by the recording software.
      1. After reading the task instruction shown on the screen, have each participant perform the 5 familiarization trials, followed by the 30 task trials. Use the same procedure for both photograph and line-drawing tasks. In the task instruction, encourage participants to assign a score to emotionality of a face stimulus as quickly as possible.
      2. IMPORTANT: Check programs prepared in steps 1.3.2 and 1.4.2 for correctly sending events time-locked to the onset of central eye-fixation, face stimulus presentation, and pressing of the GO button to the recording software during emotionality evaluation. Those onset times are coded as numeric and can be checked on the screen supported by the recording software.
        Note: The participant can take a break in between the two tasks. There is no EEG recording during the break.
    9. Use a digitizer (e.g. the Polhemus FASTRAK 3D digitizer in this study) to record the 3D positions of electrodes and save it in a file (e.g. .3dd or .dat file) for co-registering EEG caps across participants in data analysis.
    10. After the EEG experiment, have the participant fill out a 35 question inventory on his/her behaviors and feelings during the EEG experiment (e.g., have negative emotions, almost fell into sleep), and provide them payment for participating in the experiment.
    11. Bring the participant to the washroom to clean/dry his/her hair.
    12. Clean and sterilize the EEG cap according to clinical instructions.

3. Processing EEG Data

Note: The software commands provided in this section are specific for EEGLAB.

  1. Filter the EEG signals using a high-pass filter of 1 Hz and a low-pass filter of 50 Hz by calling the pop_eegfilt.m function32.
    Note: Use a low-pass filter of 40 Hz for some countries that have 50 Hz electrical grid frequency
  2. Discard bad channels with impedance higher than 5 kΩ after checking the electrode impedance recorded in step 2.2.6. Discard those bad channels with very different power spectrum compared with the neighboring channels by visual inspection of the power spectrum features (e.g., the maximum value, the curvature, etc.) in each channel.
    1. Calculate and plot the power spectrum of the EEG signal by calling the pop_spectopo.m function32.
  3. Re-reference the EEG signals with the average of brain channels without the bad channels by calling the pop_reref.m function.
  4. Segment EEGs into stimulus-locked epochs, each of which ranges from -2.0 sec pre- to 1.5 sec post-stimulus onset. Correct for baseline (-2.0 to -1.2 sec before the stimulus onset) by removing the average of baseline values from each epoch.
    1. Call the pop_epoch.m and pop_rmbase.m functions, respectively. Choose the interval of baseline prior to the central eye-fixation period and the onset of the face stimulus.
  5. Mark bad epochs that appear to contain artifacts. Discard the bad epochs while reserving the epochs contaminated by eye blinks. The epochs with artifacts usually look noisy or have extremely high peak value (e.g. higher than 100 µV) compared with typical epochs.
    1. Call the pop_rejmenu.m function to launch a semi-automatic procedure. An interaction window will pop out to re-confirm auto-selected bad epochs by the user via visual inspection. Though a majority of epochs are contaminated by eye blinks, these epochs can be tentatively reserved for later removal by independent component analysis (ICA)33 in step 3.8.
  6. After discarding bad channels and bad epochs, run ICA on the pruned EEG data using the pop_runica.m function.
  7. Among the estimated independent components (ICs), identify artifacts resulting from eye movement/blink, muscle activity, heartbeat, and line noise32.
    Note: A significantly high correlation (R2 >0.9) between IC scores of a component and those of all reference channels (VEOG, HEOG, EKG, and facial channels) indicates that this component is mainly contributed by artifacts. The estimated IC scores explained by the artifacts can be cleaned up using multiple regression analysis.
  8. Remove artifact ICs and estimate the clean EEGs which are derived by the product of the ICA mixing matrix and artifact-cleaned IC score matrix. Save the clean EEGs for further analysis.
    1. Keep the residuals of predicting artifact ICs (R2 >0.9) from the reference VEOG, HEOG, EKG and facial channels in the IC score matrix. Remove other artifact ICs by the pop_subcomp.m function. The function returns the artifact-cleaned EEGs.

4. Statistical Analysis

  1. Partition EEG channels into eleven homogeneous regions to reduce the number of statistical comparisons in ERP and ERSP analyses, that is, left- (10 channels), midline- (14), and right-frontal (10); left- (13) and right-temporal (13); left- (9), midline- (14) and right-central (9); left- (9), midline- (12) and right-occipital parietal (9) as shown in Figure 4. These regions are defined according to the functional anatomy of cortex34. Functional homogeneity of EEG signals in these regions has been validated in different experiments13,35,36.

Figure 4
Figure 4. The channel partition. The channels are divided into eleven regions. LF: left-frontal (10 channels), MF: midline-frontal (14), RF: right-frontal (10), LT: left-temporal (13), RT: right-temporal (13), LC: left-central (9), MC: midline-central (14), RC: right-central (9), LP: left-occipital parietal (9), MP: midline-occipital parietal (12), RP: right-occipital parietal (9). Please click here to view a larger version of this figure.

  1. Load the clean EEGs in step 3.8. Compute the channel ERP by averaging signals across epochs in each channel, and regional ERP by averaging ERPs within the same region.
    Note: When EEGs are loaded using the pop_loadset.m function in EEGLAB, the signals are stored in the structure variable "EEG.data" in a channel-by-time-by-epoch array.
    1. In the Matlab command window, compute the channel ERP by averaging EEG.data across epochs for every channel (e.g., channelERP = mean(EEG.dat,3)). Compute the regional ERP by averaging the channel ERPs within each region according to the partition in 4.1 (e.g., regionalERP = mean(channelERP(index,:),1), where "index" stands for the channel indices in a given region).
  2. Compute the channel ERSPs by applying a time-frequency transform (e.g. Wavelet transform) to epoch signals in each channel, and regional ERSPs by averaging channel ERSPs in the same region.
    1. Perform the time-frequency transform by calling the pop_newtimef.m function.
      Note: In this study, the "wavelet cycles" entry is set to [1, 0.5] and "baseline" is set to [-2,000 to -1,200] msec. The resulting channel ERSPs will be stored in a frequency-by-time-by-channel array.
    2. In the Matlab command window, compute the regional ERSP by averaging ERSPs across channels within each region according to the partition in 4.1 (e.g., regionalERSP = mean(channelERSP(:,:,index),3), where "channelERSP" is the output from the pop_newtimef.m function, and "index" stands for the channel indices in a given region).
  3. Calculate mean values in different time intervals (e.g. 50-150, 150-250, 250-350, 350-450 msec) for regional ERPs. Calculate mean values in different time-frequency intervals (e.g. 50-150, 150-250, 250-350, 350-450, 450-800 msec in 1-7 Hz, and 200-800 msec in 8-30 Hz) for regional ERSPs.
  4. Apply MANOVA in statistical software (e.g. IBM SPSS ) to the mean values of regional ERPs and ERSPs to evaluate main effects for the task (photograph vs. line-drawing), region (eleven scalp regions), and group (AS vs. control), as well as the interaction effects among the task, region, and group.
    1. In the statistical analysis, consider gender (male vs. female) as a covariate, and estimate the main and interaction effects by holding the gender effect constant.

Representative Results

The average verbal and performance IQ scores are listed in Table 1 for the control and AS groups along with the average reaction times and average scores assigned to emotionality of faces of the two groups. In the table, none of the group differences achieves statistical significance except for the neutral faces in the line-drawing task, where the AS group has an average score near zero (p <0.001)13. Interestingly, the AS group still has slightly longer reaction times than the control group in responding to angry and happy faces, and shorter reaction times in responding to neutral faces even under the experimental control of gender, IQ and face stimuli. Asperger's syndrome is found with impairments in the amygdala and its associated limbic structures37-39, which are known to be involved in memory of emotions except for the neutral emotion40,41. These limbic structures associated with the non-conscious process may play an important role in interpretation of behavioral responses in patients with AS.

Table 1
Table 1. Behavioral data of the scores on the Wechsler adult intelligence Scale-III, reaction times, and average emotionality scores assigned to face stimuli in the photograph and line-drawing tasks. This table is a modified version of Table 1 in Tseng et al.13

As shown in Figure 5, the N400 component in the control group is pronounced in the frontal, temporal and occipital-parietal regions in both photograph and line-drawing tasks, but the amplitude of this component is smaller in the line-drawing task. In the AS group, the N400 is visible in the midline frontal region, but invisible in other regions in the photograph task, and becomes visible in all frontal regions in the line-drawing task. The MANOVA task-by-group interaction effect is significant in the 350-450 msec post onset interval (p = 0.019). The two groups also show significant differences in the early perception in the photograph task42, and have comparable ERP patterns in the line-drawing task; that is, the task-by-group interaction effect is also significant in the 50-150 msec post onset interval (p = 0.035). Photograph and line-drawing faces reach the largest ERP difference in the temporal and occipital-parietal regions in the 250-550 msec interval.

Figure 5
Figure 5. ERP plots. ERP plots in the right frontal, right temporal and right occipital-parietal regions in the control (blue) and AS (red) groups in the (A) photograph and (B) line-drawing tasks. Locations of EEG channels are shown in the upper left-hand side of each plot. The vertical axis shows the ERP voltage (µV) and the horizontal axis shows the time in msec. This figure is a modified version of Figure 2 in Tseng et al.13 Please click here to view a larger version of this figure.

As shown in Figures 6 and 7, delta/theta synchronization in the control group is pronounced in the 50-800 msec post onset interval in both tasks. The occipital-parietal regions display strongest synchronization, followed by the central and temporal regions and then by the frontal regions in the early 50-350 msec interval, and the regional differences disappear after 350 msec. The occipital-parietal regions also demonstrate the strongest alpha/beta desynchronization in the 200-800 msec interval. In general, the photographs have an additive effect over line-drawings in delta/theta synchronization, but the line-drawings induce stronger alpha/beta desynchronization. The AS group has more comparable delta/theta synchronization as the control group in the line-drawing task, and no apparent additive effect associated with the photograph faces. The MANOVA task-by-group interaction effect is significant in the 50-150, 250-350, and 350-450 msec post onset intervals (p = 0.043, 0.003 and 0.015, respectively). The group effect is also significant in the 150-250, 250-350, and 350-450 msec intervals (p = 0.033, .011 and 0.022, respectively). The AS group displays stronger delta/theta synchronization in the occipital-parietal regions in the 150-250 msec interval as well as the midline regions in the 350-450 msec interval when compared against other scalp regions. The alpha/beta desynchronization in the AS group is similar to that of the control group (and slightly stronger) in both tasks, but the differences between the two tasks tend to be smaller in the AS group. The MANOVA group and task-by-group effects are statistically insignificant in high-frequency oscillations.

Figure 6
Figure 6. ERSP plots in the photograph task. ERSP plots for the (A) control and (B) AS groups in the photograph task. The red color denotes power increase (synchronization), and the blue color denotes power decrease (desynchronization) compared with the baseline. This figure is a modified version of Figure 3 in Tseng et al.13 Please click here to view a larger version of this figure.

Figure 7
Figure 7. ERSP plots in the line-drawing task. ERSP plots for the (A) control and (B) AS groups in the line-drawing task. This figure is a modified version of Figure 3 in Tseng et al.13 Please click here to view a larger version of this figure.

The ERP results suggest a group difference in the early perception (50-150 msec) and later semantic recognition (350-450 msec) of emotional faces in the photograph task. The AS group has a smaller P1 amplitude in the photograph task and a slightly larger P1 amplitude in the line-drawing task when compared with the control group. The amplitude differences in the P1 between the two tasks may reflect the uniqueness of patients with AS in the perception of photographs and line-drawings43. The N400 is shown to be strongly affected by the emotional content, familiarity and global/local features in faces44. In our study, the N400 (350-450 msec) in the frontal and temporal regions is highly visible in the control group but almost invisible in the AS group in the photograph task. In facial emotion recognition, the N400 may be interpreted as a process of searching for a link between a face and its semantic interpretation (angry, neutral and happy). In the control group, the ERP difference between the two tasks in the 350-450 msec interval is consistent with findings by others. The amygdala is more active to intact fearful faces or fearful faces containing only LSF contents3,45. As the most LSF contents are removed from the line-drawings, these findings from the control group indicate that the N400 is much smaller in the occipital-parietal region and almost invisible in the temporal regions compared with that in the photograph task.

Because information processing of line-drawings depends less on the non-conscious function in the amygdala, patients with AS show more comparable ERP patterns as the healthy controls in the later (350-450 msec) stages during emotional face recognition. Interestingly, the AS group can accomplish the emotionality evaluation tasks correctly without the visible N400 in the photograph task. It is reasonable to hypothesize that information processing through the amygdala and its associated limbic structures play a crucial role in triggering the amplitude of the N400, which may impact the efficiency of information processing in patients with AS but has no effect on their response accuracy.

It has been shown that emotional face recognition engages early and later changes in delta/theta oscillations8, which are considered the brain activity associated with cortical-limbic projections during stimulus estimation46-48. Delta/theta synchronization is more associated with non-conscious than with conscious face recognition46. The findings on ERSPs further indicate that the AS group has much weaker synchronization in delta/theta rhythms in the early and later stages of emotional face recognition. It is reasonable to hypothesize that weaker delta/theta synchronization reflects a disturbance in non-conscious processing of emotional expressions and a failure in the limbic-cortical projection in patients with AS. Delta/theta synchronization is slightly more pronounced in the midline frontal, midline central and midline occipital-parietal regions relative to other scalp regions in the AS group in the 350-450 msec post onset interval in both tasks. These midline regions are closely related to the cortical structure of conscious representation of emotional significance18.

Because the cognitive or conscious pathway is still mediated by the limbic structure such as the thalamus, we may hypothesize that the AS group relies on the conscious pathway more than the non-conscious pathway in responding to the photographs and line-drawings. In the control group, the delta/theta power reaches the strongest in the parietal-occipital regions time-locked to stimulus onset and increases in the frontal regions at a later stage in the photograph task. The spatial distribution of the delta/theta power in the line-drawing task becomes closer to that of the AS group. We hypothesize that the control group engages the conscious and non-conscious pathways in the photograph task, and relies on the conscious pathway in the line-drawing task.

When comparing ERSPs between the two tasks, the control group additionally suggests an additive effect of LSF contents on delta/theta synchronization in the 250-450 msec post onset interval, independent of brain regions and of mechanisms elicited by facial emotions. The LSF content in a face seems to place a constant load on the information flow, which may be easily bypassed through voluntary attention to details in a face, as is suggested by patients with AS who can evaluate facial emotions successfully in the photograph task. Strong alpha and beta oscillations have been referred to as indicators of functional processes in the neocortex associated with attention, semantic long-term memory, and cognitive estimation of stimuli49,50. In a face recognition task, alpha/beta desynchronization reflects the level of voluntary attention to visual stimuli and is associated with cognitive appraisal of facial emotions15,18,51. In this study, there is no evidence supporting a task or group effect in higher frequency oscillations (alpha and beta) except for regional differences, when comparing the difference between the parietal-occipital region and other regions. Alpha desynchronization reflects attention and a release from inhibited processes in complicated tasks52, whereas beta oscillation is seldom observed in emotion-related tasks53,54. Beta desynchronization in the AS group is generally stronger than that in the control group in both tasks, but the group difference is insignificant. The ERSPs suggest that the AS group has much weaker delta/theta power, but slightly stronger alpha/beta power when compared with the control group. We hypothesize that patients with AS may direct their attention to some important details in faces by use of cognitive appraisal of visual stimuli to compensate for sensory and affective deficits.

In summary, the recognition of facial emotions in healthy controls induces both conscious and non-conscious processes9,18,51. The reaction-time differences between the two tasks tend to be larger in the control group than those in the AS group. We hypothesize that the healthy controls engage the conscious process more than the non-conscious one in responding to the line-drawings and exert both processes in responding to the photographs, whereas patients with AS rely only on the conscious process in responding to both types of faces.

Supplemental Code File: Example Program. Please click here to download this file.

Discussion

The literature features studies on recognition of facial emotions in patients with autism by analysis of EEG reactions44, and on recognition of high- and low-spatial frequency contents using visual stimuli43. To the best of our knowledge, however, there is a lack of existing work on the brain oscillatory activity that combines emotion recognition with distinct spatial frequency contents. Our protocol is a first step towards estimating the influence of emotionality (positive, neutral and negative faces) and spatial frequency information (photographs and line-drawings) on recognition of emotions in patients with AS compared with healthy controls. Our analysis of EEG reactions in the spatial, temporal and frequency domains allows for separating the affective and cognitive functions to a degree for the scientific understanding of the AS disorder. In this study, the experimental protocol provides an approach to minimizing factors unrelated to the recognition of emotions; that is, reaction times and scores assigned to emotionality of faces are kept as similar as possible between the two groups by a carefully designed pilot study. Participants are also matched on IQ and gender in both the pilot study and EEG experiment. While previous EEG studies on AS have focused on the P1 and N17055, the protocol in this study makes a contribution in demonstrating a significant difference in the N400 component between the AS and control groups.

Ekman's emotional faces elicit stronger lower frequency oscillations in the healthy controls compared with faces in other databases (e.g., some well-validated Taiwanese's emotional faces). It is highly recommended to conduct a pilot EEG study to validate emotional face stimuli used in patients and healthy controls before the EEG experiment. Patients with AS had difficulty using HSF information in the eye regions56. In this reason, the selected Ekman's face stimuli contain emotional expressions identifiable by exposed/unexposed teeth or furrowed/smoothed eyebrows. Studies on other types of patients might consider other face features while replacing stimuli used in the protocol. The scoring system must be designed to facilitate patients performing the emotionality evaluation task, which can be resolved by interviewing patients recruited in the pilot study; that is, the ordered continuum without any tick marks except for the central and end points is designed according to feedback from the pilot patients. The labels at the end points of the scoring system can be modified, for example, friendly versus inimical, which must be chosen to maximize the emotional responses especially in the controls.

In the literature, AS is found with impairments in the amygdala and its related limbic structures37-39, which are involved in memory and retrieval of information relevant to emotions, except for the neutral emotion40,41. Further, the amygdala is sensitive to the LSF content in a photographed face3. The two tasks in the protocol are designed according to the existing findings on deficits in adults with AS, and the stimuli and scoring system were additionally designed for use with this population of patients. Clinical applications of the protocol to other adult patients with a similar type of impairment, such as autism spectrum disorders57, can be conducted with a minor modification in the face stimuli and the scoring system.

It should be noted that the protocol is not intended for clinical diagnosis of children younger than 7 years old, whose conscious (or voluntary) control over behaviors may not be fully developed26. Further, the technique does not yield clear diagnostic results in patients with psychiatric comorbidity following brain injuries, tumors or other violations of cerebral hemodynamics. Several studies have found a relationship between aggression and hormonal changes in women during the menstrual-cycle58,59. It is also well-known that the administration of ethanol or narcotic drugs changes the emotional reactions60. These types of changes may cause fluctuations in EEG reactions to emotional stimuli in both healthy controls and patients with AS. Therefore, it is not recommended to apply the protocol to women during monthly periods or when suffering premenstrual syndromes, or to patients under alcohol or drug intoxication. Neuroimaging studies on conscious and non-conscious pathways of emotions may apply the protocol to demographically matched healthy controls and patients with AS by varying the degrees of coarseness and neutrality in emotional face stimuli.

Patients with AS belong to a relatively high trait-anxiety group13,36 and their eyes blinking and motion artifacts can be serious. It is desirable to have experienced data processors and efficient algorithms for removing EEG artifacts before addressing any scientific or clinical issues. The experimental protocol represents an effort toward research into the conscious and non-conscious representations of emotions in the brain. The protocol has been validated by recruiting IQ/gender-matched controls and patients with AS in the EEG experiment. The reaction time and response accuracy are additional supplements to the psychological and behavioral diagnoses. The technique is independent of the subjective mood of the participant during the experiment, and therefore, allows for tracking dynamics of a patient's state during and after psychological or pharmacological therapy. The technique can be applied to patients suffering from other kinds of affective pathology, such as the anxiety disorder, depression, burnout syndrome, and emotional violation in post-traumatic stress. Further modifications on the protocol are encouraged for use in other social and emotional disorder groups. A well-designed pilot study with interview of controls and patients would help with validation of a modified version of the protocol.

Divulgations

The authors have nothing to disclose.

Acknowledgements

This research was supported by grants MOST102-2410-H-001-044 and MOST103-2410-H-001-058-MY2 to M. Liou, and RSF-14-15-00202 to A.N. Savostyanov. The support of Russian Science Foundation (RSF) was used for elaboration of experimental paradigm of face recognition.

Materials

Synamps 2/RT 128-channel EEG/EP/ERP Neuroscan
Quik-CapEEG 128 electrodes Neuroscan
Gel Quik-Gel
FASTRAK 3D digitizer Polhemus 

References

  1. Tamietto, M., De Gelder, B. Neural bases of the non-conscious perception of emotional signals. Nat Rev Neurosci. 11, 697-709 (2010).
  2. Harms, M. B., Martin, A., Wallace, G. L. Facial Emotion Recognition in Autism Spectrum Disorders: A Review of Behavioral and Neuroimaging Studies. Neuropsychol Rev. 20, 290-322 (2010).
  3. Vuilleumier, P., Armony, J. L., Driver, J., Dolan, R. J. Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nat Neurosci. 6, 624-631 (2003).
  4. Phan, K. L., Wager, T., Taylor, S. F., Liberzon, I. Functional neuroanatomy of emotion: A meta-analysis of emotion activation studies in PET and fMRI. Neuroimage. 16, 331-348 (2002).
  5. Kano, M., et al. Specific brain processing of facial expressions in people with alexithymia: an (H2O)-O-15-PET study. Brain. 126, 1474-1484 (2003).
  6. Williams, L. M., et al. Fronto-limbic and autonomic disjunctions to negative emotion distinguish schizophrenia subtypes. Psychiat Res-Neuroim. 155, 29-44 (2007).
  7. Goffaux, V., et al. From coarse to fine? Spatial and temporal dynamics of cortical face processing. Cereb Cortex. , (2010).
  8. Balconi, M., Lucchiari, C. EEG correlates (event-related desynchronization) of emotional face elaboration: A temporal analysis. Neurosci Lett. 392, 118-123 (2006).
  9. Balconi, M., Lucchiari, C. Consciousness and emotional facial expression recognition – Subliminal/Supraliminal stimulation effect on n200 and p300 ERPs. J Psychophysiol. 21, 100-108 (2007).
  10. Balconi, M., Pozzoli, U. Face-selective processing and the effect of pleasant and unpleasant emotional expressions on ERP correlates. Int J Psychophysiol. 49, 67-74 (2003).
  11. Balconi, M., Pozzoli, U. Event-related oscillations (EROs) and event-related potentials (ERPs) comparison in facial expression recognition. J Neuropsychol. 1, 283-294 (2007).
  12. Balconi, M., Pozzoli, U. Arousal effect on emotional face comprehension Frequency band changes in different time intervals. Physiol Behav. 97, 455-462 (2009).
  13. Tseng, Y. L., Yang, H. H., Savostyanov, A. N., Chien, V. S., Liou, M. Voluntary attention in Asperger’s syndrome: Brain electrical oscillation and phase-synchronization during facial emotion recognition. Res Autism Spectr Disord. 13, 32-51 (2015).
  14. Goffaux, V., Rossion, B. Faces are" spatial"–holistic face perception is supported by low spatial frequencies. J Exp Psychol Hum Percept Perform. 32, 1023 (2006).
  15. Knyazev, G. G., Bocharov, A. V., Levin, E. A., Savostyanov, A. N., Slobodskoj-Plusnin, J. Y. Anxiety and oscillatory responses to emotional facial expressions. Brain Res. 1227, 174-188 (2008).
  16. Adolphs, R. Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev. 1, 21-62 (2002).
  17. Acar, Z. A., Makeig, S. Neuroelectromagnetic Forward Head Modeling Toolbox. J Neurosci Methods. 190, 258-270 (2010).
  18. Balconi, M. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces. Neuropsychol Trends. 11, 19-40 (2012).
  19. Gross, T. F. The perception of four basic emotions in human and nonhuman faces by children with autism and other developmental disabilities. J Abnorm Child Psychol. 32, 469-480 (2004).
  20. Behrmann, M., Thomas, C., Humphreys, K. Seeing it differently: visual processing in autism. Trends in cognitive sciences. 10, 258-264 (2006).
  21. Holroyd, S., Baron-Cohen, S. Brief report: How far can people with autism go in developing a theory of mind?. J Autism Dev Disord. 23, 379-385 (1993).
  22. Duverger, H., Da Fonseca, D., Bailly, D., Deruelle, C. Theory of mind in Asperger syndrome. Encephale. 33, 592-597 (2007).
  23. Wallace, S., Sebastian, C., Pellicano, E., Parr, J., Bailey, A. Face processing abilities in relatives of individuals with ASD. Autism Res. 3, 345-349 (2010).
  24. Weigelt, S., Koldewyn, K., Kanwisher, N. Face identity recognition in autism spectrum disorders: a review of behavioral studies. Neurosci Biobehav Rev. 36, 1060-1084 (2012).
  25. Wilson, C., Brock, J., Palermo, R. Attention to social stimuli and facial identity recognition skills in autism spectrum disorder. J Intellect Disabil Res. 54, 1104-1115 (2010).
  26. American_Psychiatric_Association. . The Diagnostic and Statistical Manual of Mental Disorders: DSM 5. , (2013).
  27. Dahlgee, S., Gilberg, C. Symptoms in the First two years of Life. A Priliminary. Population Study of Infantile Autism European archives of Psychiatry and Neurology. Sciences. , (1989).
  28. Basar-Eroglu, C., Kolev, V., Ritter, B., Aksu, F., Basar, E. EEG, auditory evoked potentials and evoked rhythmicities in three-year-old children. Int J Neurosci. 75, 239-255 (1994).
  29. Ekman, P., Friesen, W. V. . Pictures of Facial Affect. , (1976).
  30. Gillberg, C. . Autism and Asperger’s Syndrome. , 122-146 (1991).
  31. Chiang, S. K., Tam, W. C., Pan, N. C., Chang, C. C., Chen, Y. C., Pyng, L. Y., Lin, C. Y. The appropriateness of Blyler’s and four subtests of the short form of the Wechsler Adult Intelligence Scale-III for chronic schizophrenia. Taiwanese J Psychiatr. 21, 26-36 (2007).
  32. Delorme, A., Makeig, S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods. 134, 9-21 (2004).
  33. Makeig, S., Bell, A. J., Jung, T. P., Sejnowski, T. J. Independent component analysis of electroencephalographic data. Adv Neural Inf Process Syst. 8, 145-151 (1996).
  34. Başar, E. . Brain Function and Oscillations: Volume I: Brain Oscillations. Principles and Approaches. , (2012).
  35. Tsai, A. C., et al. Recognizing syntactic errors in Chinese and English sentences: Brain electrical activity in Asperger’s syndrome. Res Autism Spectr Disord. 7, 889-905 (2013).
  36. Savostyanov, A. N., et al. EEG-correlates of trait anxiety in the stop-signal paradigm. Neurosci Lett. 449, 112-116 (2009).
  37. Ashwin, C., Baron-Cohen, S., Wheelwright, S., O’Riordan, M., Bullmore, E. T. Differential activation of the amygdala and the ‘social brain’ during fearful face-processing in Asperger Syndrome. Neuropsychologia. 45, 2-14 (2007).
  38. Kevin, K. Y., Cheung, C., Chua, S. E., McAlonan, G. M. Can Asperger syndrome be distinguished from autism? An anatomic likelihood meta-analysis of MRI studies. J Psychiatry Neurosci. 36, 412 (2011).
  39. Piggot, J., et al. Emotional attribution in high-functioning individuals with autistic spectrum disorder: A functional imaging study. J Am Acad Child Adolesc Psychiatry. 43, 473-480 (2004).
  40. Ilyutchenok, R. Y. Emotions and conditioning mechanisms. Integr Physiol Behav Sci. 16, 194-203 (1981).
  41. Kleinhans, N. M., et al. fMRI evidence of neural abnormalities in the subcortical face processing system in ASD. Neuroimage. 54, 697-704 (2011).
  42. Toivonen, M., Rama, P. N400 during recognition of voice identity and vocal affect. Neuroreport. 20, 1245-1249 (2009).
  43. Deruelle, C., Rondan, C., Gepner, B., Tardif, C. Spatial frequency and face processing in children with autism and Asperger syndrome. J Autism Dev Disord. 34, 199-210 (2004).
  44. Bentin, S., Deouell, L. Y. Structural encoding and identification in face processing: ERP evidence for separate mechanisms. Cogn Neuropsychol. 17, 35-55 (2000).
  45. Vuilleumier, P., Pourtois, G. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia. 45, 174-194 (2007).
  46. Basar, E., Guntekin, B., Oniz, A. Principles of oscillatory brain dynamics and a treatise of recognition of faces and facial expressions. Prog Brain Res. 159, 43-62 (2006).
  47. Basar, E., Schmiedt-Fehr, C., Oniz, A., Basar-Eroglu, C. Brain oscillations evoked by the face of a loved person. Brain Res. 1214, 105-115 (2008).
  48. Başar, E. . Brain Function and Oscillations: Volume II: Integrative Brain Function. Neurophysiology and Cognitive Processes. , (2012).
  49. Anokhin, A., Vogel, F. EEG alpha rhythm frequency and intelligence in normal adults. Intelligence. 23, 1-14 (1996).
  50. Klimesch, W. EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis. Brain Res Rev. 29, 169-195 (1999).
  51. Knyazev, G. G., Slobodskoj-Plusnin, J. Y., Bocharov, A. V. Event-Related Delta and Theta Synchronization during Explicit and Implicit Emotion Processing. Neurosciences. 164, 1588-1600 (2009).
  52. Klimesch, W., Sauseng, P., Hanslmayr, S. EEG alpha oscillations: The inhibition-timing hypothesis. Brain Res Rev. 53, 63-88 (2007).
  53. Knyazev, G. G., Slobodskoj-Plusnin, J. Y. Behavioural approach system as a moderator of emotional arousal elicited by reward and punishment cues. Pers Individ Dif. 42, 49-59 (2007).
  54. Balconi, M., Brambilla, E., Falbo, L. Appetitive vs. defensive responses to emotional cues. Autonomic measures and brain oscillation modulation. Brain Res. 1296, 72-74 (2009).
  55. Dakin, S., Frith, U. Vagaries of visual perception in autism. Neuron. 48, 497-507 (2005).
  56. Curby, K. M., Schyns, P. G., Gosselin, F., Gauthier, I. Face-selective fusiform activation in Asperger’s Syndrome: A matter of tuning to the right (spatial) frequency. , (2003).
  57. American_Psychiatric_Association. . Diagnostic and statistical manual of mental disorders. , (1994).
  58. Dougherty, D. M., Bjork, J. M., Moeller, F. G., Swann, A. C. The influence of menstrual-cycle phase on the relationship between testosterone and aggression. Physiol Behav. 62, 431-435 (1997).
  59. Van Goozen, S. H., Wiegant, V. M., Endert, E., Helmond, F. A., Van de Poll, N. E. Psychoendocrinological assessment of the menstrual cycle: the relationship between hormones, sexuality, and mood. Arch Sex Behav. 26, 359-382 (1997).
  60. Winward, J. L., Bekman, N. M., Hanson, K. L., Lejuez, C. W., Brown, S. A. Changes in emotional reactivity and distress tolerance among heavy drinking adolescents during sustained abstinence. Alcohol Clin Exp Res. 38, 1761-1769 (2014).

Play Video

Citer Cet Article
Chien, V. S. C., Tsai, A. C., Yang, H. H., Tseng, Y., Savostyanov, A. N., Liou, M. Conscious and Non-conscious Representations of Emotional Faces in Asperger’s Syndrome. J. Vis. Exp. (113), e53962, doi:10.3791/53962 (2016).

View Video