Language: The N400 in Semantic Incongruity

JoVE Science Education
Neuropsychology
Bu içeriği görüntülemek için JoVE aboneliği gereklidir.  Oturum açın veya ücretsiz deneme sürümünü başlatın.
JoVE Science Education Neuropsychology
Language: The N400 in Semantic Incongruity

19,086 Views

13:37 min

April 30, 2023

Genel Bakış

Source: Laboratories of Sarah I. Gimbel and Jonas T. Kaplan— University of Southern California

Understanding language is one of the most complex cognitive tasks that humans are capable of. Given the incredible amount of possible choices when combining individual words to form meaning in sentences, it is crucial that the brain is able to identify when words form coherent combinations and when an anomaly appears that undermines meaning. Extensive research has shown that certain scalp-recorded electrical events are sensitive to deviations in this kind of expectation. Importantly, these electrical signatures of incongruity are specific to unexpected meanings, and are therefore different from the brain’s general responses to other kinds of anomalies.

The neurophysiological correlates of semantic incongruity have been experimentally examined through the use of paradigms that present semantically congruent and incongruent ends to sentences. Originally introduced in 1980, the semantic incongruity task presents the participant with a series of sentences that end with either a congruent or incongruent word. To test that the response is from semantic incongruity and not more generally due to surprise, some sentences included words presented in a different size.1 The semantically incongruent end to a sentence has been shown to elicit specific electrical events that are recordable at the scalp known as event-related potentials (ERPs). An ERP is the measured brain response resulting from a specific sensory, cognitive, or motor event. ERPs are measured using electroencephalography (EEG), a noninvasive means of evaluating brain function in patients with disease and normally functioning individuals. A specific ERP component found across the scalp, known as the N400, shows greater amplitude in response to semantically incongruent events. The N400 is a negative-going deflection in the EEG signal that occurs about between 250 and 400 ms after stimulus onset. In general, early potentials reflect sensory-motor processing, while later potentials like the N400 reflect cognitive processing.

In this video, we show how to administer a semantic incongruity task using EEG. The video will cover the setup and administration of EEG, and analysis of ERPs related to both control and target stimuli in the semantic incongruity. In this task, participants are set up with the EEG electrodes, then brain activity is recorded while they view control sentences and semantically incongruent sentences. The EEG procedure is similar to that of Habibi et al.,1 and the task is modeled after Kutas and Hillyard.2 When the ERPs are averaged across the congruous and incongruous sentences, the neural correlates of each event can be compared in a selected time window.

Prosedür

1. Participant recruitment

  1. Recruit 20 participants for the experiment.
  2. Make sure that the participants have been fully informed of the research procedures and have signed all the appropriate consent forms.

2. Data collection

Figure 2
Figure 1: Electrode placement. Placement of the face electrodes to detect EOG artifacts and record mastoid movement (A). Diagram of measurement from directly between the eyebrows to just under the bump in the back of the head. 10% of this measurement is measured above the mid-eye mark, and this is where the FPZ electrode of the cap is placed (B).

  1. EEG preparation (Note: These steps are for use with the Neuroscan 4.3 system with Synamps 2 amplifier and a 64-channel quick cap.)
    1. Participants in an EEG study should not have any hair products (e.g., gel, mouse, or leave-in conditioner) in their hair prior to their participation.
    2. Fill 2-4, 10-ml syringes with conductive electrode-gel (i.e., Quick-gel). It is suggested to stir the gel before using it to release air bubbles.
    3. Brush hair and scalp thoroughly (about 5 min).
    4. Clean head with alcohol and cotton gauze. Also clean the skin for placement of electrodes: two mastoids (behind each ear), below and above the left eye VEO (vertical electro-ocular), and the far sides of each eye HEO (horizontal electro-ocular; Figure 1, left).
    5. Using two-sided adhesive disks, place the electrodes.
    6. Measure the head from the front (directly between the eyebrows, mid-eye) to the inion (below the bump of the head in the back). This distance will determine the size of the cap (small, medium, or large). To place the cap, mark the 10% of the measured distance on the forehead and make sure that the mid-frontal electrode (FPz) is placed on this marked point.
    7. Attach the face electrodes to their respective cords on the cap.
    8. Start filling the electrodes with gel, using the blunt needle tip to scrape the hair aside underneath the electrode, so the electrode is in direct contact with the scalp. Be mindful not to injure the skin.
      1. Lifting up the electrode a bit makes it easier to insert the gel. In most cases, there will be hair underneath the electrode. Moving it out of the way will allow for better impedance.
    9. Take the participant to the soundproof room and plug in the cap and individual electrodes.
    10. Check the impedance of the electrode-scalp connection to keep it under 10 KΩ. If the impedance is high make sure the electrode has conductive gel and is in touch with the scalp.
      1. Impedance is the tendency to impede the flow of an alternating current. High impedance may increase noise in the data and should be minimized before the study begins.
      2. In most cases, the hair is in the way of the electrode. Moving it out of the way should get better impedance.
    11. Once the impedance is acceptable for all electrodes and EEG traces are void of noise, data collection can begin.
  2. EEG data collection
    1. Prepare the participant to do the task.
      1. Place the participant in a chair 75 cm from the computer screen, in a sound and light-attenuated room (acoustically and electrically shielded).
      2. Tell the participant that he/she will be reading seven-word sentences, presented one word at a time on the screen in front of them. They are to read each sentence in order to answer questions about the contents of the sentences at the end of the experiment.
        1. Each of the seven words of the sentence is shown individually for 100 ms, with a 1000 ms interstimulus interval between each word.
          1. Words are presented one at a time in the center of the screen to minimize eye movements during the experiment.
        2. The participant first sees 10 practice sentences.
        3. Show the 240 stimuli in a random order, in two presentations of 120 sentences. One-third of the sentences are normal sentences, where the last word is semantically congruent with the rest of the sentence. One-third of the sentences are incongruent sentences, where the last word is semantically incongruent with the rest of the sentence. One-third of the sentences are normal sentences where the last word is presented in a larger font than the rest of the sentence.
  3. Start the system, and have continuous recording of EEG throughout the presentation of the functional task.
  4. EEG is amplified by amplifiers with a gain of 1024 and a band-pass of 0.01-100 Hz.
  5. Trials contaminated by eye-blinks and artifact rejection (approximately 15% of trials) will be eliminated off-line.

3. Data analysis

  1. Offline, reference data to averaged mastoids.
  2. Epochs are baseline corrected using the epoch 200 ms before the onset of the stimulus.
  3. To correct for motion artifacts, exclude epochs with a signal change exceeding 150 µV at any EEG electrode in the average.
  4. Digitally filter the data offline (bandpass 0.05-20 Hz).
  5. Use the ERP averages that are displayed from the Pz recording sites for the time course of each sentence (Figure 2).
    1. The peak (amplitude and latency) of the parietal N400 is automatically obtained from all electrodes.

Figure 2
Figure 2: Results of the semantic incongruity task. The participant is presented with a sentence, one word at a time. Each word appears for 100 ms, followed by 1 s of a blank screen. Participants see congruous sentences (red), incongruous sentences (blue), and sentences where the last word is presented in a larger size (green). Only the incongruous sentences produce the N400 response when the last word is presented. When the last word is congruous but larger in size, there is a later P560 response.

  1. Statistical analysis
    1. Plot ERP averages from the parietal Pz electrodes for congruent, incongruent, and deviant size conditions.
    2. For peak amplitude and latencies, use F-tests for each latency range to determine whether there is a difference between target and control stimuli.

Understanding language involves complex cognitive processes, and—given the incredible number of word choices and arrangements that can form a single sentence—the brain must be able to distinguish between coherent and incoherent combinations.

A person’s comprehension of a sentence, whether spoken—like when a mother tells her son that she’s going to the store—or written in a book, depends, in part, on what the brain anticipates the next word in the sentence to be.

For example, if someone begins to read "It was a dark and stormy…" at the beginning of a book, it is expected that "night" will be the following term.

However, occasionally unexpected words are encountered—like "…and the mad scientist was painting his laboratory the color raccoon…"—that disrupt the sentence’s meaning.

In this instance, the anomalous term is raccoon, as it refers to a type of animal, rather than an expected color, like black.

Such semantic incongruities—the senseless sentences—elicit unique electrical signals in the brain—responses known as event-related potentials, ERPs for short—that may provide insight into how the brain either retrieves the definition of, or reprocesses, the troublesome word in an attempt to comprehend the sentence.

This video explains how the technique of electroencephalography, or EEG, can be used to measure ERPs during semantic incongruity tasks, in which participants are shown sentences ending with unexpected words.

We demonstrate how to design stimuli, and collect and analyze data, specifically focusing on a unique component of ERPs, named N400 to reflect its characteristics.

In this experiment, EEG is used to measure brain activity in participants shown semantically coherent and incoherent stimuli, in order to investigate language processing and comprehension.

These stimuli consist of three kinds of sentences: congruous, incongruous, and size-deviant. Although each is composed of seven words, they differ in the nature of their last terms.

The final words in congruous sentences, like "She scratched her dog behind its ear.," pose no problems with meaning, and appear in the same font type—and size—as those preceding it.

Importantly, these sentences serve as controls to gauge how the brain responds to coherent word combinations.

In contrast, incongruous sentences, like "She dipped her chicken finger in boots.," possess last terms that are semantically anomalous.

Here, boots conflicts with the meaning of the rest of the words—it is expected that chicken fingers would be dipped in a condiment like mustard, not in articles of clothing. Thus, these stimuli evaluate how surprising, incoherent language is processed.

The final type of sentences are called size-deviant, and contain last words that are surprising in appearance—they are in a larger font—but not congruity.

For example, if in the sentence "He put his hand in his mitten.," the term mitten is written in bigger letters, it still makes semantic sense.

These stimuli are critical, as they are meant to distinguish whether the brain’s response to the last word in a sentence is the result of general surprise—the shock of an inconsistent text size—or is specific to unexpected meanings.

After participants are prepared for EEG, they are told to carefully read sentences that appear on a computer screen, as questions will be asked about them later on.

In reality, no quiz is given at the end of the experiment; however, these instructions ensure that subjects will pay attention to the upcoming stimuli.

During the task, participants are sequentially shown—in the correct order—the seven words that make up a single sentence.

Each term appears individually in the center of the monitor—to reduce eye movements that could interfere with data collection—for 100 ms, and is followed by 1000 ms of blank screen.

EEG information is continuously recorded over 120 such trials, each of which consists of a unique sentence. Specifically, stimuli are shown at the same frequency—40 times—but in a random order. Then, the task is repeated a second time, so participants must read a total of 240 sentences altogether.

Afterwards, EEG data are processed to visualize average ERPs for each type of sentence—from each electrode—and scientists search for the N400 component in these waveforms.

The "N" in this term indicates that the peak is negative, and the "400" represents its latency—that it occurs roughly 400 ms after the last-word stimulus is shown to the participant.

Based on previous experience, it is expected that the amplitude of N400 will increase in response to semantically inconsistent events, and will be recorded from all scalp electrodes.

However, this response will likely be most prominent at the Pz electrode, positioned in the midline of the scalp above the parietal lobes—regions of which are known to be involved in processing and integrating written language.

Prior to beginning the experiment, recruit a participant who is a native English speaker, and explain to them the two main components of the procedure: that they will be wearing electrodes, and be shown sentences on a computer screen. Then, collect from them all of the necessary, signed consent forms.

Next, outfit the participant with scalp and face electrodes. For more details on this procedure, check out the methods described elsewhere in this collection. Once in the testing space, verify impedance values across all electrodes.

Upon confirming that the EEG traces are void of noise, instruct the participant to sit so that their eyes are approximately 75 cm away from the screen.

Emphasize that they should read and pay careful attention to the sentences that appear word-by-word on this display, as questions will be asked about their content later on.

To ensure that the participant understands the task, show them ten practice sentences, but do not collect data during this time. Afterwards, start the EEG system to commence continuous recording.

Proceed with the functional task by presenting 120 trials—consisting of 40 congruous, 40 incongruous, and 40 deviant-size sentences—in a random order. Then, repeat this process with an additional set of 120 stimuli to guarantee that enough data are collected.

Once data have been recorded for all 240 stimuli, process it as described in JoVE’s ERPs and the Oddball Task video.

To analyze the data, first plot the average waveforms for the timecourses of congruous, incongruous, and deviant-size stimuli collected from the Pz recording site. On the x-axis of this graph—representing time in ms—indicate when each word in a sentence is shown.

Afterwards, locate the N400 peaks, and for each, calculate its average amplitude—defined as the distance between the lowest point of the peak and the baseline value of 0 µV, also represented by the horizontal axis.

Then, calculate the latency of this component—how long in ms it takes for it to appear in the waveform after the last word in a sentence is shown.

For the ranges of these amplitudes and latencies, proceed to use F-tests to determine whether there is a difference between target and control stimuli.

Notice that the N400 response was only observed after participants were shown the last word of an incongruous sentence, indicating that this electrical event reflects neural processing—particularly involving the parietal lobes—that identify an interruption in sentence processing caused by an incoherent term.

Importantly, although N400 was not observed in waveforms collected using deviant-size stimuli, another unique component—P560, a positive peak with a latency of 560 ms—was.

This indicates that the brain responds differently to unexpected visual stimuli and semantically inconsistent terms, and suggests that N400 is a unique electrical signature of language incongruity.

Now that you know how semantic inconsistency can be used to elicit the N400 component in ERPs, let’s look at other ways researchers are examining this unique electrical signal to study language processing and comprehension.

Some researchers aim to determine when the ability to identify incoherent language develops, and whether this skill changes with age.

Such work has involved showing young children—outfitted with EEG caps—representations of recognizable objects, like a camera.

However, the trick is that when the child looks at this depiction, they're told it’s something different—for example, a cat. Thus, this is a modified version of the semantic incongruity task, as the spoken word doesn’t match the meaning of the visible item.

Measurements of the brain’s electrical responses to these tasks demonstrated that children exhibit an enhanced N400-esque response to incongruous item-word pairs—one that lasts for several hundred ms—compared to congruous sets.

Importantly, this suggests that even at an early age, humans are able to identify and process semantic incongruity.

Other researchers are assessing whether ERPs can be used to better understand language deficits associated with certain personality disorders, such as schizophrenia.

Paradoxically, previous work has shown that individuals with pronounced schizophrenia-like characteristics, such as anxiety or the inability to feel pleasure, demonstrate a heightened N400 response to congruous word pairs—like animal and goat—compared to people with milder symptoms.

However, when these participants were treated with an antipsychotic drug called olanzapine, the amplitude of this congruity-caused N400 component decreased compared to individuals given a placebo, suggesting a possible therapy that could treat the disjointed speech sometimes observed in such disorders.

You’ve just watched JoVE’s video on how congruous and incongruous sentences can be used to investigate language processing. At this point, you should know how to present stimuli to participants, and collect and interpret ERP data. We hope you also now understand how the N400 component is being used to investigate other aspects of language comprehension, such as how it can be affected in behavioral disorders.

Thanks for watching!

Sonuçlar

During the semantic incongruity task where participants viewed congruous sentences, incongruous sentences, and sentences where the last word was presented in a larger size, there was a negative-going N400 response only for the incongruous sentences (Figure 2, blue). Sentences with a surprising element (larger last word) that was not semantically incongruous did not show an N400 response, but did show an increased P560 response (Figure 2, red). The N400 response started about 250 ms after the presentation of the last word of the sentence and peaked about 400 ms after the stimulus onset.

These results show that electrical activity in the brain, and particularly in the parietal lobe, registers when a semantically incongruous word is presented as part of a sentence. This electrical event reflects the neural processes that identify the interruption of ongoing sentence processing by a semantically inappropriate word. The N400 seems to provide useful information about the timing, classification, and interactions of cognitive processes involved in natural language processing and comprehension.

Applications and Summary

This study demonstrates some of the advantages of the ERP approach, in particular, its high temporal resolution. In this paradigm, to simulate natural reading, word stimuli are presented very briefly in succession. Because of the excellent temporal resolution of EEG, we are able to discern electrical responses to the stimuli individually.

As a marker of semantic processing, the N400 can be a useful tool in understanding the development of language from childhood to adulthood. Study of this component shows that even in 19-month-old babies, there is a semantic incongruity effect when they hear words that don't match pictures they are seeing.3 This demonstrates the very early presence of a mechanism for matching words to their proper context. However, while young adolescents show an N400 that discriminates between congruent and incongruent language, the response profile of this component is not yet as nuanced as that of adults; for example, it is not as sensitive to different degrees of incongruity.4 These studies demonstrate the sensitivity of this ERP component as an index of semantic processing.

Referanslar

  1. Habibi, A., Wirantana, V. & Starr, A. Cortical Activity during Perception of Musical Rhythm; Comparing Musicians and Non-musicians. Psychomusicology 24, 125-135 (2014).
  2. Kutas, M., & Hillyard, S. A. (1980). Reading Senseless Sentences: Brain Potentials Reflect Semantic Incongruity. Science, 207(4427), 203-205.
  3. Friedrich, M., & Friederici, A. D. (2004). N400-like Semantic incongruity effect in 19-month olds: Processing known words in picture contexts. Journal of Cognitive Neuroscience, 16(8), 1465-1477.
  4. Benau, E. M., Morris, J., & Couperus, J. W. (2011). Semantic processing in children and adults: Incongruity and the N400. J Psycholinguist Res, 40, 225-239.

DEŞİFRE METNİ

Understanding language involves complex cognitive processes, and—given the incredible number of word choices and arrangements that can form a single sentence—the brain must be able to distinguish between coherent and incoherent combinations.

A person’s comprehension of a sentence, whether spoken—like when a mother tells her son that she’s going to the store—or written in a book, depends, in part, on what the brain anticipates the next word in the sentence to be.

For example, if someone begins to read “It was a dark and stormy…” at the beginning of a book, it is expected that “night” will be the following term.

However, occasionally unexpected words are encountered—like “…and the mad scientist was painting his laboratory the color raccoon…”—that disrupt the sentence’s meaning.

In this instance, the anomalous term is raccoon, as it refers to a type of animal, rather than an expected color, like black.

Such semantic incongruities—the senseless sentences—elicit unique electrical signals in the brain—responses known as event-related potentials, ERPs for short—that may provide insight into how the brain either retrieves the definition of, or reprocesses, the troublesome word in an attempt to comprehend the sentence.

This video explains how the technique of electroencephalography, or EEG, can be used to measure ERPs during semantic incongruity tasks, in which participants are shown sentences ending with unexpected words.

We demonstrate how to design stimuli, and collect and analyze data, specifically focusing on a unique component of ERPs, named N400 to reflect its characteristics.

In this experiment, EEG is used to measure brain activity in participants shown semantically coherent and incoherent stimuli, in order to investigate language processing and comprehension.

These stimuli consist of three kinds of sentences: congruous, incongruous, and size-deviant. Although each is composed of seven words, they differ in the nature of their last terms.

The final words in congruous sentences, like “She scratched her dog behind its ear.,” pose no problems with meaning, and appear in the same font type—and size—as those preceding it.

Importantly, these sentences serve as controls to gauge how the brain responds to coherent word combinations.

In contrast, incongruous sentences, like “She dipped her chicken finger in boots.,” possess last terms that are semantically anomalous.

Here, boots conflicts with the meaning of the rest of the words—it is expected that chicken fingers would be dipped in a condiment like mustard, not in articles of clothing. Thus, these stimuli evaluate how surprising, incoherent language is processed.

The final type of sentences are called size-deviant, and contain last words that are surprising in appearance—they are in a larger font—but not congruity.

For example, if in the sentence “He put his hand in his mitten.,” the term mitten is written in bigger letters, it still makes semantic sense.

These stimuli are critical, as they are meant to distinguish whether the brain’s response to the last word in a sentence is the result of general surprise—the shock of an inconsistent text size—or is specific to unexpected meanings.

After participants are prepared for EEG, they are told to carefully read sentences that appear on a computer screen, as questions will be asked about them later on.

In reality, no quiz is given at the end of the experiment; however, these instructions ensure that subjects will pay attention to the upcoming stimuli.

During the task, participants are sequentially shown—in the correct order—the seven words that make up a single sentence.

Each term appears individually in the center of the monitor—to reduce eye movements that could interfere with data collection—for 100 ms, and is followed by 1000 ms of blank screen.

EEG information is continuously recorded over 120 such trials, each of which consists of a unique sentence. Specifically, stimuli are shown at the same frequency—40 times—but in a random order. Then, the task is repeated a second time, so participants must read a total of 240 sentences altogether.

Afterwards, EEG data are processed to visualize average ERPs for each type of sentence—from each electrode—and scientists search for the N400 component in these waveforms.

The “N” in this term indicates that the peak is negative, and the “400” represents its latency—that it occurs roughly 400 ms after the last-word stimulus is shown to the participant.

Based on previous experience, it is expected that the amplitude of N400 will increase in response to semantically inconsistent events, and will be recorded from all scalp electrodes.

However, this response will likely be most prominent at the Pz electrode, positioned in the midline of the scalp above the parietal lobes—regions of which are known to be involved in processing and integrating written language.

Prior to beginning the experiment, recruit a participant who is a native English speaker, and explain to them the two main components of the procedure: that they will be wearing electrodes, and be shown sentences on a computer screen. Then, collect from them all of the necessary, signed consent forms.

Next, outfit the participant with scalp and face electrodes. For more details on this procedure, check out the methods described elsewhere in this collection. Once in the testing space, verify impedance values across all electrodes.

Upon confirming that the EEG traces are void of noise, instruct the participant to sit so that their eyes are approximately 75 cm away from the screen.

Emphasize that they should read and pay careful attention to the sentences that appear word-by-word on this display, as questions will be asked about their content later on.

To ensure that the participant understands the task, show them ten practice sentences, but do not collect data during this time. Afterwards, start the EEG system to commence continuous recording.

Proceed with the functional task by presenting 120 trials—consisting of 40 congruous, 40 incongruous, and 40 deviant-size sentences—in a random order. Then, repeat this process with an additional set of 120 stimuli to guarantee that enough data are collected.

Once data have been recorded for all 240 stimuli, process it as described in JoVE’s ERPs and the Oddball Task video.

To analyze the data, first plot the average waveforms for the timecourses of congruous, incongruous, and deviant-size stimuli collected from the Pz recording site. On the x-axis of this graph—representing time in ms—indicate when each word in a sentence is shown.

Afterwards, locate the N400 peaks, and for each, calculate its average amplitude—defined as the distance between the lowest point of the peak and the baseline value of 0 µV, also represented by the horizontal axis.

Then, calculate the latency of this component—how long in ms it takes for it to appear in the waveform after the last word in a sentence is shown.

For the ranges of these amplitudes and latencies, proceed to use F-tests to determine whether there is a difference between target and control stimuli.

Notice that the N400 response was only observed after participants were shown the last word of an incongruous sentence, indicating that this electrical event reflects neural processing—particularly involving the parietal lobes—that identify an interruption in sentence processing caused by an incoherent term.

Importantly, although N400 was not observed in waveforms collected using deviant-size stimuli, another unique component—P560, a positive peak with a latency of 560 ms—was.

This indicates that the brain responds differently to unexpected visual stimuli and semantically inconsistent terms, and suggests that N400 is a unique electrical signature of language incongruity.

Now that you know how semantic inconsistency can be used to elicit the N400 component in ERPs, let’s look at other ways researchers are examining this unique electrical signal to study language processing and comprehension.

Some researchers aim to determine when the ability to identify incoherent language develops, and whether this skill changes with age.

Such work has involved showing young children—outfitted with EEG caps—representations of recognizable objects, like a camera.

However, the trick is that when the child looks at this depiction, they’re told it’s something different—for example, a cat. Thus, this is a modified version of the semantic incongruity task, as the spoken word doesn’t match the meaning of the visible item.

Measurements of the brain’s electrical responses to these tasks demonstrated that children exhibit an enhanced N400-esque response to incongruous item-word pairs—one that lasts for several hundred ms—compared to congruous sets.

Importantly, this suggests that even at an early age, humans are able to identify and process semantic incongruity.

Other researchers are assessing whether ERPs can be used to better understand language deficits associated with certain personality disorders, such as schizophrenia.

Paradoxically, previous work has shown that individuals with pronounced schizophrenia-like characteristics, such as anxiety or the inability to feel pleasure, demonstrate a heightened N400 response to congruous word pairs—like animal and goat—compared to people with milder symptoms.

However, when these participants were treated with an antipsychotic drug called olanzapine, the amplitude of this congruity-caused N400 component decreased compared to individuals given a placebo, suggesting a possible therapy that could treat the disjointed speech sometimes observed in such disorders.

You’ve just watched JoVE’s video on how congruous and incongruous sentences can be used to investigate language processing. At this point, you should know how to present stimuli to participants, and collect and interpret ERP data. We hope you also now understand how the N400 component is being used to investigate other aspects of language comprehension, such as how it can be affected in behavioral disorders.

Thanks for watching!