9.4 Neurolinguistics: Using EEG to Investigate Syntax and Semantics
Check Yourself
Video Script
When we started talking about semantics, we observed that a sentence’s syntax influences its semantics, because of the principle of compositionality. For example, we saw that a given string of words can have two different meanings if it has two different grammatical syntactic structures. And yet, we also observed that syntax is independent of syntax. A noun phrase that has the semantic thematic role of Agent often occupies the syntactic position of Subject, but not all Agents are Subjects, and not all Subjects are Agents!
The division of labour between thematic roles and grammatical roles is some evidence that syntax and semantics are represented differently in our minds. There’s also evidence from neural imaging to show that our brains process semantic information differently from syntactic information. This evidence comes from electroencephalography or EEG. Electroencephalography uses electrodes to measure electrical activity on a person’s scalp from which scientists can draw conclusions about the person’s neural activity. The particular EEG technique that gets used in neurolinguistics is ERPs or event-related potentials, which measure the timing of the neural response to a particular event, like a sound or a word.
When we’re observing ERPs, we always do so by comparing responses to different kinds of events, and the usual comparison is between events that are expected and events that are unexpected. For example, a sentence like, “She takes her coffee with cream and …” sets up a very strong expectation in your mind of what the next word will be. If the next word that arrives in the sentence matches your mind’s expectation, then the electrical response at your scalp will look something like this: the baseline condition. But if the next word that shows up violates your mind’s expectation, then compare your brain’s response: We observe a spike in negative voltage about 400 milliseconds after that unexpected word appears.
This response is called an N400. The N in N400 stands for a negative voltage, and the 400 indicates that this spike in negative voltage shows up, on average, about 400 milliseconds after the event. The N400 was first observed in 1980 by Kutas & Hillyard and has been replicated hundreds of times since then. It’s clear from all these studies that the particular kind of event that leads to an N400 response is a word that is unexpected in the semantic context.
The N400 is the brain’s response to an unexpected or surprising event, but not every kind of surprise will produce an N400. In other words, we have expectations about things besides the meanings of sentences. Think about a simple sentence like, “The bread was…” If that sentence finishes with “eaten”, that fits our mind’s expectation, and this is the baseline brain response. Now, what expectation do you have for this sentence, “The ice cream was in the..”? You probably expect a noun to come next, to follow the preposition and determiner. But if what comes next is not a noun but a verb participle, this violates your mind’s expectation. Notice that the word eaten is semantically consistent with ice cream, but is not consistent with the syntax of the sentence: determiners are followed by nouns, not verbs. So the brain’s response is a positive voltage about 600 milliseconds after that unexpected word: a P600.
When we’re using language in real time — either reading or listening — our mind sets up expectations about what’s going to happen next. If what happens next violates our semantic expectations, the brain’s response is an N400. And if what happens next violates our syntactic expectations, the brain’s response is a P600.
These two different brain responses give us further evidence that syntax is independent of semantics in our brains!