As you read this blog post, your brain is interpreting the squiggles on the screen in to words that form a sentence you can understand. Equally, when in conversation with another person, your ears gather the sound waves and send them off to your brain for processing. Just how is this achieved though? A study from The Netherlands suggests that we try to anticipate the end of sentence before we’ve heard it, in order to arrive at the full meaning as early as possible.
Jos Van Berkum, a psychologist at the Max Planck Institute for Psycholinguistics, use event-related brain potentials (ERPs) to investigate. An ERP is a measure of the brain’s electrical response to stimulus, such as the arrival of a word or sound from the ear, and tracking these measures in the brains of participants resulted in some interesting discoveries.
In one experiment participants were told one of two versions of the same story about a lecturer and a professor being summoned to the faculty dean. The lecturer is guilty of plagiarism, whilst the professor has faked research data. In one version of the story the two cheating academics are referred to in an ambiguous manner – e.g. “the two lecturers”, rather than the lecturer and professor – and the potential confusion that this entails resulted in a change in ERP reading. Listeners had to work harder to figure out what was going on.
A second test used the same story, but to investigate a different effect. The time, the dean tells the lecturer that there is ample reason to sack/promote him. Those who heard the “promote” version showed a spike in the ERP known as an N400 effect, because it peaks roughly 400 milliseconds after the word is spoken. This effect seems to reflect something about language comprehension – the word “promote” is unexpected, given the lecturer’s conduct!
It seems that the brain is constantly working out what the next word in a sentence could be, using heuristics and a wide variety of information sources. If the predicted word and the actual word jar, the N400 spike shows that the brain readjusts.
Van Berkum suggests that further research in this area is required, particularly in developing computer models of how meaning is constructed from words and connecting these models to neuroimaging techniques such as ERPs. Without new tools, he says, our understanding of the brain’s interpretation of language “won’t get much further”.
Jos J.A. Van Berkum (2008). Understanding Sentences in Context: What Brain Waves Can Tell Us Current Directions in Psychological Science, 17 (6), 376-380 DOI: 10.1111/j.1467-8721.2008.00609.x