Language Paths

Speaking and understanding speech both use the same parts of the brain, a new study has found.

"You know what'll be super fun, guys? I'll talk and you can listen."

There’s just one exception: the brain areas that control mouth movements aren’t used in understanding speech – not even when we mentally repeat to ourselves what others are saying.

This all might sound pretty obvious, but it’s actually a major breakthrough in a century-long debate on how the brain deals with speech – and as you’re about to see, the conclusion isn’t nearly as undeniable as it seems.

To understand why, we’ve got to take a quick trip back to the 1800s, when a few neuroscientists made some important discoveries. They found that patients with damage to a part of the left inferior frontal gyrus1 (IFG) called Broca’s area have trouble coming up with words, while patients with damage to a part of the left superior temporal gyrus (STG) called Wernicke’s area babble like nervous politicians:

“A month ago, quite a little, I’ve done a lot well, I impose a lot, while, on the other hand, you know what I mean, I have to run around, look it over, trebbin and all that sort of stuff.”    (from a 1974 case study)

And so, the scientists reached a simple conclusion: Broca’s area was probably crucial for producing words, and Wernicke’s for understanding them – but as researchers began to discover, the interaction between these areas (and others) was much more complex, and didn’t always follow such a straightforward model. Some scientists began to suspect that speech and comprehension might actually be somewhat overlapping processes.

But when it came to studying the neural correlates of speech production, many studies had trouble getting a clear picture at all:

Most studies of how speech works in the brain focused on comprehension. That’s mostly because it’s easier to image the brains of people who are listening quietly; talking makes the head move, which is a problem when you’re measuring the brain.

But now, thanks to new fMRI technology that can precisely map a brain while its owner is moving, a team led by psycholinguists Laura Menenti and Peter Hagoort were able to study the brain processes behind both comprehension and speech, explains a report in the journal Psychological Science.

The team studied volunteers as they spoke and listened to sentences with all sorts of grammatical structures:

The authors accomplished this with a picture of an action—a man strangling a woman, say—with one person colored green and one colored red to indicate their order in the sentence. This prompted people to say either “The man is strangling the woman” or “The woman is strangled by the man.”

Well, those sound like delightful pictures. Anyway, the fMRI data showed that different brain areas were crucial for different speech-related tasks – figuring out a sentence’s meaning, thinking of words, and assembling grammar – but that the same areas were activated for each of these tasks whether a person was speaking or listening:

Effects of primary processing load (indicative of sensory and motor processes) overlapped in auditory cortex and left inferior frontal cortex, but not in motor cortex, where processing load affected activity only in speaking.

In short, the IFG and the auditory cortex help out with both speaking and understanding, but the motor cortex is only involved in speaking. (No word, as far as I can tell, on what the STG is doing during all this.) Though some of these pathways might use different routes, the areas involved are largely the same.

This is a pretty amazing discovery, because it means that damage to a certain brain area isn’t always going to affect all speaking or comprehension ability – it may just affect a certain aspect of both those abilities, such as grammar or vocabulary:

Our data suggest that these problems would be expected to always at least partly coincide. On the other, our data confirm the idea that many different processes in the language system, such as understanding meaning or grammar, can at least partly, be damaged independently of each other.

Looking back, it’s not hard to see why those early neuroscientists came to the conclusions they did – damage to the IFG and STG does affect our language skills, but not at all in the way they expected. It just goes to show that there’s usually more than one way of interpreting the data.

So the next time you find yourself struggling for the right word, tell your friends, “I can’t help it – my IFG’s acting up again!”

___________

1. Terms like “cortex” and “gyrus” can be a little confusing, so here’s a quick rundown: “gyrus” just means a ridge on the cerebrum. “Cortex” (Latin for “bark” or “rind”) means the outermost layer of a certain part of the brain, where the cell bodies of neurons are – in other words, the “gray matter.” So the cerebral cortex is the outermost layer of the cerebrum.

Share this post…
Email Facebook Twitter Linkedin Digg Delicious Reddit Stumbleupon Tumblr
You can leave a response, or trackback from your own site.

2 Responses to “Language Paths”

  1. Laura Menenti says:

    Thanks for the nice summary!

  2. kaonyx says:

    Its fairly obvious why mouth movements should not be implicated in cognition itself, eating for example is far too important to be interrupted by thinking. LOL. It doesn’t have to be interrupted by speaking either, we can eat and talk at the same time even if it is considered bad manners.

    As far as the colocation of function in the brain goes, it is fascinating that a higher level abstraction (such as a control function), can be the common denominator. In the case of the auditory cortex, perhaps there is some semantic pre-processing going on here. That suggests that listening, reading, writing, speaking, thinking, self-reflection and even lip-reading all co-opt the auditory circuits at some point.

    I have been reading about schizophrenia, an interesting theory suggests that it manifests as a form of “dysmetria”. Seemingly unrelated aspects of brain function may have a common basis in that they represent similar kinds of control circuits. Circuits used to mediate fine motor control may be similar in concept to circuits that mediate certain emotional controls (such as fear response), and they tend to be co-located in the brain. I could imagine an evolutionary process doing that.

    But as well as that kind of abstraction of the architecture, we also have to understand that language is not entirely abstract in itself, but can have perceptual grounding. So we find the processing of verbs associated with the activation of motor regions in the brain for example.

    Anyway thanks for an interesting post. The simple idea of being able to observe the brain when movement is allowed is a radical jump in scientific progress. This is quite exciting really, finally we are starting to get a clearer picture of the brain and how it operates.

Leave a Reply

Powered by WordPress | Designed by: free Drupal themes | Thanks to hostgator coupon and cheap hosting
Social links powered by Ecreative Internet Marketing