Why does human conversation come so easily? A new study chalks it up to a sort of "mind meld" between participants. Researchers have found that the brains of speakers and listeners become synchronized as they converse and that this "neural coupling" is key to effective communication.
Scientists have traditionally considered talking and listening to be two independent processes. The idea is that speech is produced in some parts of the brain, including a region known as Broca's area, and understood in others, including a region known as Wernicke's area. But recent studies suggest that there's actually much more overlap. For example, partners in a conversation will unconsciously begin imitating each other, adopting similar grammatical structures, speaking rates, and even bodily postures.
This overlap helps people establish a "common ground" during conversation and may even help them predict what the other is going to say next, argue psychologist Martin Pickering of the University of Edinburgh and psychologist Simon Garrod of the University of Glasgow, both in the United Kingdom. Some researchers think that so-called mirror neurons, which fire when one individual observes the actions of another, might be involved in these interactions.
To test some of these hypotheses, a team led by Princeton University psychologist Uri Hasson used magnetic resonance imaging (MRI) to compare the brain activation patterns of speakers and listeners. Graduate student Lauren Silbert placed her head in an MRI machine and related a 15-minute, unrehearsed story about various adventures she had while in high school, which included two boys fighting over who was going to take her to the prom and an encounter with a police officer after a car accident. The team recorded Silbert's story, using a specially designed microphone that filtered out the loud noise of the MRI machine, and then played it back to 11 subjects while their brains were also scanned.
The researchers found considerable synchronization between Silbert's brain-activation patterns and those of her listeners as the story unfolded. For example, as Silbert spoke about her prom experience, the same areas lit up in her brain as in the brains of her listeners. In most brain regions, the activation pattern in the listeners' brains came a few seconds after that seen in Silbert's brain. But a few brain areas, including one in the frontal lobe, actually lit up before Silbert's, perhaps representing listeners' anticipating what she was going to say next, the team says.
To ensure that the coupling was not an experimental artifact, the team also asked the 11 English-speaking subjects to listen to a Russian-speaker tell a story that they could not understand. This time, the researchers saw no neural coupling, they report online today in the Proceedings of the National Academy of Sciences. The researchers also found a positive correlation between the strength of the coupling and how well the subjects could recall the details of the story.
The study is "groundbreaking," says neuroscientist Riitta Hari of the Aalto University School of Science and Technology in Espoo, Finland. "It shows how closely the brain functions of a speaker and a listener are connected during successful communication" and demonstrates that "listeners are active players and not only passive receivers." Pickering agrees that the paper is "very important and original," adding that "it provides strong evidence that the neural coupling underlies the way that speakers come to understand each other."
But neuroscientist Michael Arbib of the University of Southern California in Los Angeles says that the findings would be more convincing if the team had scanned the brains of two people as they conversed in real time. Hasson says that's the next step: "We are working on a setup which will enable us to record two subjects simultaneously conversing across two scanners."