The dark, thumping cavern of an MRI scanner can be a lonely place. How can scientists interested in the neural activity underlying social interactions capture an engaged, conversing brain while its owner is so isolated? Two research teams are advancing a curious solution: squeezing two people into one scanner.
One such MRI setup is under development with new funding from the U.S. National Science Foundation (NSF), and another has undergone initial testing described in a preprint last month. These designs have yet to prove that their scientific payoff justifies their cost and complexity, plus the requirement that two people endure a constricted almost-hug, in some cases for 1 hour or more. But the two groups hope to open up new ways to study how brains exchange subtle social and emotional cues bound up in facial expressions, eye contact, and physical touch. The tool could “greatly expand the range of investigations possible,” says Winrich Freiwald, a neuroscientist at Rockefeller University. “This is really exciting.”
Functional magnetic resonance imaging (fMRI), which measures blood oxygenation to estimate neural activity, is already a common tool for studying social processes. But compared with real social interaction, these experiments are “reduced and artificial,” says Lauri Nummenmaa, a neuroscientist at the University of Turku in Finland. Participants often look at static photos of faces or listen to recordings of speech while lying in a scanner. But photos can’t show the subtle flow of emotions across people’s faces, and recordings don’t allow the give and take of real conversation.
So researchers have crafted real-time encounters in the scanner. In 2002, neuroscientist Read Montague and colleagues at Baylor College of Medicine published the first of many studies to record simultaneously from people in separate, linked MRI machines. The approach can capture neural activity as people play an online game or communicate through an audio or video feed.
Even with that approach, “There’s a huge amount of interpersonal information filtered out,” says Ray Lee, a neuroscientist and MRI physicist at Columbia University. So over the past decade, he has been refining an fMRI setup for two. It requires a specialized pair of head coils that allows researchers to read separate signals from two adjacent brains. These cagelike metal coils encircle participants’ heads as they lie on their sides with their legs touching in the MRI magnet and gaze at each other through a window. In 2012, while at Princeton University, Lee and colleagues published the first paper on the device, which he estimates would cost $200,000 to provide to another lab.
A second two-person fMRI scanner, developed by Nummenmaa and colleagues in the lab of neuroscientist Riitta Hari at Aalto University in Finland, uses a different type and shape of head coil, but places participants in the same near-cuddling pose. (The team tried a less intimate, sphinxlike pose: bellies down, face to face. But it was “pretty bad for the neck,” Hari notes.)
In a 10 December 2019 bioRxiv preprint, the team describes an early test of the technology: recording neural activity while pairs of friends or intimate partners took turns tapping each other on the lips. With that task, the researchers could verify that the scanner picked up brain activity corresponding to both the touch of the taps and the sight of the tapping finger—along with the sound of recorded instructions.
Lee’s first research questions are also relatively simple: How does brain activity in the shared scanner differ from activity during a remote video connection? What brain networks light up when people make eye contact? He is still analyzing data and submitting publications from his 2012 setup, but in the fall of 2019, his team received nearly $1 million from NSF to design a coil with improved signal quality and scan more brains.
“If Ray can get this system working … he’s got a lot of room to grow,” says Ellen Carpenter, a neuroscientist and program director at NSF. Future studies could, for example, observe the brain as it picks up social cues and decides when and how to convey empathy to a scanner mate, she says.
Of course, researchers can already observe socializing brains by imaging them one at a time. A person being scanned can talk with or even touch a person directly outside the scanner. “Do you actually gain something from observing in real time how the activity is changing in the two brains?” Freiwald wonders. One issue is that fMRI is slow. The changes in blood oxygen that it measures happen on the scale of seconds, which means that in some cases, the precise relationship between the timing of neural firing in the two brains could elude the scanner.
Others say the cozy MRI setup itself could limit the research. “This is more than face to face,” says Uri Hasson, a neuroscientist at Princeton. “You lie next to very particular people in your life.” With other people, the experience could feel threatening. “Have you ever stood in front of a stranger 3 inches from their nose? Probably not on purpose,” says Montague, now at Virginia Polytechnic Institute and State University. “I don’t know where it’s going,” he says of the approach. “On the other hand, I’m a big proponent of the renegade maverick that [does] what they want to do.”
Despite their limitations, Lee thinks two-person MRI scanners will capture aspects of socializing brains long overlooked by neuroimaging. He plans to look at differences in the brain dynamics of children with and without autism as they make eye contact and interact with a parent in the scanner. He expects his first subjects will be sliding into the magnet together by this fall.