The blind comic book star Daredevil has a highly developed sense of hearing that allows him to “see” his environment with his ears. But you don’t need to be a superhero to pull a similar stunt, according to a new study. Researchers have identified the neural architecture used by the brain to turn subtle sounds into a mind’s-eye map of your surroundings.
The study appears to be “very solid work,” says Lore Thaler, a psychologist at Durham University in the United Kingdom who studies echolocation, the ability of bats and other animals to use sound to locate objects.
Everyone has an instinctive sense of the world around them—even if they can’t always see it, says Santani Teng, a postdoctoral researcher at the Massachusetts Institute of Technology (MIT) in Cambridge who studies auditory perception in both blind and sighted people.
“We all kind of have that intuition,” says Teng over the phone. “For instance, you can tell I’m not in a gymnasium right now. I’m in a smaller space, like an office.”
That office belongs to Aude Oliva, principal research scientist for MIT’s Computational Perception & Cognition laboratory. She and Teng, along with two other colleagues, wanted to quantify how well people can use sounds to judge the size of the room around them, and whether that ability could be detected in the brain.
In the new study, the researchers took recordings of three different sounds: a hand clap, a metallic pole strike, and a bouncing tennis ball. Then, using an audio processing program, they merged them with ambient sound recorded from three rooms sized 50, 130, and 600 cubic meters. Essentially, they created sounds inside “virtual rooms,” Teng explains.
Next, the team recruited 14 sighted participants to listen to these sounds through earbuds while strapped into a magnetoencephalogram (MEG), which measures electrical activity in the brain. The researchers presented two sounds back to back and asked the participants to quickly say whether they sounded the same. Sometimes this meant listening to a hand clap followed by a ball bounce. Other times, it meant listening to the same sound twice, but in different-sized rooms. And sometimes it meant listening to changes in both the sound and the size of the room. Over the course of about an hour, each listener compared and contrasted 150 combinations (some were repeated), with the MEG machine recording their brain response all the while.
Though they’d never be mistaken for Daredevil, the listeners did pretty well: They could tell the difference between the small and the medium room 55% of the time, the medium and large rooms 70% of the time, and the small and large rooms 90% of the time. Accuracy in telling apart the hand clap, pole strike, and ball bounce ranged between about 75% and 100%, the team reports this month in a study on the preprint server bioRxiv.
What most interested the researchers, though, was that the MEG recordings clearly showed the participants' brains handled the sound source (clap versus pole versus ball) task differently than the room size task. When listeners correctly differentiated between the sound sources, signals spiked in the brain's temporal lobe, a region known to be involved in visual and auditory processing, about 130 milliseconds after hearing the sound. When they accurately accounted for the size of the room, though, the spikes came later, at 386 milliseconds, suggesting the brain processes an object’s sound versus its environment differently. “We think we’ve found a neural signature of the brain decoding the size of space,” Teng says.
What exactly is the brain listening for when it is figuring out a room’s size? As with echolocation masters like bats and dolphins, Teng thinks the answer is echoes. Specifically, the brain gauges how long it takes for a sound’s reverberation to trail off. A bigger room produces a more sustained echo, and the brain can detect even subtle differences. “I don’t want to claim that’s the only thing the brain is paying attention to,” he says, “but it seems to be the most important part.”
Thaler agrees, but she would also like to see this idea tested further. Using audio programs to artificially limit the amount of reverberation in a room, the researchers could test listeners’ accuracy when they had only the sound’s original echo to go on, for example.
Teng hasn’t run this particular experiment with blind people yet, but he thinks it’s possible they’d be faster and more accurate than sighted people at determining a room’s size based on sound. “When you’re an organism, human or otherwise, without vision, your brain privileges nonvisual information,” he says. “So would a blind person be more accurate in this experiment? Based on the evidence, my bet would be yes.”