When it comes to prosthetic hands, you can't beat the one Luke Skywalker receives in The Empire Strikes Back. Not only did that robotic limb allow him to wield a lightsaber with great dexterity, each of his fingers twitched when a robot poked them. Although real-life brain-controlled prosthetics that enable a person to, say, pick up a pencil continue to improve for amputees, limbs that can actually feel touch sensations have remained a challenge. Now, by implanting electrodes into both the motor and the sensory areas of the brain, researchers have created a virtual prosthetic hand that monkeys control using only their minds, and that enables them to feel virtual textures.
Neuroscientist Miguel Nicolelis of Duke University in Durham, North Carolina, whose group has been developing so-called brain-machine interfaces, says that one of the pitfalls in these systems is that "no one's been able to close the loop" between controlling a limb and feeling a physical touch. So he and a group of researchers decided to create a "brain-machine-brain" interface using a virtual system. The researchers implanted two sets of tiny electrodes into a monkey's brain: one set in the motor control center, and the other in the part of the somatosensory cortex that processes the sensation of physical touch from the left hand. Using the first set, the monkey could control a virtual monkey arm on a computer screen and sweep the hand over virtual disks with different "textures." Meanwhile, the second set of electrodes fed a series of electrical pulses into the touch center of its brain. A low frequency of pulses indicated a rough texture, whereas high frequency indicated a fine texture (see video), and the monkeys quickly learned to tell the difference.
By giving the monkey rewards when it identified the right texture, the researchers discovered that it took as few as four training sessions for the animal to consistently distinguish the textures from one another, even when the researchers switched the order of the visually identical disks on the screen. The researchers then implanted the electrodes into the sensory region that receives tactile sensations from the foot in a different monkey; this monkey, too, acted as if the virtual appendage (in this case, the foot) was its own, moving it to correctly identify the textures, the team reports online today in Nature.
Although the monkeys are all adults, the motor and sensory regions of their brains are amazingly plastic, Nicolelis says: the combination of seeing an appendage that they control and feeling a physical touch tricks them into thinking that the virtual appendage is their own "within minutes." And throughout this experiment, the monkey's own general sense of touch didn't seem to be affected. "The brain," Nicolelis says, "is creating a sixth sense."
"It's definitely a milestone in brain-computer interfaces," says neuroscientist Sliman Bensmaia of the University of Chicago, who is developing touch-feedback systems for human prosthetics. Too many of the robotic arms now being developed, even very advanced ones, he says, ignore the importance of touch. "Sensory feedback is critical to doing anything," he says. Even mundane tasks like picking up a cup require a great deal of concentration so the wearer does not drop or crush it.
The new work is still an early step, however, he says. A biological arm receives countless inputs not only from texture but also from temperature and its position in space.
Nicolelis says his group is currently working on fine-tuning the sensory feedback as well as exploring ways to link the brain and computer wirelessly. After many years of working on brain-computer interfaces, he says, "we're getting very close to where they may be clinically useful" for paralyzed patients, not just in the lab, and for doctors as well. Touch feedback may allow surgeons, for instance, to perform microscopic surgery or countless other applications. "The brain," Nicolelis says, "has evolved capabilities that go way beyond the body."