Walk this way. The researchers used a virtual reality model in which optic flow didn't match the direction of a doorway in the distance.

Going With the Optic Flow

James Bond is racing his armor-plated Aston Martin along a mountain road, swerving wildly and trying to reach a narrow tunnel. Does he make it? New research suggests that if he succeeds, it's due to two strategies: Keeping his eye on the target, and watching how the guardrails whiz past. The research could eventually lead to a better understanding of visual impairments that affect mobility, and may help robots get around more easily.

Scientists have two theories for how people steer toward a target. You can simply fix your eyes on it, aim your body, and go. Or you can align your steps to the moving pattern that flashes across your retina as you walk. (If you're a Star Trek fan, think of this so-called optic flow as the blur of stars as the Enterprise jumps into warp drive.) But since both approaches get you to the same goal, scientists haven't been able to tease them apart.

In an experimental breakthrough, cognitive scientist William Warren of Brown University in Providence, Rhode Island, created a virtual reality system that allows him to separate the two strategies. He and his colleagues asked subjects to walk toward a virtual doorway and presented them with an artificial optic flow pattern that was shifted slightly away from their walking direction. If subjects used the shifted optic flow to take aim, the team predicted, they would miss the door. If they relied only on the visual direction of the door, they would head right toward it.

As it turned out, the importance of optic flow depended on its amount. When the virtual scenes contained only the doorway, subjects had no trouble heading straight for it. But when the doorway's optic flow was enhanced with whizzing objects such as posts and textured floors, subjects ended up steering away from the door.

The research, reported in the 1 February issue of Nature Neuroscience, is "undoubtedly a step forward," says John Wann of the University of Reading in the United Kingdom. "It's been difficult to create simulations where you could control optic flow and heading information." Now, says Wann, the question is how humans switch between the kinds of information they use to choose their direction.

Related sites
William Warren's home page
Brown University Visual Environment Navigation Lab
John Wann's home page

Follow News from Science

A 3D plot from a model of the Ebola risk faced at different West African regions over time.
Dancing sneakers on pavement
siderailarticle x promo