AUSTIN—Babies are as primed to learn a visual language as they are a spoken one. That’s the conclusion of research presented here today at the annual meeting of AAAS, which publishes Science. Parents and scientists know babies are learning sponges that can pick up any language they’re born into. But not as much is known about whether that includes visual language. To find out whether infants are sensitive to visual language, Rain Bosworth, a psychologist at the University of California, San Diego, tracked 6-month-olds’ and 1-year-olds’ eye movements as they watched a video of a woman performing self-grooming gestures, such as tucking her hair behind her ear, and signing. The infants watched the signs 20% more than the 1-year-old children did. That means babies can distinguish between what’s language and what’s not, even when it’s not spoken, but 1-year-olds can’t. That’s consistent with what researchers know about how babies learn spoken language. Six-month-olds home in on their native language and lose sensitivity to languages they’re not exposed to, but by 12 months old that’s more or less gone, Bosworth says. The researchers also watched babies’ gazes as they observed a signer “fingerspelling,” spelling out words with individually signed letters. The signer executed the fingerspelling cleanly or sloppily. Again, researchers found the 6-month-old babies, who had never seen sign language before, favored the well-formed letters, whereas the 12-month-olds did not show a preference. Together that means there’s a critical developmental window for picking up even nonverbal languages. As 95% of deaf children are born to hearing parents, they are at risk for developmental delays because they need that language exposure early on, the scientists say.
Check out all of our coverage of AAAS 2018.