THURSDAY, May 24 (HealthDay News) -- Infants can tell the difference between two languages without hearing the spoken words, simply by watching the face of the adult who is talking, a Canadian study says.
"It is important, because it tells us how babies are prepared to learn multiple languages," said Whitney Weikum, a doctoral candidate in neural sciences at the University of British Columbia who led the experiment.
Working under the supervision of Janet Werker, a professor of psychology at the university, Weikum had three groups of infants, ages 4, 6 and 8 months, from bilingual Canadian homes watch silent video clips of an adult speaking either French or English.
"The baby watches the screen and sees the faces of the people talking," Weikum said. "When the baby's looking time declines, the computer switches and starts a clip of an adult talking the other language. The baby notices the switch and starts watching the screen again."
That ability to tell the difference can diminish over time, depending on what languages are spoken in the home, the study found. Eight-month-old babies from bilingual French-English homes would return their attention to the screen when the language was changed. But the ability to tell the difference was lost at about 8 months of age by babies from homes where only one language was spoken.
One point made by the study, published in the May 25 issue of the journal Science, is that "language is multimodal," Weikum said. "Studies have shown that aural cues are important. This now shows the importance of visual cues."
Laura-Ann Petitto, a cognitive neuroscientist who is director of the Cognitive Neuroscience Laboratory for Language, Bilingualism and Child Development at Dartmouth College, said: "This is a landmark study about the ways that babies use multiple cues to enable them to distinguish between languages. The study suggests that, at an abstract and deep level, the learning brain might not be tied to speech itself."
It has been known that young deaf babies use visual cues to help them learn language, Petitto said, "but we never dreamed that a hearing baby can also be learning language using visual cues."
Petitto said the study has "important implications," because "it supports the belief that the brain can use multiple cues in language processing and suggests that multiple cues in teaching languages can be beneficial."
The findings also have practical applications for remedial speech teaching, Petitto said. "Various remedial tools use multi-stimuli," she said. "This is wonderful confirmation that the multiple cues that we give babies are actually useful."
Peter Gordon, associate professor of speech and language pathology at Teachers College of Columbia University, said an interesting follow-up study would be to add another language to the mix.
"If we gave them say, Russian or Chinese, a language that they are not adapted to, we would predict that they would be like the monolingual group," he said.
For more on speech and language development, visit the U.S. National Institute on Deafness and Other Communication Disorders.
SOURCES: Whitney Weikum, doctoral candidate, University of British Columbia, Vancouver, Canada; Laura-Ann Petitto, Ph.D., cognitive neuroscientist, Dartmouth College, Hanover, N.H.; Peter Gordon, associate professor of speech and language pathology, Columbia University Teachers College, New York City; May 25, 2007, Science