Evidence for children's online integration of simultaneous information from speech and iconic gestures : An ERP study
Sekine, K., Schoechl, C., Mulder, K., Holler, J., Kelly, S., Furman, R., & Ozyurek, A.
Evidence for children's online integration of simultaneous information from speech and iconic gestures: An ERP study. Language, Cognition and Neuroscience. Advance online publication
Children perceive iconic gestures, along with speech they hear. Previous studies have shown
that children integrate information from both modalities. Yet it is not known whether children
can integrate both types of information simultaneously as soon as they are available as adults
do or processes them separately initially and integrate them later. Using electrophysiological
measures, we examined the online neurocognitive processing of gesture-speech integration in
6- to 7-year-old children. We focused on the N400 event-related potentials component which
is modulated by semantic integration load. Children watched video clips of matching or
mismatching gesture-speech combinations, which varied the semantic integration load. The
ERPs showed that the amplitude of the N400 was larger in the mismatching condition than in
the matching condition. This finding provides the first neural evidence that by the ages of 6
or 7, children integrate multimodal semantic information in an online fashion comparable to
that of adults.