Gestural enhancement of degraded speech comprehension engages the language network, motor and visual cortex as reflected by a decrease in the alpha and beta band
Face-to-face communication involves the integration of speech
and visual information, such as iconic co-speech gestures.
Especially iconic gestures, that illustrate object attributes,
actions and space, can enhance speech comprehension in
adverse listening conditions (e.g. Holle et al., 2010). Using
magnetoencephalography (MEG), we aimed at identifying
the networks and the neuronal dynamics associated with
enhancing (degraded) speech comprehension by gestures. Our
central hypothesis was that gestures enhance degraded speech
comprehension, and that decreases in alpha and beta power
reflect engagement, whereas increases in gamma reflect active
processing in task relevant networks (Jensen & Mazaheri, 2010;
Jokisch & Jensen, 2007). Participants (n = 30) were presented
with videos of an actress uttering Dutch action verbs. Speech
was presented clear or degraded by applying noise-vocoding
(6-band), and accompanied by videos of an actor performing
iconic gesture depicting actions (clear speech+ gesture; C-SG,
degraded speech+gesture; D-SG) or no gesture (clear speech
only; C-S, degraded speech only; D-S). We quantified changes
in time-frequency representations of oscillatory power as the
video unfolded. The sources of the task-specific modulations
were identified using a beamformer approach. Gestural
enhancement, calculated by comparing (D-SG vs DS) to (C-SG
vs CS), revealed significant interactions between the occurrence
of a gesture and degraded speech particularly in the alpha,
beta and gamma band. Gestural enhancement was reflected
by a beta decrease in motor areas indicative of engagement of
the motor system during gesture observation, especially when
speech was degraded. A beta band decrease was also observed
in the language network including left inferior frontal gyrus,
a region involved in semantic unification operations, and left
superior temporal regions. This suggests a higher semantic
unification load when a gesture is presented together with
degraded versus clear speech. We also observed a gestural
enhancement effect in the alpha band in visual areas. This
suggests that visual areas are more engaged when a gesture is
present, most likely reflecting the allocation of visual attention,
especially when speech is degraded, which is in line with the
functional inhibition hypothesis (see Jensen & Mazaheri, 2010).
Finally we observed gamma band effects in left-temporal areas
suggesting facilitated binding of speech and gesture into a
unified representation, especially when speech is degraded. In
conclusion, our results support earlier claims on the recruitment
of a left-lateralized network including motor areas, STS/
MTG and LIFG in speech-gesture integration and gestural
enhancement of speech (see Ozyurek, 2014). Our findings
provide novel insight into the neuronal dynamics associated
with speech-gesture integration: decreases in alpha and beta
power reflect the engagement of respectively the visual and
language/motor networks, whereas a gamma band increase reflects the integrations in left prefrontal cortex. In future work
we will characterize the interaction between these networks by
means of functional connectivity analysis.
Publication type
PosterPublication date
2016
Share this page