Sara Mazzini


  • The Communicative Brain, Mazzini*, S., Seijdel*, N., & Drijvers*, L. (2023). Autistic individuals benefit from gestures during degraded speech comprehension. PsyArXiv, 10.31234/ doi:10.31234/


    *All authors contributed equally to this work
    Iconic co-speech gestures enhance degraded speech comprehension in neurotypical adults. Nonetheless, the benefit of gestures in comprehending degraded speech has not been investigated in neurodivergent populations, such as autistic individuals. Previous research demonstrated atypical audiovisual and speech-gesture integration in autistic individuals, suggesting that integrating speech and gestures may be more challenging and less beneficial for speech comprehension in adverse listening conditions in comparison to neurotypicals. Conversely, autistic individuals could also benefit from additional cues to comprehend speech in noise, as they encounter difficulties in filtering relevant information from noise. In the present study, we investigated gestural enhancement of degraded speech in neurotypical and autistic adults. Participants were presented with videos of an actress uttering a Dutch action verb and had to complete a 4-alternative forced choice task. The action verb was produced in either clear or degraded speech and accompanied by a matching gesture or without a gesture. We observed a gestural enhancement effect in both neurotypical and autistic individuals, and no difference in the size of this effect between the groups. Our findings suggest that despite the previously reported differences in audiovisual integration and gesture interpretation, autistic individuals do benefit from gestures in degraded speech comprehension, similarly to neurotypicals. These findings provide relevant insights to improve communication practices with autistic individuals and to develop new interventions for speech comprehension.
  • The Communicative Brain, Seijdel*, N., Mazzini*, S., & Drijvers*, L. (2023). Environmental noise affects audiovisual gain during speech comprehension in adverse listening conditions. OSF Preprints. doi:10.31219/


    * = all authors contributed equally
    Face-to-face communication involves both auditory speech and visual bodily information (e.g. visible speech, gestures), and often occurs in noisy settings. In such settings, meaningful hand gestures have shown to facilitate speech comprehension. However, individual differences exist in how much listeners benefit from these gestures. One factor that could impact how a listener processes and integrates audiovisual information is their experience with environmental noise. For example, growing up or going to school in high-noise environments (such as in the proximity of frequent heavy traffic or airports) has been shown to influence language abilities, such as reading comprehension, and speech perception. In the current study, we investigated whether a listener’s experience with environmental noise affects how much they can benefit from gestures in adverse listening conditions. In an online experiment, 40 participants watched video clips of an actress articulating an action verb, accompanied by an iconic gesture or none, in clear or degraded speech. Participants identified the verb and rated the noise levels of their current and previous living environments based on their geographical locations. We hypothesized that high-noise environments would hinder task performance and audiovisual gain. The results indicated that the amount of environmental noise, in current and previous living environments, influenced how much participants benefited from gestures during degraded speech comprehension. Individuals from lower-noise environments benefitted more from visual semantic information than those from high-noise environments. Individuals from higher-noise environments performed better overall, and benefitted less from gestures, potentially because they developed adaptive mechanisms to cope with auditory degradation.

Share this page