Environmental noise affects audiovisual gain during speech comprehension in adverse listening conditions

The Communicative Brain, Seijdel*, N., Mazzini*, S., & Drijvers*, L. (2023). Environmental noise affects audiovisual gain during speech comprehension in adverse listening conditions. OSF Preprints. doi:10.31219/osf.io/wbv9r.
* = all authors contributed equally
Face-to-face communication involves both auditory speech and visual bodily information (e.g. visible speech, gestures), and often occurs in noisy settings. In such settings, meaningful hand gestures have shown to facilitate speech comprehension. However, individual differences exist in how much listeners benefit from these gestures. One factor that could impact how a listener processes and integrates audiovisual information is their experience with environmental noise. For example, growing up or going to school in high-noise environments (such as in the proximity of frequent heavy traffic or airports) has been shown to influence language abilities, such as reading comprehension, and speech perception. In the current study, we investigated whether a listener’s experience with environmental noise affects how much they can benefit from gestures in adverse listening conditions. In an online experiment, 40 participants watched video clips of an actress articulating an action verb, accompanied by an iconic gesture or none, in clear or degraded speech. Participants identified the verb and rated the noise levels of their current and previous living environments based on their geographical locations. We hypothesized that high-noise environments would hinder task performance and audiovisual gain. The results indicated that the amount of environmental noise, in current and previous living environments, influenced how much participants benefited from gestures during degraded speech comprehension. Individuals from lower-noise environments benefitted more from visual semantic information than those from high-noise environments. Individuals from higher-noise environments performed better overall, and benefitted less from gestures, potentially because they developed adaptive mechanisms to cope with auditory degradation.
Publication type
Preprint
Publication date
2023

Share this page