Linda Drijvers

Presentations

Displaying 1 - 36 of 36
  • Mazzini, S., Seijdel, N., & Drijvers, L. (2023). Gestural enhancement of degraded speech comprehension in Autism Spectrum Disorder. Talk presented at the 8th Gesture and Speech in Interaction (GESPIN 2023). Nijmegen, The Netherlands. 2023-09-13 - 2023-09-15.
  • Ter Bekke, M., Holler, J., & Drijvers, L. (2023). Do listeners use speakers’ iconic hand gestures to predict upcoming words?. Talk presented at the 9th bi-annual Joint Action Meeting (JAM). Budapest, Hungary. 2023-07-10 - 2023-07-12.
  • Drijvers, L., & Holler, J. (2022). Spatial orientation influences cognitive processing in conversation. Talk presented at the 18th NVP Winter Conference on Brain and Cognition. Egmond aan Zee, The Netherlands. 2022-04-28 - 2022-04-30.
  • Drijvers, L. (2022). How does the brain integrate speech and gestures?. Talk presented at the IMPRS Conference 2022. Nijmegen, the Netherlands. 2022-06-01 - 2022-06-03.
  • Drijvers, L. (2022). Multimodal language in the brain. Talk presented at a Psycholinguistics Colloquium, Humboldt University. online. 2022-01-24.
  • Drijvers, L. (2022). Towards a multimodal view on the neurobiology of language. Talk presented at Neurobiology of Language: Key Issues and Ways Forward II. online. 2022-03-17 - 2022-03-18.
  • Drijvers, L. (2021). The multimodal facilitation effect. Talk presented at ESLP 2021 (Embodied & Situated Language Processing). online. 2021-09-20 - 2021-09-29.
  • Drijvers, L. (2020). Rapid Invisible Frequency Tagging for language research. Talk presented at Neuroxillations, University of Oxford. online. 2020-06-22.
  • Drijvers, L. (2020). Studying multimodal language processing in interactive settings with Rapid Invisible Frequency Tagging. Talk presented at Speech Science Forum, University College London. online. 2020-12-17.
  • Ter Bekke, M., Drijvers, L., & Holler, J. (2020). The predictive potential of hand gestures during conversation: An investigation of the timing of gestures in relation to speech. Talk presented at the 7th Gesture and Speech Interaction (GESPIN 2020). online. 2020-09-07 - 2020-09-09.
  • Drijvers, L. (2019). Handbewegingen en het brein: Hoe je hersenen ervoor zorgen dat je iemand kunt horen en zien praten. Talk presented at Werkoverleg Amsterdamse Psycholinguisten, University of Amsterdam. Amsterdam, The Netherlands. 2019-04-18.
  • Drijvers, L. (2019). Speech-gesture integration in clear and adverse listening conditions. Talk presented at the Adverse Listening Condition Workshop, VU Medical Center. Amsterdam, The Netherlands. 2019-10-03.
  • Drijvers, L. (2019). The neural mechanisms of how iconic gestures boost degradedspeech comprehension in native and non-native listeners. Talk presented at the Gesture-Sign Workshop Prague 2019: Converging the Perspectives on Theories, Methods, and Applications. Prague, Czech Republic. 2019-05-16 - 2019-05-17.
  • Blokpoel, M., Dingemanse, M., Kachergis, G., Bögels, S., Drijvers, L., Eijk, L., Ernestus, M., De Haas, N., Holler, J., Levinson, S. C., Lui, R., Milivojevic, B., Neville, D., Ozyurek, A., Rasenberg, M., Schriefers, H., Trujillo, J. P., Winner, T., Toni, I., & Van Rooij, I. (2018). Ambiguity helps higher-order pragmatic reasoners communicate. Talk presented at the 14th biannual conference of the German Society for Cognitive Science, GK (KOGWIS 2018). Darmstadt, Germany. 2018-09-03 - 2018-09-06.
  • Drijvers, L. (2018). How do native and non-native listeners integrate speech and gestures?. Talk presented at the Naturalis Museum / University of Leiden. Leiden, The Netherlands. 2018-01-26.
  • Drijvers, L. (2018). Handbewegingen en het brein. Talk presented at the NEMO Science Night, NEMO Museum. Amsterdam, The Netherlands. 2018-11-18.
  • Drijvers, L. (2018). Neural dynamics underlying speech-gesture integration in native and non-native listeners. Talk presented at the SpAM - Speech in the Age of Multimodal Humanities Conference. Pisa, Italy. 2018-10-11.
  • Drijvers, L. (2018). On the neural integration of gestures and speech in adverse listening conditions. Talk presented at the Donders Centre for Cognition Language Division. Nijmegen, The Netherlands. 2018-01-08.
  • Drijvers, L. (2018). Oscillatory dynamics underlying speech-gesture integration. Talk presented at the Max Planck Institute for Human Cognitive and Brain Sciences. Leipzig, Germany. 2018-03-02.
  • Drijvers, L. (2018). Rapid-frequency tagging in speech-gesture integration. Talk presented at the Centre for Human Brain Health, University of Birmingham. Birmingham, UK. 2018-04-17.
  • Drijvers, L. (2018). Speech-gesture integration studied by rapid-frequency tagging. Talk presented at the Attention & Oscillations Workshop, Centre for Human Brain Health. Birmingham, UK. 2018-11-16.
  • Drijvers, L. (2017). Communicating science to the masses. Talk presented at IMPRS Introduction days. Nijmegen, The Netherlands. 2017-09-25.
  • Drijvers, L. (2017). How do we hear and see speech in a noisy bar?. Talk presented at Neerlandistiek in het Nieuws, Faculty of Arts, Radboud University. Nijmegen, The Netherlands. 2017-01-26.
  • Drijvers, L. (2017). How does our brain hear and see language?. Talk presented at the School of Psychology, University of Birmingham. Birmingham, UK. 2017.
  • Drijvers, L. (2017). How does our brain hear and see language?. Talk presented at Radboud Summerschool 'From Molecule to Brain'. Nijmegen, The Netherlands. 2017-08-15.
  • Drijvers, L. (2017). How do gestures contribute to understanding language?. Talk presented at OBA Amsterdam. Amsterdam, The Netherlands. 2017-02-28.
  • Drijvers, L. (2017). The neural mechanisms of how iconic gestures boost degraded speech comprehension. Talk presented at the workshop Types of iconicity in language use, development, and processing. Nijmegen, The Netherlands. 2017-07-06 - 2017-07-07.
  • Drijvers, L., Ozyurek, A., & Jensen, O. (2016). Gestural enhancement of degraded speech comprehension engages the language network, motor and visual cortex as reflected by a decrease in the alpha and beta band. Talk presented at Sensorimotor Speech Processing Symposium. London, UK. 2016-08-16.
  • Drijvers, L., Ozyurek, A., & Jensen, O. (2016). Gestural enhancement of degraged speech comprehension engages the language network, motor cortex and visual cortex. Talk presented at the 2nd Workshop on Psycholinguistic Approaches to Speech Recognition in Adverse Conditions (PASRAC). Nijmegen, The Netherlands. 2016-10-31 - 2016-11-01.
  • Drijvers, L. (2017). Left-temporal alpha and beta suppression predicts L2 listeners' benefit of gestures during clear and degraded speech comprehension. Talk presented at the Donders Discussions 2017. Nijmegen, the Netherlands. 2017-10-26 - 2017-10-27.
  • Drijvers, L., Ozyurek, A., & Jensen, O. (2016). Oscillatory and temporal dynamics show engagement of the language network, motor system and visual cortex during gestural enhancement of degraded speech. Talk presented at the Donders Discussions 2016. Nijmegen, The Netherlands. 2016-11-23 - 2016-11-24.
  • Drijvers, L., & Ozyurek, A. (2016). Visible speech enhanced: What do iconic gestures and lip movements contribute to degraded speech comprehension?. Talk presented at the 7th Conference of the International Society for Gesture Studies (ISGS7). Paris, France. 2016-07-22 - 2016-07-24.
  • Drijvers, L., & Ozyurek, A. (2016). Visible speech enhanced: What do iconic gestures and lip movements contribute to degraded speech comprehension?. Talk presented at the 7th Conference of the International Society for Gesture Studies (ISGS7). Paris, France. 2016-07-18 - 2016-07-22.

    Abstract

    Natural, face-to-face communication consists of an audiovisual binding that integrates speech and visual information, such as iconic co-speech gestures and lip movements. Especially in adverse listening conditions such as in noise, this visual information can enhance speech comprehension. However, the contribution of lip movements and iconic gestures to understanding speech in noise has been mostly studied separately. Here, we investigated the contribution of iconic gestures and lip movements to degraded speech comprehension in a joint context. In a free-recall task, participants watched short videos of an actress uttering an action verb. This verb could be presented in clear speech, severely degraded speech (2-band noise-vocoding) or moderately degraded speech (6-band noise-vocoding), and could view the actress with her lips blocked, with her lips visible, or with her lips visible and making an iconic co-speech gesture. Additionally, we presented these clips without audio and with just the lip movements present, or with just lip movements and gestures present, to investigate how much information listeners could get from visual input alone. Our results reveal that when listeners perceive degraded speech in a visual context, listeners benefit more from gestural information than from just lip movements alone. This benefit is larger at moderate noise levels where auditory cues are still moderately reliable than compared to severe noise levels where auditory cues are no longer reliable. As a result, listeners are only able to benefit from this additive effect of ‘double’ multimodal enhancement of iconic gestures and lip movements when there are enough auditory cues present to map lip movements to the phonological information in the speech signal
  • Van Leeuwen, T. M., Dingemanse, M., Lockwood, G., & Drijvers, L. (2016). Color associations in nonsynaesthetes and synaesthetes: A large-scale study in Dutch. Talk presented at the Synesthesia and Cross-Modal Perception. Dublin, Ireland. 2016-04-22.
  • Drijvers, L., & Ozyurek, A. (2015). Visible speech enhanced: What do gestures and lips contribute to speech comprehension in noise?. Talk presented at the Nijmegen-Tilburg Multi-modality workshop. Tilburg, The Netherlands. 2015-10-22.
  • Schubotz, L., Drijvers, L., Holler, J., & Ozyurek, A. (2015). The cocktail party effect revisited in older and younger adults: When do iconic co-speech gestures help?. Talk presented at Donders Discussions 2015. Nijmegen, The Netherlands. 2015-11-05.

Share this page