David Peeters

Presentations

Displaying 1 - 42 of 42
  • Bosker, H. R., & Peeters, D. (2020). How hands help us hear: Evidence for a manual McGurk Effect. Talk presented at Sinn und Bedeutung 25. London, UK. 2020-09-03 - 2020-09-05.
  • Bosker, H. R., & Peeters, D. (2020). Seeing a beat gesture can change what speech sounds you hear. Talk presented at the 26th Architectures and Mechanisms for Language Processing Conference (AMLap 2020). Potsdam, Germany. 2020-09-03 - 2020-09-05.
  • Callaghan, E., Peeters, D., & Hagoort, P. (2019). Prediction: When, where & how? An investigation into spoken language prediction in naturalistic virtual environ-ments. Poster presented at the Eleventh Annual Meeting of the Society for the Neurobiology of Language (SNL 2019), Helsinki, Finland.
  • Misersky, J., Peeters, D., & Flecken, M. (2019). Moving through virtual space: Does grammar guide event perception?. Talk presented at the Workshop Crosslinguistic Perspectives on Processing and Learning (X-PPL). Zurich, Switzerland. 2019-11-04 - 2019-11-05.
  • Peeters, D. (2019). Bilingual switching between languages and listeners: Insights from virtual reality. Talk presented at the Conference on Multilingualism (COM 2019). Leiden, The Netherlands. 2019-09-01 - 2019-09-03.
  • Peeters, D. (2019). On the selection and use of spatial demonstratives. Talk presented at the Deictic Communication (DCOMM) –Theory and Application Conference. Norwich, UK. 2019-07-08 - 2019-07-09.
  • Peeters, D. (2018). The role of shared space in the choice of spatial demonstratives. Poster presented at the 10th Dubrovnik Conference on Cognitive Science (DUCOG 2018), Dubrovnik, Croatia.
  • Peeters, D. (2018). The power of pointing in linking language to the world. Talk presented at Spatial Cognition in a Multimedia and Intercultural World: The 7th International Conference on Spatial Cognition (ICSC 2018). Rome, Italy. 2018-09-10 - 2018-09-14.
  • Peeters, D. (2018). Tight link between pointing gestures and spatial demonstratives is unidirectional. Talk presented at the 8th Conference of the International Society for Gesture Studies (ISGS 8: " Gesture and Diversity" ). Cape Town, South Africa. 2018-07-04 - 2018-07-08.
  • Misersky, J., Peeters, D., & Flecken, M. (2017). The virtual reality of events of motion (VROEM). Poster presented at the workshop 'Event Representations in Brain, Language & Development' (EvRep), Nijmegen, The Netherlands.
  • Peeters, D. (2017). A standardized database of 3-D objects for virtual reality research. Poster presented at the 3rd Virtual Social Interaction Workshop, Bielefeld, Germany.
  • Peeters, D. (2017). Introducing virtual reality as the method of choice for experimental pragmatics. Talk presented at the Workshop 'Revising formal semantic and pragmatic theories from a neurocognitive perspective'. Bochum, Germany. 2017-06-19 - 2017-06-20.
  • Peeters, D. (2017). Virtual Reality revolution in the language sciences. Talk presented at the Workshop Key Questions and New Methods in the Language Sciences. Berg en Dal, The Netherlands. 2017-06-14 - 2017-06-17.
  • Tromp, J., Peeters, D., Meyer, A. S., & Hagoort, P. (2017). Combining Virtual Reality and EEG to study semantic and pragmatic processing in a naturalistic environment. Talk presented at the workshop 'Revising formal Semantic and Pragmatic theories from a Neurocognitive Perspective' (NeuroPragSem, 2017). Bochum, Germany. 2017-06-19 - 2017-06-20.
  • Peeters, D. (2016). Behavioral and neural correlates of bilingual language switching in virtual reality. Talk presented at the Leiden Institute for Brain and Cognition. Leiden, The Netherlands. 2016-06-23.
  • Peeters, D. (2016). Behavioral and neural correlates of bilingual language switching in virtual reality. Talk presented at the 2nd Virtual Social Interaction Workshop. Manchester, UK. 2016-07-12 - 2016-07-13.
  • Peeters, D. (2016). Behavioral and neural correlates of bilingual language switching in virtual reality. Poster presented at the Eighth Annual Meeting of the Society for the Neurobiology of Language (SNL 2016), London, UK.

    Abstract

    In everyday life bilinguals often switch between their languages as a function of the language background of their interlocutor. A conversation with a colleague in one's second language (L2) may, for instance, be followed by a phone call with a friend in one's native language (L1). The neurocognitive mechanisms supporting such bilingual language switching capacities have traditionally often been studied using cued-picture naming paradigms: participants named pictures that appeared on a computer screen in either their first or their second language as a function of the background color of the screen. Recently this cued-picture naming paradigm has been criticized for being unnatural, not reflecting everyday out-of-the-lab language switching. We made use of recent advances in virtual reality technology to overcome this limitation by investigating bilingual language switching in a contextually rich, ecologically valid setting while maintaining experimental control. Three separate picture naming experiments were carried out with TrialType (switch, non-switch) and Language (Dutch, English) as independent variables in a 2x2 design. In each experiment, 24 different Dutch-English late bilingual participants from the same student population named pictures in their L1 Dutch or their L2 English. Experiment 1 was a baseline experiment using the traditional cued-picture naming paradigm. In Experiments 2 and 3, participants named pictures for two virtual agents in a virtual environment that was rendered via a head-mounted display, creating a fully immersive virtual experience. Before the start of these two experiments, one virtual agent indicated in Dutch that she only understood Dutch, and the other indicated in English that she only understood English. The virtual agents sat behind a virtual monitor on which pictures appeared that participants named in Dutch or English as a function of the virtual agent that looked at them at picture onset. The physical appearance of the virtual agents in relation to their language identity (Dutch or English), and their position behind the virtual monitor (left vs. right) were fully counterbalanced across participants. In Experiment 3 participants' electrocencephalogram (EEG) was recorded. Linear mixed effects regression models of the picture naming latencies revealed similar symmetrical switch costs in all three Experiments. Switching languages led to significantly slower reaction times than not switching languages, but adding the interaction term (TrialType x Language) to the model did not improve the model fit. Data-driven cluster-based permutation tests on the EEG data collected in Experiment 3, time-locked to picture onset, revealed a more negative ERP wave for switch compared to non-switch trials, which was most pronounced between 540 ms and 700 ms after picture onset, reflecting a language-independent neural marker of language switching preceding speech onset. Similar to the behavioral data, no interaction with Language was found. These results confirm the ecological validity of the cued-picture naming paradigm to study bilingual language switching and open up a wide range of possibilities to use virtual reality technology in the study of language production and comprehension in bilingual and other communicative settings.
  • Peeters, D. (2016). Processing consequences of onomatopoeic iconicity in spoken language comprehension. Talk presented at the 38th Annual Meeting of the Cognitive Science Society (CogSci2016). Philadelphia, US. 2016-08-10 - 2016-08-13.
  • Peeters, D. (2016). The power of pointing in linking language to the world. Talk presented at the workshop 'Linking Language to Motor Concepts - Cognitive Correlates of Grasping Actions'. Vienna, Austria. 2016-06-10.
  • Peeters, D. (2016). Virtual Reality revolution in the language sciences. Talk presented at the Tilburg Center for Cognition and Communication. Tilburg, The Netherlands. 2016-09-07.
  • Peeters, D., Snijders, T. M., Hagoort, P., & Ozyurek, A. (2015). The neural integration of pointing gesture and speech in a visual context: An fMRI study. Poster presented at the 7th Annual Society for the Neurobiology of Language Conference (SNL 2015), Chigaco, USA.
  • Peeters, D., Snijders, T. M., Hagoort, P., & Ozyurek, A. (2015). The role of left inferior frontal gyrus in the integration of pointing gestures and speech. Talk presented at the 4th GESPIN - Gesture & Speech in Interaction Conference. Nantes, France. 2015-09-02 - 2015-09-04.
  • Tromp, J., Peeters, D., Hagoort, P., & Meyer, A. S. (2015). Combining EEG and virtual reality: The N400 in a virtual environment. Talk presented at the 4th edition of the Donders Discussions (DD, 2015). Nijmegen, Netherlands. 2015-11-05 - 2015-11-06.

    Abstract

    A recurring criticism in the field of psycholinguistics and is the lack of ecological validity of experimental designs. For example, many experiments on sentence comprehension are conducted enclosed booths, where sentences are presented word by word on a computer screen. In addition, very often participants are instructed to make judgments that relate directly to the experimental manipulation. Thus, the contexts in which these processes are studied is quite restricted, which calls into question the generalizability of the results to more naturalistic environments. A possible solution to this problem is the use of virtual reality (VR) in psycholinguistic experiments. By immersing participants into a virtual environment, ecological validity can be increased while experimental control is maintained. In the current experiment we combine electroencephalography (EEG) and VR to look at semantic processing in a more naturalistic setting. During the experiment, participants move through a visually rich virtual restaurant. Tables and avatars are placed in the restaurant and participants are instructed to stop at each table and look at the object (e.g. a plate with a steak) in front of the avatar. Then, the avatar will produce an utterance to accompany the object (e.g. “I think this steak is very nice”), in which the noun will either match (e.g. steak) or mismatch (e.g. mandarin) with the item on the table. Based on previous research, we predict a modulation of the N400, which should be larger in the mismatch than the match condition. Implications of the use of virtual reality for experimental research will be discussed.
  • Peeters, D., Chu, M., Holler, J., Hagoort, P., & Ozyurek, A. (2014). Behavioral and neurophysiological correlates of communicative intent in the production of pointing gestures. Poster presented at the Annual Meeting of the Society for the Neurobiology of Language [SNL2014], Amsterdam, the Netherlands.
  • Peeters, D., Chu, M., Holler, J., Hagoort, P., & Ozyurek, A. (2014). Behavioral and neurophysiological correlates of communicative intent in the production of pointing gestures. Talk presented at the 6th Conference of the International Society for Gesture Studies (ISGS6). San Diego, Cal. 2014-07-08 - 2014-07-11.
  • Peeters, D. (2014). Behavioral and neurophysiological correlates of communicative intent in the production of pointing gestures. Talk presented at The 4th Nijmegen Gesture Centre Workshop: Communicative intention in gesture and action. Nijmegen, The Netherlands. 2014-06-04 - 2014-06-05.
  • Peeters, D., Azar, Z., & Ozyurek, A. (2014). The interplay between joint attention, physical proximity, and pointing gesture in demonstrative choice. Talk presented at the 36th Annual Meeting of the Cognitive Science Society (CogSci2014). Québec City, Canada. 2014-07-23 - 2014-07-26.
  • Grainger, J., Peeters, D., Runnqvist, E., & Bertrand, D. (2013). No more cued pictures! Asymmetric switch costs in bilingual language production induced by reading words. Talk presented at the Annual Meeting of the Psychonomic Society. Toronto, Canada. 2013-11-14 - 2013-11-17.
  • Peeters, D., Runnqvist, E., Bertrand, D., & Grainger, J. (2013). Bilingual language switching across modalities: RT and ERP effects. Talk presented at The Workshop on Neurobilingualism. Groningen, The Netherlands. 2013-08-25 - 2013-08-27.
  • Peeters, D., Runnqvist, E., Bertrand, D., & Grainger, J. (2013). Bilingual language switching across modalities: RT and ERP effects. Talk presented at The conference 'Cross-linguistic priming in bilinguals: Perspectives and constraints'. Nijmegen, The Netherlands. 2013-09-09 - 2013-09-11.
  • Peeters, D., Chu, M., Holler, J., Ozyurek, A., & Hagoort, P. (2013). Getting to the point: The influence of communicative intent on the form of pointing gestures. Talk presented at the 35th Annual Meeting of the Cognitive Science Society (CogSci 2013). Berlin, Germany. 2013-08-01 - 2013-08-03.
  • Peeters, D., Chu, M., Holler, J., Ozyurek, A., & Hagoort, P. (2013). The influence of communicative intent on the form of pointing gestures. Poster presented at the Fifth Joint Action Meeting (JAM5), Berlin, Germany.
  • Peeters, D., Ozyurek, A., & Hagoort, P. (2012). Behavioral and neural correlates of deictic reference. Poster presented at the 18th Annual Conference on Architectures and Mechanisms for Language Processing [AMLaP 2012], Riva del Garda, Italy.
  • Peeters, D., Grainger, J., & Dijkstra, T. (2012). Cognate processing in L1 and L2 sentence context: A first ERP study. Poster presented at 25th Annual CUNY Conference on Human Sentence Processing, New York City, NY.
  • Peeters, D., Grainger, J., & Dijkstra, T. (2012). Processing the same words in different languages: An ERP study. Poster presented at Psycholinguistics in Flanders [PiF2012], Berg en Dal, The Netherlands.
  • Peeters, D., Ozyurek, A., & Hagoort, P. (2012). The comprehension of exophoric reference: An ERP study. Poster presented at the Fourth Annual Neurobiology of Language Conference (NLC), San Sebastian, Spain.

    Abstract

    An important property of language is that it can be used exophorically, for instance in referring to entities in the extra-linguistic context of a conversation using demonstratives such as “this” and “that”. Despite large-scale cross-linguistic descriptions of demonstrative systems, the mechanisms underlying the comprehension of such referential acts are poorly understood. Therefore, we investigated the neural mechanisms underlying demonstrative comprehension in situated contexts. Twenty-three participants were presented on a computer screen with pictures containing a speaker and two similar objects. One of the objects was close to the speaker, whereas the other was either distal from the speaker but optically close to the participant (“sagittal orientation”), or distal from both (“lateral orientation”). The speaker pointed to one object, and participants heard sentences spoken by the speaker containing a proximal (“this”) or distal (“that”) demonstrative, and a correct or incorrect noun-label (i.e., a semantic violation). EEG was recorded continuously and time-locked to the onset of demonstratives and nouns. Semantic violations on the noun-label yielded a significant, wide-spread N400 effect, regardless of the objects’ orientation. Comparing the comprehension of proximal to distal demonstratives in the sagittal orientation yielded a similar N400 effect, both for the close and the far referent. Interestingly, no demonstrative effect was found when objects were oriented laterally. Our findings suggest a similar time-course for demonstrative and noun-label processing. However, the comprehension of demonstratives depends on the spatial orientation of potential referents, whereas noun-label comprehension does not. These findings reveal new insights about the mechanisms underlying everyday demonstrative comprehension.
  • Peeters, D., & Ozyurek, A. (2012). The role of contextual factors in the use of demonstratives: Differences between Turkish and Dutch. Talk presented at the 6th Lodz Symposium: New Developments in Linguistic Pragmatics. Lodz, Poland. 2012-05-26 - 2012-05-28.

    Abstract

    An important feature of language is that it enables human beings to refer to entities, actions and events in the external world. In everyday interaction, one can refer to concrete entities in the extra-linguistic physical environment of a conversation by using demonstratives such as this and that. Traditionally, the choice of which demonstrative to use has been explained in terms of the distance of the referent [1]. In contrast, recent observational studies in different languages have suggested that factors such as joint attention also play an important role in demonstrative choice [2][3]. These claims have never been tested in a controlled setting and across different languages. There-fore, we tested demonstrative choice in a controlled elicitation task in two languages that previously have only been studied observational-ly: Turkish and Dutch. In our study, twenty-nine Turkish and twenty-four Dutch partic-ipants were presented with pictures including a speaker, an address-ee and an object (the referent). They were asked which demonstra-tive they would use in the depicted situations. Besides the distance of the referent, we manipulated the addressee’s focus of visual atten-tion, the presence of a pointing gesture, and the sentence type. A re-peated measures analysis of variance showed that, in addition to the distance of the referent, the focus of attention of the addressee on the referent and the type of sentence in which a demonstrative was used, influenced demonstrative choice in Turkish. In Dutch, only the dis-tance of the referent and the sentence type influenced demonstrative choice. Our cross-linguistic findings show that in different languages, people take into account both similar and different aspects of triadic situations to select a demonstrative. These findings reject descrip-tions of demonstrative systems that explain demonstrative choice in terms of one single variable, such as distance. The controlled study of referring acts in triadic situations is a valuable extension to observa-tional research, in that it gives us the possibility to look more specifi-cally into the interplay between language, attention, and other con-textual factors influencing how people refer to entities in the world References: [1] Levinson, S. C. (1983). Pragmatics. Cambridge: Cambridge University Press. [2] Diessel, H. (2006). Demonstratives, joint attention and the emergence of grammar. Cognitive Linguistics 17:4. 463–89. [3] Küntay, A. C. & Özyürek, A. (2006). Learning to use demonstratives in conversation: what do language specific strategies in Turkish reveal? Journal of Child Language 33. 303–320.
  • Peeters, D., & Ozyurek, A. (2012). The role of contextual factors in the use of demonstratives: Differences between Turkish and Dutch. Poster presented at The IMPRS Relations in Relativity Workshop, Max Planck Institute for Psycholinguistics, Nijmegen, the Netherlands.
  • Peeters, D., & Ozyurek, A. (2011). Demonstrating the importance of joint attention in the use of demonstratives: The case of Turkish. Poster presented at The 4th Biennial Conference of Experimental Pragmatics [XPRAG 2011], Barcelona, Spain.
  • Peeters, D., Dijkstra, T., & Grainger, J. (2011). The cognate facilitation effect is modulated by the word frequency of both readings. Talk presented at Psycholinguistics in Flanders 2011 [PIF2011]. Antwerp, Belgium. 2011-05-25 - 2011-05-26.
  • Peeters, D., Dijkstra, T., & Grainger, J. (2011). The cognate facilitation effect is modulated by the word frequency of both readings. Poster presented at Workshop on Bilingualism: Neurolinguistic and Psycholinguistic Perspectives, Aix-en-Provence, France.

    Abstract

    When a word is similar in orthography and meaning between the two languages of a bilingual, i.e., when it is a cognate, its recognition is generally facilitated compared to matched control words [1]. There have been contrasting views in the literature on how to explain this facilitation effect for completely identical cognates, such as FILM for Dutch and English [2][3]. Do identical cognates have one or two orthographic representations in the bilingual brain? To answer this question, we selected four groups of cognates with either a low or high frequency in the first and/or second language of French-English bilinguals and matched them with English control words. The bilinguals performed an English lexical decision task while their RTs and ERPs were recorded. The behavioral data showed facilitatory effects of cognate status and English L2 frequency. Further analysis of the identical cognates revealed significant main effects of both English and French frequency. Cognate facilitation was larger for cognates with a low English frequency compared to cognates with a high English frequency. The electrophysiological data showed a decreased negativity for cognates compared to control words in the N400 time-window. Those effects were more prominent for low-frequency English cognates than for high-frequency English cognates. Interestingly, for cognates with a low English frequency and a high French frequency, an effect was found in an early time-window (100-150 ms after stimulus onset). These results shed light on the representation of identical cognates in the bilingual brain and question the representational locus of word frequency effects. [1] DIJKSTRA, T., MIWA, K., BRUMMELHUIS, B., SAPPELLI, M., & BAAYEN, H. (2010). How cross-language similarity and task demands affect cognate recognition. Journal of Memory and Language, 62, p. 284-301. [2] VOGA, M., & GRAINGER, J., (2007). Cognate status and cross-script translation priming. Memory & Cognition, 35 (5), p. 938-952. [3] DIJKSTRA, A., & VAN HEUVEN, W.J.B. (2002). The architecture of the bilingual word recognition system: From identification to decision. Bilingualism: Language and Cognition, 5, p. 175-197.
  • Peeters, D. (2011). The representation and processing of identical cognates. Talk presented at Donders Discussions 2011. Nijmegen, The Netherlands. 2011-10-13 - 2011-10-14.

    Abstract

    Across the languages of a bilingual, translation equivalents can have the same orthographic form and shared meaning (e.g., TABLE in French and English). How such words, called identical cognates, are processed and represented in the bilingual brain is not well understood. I will present a study of late French-English bilinguals who processed identical cognates and control words in an L2 (English) lexical decision task. Both behavioral and electrophysiological data were collected. Reaction times to identical cognates were shorter than for non-cognate controls and depended on both English and French frequency. Cognates with a low English frequency showed a larger cognate advantage than those with a high English frequency. In addition, N400 amplitude was found to be sensitive to cognate status and both the English and French frequency of the cognate words. Theoretical consequences for the processing and representation of identical cognates are discussed.

Share this page