Presentations

Displaying 1 - 3 of 3
  • Peeters, D., Snijders, T. M., Hagoort, P., & Ozyurek, A. (2015). The role of left inferior frontal gyrus in the integration of pointing gestures and speech. Talk presented at the 4th GESPIN - Gesture & Speech in Interaction Conference. Nantes, France. 2015-09-02 - 2015-09-04.
  • Peeters, D., Snijders, T. M., Hagoort, P., & Ozyurek, A. (2015). The neural integration of pointing gesture and speech in a visual context: An fMRI study. Poster presented at the 7th Annual Society for the Neurobiology of Language Conference (SNL 2015), Chigaco, USA.
  • Tromp, J., Peeters, D., Hagoort, P., & Meyer, A. S. (2015). Combining EEG and virtual reality: The N400 in a virtual environment. Talk presented at the 4th edition of the Donders Discussions (DD, 2015). Nijmegen, Netherlands. 2015-11-05 - 2015-11-06.

    Abstract

    A recurring criticism in the field of psycholinguistics and is the lack of ecological validity of experimental designs. For example, many experiments on sentence comprehension are conducted enclosed booths, where sentences are presented word by word on a computer screen. In addition, very often participants are instructed to make judgments that relate directly to the experimental manipulation. Thus, the contexts in which these processes are studied is quite restricted, which calls into question the generalizability of the results to more naturalistic environments. A possible solution to this problem is the use of virtual reality (VR) in psycholinguistic experiments. By immersing participants into a virtual environment, ecological validity can be increased while experimental control is maintained.
    In the current experiment we combine electroencephalography (EEG) and VR to look at semantic processing in a more naturalistic setting. During the experiment, participants move through a visually rich virtual restaurant. Tables and avatars are placed in the restaurant and participants are instructed to stop at each table and look at the object (e.g. a plate with a steak) in front of the avatar. Then, the avatar will produce an utterance to accompany the object (e.g. “I think this steak is very nice”), in which the noun will either match (e.g. steak) or mismatch (e.g. mandarin) with the item on the table. Based on previous research, we predict a modulation of the N400, which should be larger in the mismatch than the match condition. Implications of the use of virtual reality for experimental research will be discussed.

Share this page