Publications

Displaying 601 - 621 of 621
  • Wassenaar, M., & Hagoort, P. (2007). Thematic role assignment in patients with Broca's aphasia: Sentence-picture matching electrified. Neuropsychologia, 45(4), 716-740. doi:10.1016/j.neuropsychologia.2006.08.016.

    Abstract

    An event-related brain potential experiment was carried out to investigate on-line thematic role assignment during sentence–picture matching in patients with Broca's aphasia. Subjects were presented with a picture that was followed by an auditory sentence. The sentence either matched the picture or mismatched the visual information depicted. Sentences differed in complexity, and ranged from simple active semantically irreversible sentences to passive semantically reversible sentences. ERPs were recorded while subjects were engaged in sentence–picture matching. In addition, reaction time and accuracy were measured. Three groups of subjects were tested: Broca patients (N = 10), non-aphasic patients with a right hemisphere (RH) lesion (N = 8), and healthy aged-matched controls (N = 15). The results of this study showed that, in neurologically unimpaired individuals, thematic role assignment in the context of visual information was an immediate process. This in contrast to patients with Broca's aphasia who demonstrated no signs of on-line sensitivity to the picture–sentence mismatches. The syntactic contribution to the thematic role assignment process seemed to be diminished given the reduction and even absence of P600 effects. Nevertheless, Broca patients showed some off-line behavioral sensitivity to the sentence–picture mismatches. The long response latencies of Broca's aphasics make it likely that off-line response strategies were used.
  • Weber, A., & Cutler, A. (2003). Perceptual similarity co-existing with lexical dissimilarity [Abstract]. Abstracts of the 146th Meeting of the Acoustical Society of America. Journal of the Acoustical Society of America, 114(4 Pt. 2), 2422. doi:10.1121/1.1601094.

    Abstract

    The extreme case of perceptual similarity is indiscriminability, as when two second‐language phonemes map to a single native category. An example is the English had‐head vowel contrast for Dutch listeners; Dutch has just one such central vowel, transcribed [E]. We examine whether the failure to discriminate in phonetic categorization implies indiscriminability in other—e.g., lexical—processing. Eyetracking experiments show that Dutch‐native listeners instructed in English to ‘‘click on the panda’’ look (significantly more than native listeners) at a pictured pencil, suggesting that pan‐ activates their lexical representation of pencil. The reverse, however, is not the case: ‘‘click on the pencil’’ does not induce looks to a panda, suggesting that pen‐ does not activate panda in the lexicon. Thus prelexically undiscriminated second‐language distinctions can nevertheless be maintained in stored lexical representations. The problem of mapping a resulting unitary input to two distinct categories in lexical representations is solved by allowing input to activate only one second‐language category. For Dutch listeners to English, this is English [E], as a result of which no vowels in the signal ever map to words containing [ae]. We suggest that the choice of category is here motivated by a more abstract, phonemic, metric of similarity.
  • Weber, A., Broersma, M., & Aoyagi, M. (2011). Spoken-word recognition in foreign-accented speech by L2 listeners. Journal of Phonetics, 39, 479-491. doi:10.1016/j.wocn.2010.12.004.

    Abstract

    Two cross-modal priming studies investigated the recognition of English words spoken with a foreign accent. Auditory English primes were either typical of a Dutch accent or typical of a Japanese accent in English and were presented to both Dutch and Japanese L2 listeners. Lexical-decision times to subsequent visual target words revealed that foreign-accented words can facilitate word recognition for L2 listeners if at least one of two requirements is met: the foreign-accented production is in accordance with the language background of the L2 listener, or the foreign accent is perceptually confusable with the standard pronunciation for the L2 listener. If neither one of the requirements is met, no facilitatory effect of foreign accents on L2 word recognition is found. Taken together, these findings suggest that linguistic experience with a foreign accent affects the ability to recognize words carrying this accent, and there is furthermore a general benefit for L2 listeners for recognizing foreign-accented words that are perceptually confusable with the standard pronunciation.
  • Wheeldon, L. (2003). Inhibitory from priming of spoken word production. Language and Cognitive Processes, 18(1), 81-109. doi:10.1080/01690960143000470.

    Abstract

    Three experiments were designed to examine the effect on picture naming of the prior production of a word related in phonological form. In Experiment 1, the latency to produce Dutch words in response to pictures (e.g., hoed , hat) was longer following the production of a form-related word (e.g., hond , dog) in response to a definition on a preceding trial, than when the preceding definition elicited an unrelated word (e.g., kerk , church). Experiment 2 demonstrated that the inhibitory effect disappears when one unrelated word is produced intervening prime and target productions (e.g., hond-kerk-hoed ). The size of the inhibitory effect was not significantly affected by the frequency of the prime words or the target picture names. In Experiment 3, facilitation was observed for word pairs that shared offset segments (e.g., kurk-jurk , cork-dress), whereas inhibition was observed for shared onset segments (e.g., bloed-bloem , blood-flower). However, no priming was observed for prime and target words with shared phonemes but no mismatching segments (e.g., oom-boom , uncle-tree; hex-hexs , fence-witch). These findings are consistent with a process of phoneme competition during phonological encoding.
  • Whitehouse, A. J., Bishop, D. V., Ang, Q., Pennell, C. E., & Fisher, S. E. (2011). CNTNAP2 variants affect early language development in the general population. Genes, Brain and Behavior, 10, 451-456. doi:10.1111/j.1601-183X.2011.00684.x.

    Abstract

    Early language development is known to be under genetic influence, but the genes affecting normal variation in the general population remain largely elusive. Recent studies of disorder reported that variants of the CNTNAP2 gene are associated both with language deficits in specific language impairment (SLI) and with language delays in autism. We tested the hypothesis that these CNTNAP2 variants affect communicative behavior, measured at 2 years of age in a large epidemiological sample, the Western Australian Pregnancy Cohort (Raine) Study. Singlepoint analyses of 1149 children (606 males, 543 emales) revealed patterns of association which were strikingly reminiscent of those observed in previous investigations of impaired language, centered on the same genetic markers, and with a consistent direction of effect (rs2710102, p = .0239; rs759178, p = .0248). Based on these findings we performed analyses of four-marker haplotypes of rs2710102- s759178-rs17236239-rs2538976, and identified significant association (haplotype TTAA, p = .049; haplotype GCAG, p = .0014). Our study suggests that common variants in the exon 13-15 region of CNTNAP2 influence early language acquisition, as assessed at age 2, in the general population. We propose that these CNTNAP2 variants increase susceptibility to SLI or autism when they occur together with other risk factors.

    Additional information

    Whitehouse_Additional_Information.doc
  • Willems, R. M., Ozyurek, A., & Hagoort, P. (2007). When language meets action: The neural integration of gesture and speech. Cerebral Cortex, 17(10), 2322-2333. doi:10.1093/cercor/bhl141.

    Abstract

    Although generally studied in isolation, language and action often co-occur in everyday life. Here we investigated one particular form of simultaneous language and action, namely speech and gestures that speakers use in everyday communication. In a functional magnetic resonance imaging study, we identified the neural networks involved in the integration of semantic information from speech and gestures. Verbal and/or gestural content could be integrated easily or less easily with the content of the preceding part of speech. Premotor areas involved in action observation (Brodmann area [BA] 6) were found to be specifically modulated by action information "mismatching" to a language context. Importantly, an increase in integration load of both verbal and gestural information into prior speech context activated Broca's area and adjacent cortex (BA 45/47). A classical language area, Broca's area, is not only recruited for language-internal processing but also when action observation is integrated with speech. These findings provide direct evidence that action and language processing share a high-level neural integration system.
  • Willems, R. M., Labruna, L., D'Esposito, M., Ivry, R., & Casasanto, D. (2011). A functional role for the motor system in language understanding: Evidence from Theta-Burst Transcranial Magnetic Stimulation. Psychological Science, 22, 849 -854. doi:10.1177/0956797611412387.

    Abstract

    Does language comprehension depend, in part, on neural systems for action? In previous studies, motor areas of the brain were activated when people read or listened to action verbs, but it remains unclear whether such activation is functionally relevant for comprehension. In the experiments reported here, we used off-line theta-burst transcranial magnetic stimulation to investigate whether a causal relationship exists between activity in premotor cortex and action-language understanding. Right-handed participants completed a lexical decision task, in which they read verbs describing manual actions typically performed with the dominant hand (e.g., “to throw,” “to write”) and verbs describing nonmanual actions (e.g., “to earn,” “to wander”). Responses to manual-action verbs (but not to nonmanual-action verbs) were faster after stimulation of the hand area in left premotor cortex than after stimulation of the hand area in right premotor cortex. These results suggest that premotor cortex has a functional role in action-language understanding.

    Additional information

    Supplementary materials Willems.pdf
  • Willems, R. M., Clevis, K., & Hagoort, P. (2011). Add a picture for suspense: Neural correlates of the interaction between language and visual information in the perception of fear. Social, Cognitive and Affective Neuroscience, 6, 404-416. doi:10.1093/scan/nsq050.

    Abstract

    We investigated how visual and linguistic information interact in the perception of emotion. We borrowed a phenomenon from film theory which states that presentation of an as such neutral visual scene intensifies the percept of fear or suspense induced by a different channel of information, such as language. Our main aim was to investigate how neutral visual scenes can enhance responses to fearful language content in parts of the brain involved in the perception of emotion. Healthy participants’ brain activity was measured (using functional magnetic resonance imaging) while they read fearful and less fearful sentences presented with or without a neutral visual scene. The main idea is that the visual scenes intensify the fearful content of the language by subtly implying and concretizing what is described in the sentence. Activation levels in the right anterior temporal pole were selectively increased when a neutral visual scene was paired with a fearful sentence, compared to reading the sentence alone, as well as to reading of non-fearful sentences presented with the same neutral scene. We conclude that the right anterior temporal pole serves a binding function of emotional information across domains such as visual and linguistic information.
  • Willems, R. M., Benn, Y., Hagoort, P., Tonia, I., & Varley, R. (2011). Communicating without a functioning language system: Implications for the role of language in mentalizing. Neuropsychologia, 49, 3130-3135. doi:10.1016/j.neuropsychologia.2011.07.023.

    Abstract

    A debated issue in the relationship between language and thought is how our linguistic abilities are involved in understanding the intentions of others (‘mentalizing’). The results of both theoretical and empirical work have been used to argue that linguistic, and more specifically, grammatical, abilities are crucial in representing the mental states of others. Here we contribute to this debate by investigating how damage to the language system influences the generation and understanding of intentional communicative behaviors. Four patients with pervasive language difficulties (severe global or agrammatic aphasia) engaged in an experimentally controlled non-verbal communication paradigm, which required signaling and understanding a communicative message. Despite their profound language problems they were able to engage in recipient design as well as intention recognition, showing similar indicators of mentalizing as have been observed in the neurologically healthy population. Our results show that aspects of the ability to communicate remain present even when core capacities of the language system are dysfunctional
  • Willems, R. M., & Hagoort, P. (2007). Neural evidence for the interplay between language, gesture, and action: A review. Brain and Language, 101(3), 278-289. doi:10.1016/j.bandl.2007.03.004.

    Abstract

    Co-speech gestures embody a form of manual action that is tightly coupled to the language system. As such, the co-occurrence of speech and co-speech gestures is an excellent example of the interplay between language and action. There are, however, other ways in which language and action can be thought of as closely related. In this paper we will give an overview of studies in cognitive neuroscience that examine the neural underpinnings of links between language and action. Topics include neurocognitive studies of motor representations of speech sounds, action-related language, sign language and co-speech gestures. It will be concluded that there is strong evidence on the interaction between speech and gestures in the brain. This interaction however shares general properties with other domains in which there is interplay between language and action.
  • Willems, R. M., & Casasanto, D. (2011). Flexibility in embodied language understanding. Frontiers in Psychology, 2, 116. doi:10.3389/fpsyg.2011.00116.

    Abstract

    Do people use sensori-motor cortices to understand language? Here we review neurocognitive studies of language comprehension in healthy adults and evaluate their possible contributions to theories of language in the brain. We start by sketching the minimal predictions that an embodied theory of language understanding makes for empirical research, and then survey studies that have been offered as evidence for embodied semantic representations. We explore four debated issues: first, does activation of sensori-motor cortices during action language understanding imply that action semantics relies on mirror neurons? Second, what is the evidence that activity in sensori-motor cortices plays a functional role in understanding language? Third, to what extent do responses in perceptual and motor areas depend on the linguistic and extra-linguistic context? And finally, can embodied theories accommodate language about abstract concepts? Based on the available evidence, we conclude that sensori-motor cortices are activated during a variety of language comprehension tasks, for both concrete and abstract language. Yet, this activity depends on the context in which perception and action words are encountered. Although modality-specific cortical activity is not a sine qua non of language processing even for language about perception and action, sensori-motor regions of the brain appear to make functional contributions to the construction of meaning, and should therefore be incorporated into models of the neurocognitive architecture of language.
  • Willems, R. M. (2011). Re-appreciating the why of cognition: 35 years after Marr and Poggio. Frontiers in Psychology, 2, 244. doi:10.3389/fpsyg.2011.00244.

    Abstract

    Marr and Poggio’s levels of description are one of the most well-known theoretical constructs of twentieth century cognitive science. It entails that behavior can and should be considered at three different levels: computation, algorithm, and implementation. In this contribution focus is on the computational level of description, the level that describes the “why” of cognition. I argue that the computational level should be taken as a starting point in devising experiments in cognitive (neuro)science. Instead, the starting point in empirical practice often is a focus on the stimulus or on some capacity of the cognitive system. The “why” of cognition tends to be ignored when designing research, and is not considered in subsequent inference from experimental results. The overall aim of this manuscript is to show how re-appreciation of the computational level of description as a starting point for experiments can lead to more informative experimentation.
  • Willems, R. M. (2007). The neural construction of a Tinkertoy [‘Journal club’ review]. The Journal of Neuroscience, 27, 1509-1510. doi:10.1523/JNEUROSCI.0005-07.2007.
  • Wittenburg, P. (2003). The DOBES model of language documentation. Language Documentation and Description, 1, 122-139.
  • Womelsdorf, T., Schoffelen, J.-M., Oostenveld, R., Singer, W., Desimone, R., Engel, A. K., & Fries, P. (2007). Modulation of neuronal interactions through neuronal synchronization. Science, 316, 1609-1612. doi:10.1126/science.1139597.

    Abstract

    Brain processing depends on the interactions between neuronal groups. Those interactions are governed by the pattern of anatomical connections and by yet unknown mechanisms that modulate the effective strength of a given connection. We found that the mutual influence among neuronal groups depends on the phase relation between rhythmic activities within the groups. Phase relations supporting interactions between the groups preceded those interactions by a few milliseconds, consistent with a mechanistic role. These effects were specific in time, frequency, and space, and we therefore propose that the pattern of synchronization flexibly determines the pattern of neuronal interactions.
  • Zeshan, U. (2003). Aspects of Türk Işaret Dili (Turkish Sign Language). Sign Language and Linguistics, 6(1), 43-75. doi:10.1075/sll.6.1.04zes.

    Abstract

    This article provides a first overview of some striking grammatical structures in Türk Idotscedilaret Dili (Turkish Sign Language, TID), the sign language used by the Deaf community in Turkey. The data are described with a typological perspective in mind, focusing on aspects of TID grammar that are typologically unusual across sign languages. After giving an overview of the historical, sociolinguistic and educational background of TID and the language community using this sign language, five domains of TID grammar are investigated in detail. These include a movement derivation signalling completive aspect, three types of nonmanual negation — headshake, backward head tilt, and puffed cheeks — and their distribution, cliticization of the negator NOT to a preceding predicate host sign, an honorific whole-entity classifier used to refer to humans, and a question particle, its history and current status in the language. A final evaluation points out the significance of these data for sign language research and looks at perspectives for a deeper understanding of the language and its history.
  • Ziegler, A., DeStefano, A. L., König, I. R., Bardel, C., Brinza, D., Bull, S., Cai, Z., Glaser, B., Jiang, W., Lee, K. E., Li, C. X., Li, J., Li, X., Majoram, P., Meng, Y., Nicodemus, K. K., Platt, A., Schwarz, D. F., Shi, W., Shugart, Y. Y. and 7 moreZiegler, A., DeStefano, A. L., König, I. R., Bardel, C., Brinza, D., Bull, S., Cai, Z., Glaser, B., Jiang, W., Lee, K. E., Li, C. X., Li, J., Li, X., Majoram, P., Meng, Y., Nicodemus, K. K., Platt, A., Schwarz, D. F., Shi, W., Shugart, Y. Y., Stassen, H. H., Sun, Y. V., Won, S., Wang, W., Wahba, G., Zagaar, U. A., & Zhao, Z. (2007). Data mining, neural nets, trees–problems 2 and 3 of Genetic Analysis Workshop 15. Genetic Epidemiology, 31(Suppl 1), S51-S60. doi:10.1002/gepi.20280.

    Abstract

    Genome-wide association studies using thousands to hundreds of thousands of single nucleotide polymorphism (SNP) markers and region-wide association studies using a dense panel of SNPs are already in use to identify disease susceptibility genes and to predict disease risk in individuals. Because these tasks become increasingly important, three different data sets were provided for the Genetic Analysis Workshop 15, thus allowing examination of various novel and existing data mining methods for both classification and identification of disease susceptibility genes, gene by gene or gene by environment interaction. The approach most often applied in this presentation group was random forests because of its simplicity, elegance, and robustness. It was used for prediction and for screening for interesting SNPs in a first step. The logistic tree with unbiased selection approach appeared to be an interesting alternative to efficiently select interesting SNPs. Machine learning, specifically ensemble methods, might be useful as pre-screening tools for large-scale association studies because they can be less prone to overfitting, can be less computer processor time intensive, can easily include pair-wise and higher-order interactions compared with standard statistical approaches and can also have a high capability for classification. However, improved implementations that are able to deal with hundreds of thousands of SNPs at a time are required.
  • Zwitserlood, I. (2011). Gebruiksgemak van het eerste Nederlandse Gebarentaal woordenboek kan beter [Book review]. Levende Talen Magazine, 4, 46-47.

    Abstract

    Review: User friendliness of the first dictionary of Sign Language of the Netherlands can be improved
  • Zwitserlood, I. (2011). Gevraagd: medewerkers verzorgingshuis met een goede oog-handcoördinatie. Het meten van NGT-vaardigheid. Levende Talen Magazine, 1, 44-46.

    Abstract

    (Needed: staff for residential care home with good eye-hand coordination. Measuring NGT-skills.)
  • Zwitserlood, I. (2011). Het Corpus NGT en de dagelijkse lespraktijk. Levende Talen Magazine, 6, 46.

    Abstract

    (The Corpus NGT and the daily practice of language teaching)
  • Zwitserlood, I. (2011). Het Corpus NGT en de opleiding leraar/tolk NGT. Levende Talen Magazine, 1, 40-41.

    Abstract

    (The Corpus NGT and teacher NGT/interpreter NGT training)

Share this page