You are here: Home Departments Neurobiology of Language News

This content is archived, it could be outdated.

Neurobiology of Language -

News

No albums or photos uploaded yet.

News
Genetic effects on language processing
The genetic FOXP2-CNTNAP2 pathway has been shown to be involved in the language capacity. Miriam Kos and colleagues investigated whether a common variant of CNTNAP2 (rs7794745) is relevant for syntactic and semantic processing in the general population.
Individual variation in semantic processing
It is well-known that, within ERP paradigms of sentence processing, semantically anomalous words elicit N400 effects. Less clear, however, is what happens after the N400. Miriam Kos and colleagues address this in a recent publication in Frontiers in Psychology.
Syntax is shared between speaking and listening
Do speaking and listening have a lot in common? Is there one integrated system for speaking and listening to sentences or are there two separate systems? In a recent paper, Segaert and colleagues focused on syntactic processing, a crucial aspect of both speaking and listing, necessary to determine the syntactic structure of a sentence.
The neurobiology of conversation
In a review recently published in Frontiers in Human Neuroscience, Laura Menenti, Martin Pickering, and Simon Garrod review neural evidence for the interactive alignment account of dialogue.
Run first read later?
When we read, various meanings of a word are activated. Is the emotional meaning activated first, to ensure fast response for survival purpose? Or do we need to know what it is before we can evaluate the good or bad of it?
Information processing gradient in the frontal lobes
In a recent paper in the Philosophical Transactions of the Royal Society, Julia Uddén and Jörg Bahlmann suggest an information processing gradient in the frontal lobe.
Self-monitoring in speech production
To prevent errors and choose the right words to say we must constantly monitor ourselves while we speak. Classic theories of monitoring have suggested that we monitor ourselves through our speech comprehension system. More recent accounts, however, suggest that we may be able to self-monitor via the detecting when two incompatible responses are simultaneously active. Dan Acheson, Peter Hagoort, and colleagues, address this issue in a paper published in Brain and Language.
The suppression of repetition enhancement
Repetition suppression is the reduction of neural responses to the repetition of stimulus features. Repetition suppression in the BOLD response as measured in fMRI studies is generally thought to underlie behavioral facilitation effects like priming.
Less is not more: Neural responses to missing and superfluous accents in context
When speakers accent repeated instead of new focus information in context, listeners detect the mismatch immediately and experience semantic processing difficulties (N400).
Embodied Cognition: Taking the next step
How does our body relate to language understanding? Embodied cognition argues that thinking is not confined to what happens in our heads, and that the body plays a crucial role in cognition.
PhD project: Jolien ten Velden
For most people speech feels like an effortless process. We think of what we want to say and the words flow out of our mouths. Our speech, however, is usually full of errors.
PhD project: Franziska Hartung
I am interested in simulation and language processing. During the last decade, research on semantic representation and language comprehension has shown that people simulate the semantic content of words they hear, for example parts of the sensory motor system can become active when people are presented with action words.
New websites for our research group
The Neurobiology of Language department will frequently update you about our research, new publications, PhD defenses, new lab members, and ongoing discussions.
Implicit acquisition of grammars
A recent hypothesis in empirical brain research on language is that the fundamental difference between animal and human communication systems is captured by the distinction between what kind of sequences these different species can process.
Rapid interactions between lexical semantic and word form analysis during word recognition in context
During reading, people often actively predict what the next word is prior to seeing the word. How fast and at which level can the contextually driven anticipation influence the processing of the upcoming word?
Spatio-temporal metaphors and mental representations of time
Many cultures talk about time in terms of space. However, the particular ways that time is spatialized differ acoss languages.
Thinking for speaking in early and late bilinguals
Do bilingual speakers of typologically different languages have multiple, perhaps conflicting, modes of thought, each corresponding to a particular language?
The role of cognitive abilities and empathy on speech-accompanying gestures
When speaking, why do some people produce hand gestures more frequently than others? In a recent study, Mingyuan Chu and colleagues found that individual differences in gesture production is related to cognitive abilities and the level of empathy.
Speech and music shape the listening brain
Speech and music are domains with different representations. Nonetheless, they both use sounds as their building blocks. An interesting question is whether experience or training with sounds in one domain can influence sound processing outside that particular domain.
The MOUS project
The Neurobiology of Language Department recently started an ambitious multidisciplinary project called 'The Mother Of all Unification Studies', or MOUS for short.
New post-doc: Diana Dimitrova
I am a postdoctoral researcher in the Neurobiology of Language Department (Donders Centre for Cognitive Neuroimaging) and in the Multilingualism Group (Centre for Language Studies).
PhD project: Jolien Francken
How do we perceive the world? I cannot look into your head, but in my experience I see and hear everything around me, and I perceive the things like they really are. But is this in fact true?
PhD project: Nietzsche Lam
When reading or listening to a sentence (in your native language), have you ever stopped to consider the complexity of this seemingly effortless task?
PhD project: Richard Kunert
My research is concerned with something so difficult that in millions of years of evolution only humans have ever achieved it, something which is so easy that even toddlers can do it without much effort, something so widespread that no culture is known to exist without it: using language as well as music.
PhD project: Ashley Lewis
My research is concerned with the electrophysiology of language processing. I investigate neuronal oscillations using EEG and MEG while participants are reading.
Neural mechanisms of communicative innovation
Our everyday conversations appear to revolve around our linguistic abilities. But closer inspection reveals that an effective conversation involves more than formulating grammatically correct and semantically coherent sentences.
Summer lab visit: Franklin Chang
Research in psycholinguistics is often focused on particular aspects of language processing in relative isolation, e.g., adult production, comprehension or language acquisition. In his research, Franklin Chang (University of Liverpool) is working towards a framework that links these facets of language behavior in a unified theory (P-chain; Dell & Chang, in press).
A non-judgemental attitude to language
Implicit knowledge - "knowing how" - underlies much of our behavioral repertoire, for instance how to speak our native language. This behavior is incredibly complex and almost everyone master it, without being able to explain exactly how they do it or how they learned to do it.
PhD project: Evelien Heyselaar
During the course of a conversation, your brain does a lot of things simultaneously: it needs to keep track of what your partner is saying, why they are saying it, and also simultaneously prepare what you are going to say so that as soon as your partner stops speaking you are ready with a response.
Four new PhD students starting in our group
From September 2013 Evelien Heyselaar, Lotte Schoot, Matthias Franken and Gwilym Lockwood will be working on their PhD projects.
PhD project: Matthias Franken
My research is concerned with how people acquire new speech articulations. I investigate the mechanisms people use when learning the pronunciation of a new language.
PhD project: Gwilym Lockwood
In my research, I hope to bridge two fields - sound-symbolism and synaesthesia - which have been sadly neglected until relatively recently.
Lin Wang investigates how our brain processes person names
I'm interested in the processing differences between person names and common nouns in the brain. Person names differ from common nouns in many aspects.
Miriam Kos defends PhD on October 23
Individuals vary in the way they process semantic and syntactic aspects of language, Miriam Kos (Neurobiology of Language Department) discovered in her dissertation "On the waves of language - Electrophysiological reflections on semantic and syntactic processing". She will defend her thesis on October 23rd at 12.30, in the Radboud University Aula.
Recognizing the emotional valence of person names
When you see or hear the name 'van Gogh', how quickly can you judge whether you like him or not? In order to make your judgment, you first need to recognize this name, then identify the person who bears this name, and finally access the associated information of this person (e.g., van Gogh is a famous Dutch post-impressionist painter.).
PhD project: Lotte Schoot
In most of the studies within our department, we investigate the brain responses of speakers who are using language without actually talking to someone. Or, when we study language comprehension processes, we look at the brains of listeners who have no idea who is talking to them.
We're at the Society for Neurobiology of Language conference
From 6-8 November, the fifth annual meeting of the SNL takes place in San Diego, CA. Many NBL researchers present their work here.
Research staff position at the Department Neurobiology of Language
The Max Planck Institute for Psycholinguistics (Nijmegen, the Netherlands) is offering a research staff position in the field of Neurobiology of Language.
Frontiers in Neuroscience for Young Minds
At the SfN meeting 2013 a new scientific journal has been launched with an editorial board of kids. Currently, the current issue features articles about “The Brain and Friends” and “The Brain and Talking/Texting”.
Call to scientists: stop excluding left-handed people from scientific studies!
Left-handed people really do have different brains and genes from right-handed people. Yet left-handed people are almost never included as study subjects in scientific research. Therefore in an article in Nature Reviews Neuroscience, Roel Willems and his colleagues from the Donders Institute and Max Planck Institute in Nijmegen call for more research into left-handed people. The article was published online on 12 February 2014.
PhD project: Danchao Cai
Some people are born with perfect/absolute pitch. What’s the neurological and genetic basis behind this?
PhD project: Alina Lartseva
About 1 in 100 children is diagnosed with Autism Spectrum Disorder (ASD). Some of these children will need intensive support and care for their whole life, and some of them will learn how to overcome their difficulties, and will be able to study, work, live independently. Despite its relatively high prevalence, we still know very little about what causes autism, what cognitive functions are impaired or how we can treat it.
The influence of communicative intent on the kinematics of pointing gestures
In everyday communication, people not only use speech but also hand gestures to convey information. One intriguing question in gesture research has been why gestures take the specific form they do.
Peter Hagoort becomes member of the Koninklijke Hollandsche Maatschappij der Wetenschappen
Peter Hagoort has been elected a member of the Koninklijke Hollandsche Maatschappij der Wetenschappen (the Royal Holland Society of Sciences and Humanities).
Synchronization of speech and gesture
Language and action systems are highly interlinked. A critical piece of evidence is that speech and its accompanying gestures are well synchronized temporally. The underlying mechanism responsible for the synchronization is still under debate.
The influence of communicative context on syntactic priming
If I describe a scene by saying 'the mistress is kissed by the man' (using a sentence in the passive voice) you are more likely to describe the following event with another passive sentence, like 'the man is hit by his wife', rather than an active sentence such as 'the wife hits her husband'.
MPI scientific retreat
From March 31st until April 2nd 2014, six members of the Neurobiology of Language department attended the MPI retreat in Münster. At this annual retreat, representatives of the different MPI departments join forces to think about the future of the institute.
Roel Willems and Franziska Hartung visit the new MPI in Frankfurt
Why do people perceive music and literature as varying in their beauty based on factors such as culture, society, historical period and individual taste? In 2012 the new Max Planck Institute for Empirical Aesthetics has been founded to answer questions regarding how humans perceive, experience and evaluate aesthetics.
Symposium: Towards a neuroscience of mutual understanding
On September 1, 2014 Arjen Stolk, Peter Hagoort and Ivan Toni organize a symposium on mechanisms of mutual understanding.
VENI grant awarded to Tineke Snijders
How do babies make sense of all the sounds they hear? Before they are able to link sound to meaning, they need to learn the locations of word boundaries in continuous speech.
VENI grants awarded to NBL alumni
In addition to the VENI grant that has been awarded to current NBL lab member Tineke Snijders, two NBL alumni obtained a prestigious Veni grant from the Dutch Science Foundation (NWO) as well: Caroline Junge and Tessa van Leeuwen.
SNL 2014 in Amsterdam starts today!
Today is the first day of the sixth annual Society for the Neurobiology of Language conference, this year held in Amsterdam, The Netherlands, partly organized by our Neurobiology of Language department. You can follow the meeting on Twitter via @SNLmtg or hashtag #snlmtg2014 and we will provide you with blog posts live from the conference.
SNL2014 day 1: Pim Levelt on the 'sleeping beauties' of psycholinguistics
The Society for the Neurobiology of Language meeting held at the 'Beurs van Berlage' in Amsterdam from August 27-29 was opened yesterday with a keynote lecture by the Netherlands' most famous psycholinguist: Pim Levelt.
SNL2014 day 2: Pascal Fries’ Fairy Tale
Pascal Fries has an almost mythological status at the Donders institute. He worked there from 2001 to 2009 and then moved to Franfurt to start his own research center, the Ernst Strüngmann Institute for Neuroscience (in cooperation with the Max Planck Society). Many stories exist about him: that he already became a professor at the age of 30 (not completely true, it 'only' happened when he was 36), that he only published in the best scientific journals (true) and that he is a genius, brilliant, a miracle child.
SNL2014 day 2: Constance Scharff on songbirds
People are not the only animals that communicate through language. On the second day of the Society for the Neurobiology of Language Conference, professor Constance Scharff (Freie Universität Berlin) gave a lecture in which she discussed the similarities between the human language system and that of the zebra finch, a songbird, to understand more about the relationship between language and the brain.
SNL2014 day 3: Michael Tomasello about the evolution of communication
Michael Tomasello is a developmental psychologist and director of the Max Planck Institute for Evolutionary Anthropology in Leipzig. He researches differences between human and primate communication. In his lecture at the Society for Neurobiology of Language conference in Amsterdam, he argued that at an early stage of development, children already acquire communicative skills that can never be learned by primates.
SNL2014 day 3: What is 'Neurobiology of Language' anyway?
On the third day of the Society for Neurobiology of Language a debate titled: “what is ‘neurobiology of language’ anyway?” was on the program. At first glance, this seems a strange question, given that the researchers that were present have committed their lives to exactly this field of research. Does this mean that they actually don’t have a clue what it is all about?
Looking back on SNL2014
A week ago, the Society for Neurobiology of Language conference 2014 took place in Amsterdam. Today we look back on the conference by means of interviews with researchers who participated.
New PhD students and post-docs starting in the NBL lab
From autumn 2014 Daniel Sharoh and Bohan Dai will be working on their PhD projects and Monique Flecken and Geertje van Bergen will join our lab to execute their VENI projects. Kirsten Weber will come back to the lab as a post-doc.
The role of beat gesture and pitch accent in semantic processing
In face-to-face communication, speakers often emphasize information with verbal (e.g., pitch accent, clefts) and nonverbal cues (e.g., hand gestures, facial expressions). The present study investigated how listeners integrate pitch accent and beat gesture (i.e., small baton-like hand movements produced along with the rhythmical pulsation of speech) to comprehend a message.
MSc project in NBL group on story reading
Emiel van den Hoven will be a MSc student in the NBL group during the coming year.
What happens in the brain when your tongue twists?
Producing language is one of the most common actions we perform. Like most actions, when we speak we rarely make mistakes, yet sometimes we produce speech errors such as saying the wrong word or mixing up the sounds in words. How do we monitor ourselves to detect when such errors occur?
PhD-project: David Peeters
One of the most important functions of language is that it allows us to refer to the things in the world around us. We continuously do so, for instance by using spatial demonstratives in combination with a perfectly timed manual pointing gesture (“look at that guy!”).
PhD project: Flora Vanlangendonck
Speaking is something you typically do with other people. Yet, you don’t talk to everyone in the same way all the time.
PhD project: Bohan Dai
In a multi-speaker context, humans have the ability to recognize and follow an individual speaker while ignoring other speakers and background noise. Listeners can even do this when the target speech is presented together with other sounds that are very similar, or the when target sound is more difficult to identify than other heard sounds. This remarkable human ability – the so called "cocktail party effect"– has been studied for over half a century.
New post-doc: Zheng Ye
Temporal connectives such as ‘before’ and ‘after’ give us the freedom to describe a sequence of events in different orders.
Nodes and networks in the neural architecture for language
In a recent paper in Current Opinion in Neurobiology Peter Hagoort presented his view on the neural architecture of the human language system.
Cerebral coherence between communicators marks the emergence of meaning
When we interact with another person, we consider what we mutually know. A new study suggests this knowledge is continuously and simultaneously adjusted in our minds as the interaction unfolds.
Stolk_etal_2014.pdf
New post-doc: Monique Flecken
People who speak different languages may talk differently about a situation. This is the case because the concepts encoded in the grammar and the lexicon of a given language may make specific things more salient than others. Do people also perceive and process situations differently, before and while speaking, and while not speaking about them? What happens when you learn an additional language with a grammar that is different from your native language?
Neural evidence for the role of shared space in online comprehension of spatial demonstratives
A fundamental property of language is that it allows us to refer to the things around us, for instance by using spatial demonstratives such as this and that in English. In a recent paper published in Cognition, David Peeters and colleagues present two ERP experiments that were carried out to investigate the neural mechanisms involved in the comprehension of such demonstrative terms in a visual everyday context.
New post-doc: Geertje van Bergen
This wordle contains the top 100 of most frequently used words in spoken Dutch, 20 of which fall into the category of discourse markers (e.g., ja, maar, uh, wel, ook). Discourse markers are linguistic elements that do not have any propositional meaning, but mark the relation between an utterance and the prior context.
From commonsense to science, and back
Commonsense cognitive concepts (CCCs) are the concepts used in daily life to explain, predict and interpret behaviour. CCCs are also used to convey neuroscientific results, not only to wider audiences but also to the scientific inner circle. In a recent article, Prof. Marc Slors from the Philosophy of Mind department of the RU Nijmegen and Jolien Francken show that translations from CCCs to brain activity, and from brain data to CCCs are made in implicit, loose and unsystematic ways.
The Behavioral and Neural Effects of Language on Motion Perception
Perception does not function as an isolated module but is tightly linked with other cognitive functions, for example the language faculty.
New post-doc: Anne Kösem
How does the brain segment the continuous speech signal into meaningful words and syllables? A recent model proposes that speech parsing results from the temporal alignment of neural oscillations to the rhythmic structure of speech, by a process called neural entrainment.
PhD project: Daniel Sharoh
Language processing is facilitated by complex, dynamic neural networks and involves interactions among populations of neurons spanning vast areas of cerebral real estate. Previous work has shown which brain areas are implicated in word and sentence processing, and which regions show greater sensitivity to increased semantic, syntactic or phonological demands. But as no man is an island, no functional brain region acts in isolation.
Peter Hagoort on the future of linguistics
At the 47th annual meeting of the European Linguistics Society (Societas Linguistica Europaea), Peter Hagoort was a plenary speaker during a round table discussion about the future of linguistics: “Quo Vadis Linguistics in the 21st century”. Below you can read a summary of his contribution to the discussion: "Linguistics quo vadis? An outsider perspective"
Auditory brain activity during speech imitation
Although speech production and speech perception have traditionally been investigated separately, in recent decades it has become clear that production and perception interact in complex ways. For example, the sound of our own speech provides useful feedback to our speech production.An important finding is the reduction of the auditory cortical response to one’s own (self-produced) speech, compared to externally generated speech.
New post-doc: Kirsten Weber
We generally do not process words in isolation but in rich contexts, such as sentences and larger discourse. From these contexts we acquire constraints and biases that shape our quick and efficient language processing and at the same time lead to ambiguities and occasionally misinterpretations. For example, we would expect the sentence fragment "the girl gave" to finish as "a flower to the boy" and not as "the boy a flower ," although both are possible.
New Publication: A predictive coding framework for rapid neural dynamics during sentence-level language comprehension
Predictive coding implementations of Bayesian hierarchical inference within cortical hierarchies have been steadily growing in popularity within the cognitive neuroscience community over the last decade or so. At the same time, work in electrophysiology has related high frequency oscillatory activity (typically in the gamma frequency range) to the feedforward, and low frequency oscillatory activity (typically in the beta or alpha frequency ranges) to the feedback flow of information within and between cortical hierarchies. This has led to the development of the so called ‘canonical microcircuit’ and the suggestion that it might be replicated throughout the cortex and constitute one general form of information processing in the brain.
Vidi Grants Awarded to Two NBL Researchers!
Drs. Jan-Mathijs Schoffelen and Roel Willems have been awarded the prestigious Vidi grant. The grant consists of maximum 800,000 EUR over the course of 5 years, which will enable them to set up their own research team to pursue their research interests.
New post-doc: David Peeters
Advances in technology constantly change the ways in which we can investigate the neurobiological underpinnings of language. In my post-doc project I will make use of virtual reality (VR) to study our linguistic and communicative capacities in rich, visual contexts.
Fast oscillatory dynamics during language comprehension
Neural oscillations play an important role in the dynamic formation of functional networks in the brain. Such networks are important for communication between brain regions and for segregating different types of information (at different frequencies) being sent from region to region within the brain. Language processing involves multiple types of information (e.g., syntactic, semantic, phonological) represented at various different levels and likely involves the representation and exchange of information within such frequency-specific functional networks. In a recent article in a special issue of the journal Brain and Language on Electrophysiology of Language we reviewed the literature on beta and gamma frequency oscillatory dynamics found during language comprehension beyond the level of processing single words (sentence-level processing and beyond).
Neural overlap in processing music and speech: a commentary
When you listen to some music and when you read a book, does your brain use the same resources? This question goes to the heart of how the brain is organized – does it make a difference between cognitive domains like music and language? In a new commentary Richard Kunert highlights a successful approach which helps to answer this question.
Ideophones in Japanese modulate the P2 and late positive complex responses
This article is about the interaction between sound-symbolism and sensory processing. Sound-symbolism is the non-arbitrary link between sound and meaning. In Dutch and other European languages, this only covers onomatopoeia, but many other languages and language families around the world have lots of sound-symbolic words to describe lots of different things (e.g. the Japanese word nurunuru, which means "slimy"). These words are known as ideophones.
How language influences our perception
How does language change what we see? In our new paper, published in the open-access journal Neuroscience of Consciousness, we used an inventive way to investigate at which level of processing linguistic material modulates visual perception.
Language processing in a conversation context
On Monday February 20th, Lotte Schoot will defend her thesis entitled "Language processing in a conversation context" at the Radboud Aula at 10:30am. All interested parties are welcome to attend. Below is a short summary of the content of her thesis.
Neurobiology of Language

What is the neurobiological infrastructure for the uniquely human capacity for language? The focus of the Neurobiology of Language Department is on the study of language production, language comprehension, and language acquisition from a cognitive neuroscience perspective. Read more...

Director: Peter Hagoort

Secretary: Carolin Lorenz

 

Flag NL Het talige brein