You are here: Home Q&A Questions and Answers

Questions and Answers

Vragen en Antwoorden Nederlands Fragen und Antworten Deutsch

Is there something you have always wanted to know about language? We might have an answer! On this page we answer questions about various aspects of language asked by people outside of the language researcher community.

Show or Hide answerHow do gender articles affect cognition?
Link & Share

Languages organize their nouns into classes in various different ways. Some don’t have any of these noun classes (e.g., English: every noun is just ‘it’), some have two (e.g., French: every noun is either masculine or feminine), some have as many as 16 (e.g., Swahili: there are different classes for animate objects, inanimate objects, tools, fruits...). Some languages with two noun classes differentiate between masculine and feminine (e.g., French), some between common and neuter (e.g., Dutch). Clearly, all these languages differ in terms of what might be called their ‘grammatical gender system’. While Western European languages might give the impression that grammatical gender (e.g., whether nouns are male or female) primarily affects the articles placed in front of nouns (e.g., de versus het in Dutch), these differences often affect the noun itself and other words connected to it as well. Polish, for example, doesn’t even have articles (as if the English word ‘the’ didn’t exit) but still uses an intricate gender system which requires adjectives to agree with nouns. The reasons for these differences between languages remain mysterious.


Given that the gender system of a language permeates all sentences, one might wonder whether it goes further and also influences how people think in general. On the face of it, this appears unlikely. A grammatical gender system is just a set of rules for how words change when combined. There is no ‘deeper’ meaning to them. Nonetheless, a series of experiments have come up with surprising results.

In the 1980’s Alexander Guiora and colleagues noticed that two to three year old Hebrew-speaking children (whose language system does differentiate between masculine and feminine nouns) are about half a year ahead in their gender identity development compared to English- speaking children. It is as if the gender distinction found in Hebrew nouns gave these children a hint about a similar gender distinction in the natural world.

Adults too seem to make use of grammatical gender even when it doesn’t make any sense to do so. Roberto Cubelli and colleagues asked people to judge whether two objects were of the same category (e.g., tools or furniture) or not. When the grammatical gender of the objects matched, people were faster in their judgements than when there was a mismatch. This task didn’t require people to name these objects, and yet still they appear to use the arbitrary grammatical classification system of their native language.

Edward Segel and Lera Boroditsky found the influence of grammatical gender even outside the laboratory - in an encyclopedia of classical paintings. They looked at all the gendered depictions of naturally asexual concepts like love, justice, and time. They noticed that these asexual entities (e.g., time) tended to be personified by masculine characters if the grammatical gender was masculine in the painter’s language (e.g., French: ‘le temps’), and vice versa for female characters (e.g., German: ‘die Zeit’).  The depicted gender agreed with grammatical gender in 78% of the cases for painters whose mother tongue was ‘gendered’, like Italian, French and German. On top of that, this effect was consistent even when only looking at those concepts with different grammatical genders in the studied languages.

These and similar studies have powerfully shown how a grammatical classification system for nouns affects the view language speakers have on the world. By forcing people to think in certain categories, general thinking habits appear to be affected. This illustrates quite nicely that thought is influenced by what you must say – rather than by what you can say. The grammatical gender effect on cognition highlights the fact that language is not an isolated skill but instead a central part of how the mind works.

Written by Richard Kunert and Gwilym Lockwood

Further reading:

Segel, E., & Boroditsky, L. (2011). Grammar in art. Frontiers in Psychology, 1,1. doi: 10.3389/fpsyg.2010.00244

Show or Hide answerIs it unavoidable that regularly using a foreign language will influence our native language?
Link & Share

Most people who try to learn a second language, or who interact with non-native speakers notice that the way non-native speakers speak their second language is influenced by their native language. They are likely to have a foreign accent, and they might use inappropriate words or an incorrect grammatical structure, because those words or that structure are used that way in their native language. A lesser known yet common phenomenon is the influence of a foreign language we learn on our native language.


People who start using a foreign language regularly (for example, after moving to a different country) often find themselves struggling to recall words when using their native language. Other common influences are the borrowing of words or collocations (two or more words that often go together). For example, Dutch non-native English speakers might insert English words, for which there is no literal translation, such as native, into a Dutch conversation. Or they may find themselves using a literal translation of the collocation ‘taking a picture’ while speaking in their native language, even if their native language does not use the verb take to express this action. Studies from the past couple of decades show that people show such an influence at all linguistic levels - as described above, they may borrow words or expressions from their second language, but they might also borrow grammatical structures or develop a non-native accent in their own native language.

In general, research has shown that all the languages we speak are always co-activated. This means that when a Dutch person speaks German, not only his German but also his Dutch, as well as any other language that person speaks, are automatically activated at the same time. This co-activation likely promotes cross-linguistic influence.

So will learning a foreign language necessarily influence one's native language at all linguistic levels? To a degree, but there are large individual differences. The influence is larger the more dominant the use of the foreign language is, and in particular if it is regularly used with native speakers of that language (as when moving to a foreign country). The influence also increases with time, so immigrants, for example, are likely to show more influence after 20 years abroad than after 2, although there is also a burst of influence in the initial period of using a foreign language regularly. Some studies also suggest that differences among people in certain cognitive abilities, like the ability to suppress irrelevant information, affect the magnitude of the influence of the second language on the native language. It is important to note though that some of these influences are relatively minor, and might not even be detectable in ordinary communication.

By Shiri Lev-Ari and Hans Rutger Bosker

Further reading:

Cook, V. (Ed.). (2003). Effects of the second language on the first. Clevedon: Multilingual Matters.

Show or Hide answerHow does dyslexia arise?
Link & Share

When a child has significant difficulties in learning to read and/or spell despite normal general intelligence and overall sensory abilities, then he or she may be diagnosed with developmental dyslexia. This condition was first described in the 1890's and referred to as 'congenital word blindness', because it was thought to result from problems with processing of visual symbols. Over the years it has become clear that visual deficits are not the core feature for most people with dyslexia. In many cases, it seems that subtle underlying difficulties with aspects of language could be contributing. To learn to read, a child needs to understand the way that words are made up by their individual units (phonemes), and must become adept at matching those phonemes to arbitrary written symbols (graphemes). Although the overall language proficiency of people with dyslexia usually appears normal, they often perform poorly on tests that involve manipulations of phonemes and processing of phonology, even when this does not involve any reading or writing.


Since dyslexia is defined as a failure to read, without being explained by an obvious known cause, it is possible that this is not one single syndrome, but instead represents a cluster of different disorders, involving distinct mechanisms. However, it has proved hard to clearly separate dyslexia out into subtypes. Studies have uncovered quite a few convincing behavioural markers (not only phonological deficits) that tend to be associated with the reading problems, and there is a lot of debate about how these features fit together into a coherent account. To give just one example, many people with dyslexia are less accurate when asked to rapidly name a visually-presented series of objects or colours. Some researchers now believe that dyslexia results from the convergence of several different cognitive deficits, co-occurring in the same person.

It is well established that dyslexia clusters in families and that inherited factors must play a substantial role in susceptibility. Nevertheless, there is no doubt that the genetic basis is complex and heterogeneous, involving multiple different genes of small effect size, interacting with the environment. Genetic mapping efforts have already enabled researchers to pinpoint a number of interesting candidate genes, such as DYX1C1, KIAA0319, DCDC2, and ROBO1, and with dramatic advances in DNA sequencing technology there is much promise for discovering others in the coming years. The neurobiological mechanisms that go awry in dyslexia are largely unknown. A prominent theory posits disruptions of a process in early development, a process in which brain cells move towards their final locations, known as neuronal migration. Indirect supporting evidence for this hypothesis comes from studies of post-mortem brain material in humans and investigations of functions of some candidate genes in rats. But there are still many open questions that need to be answered before we can fully understand the causal mechanisms that lead to this elusive syndrome.

by Simon Fisher

 Further reading:

Carrion-Castillo, A., Franke, B., & Fisher, S. E. (2013). Molecular genetics of dyslexia: an overview. Dyslexia, 19, 214–240. (link)

Demonet, J. F., Taylor, M. J., & Chaix, Y. (2004). Developmental dyslexia. Lancet, 63, 1451–1460 (link)

Fisher, S. E. & Francks, C. (2006). Genes, cognition and dyslexia: learning to read the genome. Trends in Cognitive Science, 10, 250-257.(link)

Show or Hide answerWhat is the connection between movement and language?
Link & Share

Speaking requires planning and executing rapid sequences of movements. Several muscle systems are involved in the production of speech sounds. Not only the tongue, lips and jaw, but also the larynx and respiration muscles work together in coordination when we speak. As for any other movement, motor planning and sensorimotor control are essential for speaking. 

In children, a tight relation between fine motor skills and language proficiency has been demonstrated. That is why speech therapists encourage sensory rich activities like finger painting, water or sand play, and manipulations involving small objects (coloring, buttoning, etc.) in children with speech delays. Such activities help to form new neural connections that are necessary for planning movement sequences and controlling fine-grain movements. For the same reason hand exercises can be beneficial as a part of complex therapy for speech and language recovery after a stroke or brain damage, in cases when language problems are caused by impaired articulation or motor control. 


Another connection between movement and language lies in the domain of co-speech gestures. People often gesture when they speak and understanding the gestures is important in order to grasp the speaker’s intended message. Gesture may become essential to communicate at all in situations where verbal language use is constrained (for example in a noisy environment, or when speakers of different languages communicate). Usually people are remarkably fluent in extracting intended meaning from one’s hands and body movement. Interestingly, recent research demonstrates that similar brain areas are involved in constructing the meaning from linguistic and gestural input. Finally, the sign languages that deaf individuals use to communicate show that the language itself can be manifested in body movements, e.g. hands, arms, and facial expressions.

 Written by Irina Simanova & David Peeters

Further reading:  

Why a Long Island Speech Therapist Incorporates Movement and Sensory Activities into Speech Therapy Sessions (link)

McNeill, David (2012). How Language Began: Gesture and Speech in Human Evolution. New York, USA; United Kingdom: Cambridge University Press. (link)

Show or Hide answerHow do we form the sounds of speech?
Link & Share

The vast majority of speech sounds are produced by creating a stream of air which flows from the lungs through the mouth or nose. We use this stream of air to form specific sounds with our vocal folds and/or by changing the configuration of our mouths.

7.11 p b f v

When we produce consonants, a constriction is made somewhere in the mouth, either by stopping the air stream entirely (for example with our lips when saying 'p' or with our tongues when saying 't') or by leaving a very narrow gap which makes the air hiss as it passes (for example with our lips and teeth when saying 'f' or with our tongues when saying 's').

7.11 s z m

We also use our vocal-folds to differentiate consonants. When we bring our vocal folds close together, the stream of air makes them vibrate, which sounds like a hum; when they are apart, they do not vibrate. You can feel this difference by putting your finger on your Adam's apple when you say 'ssss' and 'zzzz' - can you feel how 'zzzz' is voiced and 'ssss' is not voiced? When we produce vowels, we change the shape of our mouths by moving our tongues, lips and jaw.

The different shapes of the vocal tract act as different acoustic filters, altering the hum produced by the vocal cords in different ways. For example, we move our tongues right to the front of our mouths and make our lips wide to make an 'ie' sound, and we move our tongues to the back of our mouths and make our lips round to make an 'oe' sound. For an 'aaa' sound, we move our tongue to the bottom of our mouth, lower the jaw and open our lips wide.

Finally, there are other specific ways of creating speech sounds, such as moving the stream of air through the nose to create nasal sounds like 'm', or creating a small pressure vacuum with the tongue before releasing it with a sharp popping sound, which is how people produce click sounds in some African languages.

 Written by Matthias Sjerps, Matthias Franken & Gwilym Lockwood 

Further reading:

Ladefoged, P. (1996). Elements of acoustic phonetics (second ed.) (link)

Show or Hide answerWhat is the difference between sleep-talking and talking while awake?
Link & Share

People do all kinds of things while sleeping. They move around, mumble, laugh, and some also sometimes whisper or produce speech while asleep. Sleep-talking (or: somniloquy) happens at all ages and may occur during all phases of sleep. But what exactly is the difference between sleep-talking and normal everyday speech?

7.09 sleep

Image: Paul Sapiano

Sleep-talk contains more speech errors than everyday speech. For instance, sleep-talkers can have trouble retrieving a word (word finding problems) or switch individual sounds when producing a word (for example beatag instead of teabag). While this of course also occurs during normal speech, it happens more frequently during sleep. Sleep-talk sometimes resembles speech produced by aphasic patients. In addition, sleep-talk resembles the speech that is sometimes produced by patients suffering from schizophrenia, in that there is less of a connection between utterances, which may lead to relatively incoherent discourse. Finally, sleep-talk may be less well articulated (mumbling) than everyday speech and contain incomprehensible words or newly invented words (neologisms).

However, perhaps the most striking thing is the similarity between sleep-talk and speech produced when awake. People produce full sentences while sleeping and the grammatical structure of their utterance is often perfectly correct. There are even some anecdotal reports describing people that would be more eloquent and creative during sleep compared to being awake, for instance when speaking a second language.

Sleep-talking does not necessarily indicate a psychological disorder or psychopathology. However, it may co-occur with sleep-disorder syndromes such as somnambulism (walking around while sleeping). Also, people that have encountered a traumatic event (such as soldiers who have fought in a war) are found to talk more in their sleep than non-traumatized control subjects. Besides such environmental factors, it has been found that there is also a genetic component to sleep-talking. If your parents are regular sleep-talkers, there is a higher chance that you are a sleep-talker yourself as well.

In conclusion, in linguistic terms sleep-talk differs less from talking while being awake than one may suspect. The main difference boils down to the popular belief that we have less control over what we say during sleep than during the day. Or as The Romantics put it in their 1984 hit: "I hear the secrets that you keep; When you're talking in your sleep; and I know that I’m right, cause I hear it in the night". Whether this is really the case has not been researched scientifically.

 Written by David Peeters and Roel M. Willems

Further reading:

Arkin, A. (1981). Sleep talking. Psychology and psychophysiology. Hillsdale, NJ: Lawrence Erlbaum Associates.

About MPI

This is the MPI

The Max Planck Institute for Psycholinguistics is an institute of the German Max Planck Society. Our mission is to undertake basic research into the psychological,social and biological foundations of language. The goal is to understand how our minds and brains process language, how language interacts with other aspects of mind, and how we can learn languages of quite different types.

The institute is situated on the campus of the Radboud University. We participate in the Donders Institute for Brain, Cognition and Behaviour, and have particularly close ties to that institute's Centre for Cognitive Neuroimaging. We also participate in the Centre for Language Studies. A joint graduate school, the IMPRS in Language Sciences, links the Donders Institute, the CLS and the MPI.

Questions and Answers

whiet question mark on MPG green 124pt, stroke 2pt

This project was coordinated by:

Katrien Segaert 
Katerina Kucera
Judith Holler

Sean Roberts
Agnieszka Konopka
Gwilym Lockwood
Elma Hilbrink
Joost Rommers
Mark Dingemanse
Connie de Vos