Multimodal Language Department

Multimodal Language Department
What does our body reveal about human language?

Multimodal Language Department

Languages can be expressed and perceived not only through speech or written text but also through visible body expressions (hands, body, and face). All spoken languages use gestures along with speech, and in deaf communities all aspects of language can be expressed through the visible body in sign language. However, the unique contribution of such visible expressions to our understanding of the human language faculty is less understood. The Multimodal Language Department aims to understand how visual features of language, along with speech or in sign languages, constitute a fundamental aspect of the human language capacity, contributing to its unique flexible and adaptive nature. The ambition of the department is to conventionalise the view of language and linguistics as multimodal phenomena.

To this end, we conduct fieldwork on how gestures are used in spoken languages with different linguistic structures - such as word order or prosody - as well as in different sign languages, to understand universal and diverse patterns. The Multimodal Language Department also aims to understand the role of neural, cognitive and linguistic processing mechanisms, requirements of language use in interaction and language transmission (for instance learning constraints) in shaping multimodal structures of language. The general aim therefore is to unravel the cognitive and social foundations of the human ability for language, by considering its multimodal and crosslinguistic diversity as a fundamental design feature.

Our researchers combine multiple methods, such as corpus and computational linguistics, experimental methods, machine learning, AI, and virtual reality, to investigate multimodal language structure, use, processing and transmission. We work with a variety of language users of different signed and spoken languages around the world, as well as with individuals who have different access to sensory experience, such as deaf and blind language users, people in different age groups, and people with autism spectrum disorder.

Multimodal Language Department - YouTube


Asli Ozyurek

Multimodal Language Department
+31 24 3521304
Asli [dot] Ozyurek [at] mpi [dot] nl


Jobs paragraph
  • Sharice Clough stands beside a research poster titled "Spatial narratives from remote and recent memory in individuals with Alzheimer’s disease and healthy older adults: A multimodal perspective." She is smiling, wearing glasses, and a green dress with a conference badge.
    10 June 2024

    Sharice Clough presented at DUCOG Conference

    Sharice presented preliminary findings from a study examining the multimodal language production of healthy older adults and individuals with Alzheimer's disease producing spatial narratives from...

  • A person fine-tunes a the Furhat robot head with a projected face in a tech lab setting.
    14 May 2024

    Chinmaya Mishra on News!

    Chinmaya Mishra's research on automating robots' gaze and emotions is now featured on several websites, showcasing how robots can better interact with us. Catch the full story online for a glimpse at...

Share this page