As we listen to speech or read a text, the input we receive is often highly ambiguous. Ambiguities arise when a word or sentence can be interpreted in multiple ways. Or when the correct interpretation hinges on subtle distinctions that get lost in the sloppy way we articulate casual speech or the jerky, erratic ways we move our eyes as we read. Luckily, we are mostly unaware of ambiguities. No one thinks of furniture when a scientific paper refers to a table, or mistakes the request to "be quiet" for the name "Beek Whyat". The reason why ambiguities remain unnoticed is clear – context. However, just how the brain integrates expectations derived from context with sensory input to arrive at understanding remains obscure. This issue lies at the heart of my PhD project. I am interested in all linguistic levels: from word contexts in letter perception to paragraph contexts in sentence processing. I am also interested in the effects of expectations on perception more generally. I use computational models borrowed from cognitive (neuro)science and AI to substantiate my work. I am co-supervised by Floris de Lange and Peter Hagoort, and am also part of the Predictive Brain Lab at Donders.