I am a cognitive neuroscientist interested in how we (learn to) communicate with each other, before and beyond words. How do we use verbal and non-verbal communication to exchange meanings with other individuals? How do we combine communicative signals from the different senses?
I have always been fascinated by how our brain merges information from our complex multisensory world, and how (a)typical development shapes this crucial ability.
Accordingly, during my master thesis at CIMeC (Italy), I worked with Prof. Olivier Collignon on cross-modal plasticity in the auditory cortex of congenitally deaf individuals. In 2020 I completed a PhD in computational neuroscience with Prof. Uta Noppeney at the University of Birmingham (UK). There, I studied multimodal integration within the framework of Bayesian Causal Inference, which represents the state-of-the-art principled computational account of how the brain binds information across the senses. Having approached perception from a Bayesian perspective, I started wondering how priors are established and how they evolve as a function of experience. Hence, I worked as a postdoc in the Predictive Bain lab of Prof. Floris de Lange at the Donders Institute. There, I studied how predictions are established through statistical learning and how they influence perceptual processes in the brain.
As a postdoc in the Neurobiology of Language Department, I now plan to bring my expertise in multimodal integration and prediction to the realm of communication and language, studying both the adult and the developing human brain.
For updates on my work, please also see my Google Scholar profile.
Share this page