Displaying 1 - 3 of 3
-
Fitz, H., Uhlmann, M., Van den Broek, D., Duarte, R., Hagoort, P., & Petersson, K. M. (2020). Neuronal spike-rate adaptation supports working memory in language processing. Proceedings of the National Academy of Sciences of the United States of America, 117(34), 20881-20889. doi:10.1073/pnas.2000222117.
Abstract
Language processing involves the ability to store and integrate pieces of
information in working memory over short periods of time. According to
the dominant view, information is maintained through sustained, elevated
neural activity. Other work has argued that short-term synaptic facilitation
can serve as a substrate of memory. Here, we propose an account where
memory is supported by intrinsic plasticity that downregulates neuronal
firing rates. Single neuron responses are dependent on experience and we
show through simulations that these adaptive changes in excitability pro-
vide memory on timescales ranging from milliseconds to seconds. On this
account, spiking activity writes information into coupled dynamic variables
that control adaptation and move at slower timescales than the membrane
potential. From these variables, information is continuously read back into
the active membrane state for processing. This neuronal memory mech-
anism does not rely on persistent activity, excitatory feedback, or synap-
tic plasticity for storage. Instead, information is maintained in adaptive
conductances that reduce firing rates and can be accessed directly with-
out cued retrieval. Memory span is systematically related to both the time
constant of adaptation and baseline levels of neuronal excitability. Inter-
ference effects within memory arise when adaptation is long-lasting. We
demonstrate that this mechanism is sensitive to context and serial order
which makes it suitable for temporal integration in sequence processing
within the language domain. We also show that it enables the binding of
linguistic features over time within dynamic memory registers. This work
provides a step towards a computational neurobiology of language. -
Chang, F., & Fitz, H. (2014). Computational models of sentence production: A dual-path approach. In M. Goldrick, & M. Miozzo (
Eds. ), The Oxford handbook of language production (pp. 70-89). Oxford: Oxford University Press.Abstract
Sentence production is the process we use to create language-specific sentences that convey particular meanings. In production, there are complex interactions between meaning, words, and syntax at different points in sentences. Computational models can make these interactions explicit and connectionist learning algorithms have been useful for building such models. Connectionist models use domaingeneral mechanisms to learn internal representations and these mechanisms can also explain evidence of long-term syntactic adaptation in adult speakers. This paper will review work showing that these models can generalize words in novel ways and learn typologically-different languages like English and Japanese. It will also present modeling work which shows that connectionist learning algorithms can account for complex sentence production in children and adult production phenomena like structural priming, heavy NP shift, and conceptual/lexical accessibility. -
Fitz, H. (2014). Computermodelle für Spracherwerb und Sprachproduktion. Forschungsbericht 2014 - Max-Planck-Institut für Psycholinguistik. In Max-Planck-Gesellschaft Jahrbuch 2014. München: Max Planck Society for the Advancement of Science. Retrieved from http://www.mpg.de/7850678/Psycholinguistik_JB_2014?c=8236817.
Abstract
Relative clauses are a syntactic device to create complex sentences and they make language structurally productive. Despite a considerable number of experimental studies, it is still largely unclear how children learn relative clauses and how these are processed in the language system. Researchers at the MPI for Psycholinguistics used a computational learning model to gain novel insights into these issues. The model explains the differential development of relative clauses in English as well as cross-linguistic differences
Share this page