Publications

Displaying 1 - 5 of 5
  • Frank, S. L., & Fitz, H. (2016). Reservoir computing and the Sooner-is-Better bottleneck [Commentary on Christiansen & Slater]. Behavioral and Brain Sciences, 39: e73. doi:10.1017/S0140525X15000783.

    Abstract

    Prior language input is not lost but integrated with the current input. This principle is demonstrated by “reservoir computing”: Untrained recurrent neural networks project input sequences onto a random point in high-dimensional state space. Earlier inputs can be retrieved from this projection, albeit less reliably so as more input is received. The bottleneck is therefore not “Now-or-Never” but “Sooner-is-Better.
  • Poletiek, F. H., Fitz, H., & Bocanegra, B. R. (2016). What baboons can (not) tell us about natural language grammars. Cognition, 151, 108-112. doi:10.1016/j.cognition.2015.04.016.

    Abstract

    Rey et al. (2012) present data from a study with baboons that they interpret in support of the idea that center-embedded structures in human language have their origin in low level memory mechanisms and associative learning. Critically, the authors claim that the baboons showed a behavioral preference that is consistent with center-embedded sequences over other types of sequences. We argue that the baboons’ response patterns suggest that two mechanisms are involved: first, they can be trained to associate a particular response with a particular stimulus, and, second, when faced with two conditioned stimuli in a row, they respond to the most recent one first, copying behavior they had been rewarded for during training. Although Rey et al. (2012) ‘experiment shows that the baboons’ behavior is driven by low level mechanisms, it is not clear how the animal behavior reported, bears on the phenomenon of Center Embedded structures in human syntax. Hence, (1) natural language syntax may indeed have been shaped by low level mechanisms, and (2) the baboons’ behavior is driven by low level stimulus response learning, as Rey et al. propose. But is the second evidence for the first? We will discuss in what ways this study can and cannot give evidential value for explaining the origin of Center Embedded recursion in human grammar. More generally, their study provokes an interesting reflection on the use of animal studies in order to understand features of the human linguistic system.
  • Chang, F., & Fitz, H. (2014). Computational models of sentence production: A dual-path approach. In M. Goldrick, & M. Miozzo (Eds.), The Oxford handbook of language production (pp. 70-89). Oxford: Oxford University Press.

    Abstract

    Sentence production is the process we use to create language-specific sentences that convey particular meanings. In production, there are complex interactions between meaning, words, and syntax at different points in sentences. Computational models can make these interactions explicit and connectionist learning algorithms have been useful for building such models. Connectionist models use domaingeneral mechanisms to learn internal representations and these mechanisms can also explain evidence of long-term syntactic adaptation in adult speakers. This paper will review work showing that these models can generalize words in novel ways and learn typologically-different languages like English and Japanese. It will also present modeling work which shows that connectionist learning algorithms can account for complex sentence production in children and adult production phenomena like structural priming, heavy NP shift, and conceptual/lexical accessibility.
  • Fitz, H. (2014). Computermodelle für Spracherwerb und Sprachproduktion. Forschungsbericht 2014 - Max-Planck-Institut für Psycholinguistik. In Max-Planck-Gesellschaft Jahrbuch 2014. München: Max Planck Society for the Advancement of Science. Retrieved from http://www.mpg.de/7850678/Psycholinguistik_JB_2014?c=8236817.

    Abstract

    Relative clauses are a syntactic device to create complex sentences and they make language structurally productive. Despite a considerable number of experimental studies, it is still largely unclear how children learn relative clauses and how these are processed in the language system. Researchers at the MPI for Psycholinguistics used a computational learning model to gain novel insights into these issues. The model explains the differential development of relative clauses in English as well as cross-linguistic differences
  • Fitz, H. (2001). Church's Thesis: A philosophical critique of modern computability theory. Master Thesis, Freie Universität Berlin.

Share this page