Displaying 1 - 5 of 5
-
Chang, F., Bauman, M., Pappert, S., & Fitz, H. (2015). Do lemmas speak German?: A verb position effect in German structural priming. Cognitive Science, 39(5), 1113-1130. doi:10.1111/cogs.12184.
Abstract
Lexicalized theories of syntax often assume that verb-structure regularities are mediated by lemmas, which abstract over variation in verb tense and aspect. German syntax seems to challenge this assumption, because verb position depends on tense and aspect. To examine how German speakers link these elements, a structural priming study was performed which varied syntactic structure, verb position (encoded by tense and aspect), and verb overlap. Abstract structural priming was found, both within and across verb position, but priming was larger when the verb position was the same between prime and target. Priming was boosted by verb overlap, but there was no interaction with verb position. The results can be explained by a lemma model where tense and aspect are linked to structural choices in German. Since the architecture of this lemma model is not consistent with results from English, a connectionist model was developed which could explain the cross-linguistic variation in the production system. Together, these findings support the view that language learning plays an important role in determining the nature of structural priming in different languages -
Chang, F., & Fitz, H. (2014). Computational models of sentence production: A dual-path approach. In M. Goldrick, & M. Miozzo (
Eds. ), The Oxford handbook of language production (pp. 70-89). Oxford: Oxford University Press.Abstract
Sentence production is the process we use to create language-specific sentences that convey particular meanings. In production, there are complex interactions between meaning, words, and syntax at different points in sentences. Computational models can make these interactions explicit and connectionist learning algorithms have been useful for building such models. Connectionist models use domaingeneral mechanisms to learn internal representations and these mechanisms can also explain evidence of long-term syntactic adaptation in adult speakers. This paper will review work showing that these models can generalize words in novel ways and learn typologically-different languages like English and Japanese. It will also present modeling work which shows that connectionist learning algorithms can account for complex sentence production in children and adult production phenomena like structural priming, heavy NP shift, and conceptual/lexical accessibility. -
Fitz, H. (2014). Computermodelle für Spracherwerb und Sprachproduktion. Forschungsbericht 2014 - Max-Planck-Institut für Psycholinguistik. In Max-Planck-Gesellschaft Jahrbuch 2014. München: Max Planck Society for the Advancement of Science. Retrieved from http://www.mpg.de/7850678/Psycholinguistik_JB_2014?c=8236817.
Abstract
Relative clauses are a syntactic device to create complex sentences and they make language structurally productive. Despite a considerable number of experimental studies, it is still largely unclear how children learn relative clauses and how these are processed in the language system. Researchers at the MPI for Psycholinguistics used a computational learning model to gain novel insights into these issues. The model explains the differential development of relative clauses in English as well as cross-linguistic differences -
Brouwer, H., Fitz, H., & Hoeks, J. C. (2010). Modeling the noun phrase versus sentence coordination ambiguity in Dutch: Evidence from Surprisal Theory. In Proceedings of the 2010 Workshop on Cognitive Modeling and Computational Linguistics, ACL 2010 (pp. 72-80). Association for Computational Linguistics.
Abstract
This paper investigates whether surprisal theory can account for differential processing difficulty in the NP-/S-coordination ambiguity in Dutch. Surprisal is estimated using a Probabilistic Context-Free Grammar (PCFG), which is induced from an automatically annotated corpus. We find that our lexicalized surprisal model can account for the reading time data from a classic experiment on this ambiguity by Frazier (1987). We argue that syntactic and lexical probabilities, as specified in a PCFG, are sufficient to account for what is commonly referred to as an NP-coordination preference. -
Fitz, H. (2010). Statistical learning of complex questions. In S. Ohlsson, & R. Catrambone (
Eds. ), Proceedings of the 32nd Annual Conference of the Cognitive Science Society (pp. 2692-2698). Austin, TX: Cognitive Science Society.Abstract
The problem of auxiliary fronting in complex polar questions occupies a prominent position within the nature versus nurture controversy in language acquisition. We employ a model of statistical learning which uses sequential and semantic information to produce utterances from a bag of words. This linear learner is capable of generating grammatical questions without exposure to these structures in its training environment. We also demonstrate that the model performs superior to n-gram learners on this task. Implications for nativist theories of language acquisition are discussed.
Share this page