Falk Huettig

Presentations

Displaying 1 - 41 of 41
  • Favier, S., Meyer, A. S., & Huettig, F. (2018). How does literacy influence syntactic processing in spoken language?. Talk presented at Psycholinguistics in Flanders (PiF 2018). Gent, Belgium. 2018-06-04 - 2018-06-05.
  • Garrido Rodriguez, G., Huettig, F., Norcliffe, E., Brown, P., & Levinson, S. C. (2018). Participant assignment to thematic roles in Tzeltal: Eye tracking evidence from sentence comprehension in a verb-initial language. Talk presented at Architectures and Mechanisms for Language Processing (AMLaP 2018). Berlin, Germany. 2018-09-06 - 2018-09-08.
  • Huettig, F. (2018). How learning to read changes mind and brain [keynote]. Talk presented at Architectures and Mechanisms for Language Processing-Asia (AMLaP-Asia 2018). Telangana, India. 2018-02-01 - 2018-02-03.
  • Araújo, S., Huettig, F., & Meyer, A. S. (2016). What's the nature of the deficit underlying impaired naming? An eye-tracking study with dyslexic readers. Talk presented at IWORDD - International Workshop on Reading and Developmental Dyslexia. Bilbao, Spain. 2016-05-05 - 2016-05-07.

    Abstract

    Serial naming deficits have been identified as core symptoms of developmental dyslexia. A prominent hypothesis is that naming delays are due to inefficient phonological encoding, yet the exact nature of this underlying impairment remains largely underspecified. Here we used recordings of eye movements and word onset latencies to examine at what processing level the dyslexic naming deficit emerges: localized at an early stage of lexical encoding or rather later at the level of phonetic or motor planning. 23 dyslexic and 25 control adult readers were tested on a serial object naming task for 30 items and an analogous reading task, where phonological neighborhood density and word-frequency were manipulated. Results showed that both word properties influenced early stages of phonological activation (first fixation and first-pass duration) equally in both groups of participants. Moreover, in the control group any difficulty appeared to be resolved early in the reading process, while for dyslexic readers a processing disadvantage for low-frequency words and for words with sparse neighborhood also emerged in a measure that included late stages of output planning (eye-voice span). Thus, our findings suggest suboptimal phonetic and/or articulatory planning in dyslexia.
  • Eisner, F., Kumar, U., Mishra, R. K., Nand Tripathi, V., Guleria, A., Prakash Singh, J., & Huettig, F. (2016). Literacy acquisition drives hemispheric lateralization of reading. Talk presented at Architectures and Mechanisms for Language Processing (AMLaP 2016). Bilbao, Spain. 2016-09-01 - 2016-09-03.

    Abstract

    Reading functions beyond early visual precessing are known to be lateralized to the left hemisphere, but how left-lateralization arises during literacy acquisition is an open question. Bilateral processing or rightward asymmetries have previously been associated with developmental dyslexia. However, it is unclear at present to what extent this lack of left-lateralization reflects differences in reading ability. In this study, a group of illiterate adults in rural India (N=29) participated in a literacy training program over the course of six months. fMRI measures were obtained before and after training on a number of different visual stimulus categories, including written sentences, false fonts, and object categories such as houses and faces. This training group was matched on demographic and socioeconomic variables to an illiterate no-training group and to low- and highly-literate control groups, who were also scanned twice but received no training (total N=90). In a cross-sectional analysis before training, reading ability was positively correlated with increased BOLD responses in a left-lateralized network including the dorsal and ventral visual streams for text and false fonts, but not for other types of visual stimuli. A longitudinal analysis of learning effects in the training group showed that beginning readers engage bilateral networks more than proficient readers. Lateralization of BOLD responses was further examined by calculating laterality indices in specific regions. We observed training-related changes in lateralization for processing written stimuli in a number of subregions in the dorsal and ventral visual streams, as well as in the cerebellum. Together with the cross-sectional results, these data suggest a causal relationship between reading ability and the degree of hemispheric asymmetry in processing written materials.
  • Eisner, F., Kumar, U., Mishra, R. K., Nand Tripathi, V., Guleria, A., Prakash Singh, J., & Huettig, F. (2016). Literacy acquisition drives hemispheric lateralization of reading. Talk presented at the 31st International Congress of Psychology (ICP2016). Yokohoma, Japan. 2016-07-24 - 2016-07-29.

    Abstract

    Reading functions beyond early visual precessing are known to be lateralized to the left hemisphere, but how left-lateralization arises during literacy acquisition is an open question. Bilateral processing or rightward asymmetries have previously been associated with developmental dyslexia. However, it is unclear at present to what extent this lack of left-lateralization reflects differences in reading ability. In this study, a group of illiterate adults in rural India (N=29) participated in a literacy training program over the course of six months. fMRI measures were obtained before and after training on a number of different visual stimulus categories, including written sentences, false fonts, and object categories such as houses and faces. This training group was matched on demographic and socioeconomic variables to an illiterate no-training group and to low- and highly-literate control groups, who were also scanned twice but received no training (total N=90). In a cross-sectional analysis before training, reading ability was positively correlated with increased BOLD responses in a left-lateralized network including the dorsal and ventral visual streams for text and false fonts, but not for other types of visual stimuli. A longitudinal analysis of learning effects in the training group showed that beginning readers engage bilateral networks more than proficient readers. Lateralization of BOLD responses was further examined by calculating laterality indices in specific regions. We observed training-related changes in lateralization for processing written stimuli in a number of subregions in the dorsal and ventral visual streams, as well as in the cerebellum. Together with the cross-sectional results, these data suggest a causal relationship between reading ability and the degree of hemispheric asymmetry in processing written materials.
  • Huettig, F. (2016). Is prediction necessary to understand language?. Talk presented at the RefNet Round Table conference. Aberdeen, Scotland. 2016-01-15 - 2016-01-16.

    Abstract

    Many psycholinguistic experiments suggest that prediction is an important characteristic of language processing. Some recent theoretical accounts in the cognitive sciences (e.g., Clark, 2013; Friston, 2010) and psycholinguistics (e.g., Dell & Chang, 2014) appear to suggest that prediction is even necessary to understand language. I will evaluate this proposal. I will first discuss several arguments that may appear to be in line with the notion that prediction is necessary for language processing. These arguments include that prediction provides a unified theoretical principle of the human mind and that it pervades cortical function. We discuss whether evidence of human abilities to detect statistical regularities is necessarily evidence for predictive processing and evaluate suggestions that prediction is necessary for language learning. Five arguments are then presented that question the claim that all language processing is predictive in nature. I point out that not all language users appear to predict language and that suboptimal input makes prediction often very challenging. Prediction, moreover, is strongly context-dependent and impeded by resource limitations. I will also argue that it may be problematic that most experimental evidence for predictive language processing comes from 'prediction-encouraging' experimental set-ups. Finally, I will discuss possible ways that may lead to a further resolution of this debate. We conclude that languages can be learned and understood in the absence of prediction. Claims that all language processing is predictive in nature are premature.
  • Huettig, F. (2016). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the Psychology Department, University of Brussels. Brussels, Belgium. 2016-10.
  • Huettig, F., Kumar, U., Mishra, R. K., Tripathi, V., Guleria, A., Prakash Singh, J., & Eisner, F. (2016). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the International meeting of the Psychonomic Society. Granada, Spain. 2016-05-05 - 2016-05-08.

    Abstract

    How do human cultural inventions such as reading result in neural re-organization? In this first longitudinal study with young completely illiterate adult participants, we measured brain responses to speech, text, and other categories of visual stimuli with fMRI before and after a group of illiterate participants in India completed a literacy training program in which they learned to read and write Devanagari script. A literate and an illiterate no-training control group were matched to the training group in terms of socioeconomic background and were recruited from the same societal community in two villages of a rural area near Lucknow, India. This design permitted investigating effects of literacy cross-sectionally across groups before training (N=86) as well as longitudinally (training group N=25). The two analysis approaches yielded converging results: Literacy was associated with enhanced, left-lateralized responses to written text along the ventral stream (including lingual gyrus, fusiform gyrus, and parahippocampal gyrus), dorsal stream (intraparietal sulcus), and (pre-) motor systems (pre-central sulcus, supplementary motor area) and thalamus (pulvinar). Significantly reduced responses were observed bilaterally in the superior parietal lobe (precuneus) and in the right angular gyrus. These effects corroborate and extend previous findings from cross-sectional studies. However, effects of literacy were specific to written text and (to a lesser extent) to false fonts. We did not find any evidence for effects of literacy on responses in the auditory cortex in our Hindi-speaking participants. This raises questions about the extent to which phonological representations are altered by literacy acquisition.
  • Ostarek, M., & Huettig, F. (2016). Sensory representations are causally involved in cognition but only when the task requires it. Talk presented at the 3rd Attentive Listener in the Visual World (AttLis) workshop. Potsdam, Germany. 2016-05-10 - 2016-05-11.
  • Smith, A. C., Monaghan, P., & Huettig, F. (2016). The multimodal nature of spoken word processing in the visual world: Testing the predictions of alternative models of multimodal integration. Talk presented at the 15th Neural Computation and Psychology Workshop: Contemporary Neural Network Models (NCPW15). Philadelphia, PA, USA. 2016-08-08 - 2016-08-09.
  • Speed, L., Chen, J., Huettig, F., & Majid, A. (2016). Do classifier categories affect or reflect object concepts?. Talk presented at the 38th Annual Meeting of the Cognitive Science Society (CogSci 2016). Philadelphia, PA, USA. 2016-08-10 - 2016-08-13.

    Abstract

    We conceptualize objects based on sensory and motor information gleaned from real-world experience. But to what extent is such conceptual information structured according to higher level linguistic features too? Here we investigate whether classifiers, a grammatical category, shape the conceptual representations of objects. In three experiments native Mandarin speakers (speakers of a classifier language) and native Dutch speakers (speakers of a language without classifiers) judged the similarity of a target object (presented as a word or picture) with four objects (presented as words or pictures). One object shared a classifier with the target, the other objects did not, serving as distractors. Across all experiments, participants judged the target object as more similar to the object with the shared classifier than distractor objects. This effect was seen in both Dutch and Mandarin speakers, and there was no difference between the two languages. Thus, even speakers of a non-classifier language are sensitive to object similarities underlying classifier systems, and using a classifier system does not exaggerate these similarities. This suggests that classifier systems simply reflect, rather than affect, conceptual structure.
  • Eisner, F., Kumar, U., Mishra, R. K., Nand Tripathi, V., Guleria, A., Singh, P., & Huettig, F. (2015). The effect of literacy acquisition on cortical and subcortical networks: A longitudinal approach. Talk presented at the 7th Annual Meeting of the Society for the Neurobiology of Language. Chicago, US. 2015-10-15 - 2015-10-17.

    Abstract

    How do human cultural inventions such as reading result in neural re-organization? Previous cross-sectional studies have reported extensive effects of literacy on the neural systems for vision and language (Dehaene et al [2010, Science], Castro-Caldas et al [1998, Brain], Petersson et al [1998, NeuroImage], Carreiras et al [2009, Nature]). In this first longitudinal study with completely illiterate participants, we measured brain responses to speech, text, and other categories of visual stimuli with fMRI before and after a group of illiterate participants in India completed a literacy training program in which they learned to read and write Devanagari script. A literate and an illiterate no-training control group were matched to the training group in terms of socioeconomic background and were recruited from the same societal community in two villages of a rural area near Lucknow, India. This design permitted investigating effects of literacy cross-sectionally across groups before training (N=86) as well as longitudinally (training group N=25). The two analysis approaches yielded converging results: Literacy was associated with enhanced, mainly left-lateralized responses to written text along the ventral stream (including lingual gyrus, fusiform gyrus, and parahippocampal gyrus), dorsal stream (intraparietal sulcus), and (pre-) motor systems (pre-central sulcus, supplementary motor area), thalamus (pulvinar), and cerebellum. Significantly reduced responses were observed bilaterally in the superior parietal lobe (precuneus) and in the right angular gyrus. These positive effects corroborate and extend previous findings from cross-sectional studies. However, effects of literacy were specific to written text and (to a lesser extent) to false fonts. Contrary to previous research, we found no direct evidence of literacy affecting the processing of other types of visual stimuli such as faces, tools, houses, and checkerboards. Furthermore, unlike in some previous studies, we did not find any evidence for effects of literacy on responses in the auditory cortex in our Hindi-speaking participants. We conclude that learning to read has a specific and extensive effect on the processing of written text along the visual pathways, including low-level thalamic nuclei, high-level systems in the intraparietal sulcus and the fusiform gyrus, and motor areas. The absence of an effect of literacy on responses in the auditory cortex in particular raises questions about the extent to which phonological representations in the auditory cortex are altered by literacy acquisition or recruited online during reading.
  • de Groot, F., Huettig, F., & Olivers, C. N. (2015). Semantic influences on visual attention. Talk presented at the 15th NVP Winter Conference. Egmond aan Zee, The Netherlands. 2015-12-17 - 2015-12-19.

    Abstract

    To what extent is visual attention driven by the semantics of individual objects, rather than by their visual appearance? To investigate this we continuously measured eye movements, while observers searched through displays of common objects for an aurally instructed target. On crucial trials, the target was absent, but the display contained object s that were either semantically or visually related to the target. We hypothesized that timing is crucial in the occurrence and strength of semantic influences on visual orienting, and therefore presented the target instruction either before, during, or af ter (memory - based search) picture onset. When the target instruction was presented before picture onset we found a substantial, but delayed bias in orienting towards semantically related objects as compared to visually related objects. However, this delay disappeared when the visual information was presented before the target instruction. Furthermore, the temporal dynamics of the semantic bias did not change in the absence of visual competition. These results po int to cascadic but independent influences of semantic and visual representations on attention. In addition. the results of the memory - based search studies suggest that visual and semantic biases only arise when the visual stimuli are present. Although we consistent ly found that people fixate at locat ions previously occupied by the target object (a replication of earlier findings), we did not find such biases for visually or semantically related objects. Overall, our studies show that the question whether visual orienting is driven by semantic c ontent is better rephrased as when visual orienting is driven by semantic content.
  • de Groot, F., Huettig, F., & Olivers, C. (2015). When meaning matters: The temporal dynamics of semantic influences on visual attention. Talk presented at the 23rd Annual Workshop on Object Perception, Attention, and Memory. Chigaco, USA. 2015-10-19.
  • Hintz, F., Meyer, A. S., & Huettig, F. (2015). Context-dependent employment of mechanisms in anticipatory language processing. Talk presented at the 15th NVP Winter Conference. Egmond aan Zee, The Netherlands. 2015-12-17 - 2015-12-19.
  • Huettig, F. (2015). Cause or effect? What commonalities between illiterates and individuals with dyslexia can tell us about dyslexia. Talk presented at the Reading in the Forest workshop. Annweiler, Germany. 2015-10-26 - 2015-10-28.

    Abstract

    I will discuss recent research with illiterates and individuals with dyslexia which suggests that many cognitive ‚defi ciencies‘ proposed as possible causes of dyslexia are simply a consequence of decreased reading experience. I will argue that in order to make further progress towards an understanding of the causes of dyslexia it is necessary to appropriately distinguish between cause and effect.
  • Huettig, F. (2015). Effekte der Literalität auf die Kognition. Talk presented at Die Abschlußtagung des Verbundprojekts Alpha plus Job. Bamberg, Germany. 2015-01.
  • Huettig, F. (2015). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the Individual differences in language processing across the adult life span workshop. Nijmegen, The Netherlands. 2015-12-10 - 2015-12-11.
  • Huettig, F. (2015). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the Psychology Department, University of York. York, UK. 2015-11.
  • Huettig, F. (2015). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the Psychology Department, University of Leeds. Leeds, UK. 2015-11.
  • Huettig, F. (2015). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the Psychology Department, University of Glasgow. Glasgow, Scotland. 2015-11.
  • Huettig, F. (2015). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the Psychology Department, University of Edinburgh. Edinburgh, Scotland. 2015-09.
  • Huettig, F., Kumar, U., Mishra, R. K., Tripathi, V., Guleria, A., Prakash Singh, J., & Eisner, F. (2015). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the 21st Annual Conference on Architectures and Mechanisms for Language Processing (AMLaP 2015). Valetta, Malta. 2015-09-03 - 2015-09-05.

    Abstract

    How do human cultural inventions such as reading result in neural re-organization? In this first longitudinal study with young completely illiterate adult participants, we measured brain responses to speech, text, and other categories of visual stimuli with fMRI before and after a group of illiterate participants in India completed a literacy training program in which they learned to read and write Devanagari script. A literate and an illiterate no-training control group were matched to the training group in terms of socioeconomic background and were recruited from the same societal community in two villages of a rural area near Lucknow, India. This design permitted investigating effects of literacy cross-sectionally across groups before training (N=86) as well as longitudinally (training group N=25). The two analysis approaches yielded converging results: Literacy was associated with enhanced, left-lateralized responses to written text along the ventral stream (including lingual gyrus, fusiform gyrus, and parahippocampal gyrus), dorsal stream (intraparietal sulcus), and (pre-) motor systems (pre-central sulcus, supplementary motor area) and thalamus (pulvinar). Significantly reduced responses were observed bilaterally in the superior parietal lobe (precuneus) and in the right angular gyrus. These effects corroborate and extend previous findings from cross-sectional studies. However, effects of literacy were specific to written text and (to a lesser extent) to false fonts. We did not find any evidence for effects of literacy on responses in the auditory cortex in our Hindi-speaking participants. This raises questions about the extent to which phonological representations are altered by literacy acquisition.
  • Huettig, F., Kumar, U., Mishra, R. K., Tripathi, V., Guleria, A., Prakash Singh, J., & Eisner, F. (2015). The effect of learning to read on the neural systems for vision and language: A longitudinal approach with illiterate participants. Talk presented at the 19th Meeting of the European Society for Cognitive Psychology (ESCoP 2015). Paphos, Cyprus. 2015-09-17 - 2015-09-20.

    Abstract

    How do human
    cultural
    inventions
    such as reading
    result
    in neural
    re-organization?
    In this first longitudinal
    study
    with young
    completely
    illiterate
    adult
    participants,
    we measured
    brain
    responses
    to speech,
    text, and other
    categories
    of visual
    stimuli
    with fMRI
    before
    and after a group
    of
    illiterate
    participants
    in India
    completed
    a literacy
    training
    program
    in which
    they learned
    to read and write
    Devanagari
    script.
    A literate
    and an illiterate
    no-training
    control
    group
    were
    matched
    to the
    training
    group
    in terms
    of socioeconomic
    background
    and were
    recruited
    from
    the same
    societal
    community
    in two villages
    of a
    rural area near Lucknow,
    India.
    This design
    permitted
    investigating
    effects
    of literacy
    cross-sectionally
    across
    groups
    before
    training
    (N=86)
    as well as longitudinally
    (training
    group
    N=25).
    The two
    analysis
    approaches
    yielded
    converging
    results:
    Literacy
    was
    associated
    with enhanced,
    left-lateralized
    responses
    to written
    text
    along
    the ventral
    stream
    (including
    lingual
    gyrus,
    fusiform
    gyrus,
    and parahippocampal
    gyrus),
    dorsal
    stream
    (intraparietal
    sulcus),
    and (pre-)
    motor
    systems
    (pre-central
    sulcus,
    supplementary
    motor
    area)
    and thalamus
    (pulvinar).
    Significantly
    reduced
    responses
    were observed
    bilaterally
    in the superior
    parietal
    lobe (precuneus)
    and in the right angular
    gyrus.
    These
    effects
    corroborate
    and extend
    previous
    findings
    from
    cross-sectional
    studies.
    However,
    effects
    of literacy
    were
    specific
    to written
    text and (to a lesser
    extent)
    to
    false fonts.
    We did not find any evidence
    for effects
    of literacy
    on
    responses
    in the auditory
    cortex
    in our Hindi-speaking
    participants.
    This
    raises
    questions
    about
    the extent
    to which
    phonological
    representations are altered by literacy acquisition.
  • Mani, N., Daum, M., & Huettig, F. (2015). “Pro-active” in Many Ways: Evidence for Multiple Mechanisms in Prediction. Talk presented at the Biennial Meeting of the Society for Research in Child Development (SRCD 2015). Philadelphia, Pennsylvania, USA. 2015-03-19 - 2015-03-21.
  • Smith, A. C., Monaghan, P., & Huettig, F. (2016). The effects of orthographic transparency on the reading system: Insights from a computational model of reading development. Talk presented at the Experimental Psychology Society, London Meeting. London, U.K. 2016-01-06 - 2016-01-08.
  • Hintz, F., Meyer, A. S., & Huettig, F. (2014). Mechanisms underlying predictive language processing. Talk presented at the 56. Tagung experimentell arbeitender Psychologen [TeaP, Conference on Experimental Psychology]. Giessen, Germany. 2014-03-31 - 2014-04-02.
  • Hintz, F., Meyer, A. S., & Huettig, F. (2014). The influence of verb-specific featural restrictions, word associations, and production-based mechanisms on language-mediated anticipatory eye movements. Talk presented at the 27th annual CUNY conference on human sentence processing. Ohio State University, Columbus/Ohio (US). 2014-03-13 - 2014-03-15.
  • Huettig, F., & Guerra, E. (2014). Context-dependent mapping of linguistic and color representations challenges strong forms of embodiment. Talk presented at the 20th Architectures and Mechanisms for Language Processing Conference (AMLAP 2014). Edinburgh, UK. 2014-09-03 - 2014-09-06.

    Abstract

    A central claim of embodied theories of cognition is that sensory representations are
    routinely activated and influence language processing even in the absence of relevant
    sensory input (cf. Pulvermüller, 2005; Wassenburg & Zwaan, 2010). We tested the influence
    of color representations during language processing in three visual world eye tracking
    experiments. The method is particularly well suited to investigate this issue because the
    availability of relevant visual input can be manipulated.
    We made use of the phenomena that when participants hear a word that refers to a
    visual object or printed word they quickly direct their eye gaze to objects or printed words
    which are similar (e.g. semantically or visually) to the heard word. We used a look and listen
    task which previously has been shown to be sensitive to such relationships between spoken
    words and visual items. In Experiment 1, on experimental trials, participants listened to
    sentences containing a critical target word associated with a prototypical color (e.g.
    '...spinach...') as they inspected a visual display with four words printed in black font. One of
    the four printed words was associated with the same prototypical color (e.g. green) as the
    spoken target word (e.g. FROG). On experimental trials, the spoken target word did not have
    a printed word counterpart (SPINACH was not present in the display). In filler trials (70% of
    trials) the target was present in the display and attracted significantly more overt attention
    than the unrelated distractors. In experimental trials color competitors were not looked at
    more than the distractors. In Experiment 2 the printed words were replaced with line
    drawings of the objects. In order to direct the attentional focus of our participants toward
    color features we used a within-participants counter-balanced design and alternated color
    and greyscale trials randomly throughout the experiment. Therefore, on one trial our
    participants heard a word such as 'spinach' and saw a frog (colored in green) in the visual
    display. On the next trial however they saw a banana (in greyscale) on hearing 'canary'
    (bananas and canaries are typically yellow), etc. The presence (or absence) of color was
    thus a salient property of the experiment. Participants looked more at color competitors than
    unrelated distractors on hearing the target word in the color trials but not in the greyscale
    trials, i.e. on hearing 'spinach' they looked at the green frog but not the greyscale frog.
    Experiment 3 was identical to Experiment 2, except that the visual display was removed at
    the sentence onset, after a longer preview. This experiment examined whether the continued
    presence of color in the immediate visual environment was necessary for the observation of
    color-mediated eye movements. Eye movements directed towards the now blank screen
    were recorded as the sentence unfolded (cf. Spivey & Geng, 2001). In the filler trials,
    participants looked significantly more at the locations where the targets, rather than the
    distractors, had been previously presented as the target words acoustically unfolded. In the
    experimental trials, the locations where the color competitors had previously been presented
    did not attract increased attention (neither in color nor greyscale trials).
    These data demonstrate that language-mediated eye movements are only influenced
    by color relations between spoken words and visually displayed items if color is present in the immediate visual environment. We conclude that color representations are unlikely to be
    routinely activated in language processing. Our findings provide strong constraints for
    embodied theories of cognition which assume that sensory representations influence language processing even in the absence of relevant sensory input. These results fit best with the notion that the main role of sensory representations in language processing is a different one, namely to contextualize language in the immediate environment, connecting language to the here and now.
  • Huettig, F. (2015). Does prediction in language comprehension involve language production?. Talk presented at the Comprehension=Production? workshop. Nijmegen, the Netherlands. 2015-03-26 - 2015-03-28.

    Abstract

    The notion that predicting upcoming linguistic information in language comprehension makes use of the production system has recently received much attention (e.g., Chang et al., 2006; Dell & Chang, 2014; Federmeier, 2007; Pickering & Garrod, 2007, 2013; Van Berkum et al., 2005). So far there has been little experimental evidence for a relation between prediction and production. I will discuss the results of several recent eye-tracking experiments with toddlers (Mani & Huettig, 2012) and adults (Rommers et al. submitted, Hintz et al., in prep.) which provide some support for the view that production abilities are linked to language-mediated anticipatory eye movements. These data however also indicate that production-based prediction is situation-dependent and only one of many mechanisms supporting prediction. Taken together, these results suggest that multiple-mechanism accounts are required to provide a complete picture of anticipatory language processing.
  • Huettig, F. (2014). How embodied is language processing?. Talk presented at the 2nd Attentive Listener in the Visual World workshop. Hyderabad, India. 2014-11-03 - 2014-11-05.
  • Huettig, F. (2014). How literacy acquisition affects the illiterate mind. Talk presented at the Low Educated Second Language and Literacy Acquisition (LESLLA). Nijmegen, Netherlands. 2014-08-28 - 2014-08-30.
  • Huettig, F. (2014). Literacy influences on predictive language processing and visual search. Talk presented at the Priming across Modalities: The Influence of Orthography on Sign and Spoken Language Processing workshop. Haifa, Israel. 2014-04.
  • Huettig, F. (2014). The context-dependent influence of colour representations during language-vision interactions constrains theories of conceptual processing. Talk presented at the Color in Concepts workshop. Düsseldorf, Germany. 2014-06-02 - 2014-06-03.
  • Smith, A. C., Monaghan, P., & Huettig, F. (2014). A comprehensive model of spoken word recognition must be multimodal: Evidence from studies of language mediated visual attention. Talk presented at the 36th Annual Conference of the Cognitive Science Society [CogSci 2014]. Quebec, Canada. 2014-07-23 - 2014-07-26.
  • Smith, A. C., Monaghan, P., & Huettig, F. (2014). Examining strains and symptoms of the ‘Literacy Virus’: The effects of orthographic transparency on phonological processing in a connectionist model of reading. Talk presented at the 36th Annual Conference of the Cognitive Science Society [CogSci 2014]. Quebec, Canada. 2014-07-23 - 2014-07-26.
  • Smith, A. C., Monaghan, P., & Huettig, F. (2014). Examining the effects of orthographic transparency on phonological and semantic processing within a connectionist implementation of the triangle model of reading. Talk presented at the 14th Neural Computation and Psychology Workshop [NCPW 14]. Lancaster, U.K. 2014-08-21 - 2014-08-23.
  • Huettig, F., Singh, N., & Mishra, R. (2010). Language-mediated prediction is contingent upon formal literacy. Talk presented at Brain, Speech and Orthography Workshop. Brussels, Belgium. 2010-10-15 - 2010-10-16.

    Abstract

    A wealth of research has demonstrated that prediction is a core feature of human information processing. Much less is known, however, about the nature and the extent of predictive processing abilities. Here we investigated whether high levels of language expertise attained through formal literacy are related to anticipatory language-mediated visual orienting. Indian low and high literates listened to simple spoken sentences containing a target word (e.g., "door") while at the same time looking at a visual display of four objects (a target, i.e. the door, and three distractors). The spoken sentences were constructed to encourage anticipatory eye movements to visual target objects. High literates started to shift their eye gaze to the target object well before target word onset. In the low literacy group this shift of eye gaze occurred more than a second later, well after the onset of the target. Our findings suggest that formal literacy is crucial for the fine-tuning of language-mediated anticipatory mechanisms, abilities which proficient language users can then exploit for other cognitive activities such as language-mediated visual orienting.
  • Huettig, F. (2010). Looking, language, and memory. Talk presented at Language, Cognition, and Emotion Workshop. Delhi, India. 2010-12-06 - 2010-12-06.
  • Huettig, F. (2010). Toddlers’ language-mediated visual search: They need not have the words for it. Talk presented at International Conference on Cognitive Development 2010. Allahabad, India. 2010-12-10 - 2010-12-13.

    Abstract

    Eye movements made by listeners during language-mediated visual search reveal a strong link between visual processing and conceptual processing. For example, upon hearing the word for a missing referent with a characteristic colour (e.g., “strawberry”), listeners tend to fixate a colour-matched distractor (e.g., a red plane) more than a colour-mismatched distractor (e.g., a yellow plane). We ask whether these shifts in visual attention are mediated by the retrieval of lexically stored colour labels. Do children who do not yet possess verbal labels for the colour attribute that spoken and viewed objects have in common exhibit language-mediated eye movements like those made by older children and adults? That is, do toddlers look at a red plane when hearing “strawberry”? We observed that 24-month-olds lacking colour-term knowledge nonetheless recognised the perceptual-conceptual commonality between named and seen objects. This indicates that language-mediated visual search need not depend on stored labels for concepts.

Share this page