Candice Frances

Publications

Displaying 1 - 3 of 3
  • Gaspard III, J. C., Bauer, G. B., Mann, D. A., Boerner, K., Denum, L., Frances, C., & Reep, R. L. (2017). Detection of hydrodynamic stimuli by the postcranial body of Florida manatees (Trichechus manatus latirostris) A Neuroethology, sensory, neural, and behavioral physiology. Journal of Comparative Physiology, 203, 111-120. doi:10.1007/s00359-016-1142-8.

    Abstract

    Manatees live in shallow, frequently turbid
    waters. The sensory means by which they navigate in these
    conditions are unknown. Poor visual acuity, lack of echo-
    location, and modest chemosensation suggest that other
    modalities play an important role. Rich innervation of sen-
    sory hairs that cover the entire body and enlarged soma-
    tosensory areas of the brain suggest that tactile senses are
    good candidates. Previous tests of detection of underwater
    vibratory stimuli indicated that they use passive movement
    of the hairs to detect particle displacements in the vicinity
    of a micron or less for frequencies from 10 to 150 Hz. In
    the current study, hydrodynamic stimuli were created by
    a sinusoidally oscillating sphere that generated a dipole
    field at frequencies from 5 to 150 Hz. Go/no-go tests of
    manatee postcranial mechanoreception of hydrodynamic
    stimuli indicated excellent sensitivity but about an order of
    magnitude less than the facial region. When the vibrissae
    were trimmed, detection thresholds were elevated, suggest-
    ing that the vibrissae were an important means by which
    detection occurred. Manatees were also highly accurate in two-choice directional discrimination: greater than 90%
    correct at all frequencies tested. We hypothesize that mana-
    tees utilize vibrissae as a three-dimensional array to detect
    and localize low-frequency hydrodynamic stimuli
  • Tzekov, R., Quezada, A., Gautier, M., Biggins, D., Frances, C., Mouzon, B., Jamison, J., Mullan, M., & Crawford, F. (2014). Repetitive mild traumatic brain injury causes optic nerve and retinal damage in a mouse model. Journal of Neuropathology and Experimental Neurology, 73(4), 345-361. doi:10.1097/NEN.0000000000000059.

    Abstract

    There is increasing evidence that long-lasting morphologic and
    functional consequences can be present in the human visual system
    after repetitive mild traumatic brain injury (r-mTBI). The exact lo-
    cation and extent of the damage in this condition are not well un-
    derstood. Using a recently developed mouse model of r-mTBI, we
    assessed the effects on the retina and optic nerve using histology and
    immunohistochemistry, electroretinography (ERG), and spectral-
    domain optical coherence tomography (SD-OCT) at 10 and 13 weeks
    after injury. Control mice received repetitive anesthesia alone (r-sham).
    We observed decreased optic nerve diameters and increased cellularity
    and areas of demyelination in optic nerves in r-mTBI versus r-sham
    mice. There were concomitant areas of decreased cellularity in the
    retinal ganglion cell layer and approximately 67% decrease in brain-
    specific homeobox/POU domain protein 3AYpositive retinal ganglion
    cells in retinal flat mounts. Furthermore, SD-OCT demonstrated a de-
    tectable thinning of the inner retina; ERG demonstrated a decrease in
    the amplitude of the photopic negative response without any change in
    a- or b-wave amplitude or timing. Thus, the ERG and SD-OCT data
    correlated well with changes detected by morphometric, histologic,
    and immunohistochemical methods, thereby supporting the use of
    these noninvasive methods in the assessment of visual function and
    morphology in clinical cases of mTBI.
  • Nomi, J. S., Frances, C., Nguyen, M. T., Bastidas, S., & Troup, L. J. (2013). Interaction of threat expressions and eye gaze: an event-related potential study. NeuroReport, 24, 813-817. doi:10.1097/WNR.0b013e3283647682.

    Abstract

    he current study examined the interaction of fearful, angry,
    happy, and neutral expressions with left, straight, and
    right eye gaze directions. Human participants viewed
    faces consisting of various expression and eye gaze
    combinations while event-related potential (ERP) data
    were collected. The results showed that angry expressions
    modulated the mean amplitude of the P1, whereas fearful
    and happy expressions modulated the mean amplitude of
    the N170. No influence of eye gaze on mean amplitudes for
    the P1 and N170 emerged. Fearful, angry, and happy
    expressions began to interact with eye gaze to influence
    mean amplitudes in the time window of 200–400 ms.
    The results suggest early processing of expression
    influence ERPs independent of eye gaze, whereas
    expression and gaze interact to influence later
    ERPs.

Share this page