Computational Models of Multisensory Processing
Skip Navigation Links
Home
Summary
Developed Tool
Publications

Hun Ki Lim, Leslie P. Keniston, M. Alex Meredith, Krzysztof J. Cios
.
    Behavior and perception are highly dependent on sensory information processed by the brain, and it is becoming increasingly clear that multisensory processing underlies a wide variety of behavioral and perceptual phenomena. As shown in Figure 2.1, multiple stimuli emanate from an environmental event such as a falling tree hitting the ground, a bird singing, or a lightning strike. Receptors that are sensitive to those physical energies (e.g., light/eyes, sound/ears) transduce those stimuli into neural responses that are relayed into the brain. When projections that carry different unisensory messages synapse upon an individual neuron in a convergent area (multisensory convergence) the recipient neuron can become multisensory. As a consequence, multisensory neurons respond to their combined inputs in a manner that is significantly different (i.e., multisensory integration/interaction; either enhancement or depression) than that elicited by either input alone. We have modeled multisensory processing using networks of biologically-close neuron models along with synaptic plasticity rules to account for the full range of multisensory processing.
 
Partially supported by NIH, 1R01NS064675-01