next up previous
Next: Carandin: Cortical gain control Up: No Title Previous: BerryWarland, Wu,

Bialek: How do we characterize the neural coding of more naturalistic signals?

NEC Research Institute

We should all admit that experiments on neural coding, and neurobiology more generally, are usually done in a rather unnatural setting. Sensory signals are typically chosen from a small, discrete set, and these signals occur at well defined times in an otherwise quiet background. There is seldom an attempt to give these signals any interesting dynamics, and individual stimuli are repeated many times. The problem is that if we present a reasonably natural stimulus, with complex time dependencies, we don't really know how to quantify the response of the neuron. I will argue that information theory provides a productive framework for approaching this problem. In particular, information theory give a sharp definition to one crucial question in the study of neural coding: How much of the detailed temporal structure of the spike train is carrying a signal, and how much is just noise which should be averaged away? Mathematically the ``detailed temporal structure'' is quantified by the spike train entropy, the notion of ``carrying a signal'' is quantified by the information transmission rate, and the ``noise'' can be quantified by a conditional or noise entropy. I will discuss different techniques for measuring these quantities in real experiments, emphasizing the need to control systematic errors, and introduce a new and completely model independent approach. The bottom line is that in several systems we have clear evidence for information transmission which is more than half of the spike train entropy, even when the spike train is sampled at roughly millisecond resolution. These examples range from primary sensory neurons in invertebrates (the cricket cercal mechanoreceptors) and vertebrates (frog vibratory and auditory afferents) to a central neuron which integrates thousands of synaptic inputs and is located four synapses away from the sensory periphery (a motion sensitive neuron in the fly visual system). In the frog auditory system information rates are increased, and noise entropies are dramatically reduced, by shaping the ensemble of input signals to have the spectrum of naturally occuring frog calls. These informational results put quantitative force behind the ethologists' view that the brain is tuned to the structure of stimuli which actually occur in nature, and have clear connections to theoretical approaches which focus on efficient coding and other information theoretic optimization principles. Since this talk is scheduled for after dinner, I will close with some (hopefully not too philosophical) musing along these lines.


next up previous
Next: Carandin: Cortical gain control Up: No Title Previous: BerryWarland, Wu,

Tony Zador
Tue Oct 22 16:34:57 PDT 1996