Computation and Neural Systems Program, Caltech
The stochastic nature of single neurons
The goal of our research is a systematic investigation of information processing at different biophysical stages of processing within a neuron. We decompose the neuron into distinct stages in the pathway from the synapse to the axon hillock and compute the information capacity of each stage separately. In order to do so, it is essential to characterize the sources of noise which distort the synaptic signal at each stage. The separate modules we consider are the synapse, the dendritic tree and the spike-initiation zone.
Synaptic transmission at cortical synapses is known to be both highly unreliable and variable. We model the synapse as a cascade of a binary channels, which abstracts the probabilistic release of neurotransmitter-filled vesicles (routinely observed in central synapses), and a random amplitude semi-continuous channel which models the trial-to-trial response variability of epscs (also a very common property of synapses in the brain). Both signal estimation and signal detection theory are used to compute lower bounds on the capacity of this simple model of a central synapse. We find that for our choice of biologically plausible parameter values, single synapses transmit information poorly, but a small amount of redundancy, in the form of multiple parallel synapses between neurons, is sufficient for a significant improvement in performance.
Information is further lost as the neural signal propagates electrotonically along a weakly-active dendrite due to neuronal noise sources distributed along the dendrite. We model the dendrite as a passive linear cable with current noise sources distributed along its length. The noise sources we consider are thermal noise, channel noise arising from the stochastic nature of voltage-dependent ionic channels and synaptic noise due to spontaneous background activity. We characterize these noise sources using analytical expressions of their power spectral densities. Cable theory allows us to derive the magnitudes of the signal and noise contributions to the membrane voltage at different distances from the synaptic location. This enables us to derive bounds on the capacity of our simplified model of the dendritic channel under the same coding paradigms discussed above. For our choice of parameters, we find that the synaptic background noise is the dominant source of membrane noise and that the maximum dendritic length over which information can be reliably transmitted is limited by signal-to-noise considerations.
This modular, reductionist approach to the problem of neural coding allows us to create a quantitative picture of the properties of neuronal noise sources and their effect on the information capacity of individual cortical neurons. We are presently conducting experiments to measure noise levels in cortical pyramidal cells and incorporating model noise sources into compartmental models with realistic neuronal geometries.