A typical pyramidal neuron in the cortex receives synaptic input from other neurons. We define the activity in each of these input neurons as the ``signal'', and the variability due to the unreliability of synaptic transmission is the ``noise''.
How much information does the output spike train provide about the input spike trains ? More formally, what is the mutual information between the ensemble of input spike trains , and the output spike train ensemble ? We assume that both and are completely specified by the activity (i.e. the precise list of spike times) in each spike train; that is, all the information in the spike trains can be represented by the list of spike times, and there is no extra information contained in properties such as spike height or width. Characteristics of the spike train such as the mean or instantaneous rate can be derived from this representation; if such a derived property turns out to be the relevant one, then this formulation can be specialized appropriately.
The mutual information is defined
[Shannon and Weaver, 1948] in terms of the entropy of the
ensemble of input spike trains, the entropy of output spike
trains, and their joint entropy ,
The entropies , and depend only on the probability distributions , , and the joint distribution , respectively.
Note that since the joint distribution is symmetric , the mutual information is also symmetric, . Note also that if the inputs and outputs are completely independent, then the mutual information is 0, since the joint entropy is just the sum of the individual entropies . This is completely reasonable, since in this case the inputs provide no information about the outputs.