next up previous
Next: Methods for estimating spike Up: Theory Previous: Model of synaptic drive

Information rate of spike trains

A typical pyramidal neuron in the cortex receives synaptic input from tex2html_wrap_inline1225 other neurons. We define the activity in each of these input neurons as the ``signal'', and the variability due to the unreliability of synaptic transmission is the ``noise''.

How much information does the output spike train tex2html_wrap_inline1137 provide about the input spike trains tex2html_wrap_inline1147? More formally, what is the mutual information tex2html_wrap_inline1231 between the ensemble of input spike trains tex2html_wrap_inline1233, and the output spike train ensemble tex2html_wrap_inline1235? We assume that both tex2html_wrap_inline1237 and tex2html_wrap_inline1235 are completely specified by the activity (i.e. the precise list of spike times) in each spike train; that is, all the information in the spike trains can be represented by the list of spike times, and there is no extra information contained in properties such as spike height or width. Characteristics of the spike train such as the mean or instantaneous rate can be derived from this representation; if such a derived property turns out to be the relevant one, then this formulation can be specialized appropriately.

The mutual information tex2html_wrap_inline1231 is defined [Shannon and Weaver, 1948] in terms of the entropy tex2html_wrap_inline1243 of the ensemble of input spike trains, the entropy tex2html_wrap_inline1245 of output spike trains, and their joint entropy tex2html_wrap_inline1247,
 equation434
The entropies tex2html_wrap_inline1243, tex2html_wrap_inline1245 and tex2html_wrap_inline1247 depend only on the probability distributions tex2html_wrap_inline1255, tex2html_wrap_inline1257, and the joint distribution tex2html_wrap_inline1259, respectively.

Note that since the joint distribution is symmetric tex2html_wrap_inline1261, the mutual information is also symmetric, tex2html_wrap_inline1263. Note also that if the inputs tex2html_wrap_inline1237 and outputs tex2html_wrap_inline1235 are completely independent, then the mutual information is 0, since the joint entropy is just the sum of the individual entropies tex2html_wrap_inline1271. This is completely reasonable, since in this case the inputs provide no information about the outputs.



Tony Zador
Fri Nov 28 10:17:14 PST 1997