next up previous
Next: Results Up: Theory Previous: Model assumptions

An informative upper bound

 

The assumption that successive ISIs are independent (i.e. that the spike train is a renewal process) leads to an exact expression (rather than the upper bound provided by the reconstruction method) for the mutual information, subject only to error in the estimation of the ISI distribution. Here we review the well known result that a Poisson process (the special case where the ISI distribution is exponential) leads to the maximum entropy spike train, and give the simple closed-form expression for the entropy in this case.

The upper bound on the possible information transmitted in this model is straightforward to calculate [MacKay and McCulloch, 1952]. The output is a binary string--we have disallowed the possibility of multiple spikes per bin. If the conditional entropy is zero (i.e. if there is no noise whatsoever), then all the entropy is information, and the upper bound on the entropy is equal to the upper bound tex2html_wrap_inline1371 on the informaiton.

The probability of observing a spike in a bin of length tex2html_wrap_inline1331 depends on the firing rate R as tex2html_wrap_inline1377, and the probability of not observing a spike is tex2html_wrap_inline1379. If spikes are independent--that is, if the probability of observing a spike in one bin does not depend on whether there was a spike in any neighboring bin, so that the spike train is a Poisson process--then the entropy/bin is tex2html_wrap_inline1381. At low firing rates, tex2html_wrap_inline1383, and tex2html_wrap_inline1385, so the entropy per bin is approximately tex2html_wrap_inline1387. The entropy rate (entropy per time) is then the entropy per bin divided by the time per bin tex2html_wrap_inline1331, or
 equation523
This upper bound on the information encoded in a discretized spike train is achieved if (1) there is no noise; (2) spikes are independent; and (3) the spike rate is low compared to the inverse bin size, tex2html_wrap_inline1391. It shows that the information rate increases almost linearly with the firing rate, but the information per spike tex2html_wrap_inline1393 decreases logarithmically.


next up previous
Next: Results Up: Theory Previous: Model assumptions

Tony Zador
Fri Nov 28 10:17:14 PST 1997