One approach to this dilemma [Bialek et al., 1991, Bialek et al., 1993] is
to compute a strict lower bound on the mutual information using the
*reconstruction method*. The idea is to ``decode'' the output and
use it to ``reconstruct'' the input that gave rise to it. The error
between the reconstructed and actual outputs is then a measure of the
fidelity of transmission, and with a few testable assumptions can be
related to the information. Formally, this method is based on an
expression mathematically equivalent to eq. (5)
involving the conditional entropy of the signal given
the spike train,

In the present context, the quantity reconstructed is the sum of the
input, . The entropy is just the entropy of
the time series , and can be evaluated directly from
the Poisson synthesis equation (Eq. 3). Intuitively,
eq. (6) says that the information gained about
the spike train by observing the stimulus is just the initial
uncertainty about the synaptic drive (in the absence of knowledge of
the spike train) minus the uncertainty that remains about the signal
once the spike train is known. The reconstruction method estimates the
input from the output, and then bounds the errors of the outputs from
above by assuming they are Gaussian. This method, which can provide a
lower bound on the mutual information, has been used with much success
in a variety of experimental preparations
[de Ruyter van Steveninck and Bialek, 1988, Bialek et al., 1991, Rieke et al., 1997, de Ruyter Van Steveninck and Laughlin, 1996].

Fri Nov 28 10:17:14 PST 1997