Amit Manwani and Christof Koch
Computation and Neural Systems Program, Caltech
Synapses are indispensable components of neuronal communication, yet little quantitative characterization of synapses as communication devices has been carried out so far. This issue is particularly relevant to cortical synapses which are known to be unreliable. Tools from information and communication theory can be used to understand how reliably synapses can convey information between cortical neurons. This poster documents our first attempts at studying synaptic transmission from an information-theoretic perspective. We compute lower bounds on the capacity of a simple model of a cortical synapse under two explicit coding paradigms. Under the first scenario (``signal estimation''), we assume that the signal is encoded in the mean firing rate of a Poisson neuron. Tools from statistical estimation theory are used to compute the quality of an optimal linear estimate of the signal possible. The quality of the estimate provides a lower bound on the synaptic capacity for signal estimation. Under the second scenario (``signal detection''), we assume that spikes carry information in a binary format and detection of the presence of a spike by the postsynaptic neuron is carried out. An optimal detector is derived using signal detection theory and its performance provides a lower bound on the synaptic capacity for signal detection. We also demonstrate the generalization of the approaches to the case of multiple synapses.