We have estimated the mutual information between the synaptic drive and the resulting output spike train in a model neuron. We have adopted a framework in which the time at which individual spikes occur carries information about the input. In this formulation, the exact sequence of action potentials arriving at each of the presynaptic terminals is the ``signal'', and the ``noise'' is any variability in the response to repeated trials on which precisely the same sequence is presented. We found that the information was a smooth function of both synaptic reliability and connection redundancy: no sharp transition was observed from an ``unreliable'' to a ``reliable'' mode. However, connection redundancy can only compensate for synaptic unreliability under the assumption that the fine temporal structure of individual spikes carries information. If only the number of spikes in some relatively long time window carries information (a ``mean rate'' code), an increase in the fidelity of synaptic transmission results in a seemingly paradoxical decrease in the information available in the spike train.