Institute for Theoretical Computer Science
Technische Universitaet Graz
A simple extension of standard neural network models is introduced, that allows us to model neural computations that involve both firing rates and firing correlations. Such extension appears to be useful, since it has been shown that firing correlations play a significant computational role in many biological neural systems. However standard neural network models are only suitable for describing neural computations in terms of firing rates.
The resulting extended neural network models are still relatively simple, so that their computational power can be analyzed theoretically. We prove rigorous separation results, which show that the use of firing correlations in addition to firing rates can drastically increase the computational power of a neural network.
On the side one of our separation results also settles an old conjecture that involves just standard neural network models: We prove that some feedforward high-order sigmoidal neural nets have more computational power than any feedforward first-order sigmoidal neural net of the same size.