Learning and memory properties in fully connected networks
Abstract
This paper summarises recent results of theoretical analysis and numerical simulation, in fully connected networks of the LittleHopfield class. The theoretical analysis is based on methods of statistical mechanics as applied to spinglass problems, and the numerical work involves massively parallel simulations on the ICL Distributed Array Processor (DAP). Specific applications include: (i) exact results for the fraction of nominal vectors which are perfectly stored by the usual Hebbian rule; (ii) a numerical estimate of the position of the second phase transition in the Hopfield model, at which there is effectively total loss of memory capacity; (iii) a numerical study of the nature of the spurious states in the model; (iv) an exploration of the performance of a learning algorithm, including the exact storage of up to 512 (random) nominal vectors in a 512 node model; (v) a theoretical study of the phase transitions in generalizations where the energy function is a monomial in the state vectors.
 Publication:

Neural Networks for Computing
 Pub Date:
 August 1986
 DOI:
 10.1063/1.36221
 Bibcode:
 1986AIPC..151...65B
 Keywords:

 02.50.Ey;
 05.20.y;
 Stochastic processes;
 Classical statistical mechanics