Stochastic neural networks with the weighted Hebb rule
Abstract
Neural networks with synaptic weights constructed according to the weighted Hebb rule are studied in the presence of noise (finite temperature), when the number of stored patterns is finite. Although for arbitrary weights not all of the stored patterns are global minima, there exists a temperature range in which only the stored patterns are minima of the free energy. In particular, a detailed analysis reveals that in the presence of a single extra pattern stored with an appropriate weight in the synaptic rule, the temperature at which the spurious minima of the free energy are eliminated is significantly lower than for a similar network without this extra pattern. The convergence time of the network, together with the overlaps of the equilibria of the network with the stored patterns, can thereby be improved considerably.
 Publication:

Physics Letters A
 Pub Date:
 August 1994
 DOI:
 10.1016/03759601(94)905703
 arXiv:
 arXiv:condmat/9308005
 Bibcode:
 1994PhLA..191..127M
 Keywords:

 Condensed Matter;
 Nonlinear Sciences  Adaptation and SelfOrganizing Systems
 EPrint:
 14 pages, OKHEP 93004