Associative content-addressable networks with exponentially many robust stable states
Abstract
The brain must robustly store a large number of memories, corresponding to the many events encountered over a lifetime. However, the number of memory states in existing neural network models either grows weakly with network size or recall fails catastrophically with vanishingly little noise. We construct an associative content-addressable memory with exponentially many stable states and robust error-correction. The network possesses expander graph connectivity on a restricted Boltzmann machine architecture. The expansion property allows simple neural network dynamics to perform at par with modern error-correcting codes. Appropriate networks can be constructed with sparse random connections, glomerular nodes, and associative learning using low dynamic-range weights. Thus, sparse quasi-random structures---characteristic of important error-correcting codes---may provide for high-performance computation in artificial neural networks and the brain.
- Publication:
-
arXiv e-prints
- Pub Date:
- April 2017
- DOI:
- 10.48550/arXiv.1704.02019
- arXiv:
- arXiv:1704.02019
- Bibcode:
- 2017arXiv170402019C
- Keywords:
-
- Quantitative Biology - Neurons and Cognition;
- Computer Science - Neural and Evolutionary Computing
- E-Print:
- 42 pages, 8 figures