Evaluating complexity and resilience trade-offs in emerging memory inference machines
Abstract
Neuromorphic-style inference only works well if limited hardware resources are maximized properly, e.g. accuracy continues to scale with parameters and complexity in the face of potential disturbance. In this work, we use realistic crossbar simulations to highlight that compact implementations of deep neural networks are unexpectedly susceptible to collapse from multiple system disturbances. Our work proposes a middle path towards high performance and strong resilience utilizing the Mosaics framework, and specifically by re-using synaptic connections in a recurrent neural network implementation that possesses a natural form of noise-immunity.
- Publication:
-
arXiv e-prints
- Pub Date:
- February 2020
- DOI:
- 10.48550/arXiv.2003.10396
- arXiv:
- arXiv:2003.10396
- Bibcode:
- 2020arXiv200310396B
- Keywords:
-
- Computer Science - Neural and Evolutionary Computing;
- Computer Science - Machine Learning;
- Statistics - Machine Learning