Artificial Neural Networks generated by Low Discrepancy Sequences
Abstract
Artificial neural networks can be represented by paths. Generated as random walks on a dense network graph, we find that the resulting sparse networks allow for deterministic initialization and even weights with fixed sign. Such networks can be trained sparse from scratch, avoiding the expensive procedure of training a dense network and compressing it afterwards. Although sparse, weights are accessed as contiguous blocks of memory. In addition, enumerating the paths using deterministic low discrepancy sequences, for example the Sobol' sequence, amounts to connecting the layers of neural units by progressive permutations, which naturally avoids bank conflicts in parallel computer hardware. We demonstrate that the artificial neural networks generated by low discrepancy sequences can achieve an accuracy within reach of their dense counterparts at a much lower computational complexity.
 Publication:

arXiv eprints
 Pub Date:
 March 2021
 DOI:
 10.48550/arXiv.2103.03543
 arXiv:
 arXiv:2103.03543
 Bibcode:
 2021arXiv210303543K
 Keywords:

 Computer Science  Machine Learning