Bitwise Neural Networks
Abstract
Based on the assumption that there exists a neural network that efficiently represents a set of Boolean functions between all binary inputs and outputs, we propose a process for developing and deploying neural networks whose weight parameters, bias terms, input, and intermediate hidden layer output signals, are all binaryvalued, and require only basic bit logic for the feedforward pass. The proposed Bitwise Neural Network (BNN) is especially suitable for resourceconstrained environments, since it replaces either floating or fixedpoint arithmetic with significantly more efficient bitwise operations. Hence, the BNN requires for less spatial complexity, less memory bandwidth, and less power consumption in hardware. In order to design such networks, we propose to add a few training schemes, such as weight compression and noisy backpropagation, which result in a bitwise network that performs almost as well as its corresponding realvalued network. We test the proposed network on the MNIST dataset, represented using binary features, and show that BNNs result in competitive performance while offering dramatic computational savings.
 Publication:

arXiv eprints
 Pub Date:
 January 2016
 arXiv:
 arXiv:1601.06071
 Bibcode:
 2016arXiv160106071K
 Keywords:

 Computer Science  Machine Learning;
 Computer Science  Artificial Intelligence;
 Computer Science  Neural and Evolutionary Computing
 EPrint:
 This paper was presented at the International Conference on Machine Learning (ICML) Workshop on ResourceEfficient Machine Learning, Lille, France, Jul. 611, 2015