Deep Learning with Eigenvalue Decay Regularizer
Abstract
This paper extends our previous work on regularization of neural networks using Eigenvalue Decay by employing a soft approximation of the dominant eigenvalue in order to enable the calculation of its derivatives in relation to the synaptic weights, and therefore the application of back-propagation, which is a primary demand for deep learning. Moreover, we extend our previous theoretical analysis to deep neural networks and multiclass classification problems. Our method is implemented as an additional regularizer in Keras, a modular neural networks library written in Python, and evaluated in the benchmark data sets Reuters Newswire Topics Classification, IMDB database for binary sentiment classification, MNIST database of handwritten digits and CIFAR-10 data set for image classification.
- Publication:
-
arXiv e-prints
- Pub Date:
- April 2016
- DOI:
- 10.48550/arXiv.1604.06985
- arXiv:
- arXiv:1604.06985
- Bibcode:
- 2016arXiv160406985L
- Keywords:
-
- Computer Science - Machine Learning