On the Stability of Deep Networks
Abstract
In this work we study the properties of deep neural networks (DNN) with random weights. We formally prove that these networks perform a distancepreserving embedding of the data. Based on this we then draw conclusions on the size of the training data and the networks' structure. A longer version of this paper with more results and details can be found in (Giryes et al., 2015). In particular, we formally prove in the longer version that DNN with random Gaussian weights perform a distancepreserving embedding of the data, with a special treatment for inclass and outofclass data.
 Publication:

arXiv eprints
 Pub Date:
 December 2014
 arXiv:
 arXiv:1412.5896
 Bibcode:
 2014arXiv1412.5896G
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Information Theory;
 Computer Science  Machine Learning;
 Computer Science  Neural and Evolutionary Computing;
 Mathematics  Metric Geometry
 EPrint:
 4 pages