Algorithms and SQ Lower Bounds for PAC Learning OneHiddenLayer ReLU Networks
Abstract
We study the problem of PAC learning onehiddenlayer ReLU networks with $k$ hidden units on $\mathbb{R}^d$ under Gaussian marginals in the presence of additive label noise. For the case of positive coefficients, we give the first polynomialtime algorithm for this learning problem for $k$ up to $\tilde{O}(\sqrt{\log d})$. Previously, no polynomial time algorithm was known, even for $k=3$. This answers an open question posed by~\cite{Kliv17}. Importantly, our algorithm does not require any assumptions about the rank of the weight matrix and its complexity is independent of its condition number. On the negative side, for the more general task of PAC learning onehiddenlayer ReLU networks with arbitrary real coefficients, we prove a Statistical Query lower bound of $d^{\Omega(k)}$. Thus, we provide a separation between the two classes in terms of efficient learnability. Our upper and lower bounds are general, extending to broader families of activation functions.
 Publication:

arXiv eprints
 Pub Date:
 June 2020
 arXiv:
 arXiv:2006.12476
 Bibcode:
 2020arXiv200612476D
 Keywords:

 Computer Science  Machine Learning;
 Computer Science  Data Structures and Algorithms;
 Mathematics  Statistics Theory;
 Statistics  Machine Learning