Learning subgaussian classes : Upper and minimax bounds
Abstract
We obtain sharp oracle inequalities for the empirical risk minimization procedure in the regression model under the assumption that the target Y and the model F are subgaussian. The bound we obtain is sharp in the minimax sense if F is convex. Moreover, under mild assumptions on F, the error rate of ERM remains optimal even if the procedure is allowed to perform with constant probability. A part of our analysis is a new proof of minimax results for the gaussian regression model.
 Publication:

arXiv eprints
 Pub Date:
 May 2013
 arXiv:
 arXiv:1305.4825
 Bibcode:
 2013arXiv1305.4825L
 Keywords:

 Mathematics  Statistics Theory
 EPrint:
 learning theory, empirical process, minimax rates, Topics in Learning Theory  Societe Mathematique de France, (S. Boucheron and N. Vayatis Eds.). 2016