The Directional Bias Helps Stochastic Gradient Descent to Generalize in Kernel Regression Models
Abstract
We study the Stochastic Gradient Descent (SGD) algorithm in nonparametric statistics: kernel regression in particular. The directional bias property of SGD, which is known in the linear regression setting, is generalized to the kernel regression. More specifically, we prove that SGD with moderate and annealing stepsize converges along the direction of the eigenvector that corresponds to the largest eigenvalue of the Gram matrix. In addition, the Gradient Descent (GD) with a moderate or small stepsize converges along the direction that corresponds to the smallest eigenvalue. These facts are referred to as the directional bias properties; they may interpret how an SGDcomputed estimator has a potentially smaller generalization error than a GDcomputed estimator. The application of our theory is demonstrated by simulation studies and a case study that is based on the FashionMNIST dataset.
 Publication:

arXiv eprints
 Pub Date:
 April 2022
 arXiv:
 arXiv:2205.00061
 Bibcode:
 2022arXiv220500061L
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning
 EPrint:
 doi:10.1109/ISIT50566.2022.9834388