A maximum principle argument for the uniform convergence of graph Laplacian regressors
Abstract
We study asymptotic consistency guarantees for a nonparametric regression problem with Laplacian regularization. In particular, we consider $(x_1, y_1), \dots, (x_n, y_n)$ samples from some distribution on the cross product $\mathcal{M} \times \mathbb{R}$, where $\mathcal{M}$ is a $m$dimensional manifold embedded in $\mathbb{R}^d$. A geometric graph on the cloud $\{x_1, \dots, x_n \}$ is constructed by connecting points that are within some specified distance $\varepsilon_n$. A suitable semilinear equation involving the resulting graph Laplacian is used to obtain a regressor for the observed values of $y$. We establish probabilistic error rates for the uniform difference between the regressor constructed from the observed data and the Bayes regressor (or trend) associated to the groundtruth distribution. We give the explicit dependence of the rates in terms of the parameter $\varepsilon_n$, the strength of regularization $\beta_n$, and the number of data points $n$. Our argument relies on a simple, yet powerful, maximum principle for the graph Laplacian. We also address a simple extension of the framework to a semisupervised setting.
 Publication:

arXiv eprints
 Pub Date:
 January 2019
 arXiv:
 arXiv:1901.10089
 Bibcode:
 2019arXiv190110089G
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning;
 Mathematics  Analysis of PDEs;
 Mathematics  Statistics Theory