Eigenconvergence of Gaussian kernelized graph Laplacian by manifold heat interpolation
Abstract
This work studies the spectral convergence of graph Laplacian to the LaplaceBeltrami operator when the graph affinity matrix is constructed from $N$ random samples on a $d$dimensional manifold embedded in a possibly high dimensional space. By analyzing Dirichlet form convergence and constructing candidate approximate eigenfunctions via convolution with manifold heat kernel, we prove that, with Gaussian kernel, one can set the kernel bandwidth parameter $\epsilon \sim (\log N/ N)^{1/(d/2+2)}$ such that the eigenvalue convergence rate is $N^{1/(d/2+2)}$ and the eigenvector convergence in 2norm has rate $N^{1/(d+4)}$; When $\epsilon \sim N^{1/(d/2+3)}$, both eigenvalue and eigenvector rates are $N^{1/(d/2+3)}$. These rates are up to a $\log N$ factor and proved for finitely many lowlying eigenvalues. The result holds for unnormalized and randomwalk graph Laplacians when data are uniformly sampled on the manifold, as well as the densitycorrected graph Laplacian (where the affinity matrix is normalized by the degree matrix from both sides) with nonuniformly sampled data. As an intermediate result, we prove new pointwise and Dirichlet form convergence rates for the densitycorrected graph Laplacian. Numerical results are provided to verify the theory.
 Publication:

arXiv eprints
 Pub Date:
 January 2021
 arXiv:
 arXiv:2101.09875
 Bibcode:
 2021arXiv210109875C
 Keywords:

 Mathematics  Statistics Theory;
 Computer Science  Machine Learning;
 Statistics  Machine Learning