Robust, randomized preconditioning for kernel ridge regression
Abstract
This paper introduces two randomized preconditioning techniques for robustly solving kernel ridge regression (KRR) problems with a medium to large number of data points ($10^4 \leq N \leq 10^7$). The first method, RPCholesky preconditioning, is capable of accurately solving the fulldata KRR problem in $O(N^2)$ arithmetic operations, assuming sufficiently rapid polynomial decay of the kernel matrix eigenvalues. The second method, KRILL preconditioning, offers an accurate solution to a restricted version of the KRR problem involving $k \ll N$ selected data centers at a cost of $O((N + k^2) k \log k)$ operations. The proposed methods solve a broad range of KRR problems and overcome the failure modes of previous KRR preconditioners, making them ideal for practical applications.
 Publication:

arXiv eprints
 Pub Date:
 April 2023
 DOI:
 10.48550/arXiv.2304.12465
 arXiv:
 arXiv:2304.12465
 Bibcode:
 2023arXiv230412465D
 Keywords:

 Mathematics  Numerical Analysis;
 Statistics  Machine Learning;
 68W20;
 65F10;
 65F55
 EPrint:
 20 pages, 9 figures