Robust, randomized preconditioning for kernel ridge regression
Abstract
This paper investigates two randomized preconditioning techniques for solving kernel ridge regression (KRR) problems with a medium to large number of data points ($10^4 \leq N \leq 10^7$), and it introduces two new methods with state-of-the-art performance. The first method, RPCholesky preconditioning, accurately solves the full-data KRR problem in $O(N^2)$ arithmetic operations, assuming sufficiently rapid polynomial decay of the kernel matrix eigenvalues. The second method, KRILL preconditioning, offers an accurate solution to a restricted version of the KRR problem involving $k \ll N$ selected data centers at a cost of $O((N + k^2) k \log k)$ operations. The proposed methods solve a broad range of KRR problems, making them ideal for practical applications.
- Publication:
-
arXiv e-prints
- Pub Date:
- April 2023
- DOI:
- 10.48550/arXiv.2304.12465
- arXiv:
- arXiv:2304.12465
- Bibcode:
- 2023arXiv230412465D
- Keywords:
-
- Mathematics - Numerical Analysis;
- Statistics - Machine Learning;
- 68W20;
- 65F10;
- 65F55
- E-Print:
- 29 pages, 11 figures