PAC-Bayes Generalisation Bounds for Heavy-Tailed Losses through Supermartingales
Abstract
While PAC-Bayes is now an established learning framework for light-tailed losses (\emph{e.g.}, subgaussian or subexponential), its extension to the case of heavy-tailed losses remains largely uncharted and has attracted a growing interest in recent years. We contribute PAC-Bayes generalisation bounds for heavy-tailed losses under the sole assumption of bounded variance of the loss function. Under that assumption, we extend previous results from \citet{kuzborskij2019efron}. Our key technical contribution is exploiting an extention of Markov's inequality for supermartingales. Our proof technique unifies and extends different PAC-Bayesian frameworks by providing bounds for unbounded martingales as well as bounds for batch and online learning with heavy-tailed losses.
- Publication:
-
arXiv e-prints
- Pub Date:
- October 2022
- DOI:
- arXiv:
- arXiv:2210.00928
- Bibcode:
- 2022arXiv221000928H
- Keywords:
-
- Statistics - Machine Learning;
- Computer Science - Machine Learning;
- Mathematics - Statistics Theory
- E-Print:
- New Section 3 on Online PAC-Bayes