On Estimating $L_2^2$ Divergence
Abstract
We give a comprehensive theoretical characterization of a nonparametric estimator for the $L_2^2$ divergence between two continuous distributions. We first bound the rate of convergence of our estimator, showing that it is $\sqrt{n}$-consistent provided the densities are sufficiently smooth. In this smooth regime, we then show that our estimator is asymptotically normal, construct asymptotic confidence intervals, and establish a Berry-Esséen style inequality characterizing the rate of convergence to normality. We also show that this estimator is minimax optimal.
- Publication:
-
arXiv e-prints
- Pub Date:
- October 2014
- DOI:
- 10.48550/arXiv.1410.8372
- arXiv:
- arXiv:1410.8372
- Bibcode:
- 2014arXiv1410.8372K
- Keywords:
-
- Statistics - Machine Learning