Speculate-Correct Error Bounds for k-Nearest Neighbor Classifiers
Abstract
We introduce the speculate-correct method to derive error bounds for local classifiers. Using it, we show that k nearest neighbor classifiers, in spite of their famously fractured decision boundaries, have exponential error bounds with O(sqrt((k + ln n) / n)) error bound range for n in-sample examples.
- Publication:
-
arXiv e-prints
- Pub Date:
- October 2014
- arXiv:
- arXiv:1410.2500
- Bibcode:
- 2014arXiv1410.2500B
- Keywords:
-
- Computer Science - Machine Learning;
- Computer Science - Information Theory;
- Statistics - Machine Learning