'Less Than One'Shot Learning: Learning N Classes From M<N Samples
Abstract
Deep neural networks require large training sets but suffer from high computational cost and long training times. Training on much smaller training sets while maintaining nearly the same accuracy would be very beneficial. In the fewshot learning setting, a model must learn a new class given only a small number of samples from that class. Oneshot learning is an extreme form of fewshot learning where the model must learn a new class from a single example. We propose the `less than one'shot learning task where models must learn $N$ new classes given only $M<N$ examples and we show that this is achievable with the help of soft labels. We use a softlabel generalization of the kNearest Neighbors classifier to explore the intricate decision landscapes that can be created in the `less than one'shot learning setting. We analyze these decision landscapes to derive theoretical lower bounds for separating $N$ classes using $M<N$ softlabel samples and investigate the robustness of the resulting systems.
 Publication:

arXiv eprints
 Pub Date:
 September 2020
 arXiv:
 arXiv:2009.08449
 Bibcode:
 2020arXiv200908449S
 Keywords:

 Computer Science  Machine Learning;
 Statistics  Machine Learning
 EPrint:
 Sucholutsky, I. and Schonlau, M. 2021. 'Less Than One'Shot Learning: Learning N Classes From M <