Learning Mahalanobis Metric Spaces via Geometric Approximation Algorithms
Abstract
Learning Mahalanobis metric spaces is an important problem that has found numerous applications. Several algorithms have been designed for this problem, including Information Theoretic Metric Learning (ITML) [Davis et al. 2007] and Large Margin Nearest Neighbor (LMNN) classification [Weinberger and Saul 2009]. We consider a formulation of Mahalanobis metric learning as an optimization problem, where the objective is to minimize the number of violated similarity/dissimilarity constraints. We show that for any fixed ambient dimension, there exists a fully polynomialtime approximation scheme (FPTAS) with nearlylinear running time. This result is obtained using tools from the theory of linear programming in low dimensions. We also discuss improvements of the algorithm in practice, and present experimental results on synthetic and realworld data sets. Our algorithm is fully parallelizable and performs favorably in the presence of adversarial noise.
 Publication:

arXiv eprints
 Pub Date:
 May 2019
 arXiv:
 arXiv:1905.09989
 Bibcode:
 2019arXiv190509989I
 Keywords:

 Computer Science  Machine Learning;
 Statistics  Machine Learning
 EPrint:
 8 pages, 5 figures. Under review