Signed Graph Metric Learning via Gershgorin Disc Perfect Alignment
Abstract
Given a convex and differentiable objective $Q(\M)$ for a real symmetric matrix $\M$ in the positive definite (PD) cone  used to compute Mahalanobis distances  we propose a fast general metric learning framework that is entirely projectionfree. We first assume that $\M$ resides in a space $\cS$ of generalized graph Laplacian matrices corresponding to balanced signed graphs. $\M \in \cS$ that is also PD is called a graph metric matrix. Unlike lowrank metric matrices common in the literature, $\cS$ includes the important diagonalonly matrices as a special case. The key theorem to circumvent full eigendecomposition and enable fast metric matrix optimization is Gershgorin disc perfect alignment (GDPA): given $\M \in \cS$ and diagonal matrix $§$, where $S_{ii} = 1/v_i$ and $\v$ is $\M$'s first eigenvector, we prove that Gershgorin disc leftends of similarity transform $\B = §\M §^{1}$ are perfectly aligned at the smallest eigenvalue $\lambda_{\min}$. Using this theorem, we replace the PD cone constraint in the metric learning problem with tightest possible linear constraints per iteration, so that the alternating optimization of the diagonal / offdiagonal terms in $\M$ can be solved efficiently as linear programs via the FrankWolfe method. We update $\v$ using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as entries in $\M$ are optimized successively. Experiments show that our graph metric optimization is significantly faster than coneprojection schemes, and produces competitive binary classification performance.
 Publication:

arXiv eprints
 Pub Date:
 June 2020
 arXiv:
 arXiv:2006.08816
 Bibcode:
 2020arXiv200608816Y
 Keywords:

 Computer Science  Machine Learning;
 Statistics  Machine Learning
 EPrint:
 code available: https://github.com/bobchengyang/SGML