Riemannian optimization using three different metrics for Hermitian PSD fixedrank constraints: an extended version
Abstract
We consider smooth optimization problems with a Hermitian positive semidefinite fixedrank constraint, where a quotient geometry with three Riemannian metrics $g^i(\cdot, \cdot)$ $(i=1,2,3)$ is used to represent this constraint. By taking the nonlinear conjugate gradient method (CG) as an example, we show that CG on the quotient geometry with metric $g^1$ is equivalent to CG on the factorbased optimization framework, which is often called the BurerMonteiro approach. We also show that CG on the quotient geometry with metric $g^3$ is equivalent to CG on the commonlyused embedded geometry. We call two CG methods equivalent if they produce an identical sequence of iterates $\{X_k\}$. In addition, we show that if the limit point of the sequence $\{X_k\}$ generated by an algorithm has lower rank, that is $X_k\in \mathbb C^{n\times n}, k = 1, 2, \ldots$ has rank $p$ and the limit point $X_*$ has rank $r < p$, then the condition number of the Riemannian Hessian with metric $g^1$ can be unbounded, but those of the other two metrics stay bounded. Numerical experiments show that the BurerMonteiro CG method has slower local convergence rate if the limit point has a reduced rank, compared to CG on the quotient geometry under the other two metrics. This slower convergence rate can thus be attributed to the large condition number of the Hessian near a minimizer.
 Publication:

arXiv eprints
 Pub Date:
 April 2022
 arXiv:
 arXiv:2204.07830
 Bibcode:
 2022arXiv220407830Z
 Keywords:

 Mathematics  Optimization and Control;
 Mathematics  Numerical Analysis