BlockSimultaneous Direction Method of Multipliers: A proximal primaldual splitting algorithm for nonconvex problems with multiple constraints
Abstract
We introduce a generalization of the linearized Alternating Direction Method of Multipliers to optimize a realvalued function $f$ of multiple arguments with potentially multiple constraints $g_\circ$ on each of them. The function $f$ may be nonconvex as long as it is convex in every argument, while the constraints $g_\circ$ need to be convex but not smooth. If $f$ is smooth, the proposed BlockSimultaneous Direction Method of Multipliers (bSDMM) can be interpreted as a proximal analog to inexact coordinate descent methods under constraints. Unlike alternative approaches for joint solvers of multipleconstraint problems, we do not require linear operators $L$ of a constraint function $g(L\ \cdot)$ to be invertible or linked between each other. bSDMM is wellsuited for a range of optimization problems, in particular for data analysis, where $f$ is the likelihood function of a model and $L$ could be a transformation matrix describing e.g. finite differences or basis transforms. We apply bSDMM to the Nonnegative Matrix Factorization task of a hyperspectral unmixing problem and demonstrate convergence and effectiveness of multiple constraints on both matrix factors. The algorithms are implemented in python and released as an opensource package.
 Publication:

arXiv eprints
 Pub Date:
 August 2017
 arXiv:
 arXiv:1708.09066
 Bibcode:
 2017arXiv170809066M
 Keywords:

 Mathematics  Optimization and Control;
 Computer Science  Computer Vision and Pattern Recognition;
 Computer Science  Machine Learning
 EPrint:
 13 pages, 4 figures