Leveraging Two Reference Functions in Block Bregman Proximal Gradient Descent for Nonconvex and NonLipschitz Problems
Abstract
In the applications of signal processing and data analytics, there is a wide class of nonconvex problems whose objective function is freed from the common global Lipschitz continuous gradient assumption (e.g., the nonnegative matrix factorization (NMF) problem). Recently, this type of problem with some certain special structures has been solved by Bregman proximal gradient (BPG). This inspires us to propose a new Blockwise tworeferences Bregman proximal gradient (B2B) method, which adopts two reference functions so that a closedform solution in the Bregman projection is obtained. Based on the relative smoothness, we prove the global convergence of the proposed algorithms for various block selection rules. In particular, we establish the global convergence rate of $O(\frac{\sqrt{s}}{\sqrt{k}})$ for the greedy and randomized block updating rule for B2B, which is $O(\sqrt{s})$ times faster than the cyclic variant, i.e., $O(\frac{s}{\sqrt{k}} )$, where $s$ is the number of blocks, and $k$ is the number of iterations. Multiple numerical results are provided to illustrate the superiority of the proposed B2B compared to the stateoftheart works in solving NMF problems.
 Publication:

arXiv eprints
 Pub Date:
 December 2019
 arXiv:
 arXiv:1912.07527
 Bibcode:
 2019arXiv191207527G
 Keywords:

 Mathematics  Optimization and Control;
 Computer Science  Machine Learning
 EPrint:
 Submit to TSP