A Lagrange-Newton Algorithm for Sparse Nonlinear Programming
Abstract
The sparse nonlinear programming (SNP) problem has wide applications in signal and image processing, machine learning, pattern recognition, finance and management, etc. However, the computational challenge posed by SNP has not yet been well resolved due to the nonconvex and discontinuous $\ell_0$-norm involved. In this paper, we resolve this numerical challenge by developing a fast Newton-type algorithm. As a theoretical cornerstone, we establish a first-order optimality condition for SNP based on the concept of strong $\beta$-Lagrangian stationarity via the Lagrangian function, and reformulate it as a system of nonlinear equations called the Lagrangian equations. The nonsingularity of the corresponding Jacobian is discussed, based on which the Lagrange-Newton algorithm (LNA) is then proposed. Under mild conditions, we establish the locally quadratic convergence and the iterative complexity estimation of LNA. To further demonstrate the efficiency and superiority of our proposed algorithm, we apply LNA to solve two specific application problems arising from compressed sensing and sparse high-order portfolio selection, in which significant benefits accrue from the restricted Newton step in LNA.
- Publication:
-
arXiv e-prints
- Pub Date:
- April 2020
- DOI:
- 10.48550/arXiv.2004.13257
- arXiv:
- arXiv:2004.13257
- Bibcode:
- 2020arXiv200413257Z
- Keywords:
-
- Mathematics - Optimization and Control;
- 90C30;
- 49M15;
- 90C46