Generalized Damped Newton Algorithms in Nonsmooth Optimization with Applications to Lasso Problems
Abstract
The paper proposes and develops new globally convergent algorithms of the generalized damped Newton type for solving important classes of nonsmooth optimization problems. These algorithms are based on the theory and calculations of secondorder subdifferentials of nonsmooth functions with employing the machinery of secondorder variational analysis and generalized differentiation. First we develop a globally superlinearly convergent damped Newtontype algorithm for the class of continuously differentiable functions with Lipschitzian gradients, which are nonsmooth of second order. Then we design such a globally convergent algorithm to solve a class of nonsmooth convex composite problems with extendedrealvalued cost functions, which typically arise in machine learning and statistics. Finally, the obtained algorithmic developments and justifications are applied to solving a major class of Lasso problems with detailed numerical implementations. We present the results of numerical experiments and compare the performance of our main algorithm applied to Lasso problems with those achieved by other firstorder and secondorder methods.
 Publication:

arXiv eprints
 Pub Date:
 January 2021
 arXiv:
 arXiv:2101.10555
 Bibcode:
 2021arXiv210110555D
 Keywords:

 Mathematics  Optimization and Control