We consider minimization of indefinite quadratics with either trust-region (norm) constraints or cubic regularization. Despite the nonconvexity of these problems we prove that, under mild assumptions, gradient descent converges to their global solutions, and give a non-asymptotic rate of convergence for the cubic variant. We also consider Krylov subspace solutions and establish sharp convergence guarantees to the solutions of both trust-region and cubic-regularized problems. Our rates mirror the behavior of these methods on convex quadratics and eigenvector problems, highlighting their scalability. When we use Krylov subspace solutions to approximate the cubic-regularized Newton step, our results recover the strongest known convergence guarantees to approximate second-order stationary points of general smooth nonconvex functions.
- Pub Date:
- March 2020
- Mathematics - Optimization and Control
- This is a SIAM Review preprint covering our papers "Gradient Descent Finds the Cubic-Regularized Non-Convex Newton Step" (SIOPT, 2019) and "Analysis of Krylov Subspace Solutions of Regularized Nonconvex Quadratic Problems" (NeurIPS, 2018)