Secondorder Conditional Gradient Sliding
Abstract
Constrained secondorder convex optimization algorithms are the method of choice when a high accuracy solution to a problem is needed, due to their local quadratic convergence. These algorithms require the solution of a constrained quadratic subproblem at every iteration. We present the \emph{SecondOrder Conditional Gradient Sliding} (SOCGS) algorithm, which uses a projectionfree algorithm to solve the constrained quadratic subproblems inexactly. When the feasible region is a polytope the algorithm converges quadratically in primal gap after a finite number of linearly convergent iterations. Once in the quadratic regime the SOCGS algorithm requires $\mathcal{O}(\log(\log 1/\varepsilon))$ firstorder and Hessian oracle calls and $\mathcal{O}(\log (1/\varepsilon) \log(\log1/\varepsilon))$ linear minimization oracle calls to achieve an $\varepsilon$optimal solution. This algorithm is useful when the feasible region can only be accessed efficiently through a linear optimization oracle, and computing firstorder information of the function, although possible, is costly.
 Publication:

arXiv eprints
 Pub Date:
 February 2020
 arXiv:
 arXiv:2002.08907
 Bibcode:
 2020arXiv200208907C
 Keywords:

 Mathematics  Optimization and Control;
 Computer Science  Machine Learning;
 Statistics  Machine Learning