Unifying Framework for Accelerated Randomized Methods in Convex Optimization
Abstract
In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on accelerated random block-coordinate descent, accelerated random directional search, accelerated random derivative-free method and, using our framework, provide their versions for problems with inexact oracle information. Our contribution also includes accelerated random block-coordinate descent with inexact oracle and entropy proximal setup as well as derivative-free version of this method. Moreover, we present an extension of our framework for strongly convex optimization problems. We also discuss an extension for the case of inexact model of the objective function.
- Publication:
-
arXiv e-prints
- Pub Date:
- July 2017
- DOI:
- 10.48550/arXiv.1707.08486
- arXiv:
- arXiv:1707.08486
- Bibcode:
- 2017arXiv170708486D
- Keywords:
-
- Mathematics - Optimization and Control;
- Mathematics - Numerical Analysis;
- 90C25;
- 90C30;
- 90C06;
- 90C56;
- 68Q25;
- 65K05;
- 49M27;
- 68W20;
- 65Y20;
- 68W40;
- G.1.6
- E-Print:
- Compared to the previous version, we add an extension of our framework for strongly convex optimization problems. We also discuss an extension for the case of inexact model of the objective function and non-accelerated methods