Minibatch Stochastic Approximation Methods for Nonconvex Stochastic Composite Optimization
Abstract
This paper considers a class of constrained stochastic composite optimization problems whose objective function is given by the summation of a differentiable (possibly nonconvex) component, together with a certain nondifferentiable (but convex) component. In order to solve these problems, we propose a randomized stochastic projected gradient (RSPG) algorithm, in which proper minibatch of samples are taken at each iteration depending on the total budget of stochastic samples allowed. The RSPG algorithm also employs a general distance function to allow taking advantage of the geometry of the feasible region. Complexity of this algorithm is established in a unified setting, which shows nearly optimal complexity of the algorithm for convex stochastic programming. A postoptimization phase is also proposed to significantly reduce the variance of the solutions returned by the algorithm. In addition, based on the RSPG algorithm, a stochastic gradient free algorithm, which only uses the stochastic zerothorder information, has been also discussed. Some preliminary numerical results are also provided.
 Publication:

arXiv eprints
 Pub Date:
 August 2013
 arXiv:
 arXiv:1308.6594
 Bibcode:
 2013arXiv1308.6594G
 Keywords:

 Mathematics  Optimization and Control
 EPrint:
 32 pages