A General Analysis Framework of Lower Complexity Bounds for FiniteSum Optimization
Abstract
This paper studies the lower bound complexity for the optimization problem whose objective function is the average of $n$ individual smooth convex functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component. For the stronglyconvex case, we prove such an algorithm can not reach an $\varepsilon$suboptimal point in fewer than $\Omega((n+\sqrt{\kappa n})\log(1/\varepsilon))$ iterations, where $\kappa$ is the condition number of the objective function. This lower bound is tighter than previous results and perfectly matches the upper bound of the existing proximal incremental firstorder oracle algorithm PointSAGA. We develop a novel construction to show the above result, which partitions the tridiagonal matrix of classical examples into $n$ groups. This construction is friendly to the analysis of proximal oracle and also could be used to general convex and average smooth cases naturally.
 Publication:

arXiv eprints
 Pub Date:
 August 2019
 arXiv:
 arXiv:1908.08394
 Bibcode:
 2019arXiv190808394X
 Keywords:

 Mathematics  Optimization and Control;
 Computer Science  Machine Learning;
 Statistics  Machine Learning