Papers
Topics
Authors
Recent
Search
2000 character limit reached

Stochastic Approximation of Smooth and Strongly Convex Functions: Beyond the $O(1/T)$ Convergence Rate

Published 27 Jan 2019 in cs.LG and stat.ML | (1901.09344v1)

Abstract: Stochastic approximation (SA) is a classical approach for stochastic convex optimization. Previous studies have demonstrated that the convergence rate of SA can be improved by introducing either smoothness or strong convexity condition. In this paper, we make use of smoothness and strong convexity simultaneously to boost the convergence rate. Let $\lambda$ be the modulus of strong convexity, $\kappa$ be the condition number, $F_$ be the minimal risk, and $\alpha>1$ be some small constant. First, we demonstrate that, in expectation, an $O(1/[\lambda T\alpha] + \kappa F_/T)$ risk bound is attainable when $T = \Omega(\kappa\alpha)$. Thus, when $F_$ is small, the convergence rate could be faster than $O(1/[\lambda T])$ and approaches $O(1/[\lambda T\alpha])$ in the ideal case. Second, to further benefit from small risk, we show that, in expectation, an $O(1/2{T/\kappa}+F_)$ risk bound is achievable. Thus, the excess risk reduces exponentially until reaching $O(F_)$, and if $F_=0$, we obtain a global linear convergence. Finally, we emphasize that our proof is constructive and each risk bound is equipped with an efficient stochastic algorithm attaining that bound.

Citations (29)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.