2000 character limit reached
Accelerated Stochastic Gradient Descent for Minimizing Finite Sums (1506.03016v2)
Published 9 Jun 2015 in stat.ML and cs.LG
Abstract: We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. Unlike SVRG, our method can be directly applied to non-strongly and strongly convex problems. We show that our method achieves a lower overall complexity than the recently proposed methods that supports non-strongly convex problems. Moreover, this method has a fast rate of convergence for strongly convex problems. Our experiments show the effectiveness of our method.
- Atsushi Nitanda (29 papers)