Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

IS-ASGD: Accelerating Asynchronous SGD using Importance Sampling (1706.08210v3)

Published 26 Jun 2017 in cs.DC

Abstract: Variance reduction (VR) techniques for convergence rate acceleration of stochastic gradient descent (SGD) algorithm have been developed with great efforts recently. VR's two variants, stochastic variance-reduced-gradient (SVRG-SGD) and importance sampling (IS-SGD) have achieved remarkable progresses. Meanwhile, asynchronous SGD (ASGD) is becoming more critical due to the ever-increasing scale of the optimization problems. The application of VR in ASGD to accelerate its convergence rate has therefore attracted much interest and SVRG-ASGDs are therefore proposed. However, we found that SVRG suffers dissatisfying performance in accelerating ASGD when the datasets are sparse and large-scale. In such case, SVRG-ASGD's iterative computation cost is magnitudes higher than ASGD which makes it very slow. On the other hand, IS achieves improved convergence rate with few extra computation cost and is invariant to the sparsity of dataset. This advantage makes it very suitable for the acceleration of ASGD for large-scale sparse datasets. In this paper we propose a novel IS-combined ASGD for effective convergence rate acceleration, namely, IS-ASGD. We theoretically prove the superior convergence bound of IS-ASGD. Experimental results also demonstrate our statements.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Fei Wang (574 papers)
  2. Jun Ye (179 papers)
  3. Weichen Li (7 papers)
  4. Guihai Chen (74 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.