Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

S-ADDOPT: Decentralized stochastic first-order optimization over directed graphs (2005.07785v3)

Published 15 May 2020 in cs.LG, cs.SY, eess.SY, math.OC, and stat.ML

Abstract: In this report, we study decentralized stochastic optimization to minimize a sum of smooth and strongly convex cost functions when the functions are distributed over a directed network of nodes. In contrast to the existing work, we use gradient tracking to improve certain aspects of the resulting algorithm. In particular, we propose the~\textbf{\texttt{S-ADDOPT}} algorithm that assumes a stochastic first-order oracle at each node and show that for a constant step-size~$\alpha$, each node converges linearly inside an error ball around the optimal solution, the size of which is controlled by~$\alpha$. For decaying step-sizes~$\mathcal{O}(1/k)$, we show that~\textbf{\texttt{S-ADDOPT}} reaches the exact solution sublinearly at~$\mathcal{O}(1/k)$ and its convergence is asymptotically network-independent. Thus the asymptotic behavior of~\textbf{\texttt{S-ADDOPT}} is comparable to the centralized stochastic gradient descent. Numerical experiments over both strongly convex and non-convex problems illustrate the convergence behavior and the performance comparison of the proposed algorithm.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Muhammad I. Qureshi (5 papers)
  2. Ran Xin (25 papers)
  3. Soummya Kar (147 papers)
  4. Usman A. Khan (56 papers)
Citations (29)

Summary

We haven't generated a summary for this paper yet.