Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Provably Accelerated Decentralized Gradient Method Over Unbalanced Directed Graphs (2107.12065v2)

Published 26 Jul 2021 in math.OC, cs.DC, cs.LG, cs.SY, eess.SP, and eess.SY

Abstract: We consider the decentralized optimization problem, where a network of $n$ agents aims to collaboratively minimize the average of their individual smooth and convex objective functions through peer-to-peer communication in a directed graph. To tackle this problem, we propose two accelerated gradient tracking methods, namely APD and APD-SC, for non-strongly convex and strongly convex objective functions, respectively. We show that APD and APD-SC converge at the rates $O\left(\frac{1}{k2}\right)$ and $O\left(\left(1 - C\sqrt{\frac{\mu}{L}}\right)k\right)$, respectively, up to constant factors depending only on the mixing matrix. APD and APD-SC are the first decentralized methods over unbalanced directed graphs that achieve the same provable acceleration as centralized methods. Numerical experiments demonstrate the effectiveness of both methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)
Citations (3)

Summary

We haven't generated a summary for this paper yet.