Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
131 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accelerating Gradient Tracking with Periodic Global Averaging (2403.11293v2)

Published 17 Mar 2024 in math.OC

Abstract: Decentralized optimization algorithms have recently attracted increasing attention due to its wide applications in all areas of science and engineering. In these algorithms, a collection of agents collaborate to minimize the average of a set of heterogeneous cost functions in a decentralized manner. State-of-the-art decentralized algorithms like Gradient Tracking (GT) and Exact Diffusion (ED) involve communication at each iteration. Yet, communication between agents is often expensive, resource intensive, and can be very slow. To this end, several strategies have been developed to balance between communication overhead and convergence rate of decentralized methods. In this paper, we introduce GT-PGA, which incorporates~GT with periodic global averaging. With the additional PGA, the influence of poor network connectivity in the GT algorithm can be compensated or controlled by a careful selection of the global averaging period. Under the stochastic, nonconvex setup, our analysis quantifies the crucial trade-off between the connectivity of network topology and the PGA period. Thus, with a suitable design of the PGA period, GT-PGA improves the convergence rate of vanilla GT. Numerical experiments are conducted to support our theory, and simulation results reveal that the proposed GT-PGA accelerates practical convergence, especially when the network is sparse.

Summary

We haven't generated a summary for this paper yet.