Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Asynchronous Distributed Optimization via Dual Decomposition and Block Coordinate Subgradient Methods (2102.07953v1)

Published 16 Feb 2021 in math.OC, cs.SY, and eess.SY

Abstract: We study the problem of minimizing the sum of potentially non-differentiable convex cost functions with partially overlapping dependences in an asynchronous manner, where communication in the network is not coordinated. We study the behavior of an asynchronous algorithm based on dual decomposition and block coordinate subgradient methods under assumptions weaker than those used in the literature. At the same time, we allow different agents to use local stepsizes with no global coordination. Sufficient conditions are provided for almost sure convergence to the solution of the optimization problem. Under additional assumptions, we establish a sublinear convergence rate that in turn can be strengthened to linear convergence rate if the problem is strongly convex and has Lipschitz gradients. We also extend available results in the literature by allowing multiple and potentially overlapping blocks to be updated at the same time with non-uniform and potentially time varying probabilities assigned to different blocks. A numerical example is provided to illustrate the effectiveness of the algorithm.

Citations (5)

Summary

We haven't generated a summary for this paper yet.