Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Mixing-Accelerated Primal-Dual Proximal Algorithm for Distributed Nonconvex Optimization (2304.02830v2)

Published 6 Apr 2023 in math.OC, cs.SY, and eess.SY

Abstract: In this paper, we develop a distributed mixing-accelerated primal-dual proximal algorithm, referred to as MAP-Pro, which enables nodes in multi-agent networks to cooperatively minimize the sum of their nonconvex, smooth local cost functions in a decentralized fashion. The proposed algorithm is constructed upon minimizing a computationally inexpensive augmented-Lagrangian-like function and incorporating a time-varying mixing polynomial to expedite information fusion across the network. The convergence results derived for MAP-Pro include a sublinear rate of convergence to a stationary solution and, under the Polyak-{\L}ojasiewics (P-{\L}) condition, a linear rate of convergence to the global optimal solution. Additionally, we may embed the well-noted Chebyshev acceleration scheme in MAP-Pro, which generates a specific sequence of mixing polynomials with given degrees and enhances the convergence performance based on MAP-Pro. Finally, we illustrate the competitive convergence speed and communication efficiency of MAP-Pro via a numerical example.

Summary

We haven't generated a summary for this paper yet.