Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Linear Convergence of Distributed Compressed Optimization with Equality Constraints (2503.02468v1)

Published 4 Mar 2025 in eess.SY and cs.SY

Abstract: In this paper, the distributed strongly convex optimization problem is studied with spatio-temporal compressed communication and equality constraints. For the case where each agent holds an distributed local equality constraint, a distributed saddle-point algorithm is proposed by employing distributed filters to derive errors of the transmitted states for spatio-temporal compression purposes. It is shown that the resulting distributed compressed algorithm achieves linear convergence. Furthermore, the algorithm is generalized to the case where each agent holds a portion of the global equality constraint, i.e., the constraints across agents are coupled. By introducing an additional design freedom, the global equality constraint is shown to be equivalent to the one where each agent holds an equality constraint, for which the proposed distributed compressed saddle-point algorithm can be adapted to achieve linear convergence. Numerical simulations are adopted to validate the effectiveness of the proposed algorithms.

Summary

We haven't generated a summary for this paper yet.