Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving the communication in decentralized manifold optimization through single-step consensus and compression (2407.08904v1)

Published 12 Jul 2024 in math.OC

Abstract: We are concerned with decentralized optimization over a compact submanifold, where the loss functions of local datasets are defined by their respective local datasets. A key challenge in decentralized optimization is mitigating the communication bottleneck, which primarily involves two strategies: achieving consensus and applying communication compression. Existing projection/retraction-type algorithms rely on multi-step consensus to attain both consensus and optimality. Due to the nonconvex nature of the manifold constraint, it remains an open question whether the requirement for multi-step consensus can be reduced to single-step consensus. We address this question by carefully elaborating on the smoothness structure and the asymptotic 1-Lipschitz continuity associated with the manifold constraint. Furthermore, we integrate these insights with a communication compression strategy to propose a communication-efficient gradient algorithm for decentralized manifold optimization problems, significantly reducing per-iteration communication costs. Additionally, we establish an iteration complexity of $\mathcal{O}(\epsilon{-1})$ to find an $\epsilon$-stationary point, which matches the complexity in the Euclidean setting. Numerical experiments demonstrate the efficiency of the proposed method in comparison to state-of-the-art approaches.

Summary

We haven't generated a summary for this paper yet.