Improving the communication in decentralized manifold optimization through single-step consensus and compression (2407.08904v1)
Abstract: We are concerned with decentralized optimization over a compact submanifold, where the loss functions of local datasets are defined by their respective local datasets. A key challenge in decentralized optimization is mitigating the communication bottleneck, which primarily involves two strategies: achieving consensus and applying communication compression. Existing projection/retraction-type algorithms rely on multi-step consensus to attain both consensus and optimality. Due to the nonconvex nature of the manifold constraint, it remains an open question whether the requirement for multi-step consensus can be reduced to single-step consensus. We address this question by carefully elaborating on the smoothness structure and the asymptotic 1-Lipschitz continuity associated with the manifold constraint. Furthermore, we integrate these insights with a communication compression strategy to propose a communication-efficient gradient algorithm for decentralized manifold optimization problems, significantly reducing per-iteration communication costs. Additionally, we establish an iteration complexity of $\mathcal{O}(\epsilon{-1})$ to find an $\epsilon$-stationary point, which matches the complexity in the Euclidean setting. Numerical experiments demonstrate the efficiency of the proposed method in comparison to state-of-the-art approaches.