Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
Gemini 2.5 Pro
GPT-5
GPT-4o
DeepSeek R1 via Azure
2000 character limit reached

Decentralized Conjugate Gradient and Memoryless BFGS Methods (2409.07122v3)

Published 11 Sep 2024 in math.OC

Abstract: This paper proposes a new decentralized conjugate gradient (NDCG) method and a decentralized memoryless BFGS (DMBFGS) method for the nonconvex and strongly convex decentralized optimization problem, respectively, of minimizing a finite sum of continuously differentiable functions over a fixed-connected undirected network. Gradient tracking techniques are applied in these two methods to enhance their convergence properties and the numerical stability. In particular, we show global convergence of NDCG with constant stepsize for general nonconvex smooth decentralized optimization. Our new DMBFGS method uses a scaled memoryless BFGS technique and only requires gradient information to approximate second-order information of the component functions in the objective. We also establish global convergence and linear convergence rate of DMBFGS with constant stepsize for strongly convex smooth decentralized optimization. Our numerical results show that NDCG and DMBFGS are very efficient in terms of both iteration and communication cost compared with other state-of-the-art methods for solving smooth decentralized optimization.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com