Decentralized Conjugate Gradient and Memoryless BFGS Methods (2409.07122v3)
Abstract: This paper proposes a new decentralized conjugate gradient (NDCG) method and a decentralized memoryless BFGS (DMBFGS) method for the nonconvex and strongly convex decentralized optimization problem, respectively, of minimizing a finite sum of continuously differentiable functions over a fixed-connected undirected network. Gradient tracking techniques are applied in these two methods to enhance their convergence properties and the numerical stability. In particular, we show global convergence of NDCG with constant stepsize for general nonconvex smooth decentralized optimization. Our new DMBFGS method uses a scaled memoryless BFGS technique and only requires gradient information to approximate second-order information of the component functions in the objective. We also establish global convergence and linear convergence rate of DMBFGS with constant stepsize for strongly convex smooth decentralized optimization. Our numerical results show that NDCG and DMBFGS are very efficient in terms of both iteration and communication cost compared with other state-of-the-art methods for solving smooth decentralized optimization.