Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Communication Efficient Distributed Newton Method with Fast Convergence Rates (2305.17945v1)

Published 29 May 2023 in math.OC

Abstract: We propose a communication and computation efficient second-order method for distributed optimization. For each iteration, our method only requires $\mathcal{O}(d)$ communication complexity, where $d$ is the problem dimension. We also provide theoretical analysis to show the proposed method has the similar convergence rate as the classical second-order optimization algorithms. Concretely, our method can find~$\big(\epsilon, \sqrt{dL\epsilon}\,\big)$-second-order stationary points for nonconvex problem by $\mathcal{O}\big(\sqrt{dL}\,\epsilon{-3/2}\big)$ iterations, where $L$ is the Lipschitz constant of Hessian. Moreover, it enjoys a local superlinear convergence under the strongly-convex assumption. Experiments on both convex and nonconvex problems show that our proposed method performs significantly better than baselines.

Citations (5)

Summary

We haven't generated a summary for this paper yet.