Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Subspace Quasi-Newton Method with Gradient Approximation (2406.01965v1)

Published 4 Jun 2024 in math.OC

Abstract: In recent years, various subspace algorithms have been developed to handle large-scale optimization problems. Although existing subspace Newton methods require fewer iterations to converge in practice, the matrix operations and full gradient computation are bottlenecks when dealing with large-scale problems. %In this study, We propose a subspace quasi-Newton method that is restricted to a deterministic-subspace together with a gradient approximation based on random matrix theory. Our method does not require full gradients, let alone Hessian matrices. Yet, it achieves the same order of the worst-case iteration complexities in average for convex and nonconvex cases, compared to existing subspace methods. In numerical experiments, we confirm the superiority of our algorithm in terms of computation time.

Citations (1)

Summary

We haven't generated a summary for this paper yet.