Papers
Topics
Authors
Recent
2000 character limit reached

Sub-sampled Trust-Region Methods with Deterministic Worst-Case Complexity Guarantees (2507.17556v1)

Published 23 Jul 2025 in math.OC

Abstract: In this paper, we develop and analyze sub-sampled trust-region methods for solving finite-sum optimization problems. These methods employ subsampling strategies to approximate the gradient and Hessian of the objective function, significantly reducing the overall computational cost. We propose a novel adaptive procedure for deterministically adjusting the sample size used for gradient (or gradient and Hessian) approximations. Furthermore, we establish worst-case iteration complexity bounds for obtaining approximate stationary points. More specifically, for a given $\varepsilon_g, \varepsilon_H\in (0,1)$, it is shown that an $\varepsilon_g$-approximate first-order stationary point is reached in at most $\mathcal{O}({\varepsilon_g}{-2} )$ iterations, whereas an $(\varepsilon_g,\varepsilon_H)$-approximate second-order stationary point is reached in at most $\mathcal{O}(\max{\varepsilon_{g}{-2}\varepsilon_{H}{-1},\varepsilon_{H}{-3}})$ iterations. Finally, numerical experiments illustrate the effectiveness of our new subsampling technique.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.