Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
Gemini 2.5 Pro
GPT-5
GPT-4o
DeepSeek R1 via Azure
2000 character limit reached

First- and Second-Order High Probability Complexity Bounds for Trust-Region Methods with Noisy Oracles (2205.03667v3)

Published 7 May 2022 in math.OC

Abstract: In this paper, we present convergence guarantees for a modified trust-region method designed for minimizing objective functions whose value and gradient and Hessian estimates are computed with noise. These estimates are produced by generic stochastic oracles, which are not assumed to be unbiased or consistent. We introduce these oracles and show that they are more general and have more relaxed assumptions than the stochastic oracles used in prior literature on stochastic trust-region methods. Our method utilizes a relaxed step acceptance criterion and a cautious trust-region radius updating strategy which allows us to derive exponentially decaying tail bounds on the iteration complexity for convergence to points that satisfy approximate first- and second-order optimality conditions. Finally, we present two sets of numerical results. We first explore the tightness of our theoretical results on an example with adversarial zeroth- and first-order oracles. We then investigate the performance of the modified trust-region algorithm on standard noisy derivative-free optimization problems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (23)
  1. Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization. Mathematical programming, 134(1):223–257, 2012.
  2. Convergence of trust-region methods based on probabilistic models. SIAM Journal on Optimization, 24(3):1238–1264, 2014.
  3. Derivative-free optimization of noisy functions via quasi-newton methods. SIAM Journal on Optimization, 29(2):965–993, 2019.
  4. A theoretical and empirical comparison of gradient approximations in derivative-free optimization. Foundations of Computational Mathematics, pages 1–54, 2021.
  5. Global convergence rate analysis of a generic line search algorithm with noise. SIAM Journal on Optimization, 31(2):1489–1518, 2021.
  6. Convergence rate analysis of a stochastic trust-region method via supermartingales. INFORMS journal on optimization, 1(2):92–119, 2019.
  7. Sample size selection in optimization methods for machine learning. Mathematical programming, 134(1):127–155, 2012.
  8. Richard G Carter. On the global convergence of trust region algorithms using inexact gradient information. SIAM Journal on Numerical Analysis, 28(1):251–265, 1991.
  9. Global convergence rate analysis of unconstrained optimization methods based on probabilistic models. Mathematical Programming, 169(2):337–375, 2018.
  10. Stochastic optimization using a trust-region method and random models. Mathematical Programming, 169(2):447–487, 2018.
  11. Trust region methods. SIAM, 2000.
  12. Introduction to derivative-free optimization. SIAM, 2009.
  13. Complexity and global rates of trust-region methods based on probabilistic models. IMA Journal of Numerical Analysis, 38(3):1579–1597, 2018.
  14. High probability complexity bounds for line search based on stochastic oracles. Advances in Neural Information Processing Systems, 34:9193–9203, 2021.
  15. Benchmarking derivative-free optimization algorithms. SIAM Journal on Optimization, 20(1):172–191, 2009.
  16. Cubic regularization of newton method and its global performance. Mathematical Programming, 108(1):177–205, 2006.
  17. Random gradient-free minimization of convex functions. Foundations of Computational Mathematics, 17(2):527–566, 2017.
  18. A stochastic line search method with expected complexity analysis. SIAM Journal on Optimization, 30(1):349–376, 2020.
  19. Michael JD Powell. Uobyqa: unconstrained optimization by quadratic approximation. Mathematical Programming, 92(3):555–582, 2002.
  20. MJD Powell. On the lagrange functions of quadratic models that are defined by interpolation. Optimization Methods and Software, 16(1-4):289–309, 2001.
  21. A trust region method for the optimization of noisy functions. arXiv preprint arXiv:2201.00973, 2022.
  22. Roman Vershynin. High-dimensional probability: An introduction with applications in data science, volume 47. Cambridge university press, 2018.
  23. Ya-xiang Yuan. Recent advances in trust region algorithms. Mathematical Programming, 151(1):249–281, 2015.
Citations (26)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube