Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
Gemini 2.5 Pro Pro
o3 Pro
GPT-4.1 Pro
DeepSeek R1 via Azure Pro
2000 character limit reached

About some works of Boris Polyak on convergence of gradient methods and their development (2311.16743v2)

Published 28 Nov 2023 in math.OC

Abstract: The paper presents a review of the state-of-the-art of subgradient and accelerated methods of convex optimization, including in the presence of disturbances and access to various information about the objective function (function value, gradient, stochastic gradient, higher derivatives). For nonconvex problems, the Polak-Lojasiewicz condition is considered and a review of the main results is given. The behavior of numerical methods in the presence of sharp minima is considered. The purpose of this survey is to show the influence of the works of B.T. Polyak (1935 -- 2023) on gradient optimization methods and their neighborhoods on the modern development of numerical optimization methods.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.