Convergence of First-Order Algorithms with Momentum from the Perspective of an Inexact Gradient Descent Method (2505.03050v1)
Abstract: This paper introduces a novel inexact gradient descent method with momentum (IGDm) considered as a general framework for various first-order methods with momentum. This includes, in particular, the inexact proximal point method (IPPm), extragradient method (EGm), and sharpness-aware minimization (SAMm). Asymptotic convergence properies of IGDm are established under both global and local assumptions on objective functions with providing constructive convergence rates depending on the Polyak-\L ojasiewicz-Kurdyka (PLK) conditions for the objective function. Global convergence of EGm and SAMm for general smooth functions and of IPPM for weakly convex functions is derived in this way. Moreover, local convergence properties of EGm and SAMm for locally smooth functions as well as of IPPm for prox-regular functions are established. Numerical experiments for derivative-free optimization problems are conducted to confirm the efficiency of the momentum effects of the developed methods under inexactness of gradient computations