2000 character limit reached
Adaptive Accelerated Gradient Method for Smooth Convex Optimization (2512.20478v1)
Published 23 Dec 2025 in math.OC
Abstract: We propose an adaptive accelerated gradient method for solving smooth convex optimization problems. The method incorporates a scheme to determine the step size adaptively, by means of a local estimation of the smoothness constant, which is assumed unknown, without resorting to line search procedures. The sequence generated by this method converges weakly to a minimizer of the objective function, and the function values converge at a fast rate of $\mathcal{O}\left( \frac{1}{k2} \right)$. Moreover, if the objective function is strongly convex, the function values converge at a linear rate.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.