2000 character limit reached
Nestorv's Accelerated Proximal Gradient Method with Backtracking for Multiobjective Optimization (2507.06737v1)
Published 9 Jul 2025 in math.OC
Abstract: In this paper, we propose a novel extrapolation coefficient scheme within the Nesterov framework and develop an accelerated proximal gradient algorithm. We establish that the algorithm achieves a sublinear convergence rate. The proposed scheme only requires the Lipschitz constant estimate sequence to satisfy mild initial conditions, under which a key equality property can be derived to support the convergence analysis. Numerical experiments are provided to demonstrate the effectiveness and practical performance of the proposed method.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.