Time Scaling Makes Accelerated Gradient Flow and Proximal Method Faster in Multiobjective Optimization (2508.07254v1)
Abstract: This paper extends a class of single-objective gradient flows and accelerated proximal methods to the multiobjective optimization domain within Euclidean spaces. The proposed gradient flow is a second-order differential equation composed of a second-order term, a first-order term with asymptotic vanishing behavior, and a gradient term with time scaling. We prove the existence of trajectory solutions to the equation and, through Lyapunov analysis, demonstrate that with appropriate parameter choices, the trajectory solutions can achieve a sublinear convergence rate faster than $O(1/t2)$. For the proposed proximal algorithm, we similarly obtain a sublinear convergence rate faster than $O(1/k2)$.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.