Multiobjective Accelerated Gradient-like Flow with Asymptotic Vanishing Normalized Gradient (2507.20183v1)
Abstract: In this paper, we extend the gradient system with unit-norm gradient term modification proposed by Wang et al.\cite{wang2021search} to multiobjective optimization, studying the following system: $$ \ddot x(t)+\frac{\alpha }{t}\dot x(t)+\frac{\alpha -\beta }{tp}\frac{|\dot x(t)|}{|\proj_{C(x(t))}(0)|}\proj_{C(x(t))}(0)+\proj_{C(x(t))}(-\ddot x(t))=0 $$ where $C(x(t))=\textbf{conv}{\nabla f_i(x(t)):i=1,\cdots,m}$, $f_i(x(t)):\Rn\to \R$ are continuously differentiable convex functions, and $\alpha \ge \beta \ge 3$. Under certain assumptions, we establish the existence of trajectory solutions for this system. Using a merit function, we characterize the convergence of trajectory solutions: For $p>1$, $\alpha >\beta \ge 3$, we obtain a convergence rate of $O(1/t2)$. When $\beta >3$, the trajectory solutions converge to a weak Pareto solution of the multiobjective optimization problem $\min _{x}(f_1(x),\cdots,f_m(x))\top$. For $p=1$, $\alpha >\beta \ge 3$, we derive a convergence rate of $O(\ln2 t/t2)$. We further generalize Wang et al.'s FISC-nes algorithm to multiobjective optimization, achieving a convergence rate of $O(\ln2k/k2)$. The numerical experiments demonstrate that our system and algorithm exhibit competitive performance.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.