Smoothing Accelerated Proximal Gradient Method with Fast Convergence Rate for Nonsmooth Multi-objective Optimization
Abstract: This paper proposes a Smoothing Accelerated Proximal Gradient Method with Extrapolation Term (SAPGM) for nonsmooth multiobjective optimization. By combining the smoothing methods and the accelerated algorithm for multiobjective optimization by Tanabe et al., our method achieve fast convergence rate. Specifically, we establish that the convergence rate of our proposed method can be enhanced to $o(\ln\sigma k/k)$ by incorporating a extrapolation term $\frac{k-1}{k + \alpha -1}$ with $\alpha > 3$.Moreover, we prove that the iterates sequence is convergent to a Pareto optimal solution of the primal problem. Furthermore, we present an effective strategy for solving the subproblem through its dual representation, validating the efficacy of the proposed method through a series of numerical experiments.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.