Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Smoothing Accelerated Proximal Gradient Method with Fast Convergence Rate for Nonsmooth Multi-objective Optimization (2312.01609v5)

Published 4 Dec 2023 in math.OC

Abstract: This paper proposes a Smoothing Accelerated Proximal Gradient Method with Extrapolation Term (SAPGM) for nonsmooth multiobjective optimization. By combining the smoothing methods and the accelerated algorithm for multiobjective optimization by Tanabe et al., our method achieve fast convergence rate. Specifically, we establish that the convergence rate of our proposed method can be enhanced to $o(\ln\sigma k/k)$ by incorporating a extrapolation term $\frac{k-1}{k + \alpha -1}$ with $\alpha > 3$.Moreover, we prove that the iterates sequence is convergent to a Pareto optimal solution of the primal problem. Furthermore, we present an effective strategy for solving the subproblem through its dual representation, validating the efficacy of the proposed method through a series of numerical experiments.

Summary

We haven't generated a summary for this paper yet.