Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An effective subgradient algorithm via Mifflin's line search for nonsmooth nonconvex multiobjective optimization (2406.14905v1)

Published 21 Jun 2024 in math.OC

Abstract: We propose a descent subgradient algorithm for unconstrained nonsmooth nonconvex multiobjective optimization problems. To find a descent direction, we present an iterative process that efficiently approximates the Goldstein subdifferential of each objective function. To this end, we develop a new variant of Mifflin's line search in which the subgradients are arbitrary and its finite convergence is proved under a semismooth assumption. To reduce the number of subgradient evaluations, we employ a backtracking line search that identifies the objectives requiring an improvement in the current approximation of the Goldstein subdifferential. Meanwhile, for the remaining objectives, new subgradients are not computed. Unlike bundle-type methods, the proposed approach can handle nonconvexity without the need for algorithmic adjustments. Moreover, the quadratic subproblems have a simple structure, and hence the method is easy to implement. We analyze the global convergence of the proposed method and prove that any accumulation point of the generated sequence satisfies a necessary Pareto optimality condition. Furthermore, our convergence analysis addresses a theoretical challenge in a recently developed subgradient method. Through numerical experiments, we observe the practical capability of the proposed method and evaluate its efficiency when applied to a diverse range of nonsmooth test problems.

Summary

We haven't generated a summary for this paper yet.