Papers
Topics
Authors
Recent
AI Research Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 73 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 14 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 156 tok/s Pro
GPT OSS 120B 388 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Quasi-Newton Method for Set Optimization Problems with Set-Valued Mapping Given by Finitely Many Vector-Valued Functions (2501.04711v1)

Published 29 Dec 2024 in math.OC

Abstract: In this article, we propose a quasi-Newton method for unconstrained set optimization problems to find its weakly minimal solutions with respect to lower set-less ordering. The set-valued objective mapping under consideration is given by a finite number of vector-valued functions that are twice continuously differentiable. To find the necessary optimality condition for weak minimal points with the help of the proposed quasi-Newton method, we use the concept of partition and formulate a family of vector optimization problems. The evaluation of necessary optimality condition for finding the weakly minimal points involves the computation of the approximate Hessian of every objective function, which is done by a quasi-Newton scheme for vector optimization problems. In the proposed quasi-Newton method, we derive a sequence of iterative points that exhibits convergence to a point which satisfies the derived necessary optimality condition for weakly minimal points. After that, we find a descent direction for a suitably chosen vector optimization problem from this family of vector optimization problems and update from the current iterate to the next iterate. The proposed quasi-Newton method for set optimization problems is not a direct extension of that for vector optimization problems, as the selected vector optimization problem varies across the iterates. The well-definedness and convergence of the proposed method are analyzed. The convergence of the proposed algorithm under some regularity condition of the stationary points, a condition on nonstationary points, the boundedness of the norm of quasi-Newton direction, and the existence of step length that satisfies the Armijo condition are derived. We obtain a local superlinear convergence of the proposed method under uniform continuity of the Hessian approximation function.

Summary

We haven't generated a summary for this paper yet.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 2 likes.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube