Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the global convergent of an inexact quasi-Newton conditional gradient method for constrained nonlinear systems (1806.01669v1)

Published 5 Jun 2018 in math.OC

Abstract: In this paper, we propose a globally convergent method for solving constrained nonlinear systems. The method combines an efficient Newton conditional gradient method with a derivative-free and nonmonotone linesearch strategy. The global convergence analysis of the proposed method is established under suitable conditions, and some preliminary numerical experiments are given to illustrate its performance.

Summary

We haven't generated a summary for this paper yet.