Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs (2108.05102v2)

Published 11 Aug 2021 in math.NA and cs.NA

Abstract: The local minimax method (LMM) proposed in [Y. Li and J. Zhou, SIAM J. Sci. Comput., 23(3), 840--865 (2001)] and [Y. Li and J. Zhou, SIAM J. Sci. Comput., 24(3), 865--885 (2002)] is an efficient method to solve nonlinear elliptic partial differential equations (PDEs) with certain variational structures for multiple solutions. The steepest descent direction and the Armijo-type step-size search rules are adopted in [Y. Li and J. Zhou, SIAM J. Sci. Comput., 24(3), 865--885 (2002)] and play a significant role in the performance and convergence analysis of traditional LMMs. In this paper, a new algorithm framework of the LMMs is established based on general descent directions and two normalized (strong) Wolfe-Powell-type step-size search rules. The corresponding algorithm framework named as the normalized Wolfe-Powell-type LMM (NWP-LMM) is introduced with its feasibility and global convergence rigorously justified for general descent directions. As a special case, the global convergence of the NWP-LMM algorithm combined with the preconditioned steepest descent (PSD) directions is also verified. Consequently, it extends the framework of traditional LMMs. In addition, conjugate gradient-type (CG-type) descent directions are utilized to speed up the NWP-LMM algorithm. Finally, extensive numerical results for several semilinear elliptic PDEs are reported to profile their multiple unstable solutions and compared for different algorithms in the LMM's family to indicate the effectiveness and robustness of our algorithms. In practice, the NWP-LMM combined with the CG-type direction indeed performs much better than its known LMM companions.

Citations (3)

Summary

We haven't generated a summary for this paper yet.