Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Regularization Properties of the Krylov Iterative Solvers CGME and LSMR For Linear Discrete Ill-Posed Problems with an Application to Truncated Randomized SVDs (1812.04762v2)

Published 12 Dec 2018 in math.NA

Abstract: For the large-scale linear discrete ill-posed problem $\min|Ax-b|$ or $Ax=b$ with $b$ contaminated by Gaussian white noise, there are four commonly used Krylov solvers: LSQR and its mathematically equivalent CGLS, the Conjugate Gradient (CG) method applied to $ATAx=ATb$, CGME, the CG method applied to $\min|AATy-b|$ or $AATy=b$ with $x=ATy$, and LSMR, the minimal residual (MINRES) method applied to $ATAx=ATb$. These methods have intrinsic regularizing effects, where the number $k$ of iterations plays the role of the regularization parameter. In this paper, we establish a number of regularization properties of CGME and LSMR, including the filtered SVD expansion of CGME iterates, and prove that the 2-norm filtering best regularized solutions by CGME and LSMR are less accurate than and at least as accurate as those by LSQR, respectively. We also prove that the semi-convergence of CGME and LSMR always occurs no later and sooner than that of LSQR, respectively. As a byproduct, using the analysis approach for CGME, we improve a fundamental result on the accuracy of the truncated rank $k$ approximate SVD of $A$ generated by randomized algorithms, and reveal how the truncation step damages the accuracy. Numerical experiments justify our results on CGME and LSMR.

Summary

We haven't generated a summary for this paper yet.