Randomized algorithms for Tikhonov regularization in linear least squares (2203.07329v1)
Abstract: We describe two algorithms to efficiently solve regularized linear least squares systems based on sketching. The algorithms compute preconditioners for $\min |Ax-b|2_2 + \lambda |x|2_2$, where $A\in\mathbb{R}{m\times n}$ and $\lambda>0$ is a regularization parameter, such that LSQR converges in $\mathcal{O}(\log(1/\epsilon))$ iterations for $\epsilon$ accuracy. We focus on the context where the optimal regularization parameter is unknown, and the system must be solved for a number of parameters $\lambda$. Our algorithms are applicable in both the underdetermined $m\ll n$ and the overdetermined $m\gg n$ setting. Firstly, we propose a Cholesky-based sketch-to-precondition algorithm that uses a `partly exact' sketch, and only requires one sketch for a set of $N$ regularization parameters $\lambda$. The complexity of solving for $N$ parameters is $\mathcal{O}(mn\log(\max(m,n)) +N(\min(m,n)3 + mn\log(1/\epsilon)))$. Secondly, we introduce an algorithm that uses a sketch of size $\mathcal{O}(\text{sd}{\lambda}(A))$ for the case where the statistical dimension $\text{sd}{\lambda}(A)\ll\min(m,n)$. The scheme we propose does not require the computation of the Gram matrix, resulting in a more stable scheme than existing algorithms in this context. We can solve for $N$ values of $\lambda_i$ in $\mathcal{O}(mn\log(\max(m,n)) + \min(m,n)\,\text{sd}_{\min\lambda_i}(A)2 + Nmn\log(1/\epsilon))$ operations.
- Maike Meier (4 papers)
- Yuji Nakatsukasa (69 papers)