Theoretical and Computable Optimal Subspace Expansions for Matrix Eigenvalue Problems
Abstract: Consider the optimal subspace expansion problem for the matrix eigenvalue problem $Ax=\lambda x$: Which vector $w$ in the current subspace $\mathcal{V}$, after multiplied by $A$, provides an optimal subspace expansion for approximating a desired eigenvector $x$ in the sense that $x$ has the smallest angle with the expanded subspace $\mathcal{V}w=\mathcal{V}+{\rm span}{Aw}$, i.e., $w{opt}=\arg\max_{w\in\mathcal{V}}\cos\angle(\mathcal{V}w,x)$? This problem is important as many iterative methods construct nested subspaces that successively expand $\mathcal{V}$ to $\mathcal{V}_w$. An expression of $w{opt}$ by Ye (Linear Algebra Appl., 428 (2008), pp. 911--918) for $A$ general, but it could not be exploited to construct a computable (nearly) optimally expanded subspace. He turns to deriving a maximization characterization of $\cos\angle(\mathcal{V}w,x)$ for a {\em given} $w\in \mathcal{V}$ when $A$ is Hermitian. We generalize Ye's maximization characterization to the general case and find its maximizer. Our main contributions consist of explicit expressions of $w{opt}$, $(I-P_V)Aw_{opt}$ and the optimally expanded subspace $\mathcal{V}{w{opt}}$ for $A$ general, where $P_V$ is the orthogonal projector onto $\mathcal{V}$. These results are fully exploited to obtain computable optimally expanded subspaces within the framework of the standard, harmonic, refined, and refined harmonic Rayleigh--Ritz methods. We show how to efficiently implement the proposed subspace expansion approaches. Numerical experiments demonstrate the effectiveness of our computable optimal expansions.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.