Low Rank Approximation at Sublinear Cost (1906.04327v7)
Abstract: Low Rank Approximation (LRA) of an m-by-n matrix is a hot research subject, fundamental for Matrix and Tensor Computations and Big Data Mining and Analysis. Computations with LRA can be performed at sublinear cost -- by using much fewer than mn memory cells and arithmetic operations, but can we compute LRA at sublinear cost? Yes and no. No, because spectral, Frobenius, and all other norms of the error matrix of LRA output by any sublinear cost deterministic or randomized algorithm exceed their minimal values for LRA by infinitely large factors for the worst case input and even for the inputs from the small families of our Appendix. Yes, because for about two decades Cross-Approximation (C-A) iterations, running at sublinear cost, have been consistently computing close LRA worldwide. We provide new insight into that "yes" and "no" coexistence by identifying C-A iterations as recursive sketching algorithms for LRA that use sampling test matrices and run at sublinear cost. As we prove in good accordance with our numerical tests, already at a single recursive step they compute close LRA. except for a narrow class of hard inputs, which tends to shrink in the recursive process. We also discuss enhancing the power of sketching by means of using leverage scores.