Papers
Topics
Authors
Recent
Search
2000 character limit reached

New Studies of Randomized Augmentation and Additive Preprocessing

Published 18 Dec 2014 in math.NA | (1412.5864v3)

Abstract: 1. A standard Gaussian random matrix has full rank with probability 1 and is well-conditioned with a probability quite close to 1 and converging to 1 fast as the matrix deviates from square shape and becomes more rectangular. 2. If we append sufficiently many standard Gaussian random rows or columns to any normalized matrix A, then the augmented matrix has full rank with probability 1 and is well-conditioned with a probability close to 1, even if the matrix A is rank deficient or ill-conditioned. 3. We specify and prove these properties of augmentation and extend them to additive preprocessing, that is, to adding a product of two rectangular Gaussian matrices. 4. By applying our randomization techniques to a matrix that has numerical rank r, we accelerate the known algorithms for the approximation of its leading and trailing singular spaces associated with its r largest and with all its remaining singular values, respectively. 5. Our algorithms use much fewer random parameters and run much faster when various random sparse and structured preprocessors replace Gaussian. Empirically the outputs of the resulting algorithms is as accurate as the outputs under Gaussian preprocessing. 6. Our novel duality techniques provides formal support, so far missing, for these empirical observations and opens door to derandomization of our preprocessing and to further acceleration and simplification of our algorithms by using more efficient sparse and structured preprocessors. 7. Our techniques and our progress can be applied to various other fundamental matrix computations such as the celebrated low-rank approximation of a matrix by means of random sampling.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.