Randomness Efficient Fast-Johnson-Lindenstrauss Transform with Applications in Differential Privacy and Compressed Sensing (1410.2470v6)
Abstract: The Johnson-Lindenstrauss property ({\sf JLP}) of random matrices has immense application in computer science ranging from compressed sensing, learning theory, numerical linear algebra, to privacy. This paper explores the properties and applications of a distribution of random matrices. Our distribution satisfies {\sf JLP} with desirable properties like fast matrix-vector multiplication, sparsity, and optimal subspace embedding. We can sample a random matrix from this distribution using exactly $2n+n \log n$ random bits. We show that a random matrix picked from this distribution preserves differential privacy under the condition that the input private matrix satisfies certain spectral property. This improves the run-time of various differentially private mechanisms like Blocki {\it et al.} (FOCS 2012) and Upadhyay (ASIACRYPT 13). Our final construction has a bounded column sparsity. Therefore, this answers an open problem stated in Blocki {\it et al.} (FOCS 2012). Using the results of Baranuik {\it et al.} (Constructive Approximation: 28(3)), our result implies a randomness efficient matrices that satisfies the Restricted-Isometry Property of optimal order for small sparsity with exactly linear random bits. We also show that other known distributions of sparse random matrices with the {\sf JLP} does not preserves differential privacy; thereby, answering one of the open problem posed by Blocki {\it et al.} (FOCS 2012). Extending on the works of Kane and Nelson (JACM: 61(1)), we also give unified analysis of some of the known Johnson-Lindenstrauss transform. We also present a self-contained simplified proof of an inequality on quadratic form of Gaussian variables that we use in all our proofs.