SLRMA: Sparse Low-Rank Matrix Approximation for Data Compression (1507.01673v2)
Abstract: Low-rank matrix approximation (LRMA) is a powerful technique for signal processing and pattern analysis. However, its potential for data compression has not yet been fully investigated in the literature. In this paper, we propose sparse low-rank matrix approximation (SLRMA), an effective computational tool for data compression. SLRMA extends the conventional LRMA by exploring both the intra- and inter-coherence of data samples simultaneously. With the aid of prescribed orthogonal transforms (e.g., discrete cosine/wavelet transform and graph transform), SLRMA decomposes a matrix into a product of two smaller matrices, where one matrix is made of extremely sparse and orthogonal column vectors, and the other consists of the transform coefficients. Technically, we formulate SLRMA as a constrained optimization problem, i.e., minimizing the approximation error in the least-squares sense regularized by $\ell_0$-norm and orthogonality, and solve it using the inexact augmented Lagrangian multiplier method. Through extensive tests on real-world data, such as 2D image sets and 3D dynamic meshes, we observe that (i) SLRMA empirically converges well; (ii) SLRMA can produce approximation error comparable to LRMA but in a much sparse form; (iii) SLRMA-based compression schemes significantly outperform the state-of-the-art in terms of rate-distortion performance.
- Junhui Hou (138 papers)
- Lap-Pui Chau (57 papers)
- Nadia Magnenat-Thalmann (7 papers)
- Ying He (102 papers)