Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compressed Dictionary Learning (1805.00692v2)

Published 2 May 2018 in stat.ML and cs.LG

Abstract: In this paper we show that the computational complexity of the Iterative Thresholding and K-residual-Means (ITKrM) algorithm for dictionary learning can be significantly reduced by using dimensionality-reduction techniques based on the Johnson-Lindenstrauss lemma. The dimensionality reduction is efficiently carried out with the fast Fourier transform. We introduce the Iterative compressed-Thresholding and K-Means (IcTKM) algorithm for fast dictionary learning and study its convergence properties. We show that IcTKM can locally recover an incoherent, overcomplete generating dictionary of $K$ atoms from training signals of sparsity level $S$ with high probability. Fast dictionary learning is achieved by embedding the training data and the dictionary into $m < d$ dimensions, and recovery is shown to be locally stable with an embedding dimension which scales as low as $m = O(S \log4 S \log3 K)$. The compression effectively shatters the data dimension bottleneck in the computational cost of ITKrM, reducing it by a factor $O(m/d)$. Our theoretical results are complemented with numerical simulations which demonstrate that IcTKM is a powerful, low-cost algorithm for learning dictionaries from high-dimensional data sets.

Citations (4)

Summary

We haven't generated a summary for this paper yet.