Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Correlation and Class Based Block Formation for Improved Structured Dictionary Learning (1708.01448v2)

Published 4 Aug 2017 in cs.CV

Abstract: In recent years, the creation of block-structured dictionary has attracted a lot of interest. Learning such dictionaries involve two step process: block formation and dictionary update. Both these steps are important in producing an effective dictionary. The existing works mostly assume that the block structure is known a priori while learning the dictionary. For finding the unknown block structure given a dictionary commonly sparse agglomerative clustering (SAC) is used. It groups atoms based on their consistency in sparse coding with respect to the unstructured dictionary. This paper explores two innovations towards improving the reconstruction as well as the classification ability achieved with the block-structured dictionary. First, we propose a novel block structuring approach that makes use of the correlation among dictionary atoms. Unlike the SAC approach, which groups diverse atoms, in the proposed approach the blocks are formed by grouping the top most correlated atoms in the dictionary. The proposed block clustering approach is noted to yield significant reductions in redundancy as well as provides a direct control on the block size when compared with the existing SAC-based block structuring. Later, motivated by works using supervised \emph{a priori} known block structure, we also explore the incorporation of class information in the proposed block formation approach to further enhance the classification ability of the block dictionary. For assessment of the reconstruction ability with proposed innovations is done on synthetic data while the classification ability has been evaluated in large variability speaker verification task.

Summary

We haven't generated a summary for this paper yet.