Papers
Topics
Authors
Recent
Search
2000 character limit reached

An Incidence Geometry approach to Dictionary Learning

Published 28 Feb 2014 in cs.LG and stat.ML | (1402.7344v2)

Abstract: We study the Dictionary Learning (aka Sparse Coding) problem of obtaining a sparse representation of data points, by learning \emph{dictionary vectors} upon which the data points can be written as sparse linear combinations. We view this problem from a geometry perspective as the spanning set of a subspace arrangement, and focus on understanding the case when the underlying hypergraph of the subspace arrangement is specified. For this Fitted Dictionary Learning problem, we completely characterize the combinatorics of the associated subspace arrangements (i.e.\ their underlying hypergraphs). Specifically, a combinatorial rigidity-type theorem is proven for a type of geometric incidence system. The theorem characterizes the hypergraphs of subspace arrangements that generically yield (a) at least one dictionary (b) a locally unique dictionary (i.e.\ at most a finite number of isolated dictionaries) of the specified size. We are unaware of prior application of combinatorial rigidity techniques in the setting of Dictionary Learning, or even in machine learning. We also provide a systematic classification of problems related to Dictionary Learning together with various algorithms, their assumptions and performance.

Citations (4)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.