Papers
Topics
Authors
Recent
2000 character limit reached

Analyzing Linear Dynamical Systems: From Modeling to Coding and Learning

Published 3 Aug 2016 in cs.CV | (1608.01059v2)

Abstract: Encoding time-series with Linear Dynamical Systems (LDSs) leads to rich models with applications ranging from dynamical texture recognition to video segmentation to name a few. In this paper, we propose to represent LDSs with infinite-dimensional subspaces and derive an analytic solution to obtain stable LDSs. We then devise efficient algorithms to perform sparse coding and dictionary learning on the space of infinite-dimensional subspaces. In particular, two solutions are developed to sparsely encode an LDS. In the first method, we map the subspaces into a Reproducing Kernel Hilbert Space (RKHS) and achieve our goal through kernel sparse coding. As for the second solution, we propose to embed the infinite-dimensional subspaces into the space of symmetric matrices and formulate the sparse coding accordingly in the induced space. For dictionary learning, we encode time-series by introducing a novel concept, namely the two-fold LDSs. We then make use of the two-fold LDSs to derive an analytical form for updating atoms of an LDS dictionary, i.e., each atom is an LDS itself. Compared to several baselines and state-of-the-art methods, the proposed methods yield higher accuracies in various classification tasks including video classification and tactile recognition.

Citations (4)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.