Papers
Topics
Authors
Recent
2000 character limit reached

Adaptive Recurrent Neural Network Based on Mixture Layer

Published 24 Jan 2018 in cs.LG, cs.AI, and stat.ML | (1801.08094v4)

Abstract: Although Recurrent Neural Network (RNN) has been a powerful tool for modeling sequential data, its performance is inadequate when processing sequences with multiple patterns. In this paper, we address this challenge by introducing a novel mixture layer and constructing an adaptive RNN. The mixture layer augmented RNN (termed as M-RNN) partitions patterns in training sequences into several clusters and stores the principle patterns as prototype vectors of components in a mixture model. By leveraging the mixture layer, the proposed method can adaptively update states according to the similarities between encoded inputs and prototype vectors, leading to a stronger capacity in assimilating sequences with multiple patterns. Moreover, our approach can be further extended by taking advantage of prior knowledge about data. Experiments on both synthetic and real datasets demonstrate the effectiveness of the proposed method.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.