Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Feedforward Sequential Memory Networks: A New Structure to Learn Long-term Dependency (1512.08301v2)

Published 28 Dec 2015 in cs.NE

Abstract: In this paper, we propose a novel neural network structure, namely \emph{feedforward sequential memory networks (FSMN)}, to model long-term dependency in time series without using recurrent feedback. The proposed FSMN is a standard fully-connected feedforward neural network equipped with some learnable memory blocks in its hidden layers. The memory blocks use a tapped-delay line structure to encode the long context information into a fixed-size representation as short-term memory mechanism. We have evaluated the proposed FSMNs in several standard benchmark tasks, including speech recognition and LLMling. Experimental results have shown FSMNs significantly outperform the conventional recurrent neural networks (RNN), including LSTMs, in modeling sequential signals like speech or language. Moreover, FSMNs can be learned much more reliably and faster than RNNs or LSTMs due to the inherent non-recurrent model structure.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shiliang Zhang (132 papers)
  2. Cong Liu (169 papers)
  3. Hui Jiang (99 papers)
  4. Si Wei (19 papers)
  5. Lirong Dai (31 papers)
  6. Yu Hu (75 papers)
Citations (75)

Summary

We haven't generated a summary for this paper yet.