Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ARMA Cell: A Modular and Effective Approach for Neural Autoregressive Modeling (2208.14919v2)

Published 31 Aug 2022 in cs.LG, cs.NE, and stat.ML

Abstract: The autoregressive moving average (ARMA) model is a classical, and arguably one of the most studied approaches to model time series data. It has compelling theoretical properties and is widely used among practitioners. More recent deep learning approaches popularize recurrent neural networks (RNNs) and, in particular, Long Short-Term Memory (LSTM) cells that have become one of the best performing and most common building blocks in neural time series modeling. While advantageous for time series data or sequences with long-term effects, complex RNN cells are not always a must and can sometimes even be inferior to simpler recurrent approaches. In this work, we introduce the ARMA cell, a simpler, modular, and effective approach for time series modeling in neural networks. This cell can be used in any neural network architecture where recurrent structures are present and naturally handles multivariate time series using vector autoregression. We also introduce the ConvARMA cell as a natural successor for spatially-correlated time series. Our experiments show that the proposed methodology is competitive with popular alternatives in terms of performance while being more robust and compelling due to its simplicity

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. TensorFlow: A system for large-scale machine learning. Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI ’16), pages 265–283, 2016.
  2. Time Series Analysis: Forecasting and Control. John Wiley & Sons, 2015.
  3. Recurrent networks and NARMA modeling. In Advances in Neural Information Processing Systems, volume 4, 1991.
  4. Recurrent neural networks and robust time series prediction. IEEE Transactions on Neural Networks, 5(2):240–254, 1994.
  5. On the properties of neural machine translation: Encoder-decoder approaches. arXiv preprint arXiv:1409.1259, 2014.
  6. J. Elman. Finding structure in time. Cognitive Science, 14(2):179–211, 1990.
  7. C. Granger and A. Andersen. On the invertibility of time series models. Stochastic Processes and their Applications, 8(1):87–92, 1978.
  8. Deep Learning. MIT Press, Cambridge, MA, USA, 2016.
  9. R. Hyndman and Y. Khandakar. Automatic time series forecasting: The forecast package for R. Journal of Statistical Software, 26(3):1–22, 2008.
  10. S. Hochreiter and J. Schmidhuber. Long short-term memory. Neural Computation, 9(8):1735–1780, 1997.
  11. A review of deep learning models for time series prediction. IEEE Sensors Journal, 21(6):7833–7848, 2019.
  12. Improving predictions of bayesian neural nets via local linearization. In International Conference on Artificial Intelligence and Statistics, pages 703–711. PMLR, 2021.
  13. M. Jordan. Attractor dynamics and parallelism in a connectionist sequential machine. In Proceedings of the Eighth Annual Conference of the Cognitive Science Society, pages 531–546, Hillsdale, NJ: Erlbaum, 1986.
  14. D. Kingma and J. Ba. Adam: A method for stochastic optimization, 2014. Published as a conference paper at the 3rd International Conference for Learning Representations, San Diego, 2015.
  15. Testing for neglected nonlinearity in time series models: A comparison of neural network methods and alternative tests. Journal of Econometrics, 56(3):269–290, 1993.
  16. The m4 competition: Results, findings, conclusion and way forward. International Journal of Forecasting, 34(4):802–808, 2018.
  17. T. Rao. On the theory of bilinear time series models. Journal of the Royal Statistical Society: Series B (Methodological), 43(2):244–255, 1981.
  18. I. Rivals and L. Personnaz. Black-box modeling with state-space neural networks. In Neural Adaptive Control Technology, pages 237–264. World Scientific, 1996.
  19. H. Saxén. On the equivalence between ARMA models and simple recurrent neural networks. In Applications of Computer Aided Time Series Modeling, pages 281–289. Springer, 1997.
  20. Convolutional LSTM network: A machine learning approach for precipitation nowcasting. In C. Cortes, N. Lawrence, D. Lee, M. Sugiyama, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 28. Curran Associates, Inc., 2015.
  21. Deepar: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 36(3):1181–1191, 2020.
  22. Deep learning for precipitation nowcasting: A benchmark and a new model, 2017.
  23. S. Seabold and J. Perktold. statsmodels: Econometric and statistical modeling with python. In 9th Python in Science Conference, 2010.
  24. G. Tiao and G. Box. Modeling multiple time series with applications. Journal of the American Statistical Association, 76(376):802–816, 1981.
  25. H. Tong and K. Lim. Threshold autoregression, limit cycles and cyclical data. In Exploration Of A Nonlinear World: An Appreciation of Howell Tong’s Contributions to Statistics, pages 9–56. World Scientific, 2009.
  26. A generative adversarial gated recurrent unit model for precipitation nowcasting. IEEE Geoscience and Remote Sensing Letters, 17(4):601–605, 2019.
  27. A. Trindade. Electricityloaddiagrams20112014, 2015.
  28. Temporal regularized matrix factorization for high-dimensional time series prediction. In NIPS, pages 847–855, 2016.
  29. G. Zhang. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing, 50:159–175, 2003.
Citations (1)

Summary

We haven't generated a summary for this paper yet.