Papers
Topics
Authors
Recent
Search
2000 character limit reached

EXPRTS: Exploring and Probing the Robustness of Time Series Forecasting Models

Published 6 Mar 2024 in cs.LG | (2403.03508v3)

Abstract: When deploying time series forecasting models based on machine learning to real world settings, one often encounter situations where the data distribution drifts. Such drifts expose the forecasting models to out-of-distribution (OOD) data, and machine learning models lack robustness in these settings. Robustness can be improved by using deep generative models or genetic algorithms to augment time series datasets, but these approaches lack interpretability and are computationally expensive. In this work, we develop an interpretable and simple framework for generating time series. Our method combines time-series decompositions with analytic functions, and is able to generate time series with characteristics matching both in- and out-of-distribution data. This approach allows users to generate new time series in an interpretable fashion, which can be used to augment the dataset and improve forecasting robustness. We demonstrate our framework through EXPRTS, a visual analytics tool designed for univariate time series forecasting models and datasets. Different visualizations of the data distribution, forecasting errors and single time series instances enable users to explore time series datasets, apply transformations, and evaluate forecasting model robustness across diverse scenarios. We show how our framework can generate meaningful OOD time series that improve model robustness, and we validate EXPRTS effectiveness and usability through three use-cases and a user study.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. GluonTS: Probabilistic and Neural Time Series Modeling in Python. Journal of Machine Learning Research, 21(116):1–6, 2020.
  2. Bagging exponential smoothing methods using stl decomposition and boxcox transformation. International Journal of Forecasting, 32(2):303–312, 2016. doi: 10 . 1016/j . ijforecast . 2015 . 07 . 002
  3. Bokeh Development Team. Bokeh: Python library for interactive visualization, 2018.
  4. Conditional time series forecasting with convolutional neural networks, 2018.
  5. R. M. Byrne. Counterfactual thought. Annual Review of Psychology, 67(1):135–157, 2016. PMID: 26393873. doi: 10 . 1146/annurev-psych-122414-033249
  6. R. M. J. Byrne. Counterfactuals in explainable artificial intelligence (xai): Evidence from human reasoning. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, IJCAI-19, pp. 6276–6282. International Joint Conferences on Artificial Intelligence Organization, 7 2019. doi: 10 . 24963/ijcai . 2019/876
  7. Stl: A seasonal-trend decomposition procedure based on loess (with discussion). Journal of Official Statistics, 6:3–73, 1990.
  8. A survey of data augmentation approaches for NLP. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pp. 968–988. Association for Computational Linguistics, Online, Aug. 2021. doi: 10 . 18653/v1/2021 . findings-acl . 84
  9. K. Fukushima. Visual feature extraction by a multilayered network of analog threshold elements. IEEE Transactions on Systems Science and Cybernetics, 5(4):322–333, 1969. doi: 10 . 1109/TSSC . 1969 . 300225
  10. A survey on concept drift adaptation. ACM Comput. Surv., 46(4), mar 2014. doi: 10 . 1145/2523813
  11. Data augmentation techniques in time series domain: a survey and taxonomy. Neural Computing and Applications, 35(14):10123–10145, mar 2023. doi: 10 . 1007/s00521-023-08459-3
  12. Gratis: Generating time series with diverse and controllable characteristics. Statistical Analysis and Data Mining: The ASA Data Science Journal, 13(4):354–376, 2020. doi: 10 . 1002/sam . 11461
  13. Visualising forecasting algorithm performance using time series instance spaces. International Journal of Forecasting, 33(2):345–358, 2017. doi: 10 . 1016/j . ijforecast . 2016 . 09 . 004
  14. Generating what-if scenarios for time series data. In Proceedings of the 29th International Conference on Scientific and Statistical Database Management, pp. 1–12, 06 2017. doi: 10 . 1145/3085504 . 3085507
  15. D. P. Kingma and J. Ba. Adam: A Method for Stochastic Optimization. arXiv e-prints, p. arXiv:1412.6980, Dec. 2014. doi: 10 . 48550/arXiv . 1412 . 6980
  16. S. Makridakis and M. Hibon. The m3-competition: results, conclusions and implications. International Journal of Forecasting, 16(4):451–476, 2000. The M3- Competition. doi: 10 . 1016/S0169-2070(00)00057-1
  17. The m4 competition: 100,000 time series and 61 forecasting methods. International Journal of Forecasting, 36(1):54–74, 2020. M4 Competition. doi: 10 . 1016/j . ijforecast . 2019 . 04 . 014
  18. C. Molnar. Interpretable Machine Learning. Independently published, 2 ed., 2022.
  19. N-beats: Neural basis expansion analysis for interpretable time series forecasting. In International Conference on Learning Representations, 2020.
  20. Pytorch: An imperative style, high-performance deep learning library. CoRR, abs/1912.01703, 2019.
  21. Explainable Artificial Intelligence (XAI) on TimeSeries Data: A Survey. arXiv e-prints, p. arXiv:2104.00950, Apr. 2021. doi: 10 . 48550/arXiv . 2104 . 00950
  22. Data augmentation for univariate time series forecasting with neural networks. Pattern Recognition, 134:109132, 02 2023. doi: 10 . 1016/j . patcog . 2022 . 109132
  23. C. Shorten and T. M. Khoshgoftaar. A survey on image data augmentation for deep learning. J. Big Data, 6:60, 2019. doi: 10 . 1186/s40537-019-0197-0
  24. Wavenet: A generative model for raw audio. CoRR, abs/1609.03499, 2016.
  25. Attention is all you need. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, eds., Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc., 2017.
  26. Time series data augmentation for deep learning: A survey. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, pp. 4653–4660, 08 2021. doi: 10 . 24963/ijcai . 2021/631
  27. A Multi-Horizon Quantile Recurrent Forecaster. arXiv e-prints, p. arXiv:1711.11053, Nov. 2017. doi: 10 . 48550/arXiv . 1711 . 11053
  28. The what-if tool: Interactive probing of machine learning models. IEEE Transactions on Visualization and Computer Graphics, 26(1):56–65, 2020. doi: 10 . 1109/TVCG . 2019 . 2934619
  29. Informer: Beyond efficient transformer for long sequence time-series forecasting. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12):11106–11115, May 2021. doi: 10 . 1609/aaai . v35i12 . 17325

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.