Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Probing the Robustness of Time-series Forecasting Models with CounterfacTS (2403.03508v1)

Published 6 Mar 2024 in cs.LG

Abstract: A common issue for machine learning models applied to time-series forecasting is the temporal evolution of the data distributions (i.e., concept drift). Because most of the training data does not reflect such changes, the models present poor performance on the new out-of-distribution scenarios and, therefore, the impact of such events cannot be reliably anticipated ahead of time. We present and publicly release CounterfacTS, a tool to probe the robustness of deep learning models in time-series forecasting tasks via counterfactuals. CounterfacTS has a user-friendly interface that allows the user to visualize, compare and quantify time series data and their forecasts, for a number of datasets and deep learning models. Furthermore, the user can apply various transformations to the time series and explore the resulting changes in the forecasts in an interpretable manner. Through example cases, we illustrate how CounterfacTS can be used to i) identify the main features characterizing and differentiating sets of time series, ii) assess how the model performance depends on these characateristics, and iii) guide transformations of the original time series to create counterfactuals with desired properties for training and increasing the forecasting performance in new regions of the data distribution. We discuss the importance of visualizing and considering the location of the data in a projected feature space to transform time-series and create effective counterfactuals for training the models. Overall, CounterfacTS aids at creating counterfactuals to efficiently explore the impact of hypothetical scenarios not covered by the original data in time-series forecasting tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. GluonTS: Probabilistic and Neural Time Series Modeling in Python. Journal of Machine Learning Research, 21(116):1–6, 2020.
  2. Bagging exponential smoothing methods using stl decomposition and boxcox transformation. International Journal of Forecasting, 32(2):303–312, 2016. doi: 10 . 1016/j . ijforecast . 2015 . 07 . 002
  3. Bokeh Development Team. Bokeh: Python library for interactive visualization, 2018.
  4. Conditional time series forecasting with convolutional neural networks, 2018.
  5. R. M. Byrne. Counterfactual thought. Annual Review of Psychology, 67(1):135–157, 2016. PMID: 26393873. doi: 10 . 1146/annurev-psych-122414-033249
  6. R. M. J. Byrne. Counterfactuals in explainable artificial intelligence (xai): Evidence from human reasoning. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, IJCAI-19, pp. 6276–6282. International Joint Conferences on Artificial Intelligence Organization, 7 2019. doi: 10 . 24963/ijcai . 2019/876
  7. Stl: A seasonal-trend decomposition procedure based on loess (with discussion). Journal of Official Statistics, 6:3–73, 1990.
  8. A survey of data augmentation approaches for NLP. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pp. 968–988. Association for Computational Linguistics, Online, Aug. 2021. doi: 10 . 18653/v1/2021 . findings-acl . 84
  9. K. Fukushima. Visual feature extraction by a multilayered network of analog threshold elements. IEEE Transactions on Systems Science and Cybernetics, 5(4):322–333, 1969. doi: 10 . 1109/TSSC . 1969 . 300225
  10. A survey on concept drift adaptation. ACM Comput. Surv., 46(4), mar 2014. doi: 10 . 1145/2523813
  11. Data augmentation techniques in time series domain: a survey and taxonomy. Neural Computing and Applications, 35(14):10123–10145, mar 2023. doi: 10 . 1007/s00521-023-08459-3
  12. Gratis: Generating time series with diverse and controllable characteristics. Statistical Analysis and Data Mining: The ASA Data Science Journal, 13(4):354–376, 2020. doi: 10 . 1002/sam . 11461
  13. Visualising forecasting algorithm performance using time series instance spaces. International Journal of Forecasting, 33(2):345–358, 2017. doi: 10 . 1016/j . ijforecast . 2016 . 09 . 004
  14. Generating what-if scenarios for time series data. In Proceedings of the 29th International Conference on Scientific and Statistical Database Management, pp. 1–12, 06 2017. doi: 10 . 1145/3085504 . 3085507
  15. D. P. Kingma and J. Ba. Adam: A Method for Stochastic Optimization. arXiv e-prints, p. arXiv:1412.6980, Dec. 2014. doi: 10 . 48550/arXiv . 1412 . 6980
  16. S. Makridakis and M. Hibon. The m3-competition: results, conclusions and implications. International Journal of Forecasting, 16(4):451–476, 2000. The M3- Competition. doi: 10 . 1016/S0169-2070(00)00057-1
  17. The m4 competition: 100,000 time series and 61 forecasting methods. International Journal of Forecasting, 36(1):54–74, 2020. M4 Competition. doi: 10 . 1016/j . ijforecast . 2019 . 04 . 014
  18. C. Molnar. Interpretable Machine Learning. Independently published, 2 ed., 2022.
  19. N-beats: Neural basis expansion analysis for interpretable time series forecasting. In International Conference on Learning Representations, 2020.
  20. Pytorch: An imperative style, high-performance deep learning library. CoRR, abs/1912.01703, 2019.
  21. Explainable Artificial Intelligence (XAI) on TimeSeries Data: A Survey. arXiv e-prints, p. arXiv:2104.00950, Apr. 2021. doi: 10 . 48550/arXiv . 2104 . 00950
  22. Data augmentation for univariate time series forecasting with neural networks. Pattern Recognition, 134:109132, 02 2023. doi: 10 . 1016/j . patcog . 2022 . 109132
  23. C. Shorten and T. M. Khoshgoftaar. A survey on image data augmentation for deep learning. J. Big Data, 6:60, 2019. doi: 10 . 1186/s40537-019-0197-0
  24. Wavenet: A generative model for raw audio. CoRR, abs/1609.03499, 2016.
  25. Attention is all you need. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, eds., Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc., 2017.
  26. Time series data augmentation for deep learning: A survey. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, pp. 4653–4660, 08 2021. doi: 10 . 24963/ijcai . 2021/631
  27. A Multi-Horizon Quantile Recurrent Forecaster. arXiv e-prints, p. arXiv:1711.11053, Nov. 2017. doi: 10 . 48550/arXiv . 1711 . 11053
  28. The what-if tool: Interactive probing of machine learning models. IEEE Transactions on Visualization and Computer Graphics, 26(1):56–65, 2020. doi: 10 . 1109/TVCG . 2019 . 2934619
  29. Informer: Beyond efficient transformer for long sequence time-series forecasting. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12):11106–11115, May 2021. doi: 10 . 1609/aaai . v35i12 . 17325

Summary

We haven't generated a summary for this paper yet.