Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Interval Forecasts for Gas Prices in the Face of Structural Breaks -- Statistical Models vs. Neural Networks (2407.16723v1)

Published 23 Jul 2024 in cs.LG

Abstract: Reliable gas price forecasts are an essential information for gas and energy traders, for risk managers and also economists. However, ahead of the war in Ukraine Europe began to suffer from substantially increased and volatile gas prices which culminated in the aftermath of the North Stream 1 explosion. This shock changed both trend and volatility structure of the prices and has considerable effects on forecasting models. In this study we investigate whether modern machine learning methods such as neural networks are more resilient against such changes than statistical models such as autoregressive moving average (ARMA) models with conditional heteroskedasticity, or copula-based time series models. Thereby the focus lies on interval forecasting and applying respective evaluation measures. As data, the Front Month prices from the Dutch Title Transfer Facility, currently the predominant European exchange, are used. We see that, during the shock period, most models underestimate the variance while overestimating the variance in the after-shock period. Furthermore, we recognize that, during the shock, the simpler models, i.e. an ARMA model with conditional heteroskedasticity and the multilayer perceptron (a neural network), perform best with regards to prediction interval coverage. Interestingly, the widely-used long-short term neural network is outperformed by its competitors.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (61)
  1. Charu C. Aggarwal. Neural Networks and Deep Learning - A Textbook. Springer, 2018.
  2. Dependence and extreme dependence of crude oil and natural gas prices with applications to risk management. Energy Economics, 42:332–342, 2014.
  3. A gentle introduction to conformal prediction and distribution-free uncertainty quantification. arXiv preprint arXiv:2107.07511, 2021.
  4. John E Angus. The probability integral transform and related results. SIAM review, 36(4):652–654, 1994.
  5. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271, 2018.
  6. Brendan K Beare. Copulas and temporal dependence. Econometrica, 78(1):395–410, 2010.
  7. Vine copula specifications for stationary multivariate markov chains. Journal of Time Series Analysis, 36(2):228–246, 2015.
  8. Random search for hyper-parameter optimization. Journal of Machine Learning Research, 13(10):281–305, 2012.
  9. Distributional modeling and forecasting of natural gas prices. Journal of Forecasting, 2022.
  10. Tim Bollerslev. Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics, 31(3):307–327, 1986.
  11. Time series analysis: forecasting and control. John Wiley & Sons, 2015.
  12. Copar—multivariate time series modeling using the copula autoregressive model. Applied Stochastic Models in Business and Industry, 31(4):495–514, 2015.
  13. The estimation of copulas: Theory and practice. Copulas: From theory to application in finance, pages 35–64, 2007.
  14. Estimation of copula-based semiparametric time series models. Journal of Econometrics, 130(2):307–335, 2006.
  15. G. Cybenko. Approximation by superpositions of a sigmoidal function. Math. Control Signal Systems 2, page 303–314, 1989.
  16. Quantile regression: Theory and applications. Wiley Series in Probability and Statistics. Wiley, Chichester, England, 2014.
  17. The t copula and related copulas. International Statistical Review, 73(1):111–129, 2005.
  18. Backpropagation of pseudoerrors: Neural networks that are adaptive to heterogeneous noise. IEEE Transactions on Neural Networks, 14:253–62, 02 2003.
  19. Confidence intervals and prediction intervals for feed-forward neural networks. Clinical Applications of Artificial Neural Networks, 02 2000.
  20. Quantitative Risk Management. Princeton University Press, 2015.
  21. GARCH models: structure, statistical inference and financial applications. John Wiley & Sons, 2019.
  22. Package ‘rugarch’. R Team Cooperation, 2018.
  23. Strictly proper scoring rules, prediction, and estimation. Journal of the American Statistical Association, 102(477):359–378, 2007.
  24. Deep Learning. MIT Press, 2016.
  25. Hybrid speech recognition with deep bidirectional lstm. In 2013 IEEE Workshop on Automatic Speech Recognition and Understanding, pages 273–278, 2013.
  26. Ensemble machine learning models for the detection of energy theft. Electric Power Systems Research, 192:106904, 2021.
  27. James Douglas Hamilton. Time series analysis. Princeton University Press, 2020.
  28. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
  29. Probabilistic backpropagation for scalable learning of bayesian neural networks. In Francis Bach and David Blei, editors, Proceedings of the 32nd International Conference on Machine Learning, volume 37 of Proceedings of Machine Learning Research, pages 1861–1869. PMLR, 07–09 Jul 2015.
  30. Neural network models for time series forecasts. Management Science, 42(7):1082–1092, 1996.
  31. Long short-term memory. Neural Computation, 9(8):1735–1780, 1997.
  32. Multilayer feedforward networks are universal approximators. Neural Networks, 2(5):359–366, 1989.
  33. Short-term load forecasting using an lstm neural network. In 2020 IEEE Power and Energy Conference at Illinois (PECI), pages 1–6, 2020.
  34. Ling Hu. Dependence patterns across financial markets: a mixed copula approach. Applied Financial Economics, 16(10):717–729, 2006.
  35. Ensemble conformalized quantile regression for probabilistic time series forecasting. IEEE Transactions on Neural Networks and Learning Systems, pages 1–12, 2022.
  36. Harry Joe. Dependence modeling with copulas. CRC Press, 2014.
  37. The copula-garch model of conditional dependencies: An international stock market application. Journal of International Money and Finance, 25(5):827–853, 2006.
  38. Comprehensive review of neural network-based prediction intervals and new advances. IEEE transactions on neural networks, 22(9):1341–1356, 2011.
  39. Lower upper bound estimation method for construction of neural network-based prediction intervals. IEEE Transactions on Neural Networks, 22(3):337–346, 2011.
  40. Regression quantiles. Econometrica, 46(1):33–50, 1978.
  41. Time series copulas for heteroskedastic data. Journal of Applied Econometrics, 33(3):332–354, 2018.
  42. Quantitative risk management: concepts, techniques and tools-revised edition. Princeton university press, 2015.
  43. Estimating the mean and variance of the target probability distribution. In Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN’94), volume 1, pages 55–60 vol.1, 1994.
  44. Estimating value at risk: Lstm vs. garch. arXiv preprint arXiv:2207.10539, 2022.
  45. An introduction to convolutional neural networks. arXiv preprint arXiv:1511.08458, 2015.
  46. Tcnn: Temporal convolutional neural network for real-time speech enhancement in the time domain. In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 6875–6879, 2019.
  47. Forecasting natural gas prices with spatio-temporal copula-based time series models. In Olga Valenzuela, Fernando Rojas, Luis Javier Herrera, Héctor Pomares, and Ignacio Rojas, editors, Theory and Applications of Time Series Analysis, pages 221–236, Cham, 2023. Springer Nature Switzerland.
  48. Andrew Patton. Copula methods for forecasting multivariate time series. Handbook of Economic Forecasting, 2:899–960, 2013.
  49. Andrew J Patton. A review of copula models for economic time series. Journal of Multivariate Analysis, 110:4–18, 2012.
  50. High-quality prediction intervals for deep learning: A distribution-free, ensembled approach. In Jennifer Dy and Andreas Krause, editors, Proceedings of the 35th International Conference on Machine Learning, volume 80 of Proceedings of Machine Learning Research, pages 4075–4084. PMLR, 10–15 Jul 2018.
  51. Conformalized quantile regression. Advances in neural information processing systems, 32, 2019.
  52. Prediction intervals: Split normal mixture from quality-driven deep ensembles. In Jonas Peters and David Sontag, editors, Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), volume 124 of Proceedings of Machine Learning Research, pages 1179–1187. PMLR, 03–06 Aug 2020.
  53. Time series forecasting of petroleum production using deep lstm recurrent networks. Neurocomputing, 323:203–213, 2019.
  54. Forecasting time series with multivariate copulas. Dependence Modeling, 3(1), 2015.
  55. M Sklar. Fonctions de repartition an dimensions et leurs marges. Publ. Inst. Statist. Univ. Paris, 8:229–231, 1959.
  56. Modeling longitudinal data using a pair-copula decomposition of serial dependence. Journal of the American Statistical Association, 105(492):1467–1479, 2010.
  57. Wavenet: A generative model for raw audio. arXiv preprint arXiv:1609.03499, 2016.
  58. Learning natural language inference with lstm. arXiv preprint arXiv:1512.08849, 2023.
  59. Forecasting qos attributes using lstm networks. In 2018 International Joint Conference on Neural Networks (IJCNN), pages 1–8, 2018.
  60. Yinyu Ye. Interior algorithms for linear, quadratic, and linearly constrained convex programming. Stanford University, 1988.
  61. G Peter Zhang. A neural network ensemble method with jittered training data for time series forecasting. Information Sciences, 177(23):5329–5346, 2007.

Summary

We haven't generated a summary for this paper yet.