Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generative Probabilistic Time Series Forecasting and Applications in Grid Operations (2402.13870v1)

Published 21 Feb 2024 in cs.LG, eess.SP, and stat.AP

Abstract: Generative probabilistic forecasting produces future time series samples according to the conditional probability distribution given past time series observations. Such techniques are essential in risk-based decision-making and planning under uncertainty with broad applications in grid operations, including electricity price forecasting, risk-based economic dispatch, and stochastic optimizations. Inspired by Wiener and Kallianpur's innovation representation, we propose a weak innovation autoencoder architecture and a learning algorithm to extract independent and identically distributed innovation sequences from nonparametric stationary time series. We show that the weak innovation sequence is Bayesian sufficient, which makes the proposed weak innovation autoencoder a canonical architecture for generative probabilistic forecasting. The proposed technique is applied to forecasting highly volatile real-time electricity prices, demonstrating superior performance across multiple forecasting measures over leading probabilistic and point forecasting techniques.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. R. Weron and A. Misiorek, “Forecasting spot electricity prices: A comparison of parametric and semiparametric time series models,” International Journal of Forecasting, vol. 24, no. 4, pp. 744–763, 2008. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0169207008000952
  2. R. Weron, “Electricity price forecasting: A review of the state-of-the-art with a look into the future,” International Journal of Forecasting, vol. 30, no. 4, pp. 1030–1081, Oct. 2014. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0169207014001083
  3. W. Härdle, H. Lütkepohl, and R. Chen, “A review of nonparametric time series analysis,” International Statistical Review / Revue Internationale de Statistique, vol. 65, no. 1, pp. 49–72, 1997, publisher: [Wiley, International Statistical Institute (ISI)]. [Online]. Available: https://www.jstor.org/stable/1403432
  4. M. Rosenblatt, “Stationary processes as shifts of functions of independent random variables,” Journal of Mathematics and Mechanics, vol. 8, no. 5, pp. 665–681, 1959.
  5. T. Kailath, “An innovations approach to least-squares estimation–Part I: Linear filtering in additive white noise,” IEEE Transactions on Automatic Control, vol. 13, no. 6, pp. 646–655, 1968.
  6. A. Gautam and V. Singh, “Parametric versus non-parametric time series forecasting methods: A review,” Journal of Engineering Science and Technology Review, vol. 13, pp. 165–171, 2020.
  7. N. Nguyen and B. Quanz, “Temporal latent auto-encoder: a method for probabilistic multivariate time series forecasting,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 10, pp. 9117–9125, May 2021, number: 10. [Online]. Available: https://ojs.aaai.org/index.php/AAAI/article/view/17101
  8. L. Li, J. Zhang, J. Yan, Y. Jin, Y. Zhang, Y. Duan, and G. Tian, “Synergetic learning of heterogeneous temporal sequences for multi-horizon probabilistic forecasting,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 10, pp. 8420–8428, May 2021, number: 10. [Online]. Available: https://ojs.aaai.org/index.php/AAAI/article/view/17023
  9. K. Rasul, A.-S. Sheikh, I. Schuster, U. M. Bergmann, and R. Vollgraf, “Multivariate probabilistic time series forecasting via conditioned normalizing flows,” Feb. 2022. [Online]. Available: https://openreview.net/forum?id=WiGQBFuVRv
  10. Y. Li, X. Lu, Y. Wang, and D. Dou, “Generative time series forecasting with diffusion, denoise, and disentanglement,” in Advances in Neural Information Processing Systems, S. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, and A. Oh, Eds., vol. 35.   Curran Associates, Inc., 2022, pp. 23 009–23 022. [Online]. Available: https://proceedings.neurips.cc/paper_files/paper/2022/file/91a85f3fb8f570e6be52b333b5ab017a-Paper-Conference.pdf
  11. D. Salinas, V. Flunkert, and J. Gasthaus, “DeepAR: probabilistic forecasting with autoregressive recurrent networks,” Feb. 2019, arXiv:1704.04110 [cs, stat]. [Online]. Available: http://arxiv.org/abs/1704.04110
  12. Y. Wang, A. Smola, D. C. Maddix, J. Gasthaus, D. Foster, and T. Januschowski, “Deep factors for forecasting,” May 2019, arXiv:1905.12417 [cs, stat]. [Online]. Available: http://arxiv.org/abs/1905.12417
  13. H. Du, S. Du, and W. Li, “Probabilistic time series forecasting with deep non-linear state space models,” CAAI Transactions on Intelligence Technology, vol. n/a, no. n/a, 2022, _eprint: https://ietresearch.onlinelibrary.wiley.com/doi/pdf/10.1049/cit2.12085. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1049/cit2.12085
  14. A. v. d. Oord, S. Dieleman, H. Zen, K. Simonyan, O. Vinyals, A. Graves, N. Kalchbrenner, A. Senior, and K. Kavukcuoglu, “WaveNet: a generative model for raw audio,” Sep. 2016, arXiv:1609.03499 [cs]. [Online]. Available: http://arxiv.org/abs/1609.03499
  15. A. Borovykh, S. Bohte, and C. W. Oosterlee, “Conditional time series forecasting with convolutional neural networks,” Sep. 2018, arXiv:1703.04691 [stat]. [Online]. Available: http://arxiv.org/abs/1703.04691
  16. H. Zhou, S. Zhang, J. Peng, S. Zhang, J. Li, H. Xiong, and W. Zhang, “Informer: beyond efficient transformer for long sequence time-series forecasting,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 12, pp. 11 106–11 115, May 2021, number: 12. [Online]. Available: https://ojs.aaai.org/index.php/AAAI/article/view/17325
  17. T. Zhou, Z. Ma, Q. Wen, X. Wang, L. Sun, and R. Jin, “FEDformer: frequency enhanced decomposed transformer for long-term series forecasting,” Jun. 2022, arXiv:2201.12740 [cs, stat]. [Online]. Available: http://arxiv.org/abs/2201.12740
  18. S. Liu, H. Yu, C. Liao, J. Li, W. Lin, A. X. Liu, and S. Dustdar, “Pyraformer: low-complexity pyramidal attention for long-range time series modeling and forecasting,” Mar. 2022. [Online]. Available: https://openreview.net/forum?id=0EXmFzUn5I
  19. R. E. Kalman, “A new approach to linear filtering and prediction problems,” Trans. ASME J. of Basic Engineering, vol. 82, no. 1, pp. 35–45, 1960.
  20. X. Wang and L. Tong, “Innovations autoencoder and its application in one-class anomalous sequence detection,” Journal of Machine Learning Research, vol. 23, no. 49, pp. 1–27, 2022. [Online]. Available: http://jmlr.org/papers/v23/21-0735.html
  21. X. Wang, M. Lee, Q. Zhao, and L. Tong, “Non-parametric probabilistic time series forecasting via innovations representation,” arXiv preprint arXiv:2306.03782, 2023.
  22. M. Arjovsky, S. Chintala, and L.Bottou, “Wasserstein GAN,” Jan. 2017, arXiv:1701.07875.
  23. A. Alexandrov, K. Benidis, M. Bohlke-Schneider, V. Flunkert, J. Gasthaus, T. Januschowski, D. C. Maddix, S. Rangapuram, D. Salinas, J. Schulz, L. Stella, A. C. Türkmen, and Y. Wang, “GluonTS: probabilistic time series models in Python,” Jun. 2019, arXiv:1906.05264 [cs, stat]. [Online]. Available: http://arxiv.org/abs/1906.05264
  24. Y. Ji, R. J. Thomas, and L. Tong, “Probabilistic forecasting of real-time lmp and network congestion,” IEEE Transactions on Power Systems, vol. 32, no. 2, pp. 831–841, 2017.
  25. Y. Ji, L. Tong, and W. Deng, “Probabilistic forecasting of power system and market operations,” in Advanced data analytics for power systems.   Cambridge : Cambridge University Press, 2021.
Citations (2)

Summary

  • The paper introduces the GPF-WI framework that leverages weak innovation autoencoders to extract independent innovation sequences for nonparametric time series forecasting.
  • The methodology employs Bayesian sufficiency, ensuring that decisions based solely on weak innovations remain optimal under uncertainty.
  • Application to volatile real-time electricity pricing shows superior forecasting accuracy compared to traditional probabilistic and point forecasting techniques.

Generative Probabilistic Forecasting via Weak Innovation Autoencoders: A New Pathway in Time Series Analysis

Introduction to Generative Probabilistic Forecasting

Generative Probabilistic Forecasting (GPF) emerges as a pivotal technique in addressing the challenges innate to decision-making under uncertainty, especially in grid operations. This technique hinges on generating future time series samples based on the conditional probability distribution given past observations. Traditional approaches often resort to parametric models, limiting their effectiveness in capturing the full spectrum of complexities in nonparametric time series models. Venturing into the field of nonparametric probabilistic forecasting introduces the concept of a weak innovation autoencoder (WIAE) inspired by the foundational work of Wiener and Kallianpur on innovation representation.

Weak Innovation Autoencoder (WIAE)

WIAE stands as a cornerstone in our approach to GPF. It mitigates the challenges posed by nonparametric time series forecasting by extracting identically distributed innovation sequences from stationary time series. Through its architecture, the WIAE distills new information at each time step, independent of past observations, encapsulating it in what is referred to as the weak innovation sequence. This procedure not only broadens the scope of applicable time series beyond what strong innovation representation permits but also upholds the Bayesian sufficiency for generative probabilistic forecasting.

Generative Probabilistic Forecasting with Weak Innovation (GPF-WI)

GPF-WI is introduced as a potent method to operationalize the concept of WIAE in time series forecasting. Rather than attempting to directly generate samples from the conditional distribution, which is computationally intractable, GPF-WI cleverly leverages past innovation observations along with independently sampled data. This methodological innovation simplifies the process considerably, relying on the decoder's capacity to utilize past innovations for future predictions.

Theoretical Implications and Bayesian Sufficiency

One of the theoretical bedrocks of this paper is the establishment of Bayesian sufficiency for weak innovation sequences in the context of probabilistic forecasting. The implication here is profound: decisions made based on weak innovation sequences do not result in a loss of optimality. This theoretical underpinning validates the use of WIAE in generative forecasting, ensuring that the forecasts generated are not only credible but also optimal from a Bayesian perspective.

Practical Applications and Performance Evaluation

The practical significance of the proposed GPF-WI technique is underscored through its application to forecasting highly volatile real-time electricity prices. The empirical evaluation across various datasets showcases the method's superior performance in probabilistic forecasting measures compared to existing probabilistic and point forecasting techniques. This empirical evidence not only validates the efficacy of GPF-WI but also emphasizes its potential in improving decision-making processes in grid operations and beyond.

Concluding Remarks

The exploration of generative probabilistic forecasting through weak innovation autoencoders opens new avenues in the analysis of nonparametric time series. By leveraging the Bayesian sufficiency of weak innovation sequences, the proposed GPF-WI technique sets a new standard in forecasting, particularly in scenarios characterized by high volatility and uncertainty. The implications of this research extend far beyond grid operations, promising advancements in various fields where decision-making under uncertainty is paramount. As we progress, the adaptability and potential extensions of WIAE, including its application to multivariate time series, and the exploration of advanced neural network architectures, herald a promising future in the field of generative probabilistic forecasting.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets