Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Generative Methods for Producing Forecast Trajectories in Power Systems (2309.15137v1)

Published 26 Sep 2023 in cs.LG and cs.AI

Abstract: With the expansion of renewables in the electricity mix, power grid variability will increase, hence a need to robustify the system to guarantee its security. Therefore, Transport System Operators (TSOs) must conduct analyses to simulate the future functioning of power systems. Then, these simulations are used as inputs in decision-making processes. In this context, we investigate using deep learning models to generate energy production and load forecast trajectories. To capture the spatiotemporal correlations in these multivariate time series, we adapt autoregressive networks and normalizing flows, demonstrating their effectiveness against the current copula-based statistical approach. We conduct extensive experiments on the French TSO RTE wind forecast data and compare the different models with \textit{ad hoc} evaluation metrics for time series generation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (26)
  1. J. Dumas, A. Wehenkel, D. Lanaspeze, B. Cornélusse, and A. Sutera, “A deep generative model for probabilistic energy forecasting in power systems: normalizing flows,” Applied Energy, vol. 305, p. 117871, 2022.
  2. E. H. Capel and J. Dumas, “Denoising diffusion probabilistic models for probabilistic energy forecasting,” in 2023 IEEE Belgrade PowerTech, 2023, pp. 1–6.
  3. S. Kolkmann, L. Ostmeier, and C. Weber, “Modeling Multivariate Intraday Forecast Update Processes for Wind Power,” Rochester, NY, Jun. 2022.
  4. I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” in Advances in Neural Information Processing Systems, vol. 27.   Curran Associates, Inc., 2014.
  5. D. P. Kingma and M. Welling, “Auto-Encoding Variational Bayes,” Dec. 2022, arXiv:1312.6114 [cs, stat].
  6. J. Ho, A. Jain, and P. Abbeel, “Denoising Diffusion Probabilistic Models,” in Advances in Neural Information Processing Systems, vol. 33.   Curran Associates, Inc., 2020, pp. 6840–6851.
  7. J. Yoon, D. Jarrett, and M. van der Schaar, “Time-series Generative Adversarial Networks,” in Advances in Neural Information Processing Systems, vol. 32.   Curran Associates, Inc., 2019.
  8. A. Desai, C. Freeman, Z. Wang, and I. Beaver, “TimeVAE: A Variational Auto-Encoder for Multivariate Time Series Generation,” Dec. 2021, arXiv:2111.08095 [cs].
  9. K. Rasul, C. Seward, I. Schuster, and R. Vollgraf, “Autoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting,” in Proceedings of the 38th International Conference on Machine Learning.   PMLR, Jul. 2021, pp. 8857–8868, iSSN: 2640-3498.
  10. M. Heusel, H. Ramsauer, T. Unterthiner, B. Nessler, G. Klambauer, and S. Hochreiter, “Gans trained by a two time-scale update rule converge to a nash equilibrium,” CoRR, vol. abs/1706.08500, 2017.
  11. H. Arnout, J. Bronner, and T. Runkler, “Evaluation of generative adversarial networks for time series data,” in 2021 International Joint Conference on Neural Networks (IJCNN), 2021, pp. 1–7.
  12. M. Cuturi and M. Blondel, “Soft-DTW: a differentiable loss function for time-series,” in Proceedings of the 34th International Conference on Machine Learning - Volume 70, ser. ICML’17.   Sydney, NSW, Australia: JMLR.org, 2017, pp. 894–903.
  13. A. van den Oord, N. Kalchbrenner, L. Espeholt, k. kavukcuoglu, O. Vinyals, and A. Graves, “Conditional Image Generation with PixelCNN Decoders,” in Advances in Neural Information Processing Systems, vol. 29.   Curran Associates, Inc., 2016.
  14. A. v. d. Oord, S. Dieleman, H. Zen, K. Simonyan, O. Vinyals, A. Graves, N. Kalchbrenner, A. Senior, and K. Kavukcuoglu, “WaveNet: A Generative Model for Raw Audio,” Sep. 2016, arXiv:1609.03499 [cs].
  15. X. I. Chen, N. Mishra, M. Rohaninejad, and P. Abbeel, “PixelSNAIL: An Improved Autoregressive Generative Model,” in Proceedings of the 35th International Conference on Machine Learning.   PMLR, Jul. 2018, pp. 864–872, iSSN: 2640-3498.
  16. M. Baudry, “Some statistical learning problems with incomplete data,” phdthesis, Université de Lyon, Jan. 2020.
  17. G. Ostrovski, W. Dabney, and R. Munos, “Autoregressive Quantile Networks for Generative Modeling,” in Proceedings of the 35th International Conference on Machine Learning.   PMLR, Jul. 2018, pp. 3936–3945, iSSN: 2640-3498.
  18. D. Salinas, V. Flunkert, J. Gasthaus, and T. Januschowski, “DeepAR: Probabilistic forecasting with autoregressive recurrent networks,” International Journal of Forecasting, vol. 36, no. 3, pp. 1181–1191, Jul. 2020.
  19. D. Salinas, M. Bohlke-Schneider, L. Callot, R. Medico, and J. Gasthaus, “High-dimensional multivariate forecasting with low-rank Gaussian Copula Processes,” in Advances in Neural Information Processing Systems, vol. 32.   Curran Associates, Inc., 2019.
  20. K. Rasul, A.-S. Sheikh, I. Schuster, U. M. Bergmann, and R. Vollgraf, “Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows,” Jan. 2021.
  21. M. Germain, K. Gregor, I. Murray, and H. Larochelle, “MADE: Masked Autoencoder for Distribution Estimation,” in Proceedings of the 32nd International Conference on Machine Learning.   PMLR, Jun. 2015, pp. 881–889, iSSN: 1938-7228.
  22. G. Papamakarios, T. Pavlakou, and I. Murray, “Masked Autoregressive Flow for Density Estimation,” in Advances in Neural Information Processing Systems, vol. 30.   Curran Associates, Inc., 2017.
  23. L. Dinh, J. Sohl-Dickstein, and S. Bengio, “Density estimation using Real NVP,” Jul. 2022.
  24. E. Dai and J. Chen, “Graph-Augmented Normalizing Flows for Anomaly Detection of Multiple Time Series,” Jan. 2022.
  25. K. Kan, F.-X. Aubet, T. Januschowski, Y. Park, K. Benidis, L. Ruthotto, and J. Gasthaus, “Multivariate Quantile Function Forecaster,” in Proceedings of The 25th International Conference on Artificial Intelligence and Statistics.   PMLR, May 2022, pp. 10 603–10 621, iSSN: 2640-3498.
  26. T. Gneiting and A. E. Raftery, “Strictly Proper Scoring Rules, Prediction, and Estimation,” Journal of the American Statistical Association, vol. 102, no. 477, pp. 359–378, Mar. 2007.

Summary

We haven't generated a summary for this paper yet.