Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Advancing Long-Term Multi-Energy Load Forecasting with Patchformer: A Patch and Transformer-Based Approach (2404.10458v1)

Published 16 Apr 2024 in cs.LG and cs.AI

Abstract: In the context of increasing demands for long-term multi-energy load forecasting in real-world applications, this paper introduces Patchformer, a novel model that integrates patch embedding with encoder-decoder Transformer-based architectures. To address the limitation in existing Transformer-based models, which struggle with intricate temporal patterns in long-term forecasting, Patchformer employs patch embedding, which predicts multivariate time-series data by separating it into multiple univariate data and segmenting each of them into multiple patches. This method effectively enhances the model's ability to capture local and global semantic dependencies. The numerical analysis shows that the Patchformer obtains overall better prediction accuracy in both multivariate and univariate long-term forecasting on the novel Multi-Energy dataset and other benchmark datasets. In addition, the positive effect of the interdependence among energy-related products on the performance of long-term time-series forecasting across Patchformer and other compared models is discovered, and the superiority of the Patchformer against other models is also demonstrated, which presents a significant advancement in handling the interdependence and complexities of long-term multi-energy forecasting. Lastly, Patchformer is illustrated as the only model that follows the positive correlation between model performance and the length of the past sequence, which states its ability to capture long-range past local semantic information.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (36)
  1. “Modelling of integrated multi-energy systems: Drivers, requirements, and opportunities” In 2016 Power Systems Computation Conference (PSCC), 2016, pp. 1–22 IEEE
  2. Damitha K Ranaweera, George G Karady and Richard G Farmer “Economic impact analysis of load forecasting” In IEEE Transactions on Power Systems 12.3 IEEE, 1997, pp. 1388–1392
  3. “Multi-energy load forecasting for regional integrated energy systems considering temporal dynamic and coupling characteristics” In Energy 195 Elsevier, 2020, pp. 116964
  4. “Time series analysis: forecasting and control” John Wiley & Sons, 2015
  5. George EP Box, Gwilym M Jenkins and John F MacGregor “Some recent advances in forecasting and control” In Journal of the Royal Statistical Society: Series C (Applied Statistics) 23.2 Wiley Online Library, 1974, pp. 158–179
  6. “Deep state space models for time series forecasting” In Advances in neural information processing systems 31, 2018
  7. “Time series forecasting using LSTM networks: A symbolic approach” In arXiv preprint arXiv:2003.05672, 2020
  8. “Time series forecasting using GRU neural network with multi-lag after decomposition” In Neural Information Processing: 24th International Conference, ICONIP 2017, Guangzhou, China, November 14–18, 2017, Proceedings, Part V 24, 2017, pp. 523–532 Springer
  9. “An improved time-series forecasting model using time series decomposition and gru architecture” In International Conference on Neural Information Processing, 2021, pp. 587–596 Springer
  10. Nikita Kitaev, Łukasz Kaiser and Anselm Levskaya “Reformer: The efficient transformer” In arXiv preprint arXiv:2001.04451, 2020
  11. “Informer: Beyond efficient transformer for long sequence time-series forecasting” In Proceedings of the AAAI conference on artificial intelligence 35.12, 2021, pp. 11106–11115
  12. “Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting” In Advances in Neural Information Processing Systems 34, 2021, pp. 22419–22430
  13. “Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting” In International Conference on Machine Learning, 2022, pp. 27268–27286 PMLR
  14. “A Time Series is Worth 64 Words: Long-term Forecasting with Transformers” In The Eleventh International Conference on Learning Representations, 2023 URL: https://openreview.net/forum?id=Jbdc0vTOcol
  15. “Attention is all you need” In Advances in neural information processing systems 30, 2017
  16. “An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale” In International Conference on Learning Representations, 2021 URL: https://openreview.net/forum?id=YicbFdNTTy
  17. “BEiT: BERT Pre-Training of Image Transformers” In International Conference on Learning Representations, 2022 URL: https://openreview.net/forum?id=p-BhZSz59o4
  18. “Short-term load forecasting using lifting scheme and ARIMA models” In Expert Systems with Applications 38.5 Elsevier, 2011, pp. 5902–5911
  19. “Evaluation of a multiple linear regression model and SARIMA model in forecasting heat demand for district heating system” In Applied energy 179 Elsevier, 2016, pp. 544–552
  20. Heng Shi, Minghao Xu and Ran Li “Deep learning for household load forecasting—A novel pooling deep RNN” In IEEE Transactions on Smart Grid 9.5 IEEE, 2017, pp. 5271–5280
  21. “Short-term residential load forecasting based on LSTM recurrent neural network” In IEEE transactions on smart grid 10.1 IEEE, 2017, pp. 841–851
  22. “Bi-directional long short-term memory method based on attention mechanism and rolling update for short-term load forecasting” In International Journal of Electrical Power & Energy Systems 109 Elsevier, 2019, pp. 470–479
  23. Ge Zhang, Xiaoqing Bai and Yuxuan Wang “Short-time multi-energy load forecasting method based on CNN-Seq2Seq model with attention mechanism” In Machine Learning with Applications 5 Elsevier, 2021, pp. 100064
  24. Ge Zhang, Songyang Zhu and Xiaoqing Bai “Federated learning-based multi-energy load forecasting method using CNN-Attention-LSTM model” In Sustainability 14.19 MDPI, 2022, pp. 12843
  25. “Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting” In The Eleventh International Conference on Learning Representations, 2022
  26. “A high precision artificial neural networks model for short-term energy load forecasting” In Energies 11.1 MDPI, 2018, pp. 213
  27. Senfeng Cen and Chang Gyoon Lim “Multi-task Learning of the PatchTCN-TST Model for Short-term Multi-load Energy Forecasting Considering Indoor Environments in a Smart Building” In IEEE Access IEEE, 2024
  28. Nooriya A Mohammed and Ammar Al-Bazi “An adaptive backpropagation algorithm for long-term electricity load forecasting” In Neural Computing and Applications 34.1 Springer, 2022, pp. 477–491
  29. Swasti R Khuntia, Jose L Rueda and Mart AMM Van der Meijden “Long-term electricity load forecasting considering volatility using multiplicative error model” In energies 11.12 MDPI, 2018, pp. 3308
  30. “Long-term electricity load forecasting: Current and future trends” In Utilities Policy 58 Elsevier, 2019, pp. 102–119
  31. “A transformer-based method of multienergy load forecasting in integrated energy system” In IEEE Transactions on Smart Grid 13.4 IEEE, 2022, pp. 2703–2714
  32. “Probabilistic Multi-energy Load Forecasting for Integrated Energy System Based on Bayesian Transformer Network” In IEEE Transactions on Smart Grid IEEE, 2023
  33. “Campus Metabolism” Accessed: 2024-03-19, https://cm.asu.edu/
  34. “Are transformers effective for time series forecasting?” In Proceedings of the AAAI conference on artificial intelligence 37.9, 2023, pp. 11121–11128
  35. Scott M Lundberg and Su-In Lee “A unified approach to interpreting model predictions” In Advances in neural information processing systems 30, 2017
  36. Marco Tulio Ribeiro, Sameer Singh and Carlos Guestrin “" Why should i trust you?" Explaining the predictions of any classifier” In Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, 2016, pp. 1135–1144
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Qiuyi Hong (1 paper)
  2. Fanlin Meng (14 papers)
  3. Felipe Maldonado (14 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets