Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Natural Gas Consumption Forecasting System for Continual Learning Scenarios based on Hoeffding Trees with Change Point Detection Mechanism (2309.03720v4)

Published 7 Sep 2023 in cs.LG and cs.AI

Abstract: Forecasting natural gas consumption, considering seasonality and trends, is crucial in planning its supply and consumption and optimizing the cost of obtaining it, mainly by industrial entities. However, in times of threats to its supply, it is also a critical element that guarantees the supply of this raw material to meet individual consumers' needs, ensuring society's energy security. This article introduces a novel multistep ahead forecasting of natural gas consumption with change point detection integration for model collection selection with continual learning capabilities using data stream processing. The performance of the forecasting models based on the proposed approach is evaluated in a complex real-world use case of natural gas consumption forecasting. We employed Hoeffding tree predictors as forecasting models and the Pruned Exact Linear Time (PELT) algorithm for the change point detection procedure. The change point detection integration enables selecting a different model collection for successive time frames. Thus, three model collection selection procedures (with and without an error feedback loop) are defined and evaluated for forecasting scenarios with various densities of detected change points. These models were compared with change point agnostic baseline approaches. Our experiments show that fewer change points result in a lower forecasting error regardless of the model collection selection procedure employed. Also, simpler model collection selection procedures omitting forecasting error feedback leads to more robust forecasting models suitable for continual learning tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. doi:https://doi.org/10.1016/j.apenergy.2007.07.004. URL http://www.sciencedirect.com/science/article/pii/S0306261907001183
  2. doi:https://doi.org/10.1016/j.enbuild.2013.11.032. URL http://www.sciencedirect.com/science/article/pii/S0378778813007299
  3. doi:https://doi.org/10.1016/j.enbuild.2012.10.023. URL http://www.sciencedirect.com/science/article/pii/S0378778812005324
  4. arXiv:https://doi.org/10.1080/15567249.2014.893040, doi:10.1080/15567249.2014.893040. URL https://doi.org/10.1080/15567249.2014.893040
  5. doi:https://doi.org/10.1016/j.energy.2015.03.084. URL http://www.sciencedirect.com/science/article/pii/S036054421500393X
  6. doi:https://doi.org/10.1016/j.enbuild.2017.07.017.
  7. doi:https://doi.org/10.1016/j.energy.2019.04.167.
  8. doi:https://doi.org/10.1016/j.enbuild.2016.06.020. URL http://www.sciencedirect.com/science/article/pii/S0378778816305096
  9. doi:https://doi.org/10.1016/j.apenergy.2011.11.003.
  10. doi:https://doi.org/10.1016/S1364-6613(99)01294-2. URL https://www.sciencedirect.com/science/article/pii/S1364661399012942
  11. doi:https://doi.org/10.1016/j.neunet.2019.01.012. URL https://www.sciencedirect.com/science/article/pii/S0893608019300231
  12. arXiv:1902.10486. URL http://arxiv.org/abs/1902.10486
  13. doi:10.1109/TPAMI.2022.3206549.
  14. arXiv:https://www.pnas.org/doi/pdf/10.1073/pnas.1611835114, doi:10.1073/pnas.1611835114. URL https://www.pnas.org/doi/abs/10.1073/pnas.1611835114
  15. arXiv:1606.04671. URL http://arxiv.org/abs/1606.04671
  16. doi:10.1109/ICDM51629.2021.00026.
  17. doi:10.1145/3517745.3563033. URL https://doi.org/10.1145/3517745.3563033
  18. doi:https://doi.org/10.1016/j.neunet.2021.07.021. URL https://www.sciencedirect.com/science/article/pii/S0893608021002847
  19. doi:10.1007/BF00116251. URL https://doi.org/10.1007/BF00116251
  20. doi:10.1145/347090.347107. URL https://doi.org/10.1145/347090.347107
  21. doi:10.1145/3219819.3220005. URL https://doi.org/10.1145/3219819.3220005
  22. doi:10.1109/IDEAS.2007.4318108.
  23. doi:10.1145/2089094.2089106. URL https://doi.org/10.1145/2089094.2089106
  24. arXiv:1801.00718. URL http://arxiv.org/abs/1801.00718
  25. doi:"10.1080/01621459.2012.737745". URL https://doi.org/10.1080%2F01621459.2012.737745
  26. doi:10.1109/LSP.2001.838216.
  27. IEEE, 2021. doi:"10.1109/bigdata52589.2021.9671962", [link]. URL "https://doi.org/10.1109%2Fbigdata52589.2021.9671962"
  28. doi:https://doi.org/10.1016/j.engappai.2023.106005. URL https://www.sciencedirect.com/science/article/pii/S0952197623001896
  29. doi:https://doi.org/10.1007/s11356-022-20861-3.
  30. doi:10.3390/en15134880. URL https://www.mdpi.com/1996-1073/15/13/4880
  31. doi:https://doi.org/10.1016/j.energy.2020.119430. URL http://www.sciencedirect.com/science/article/pii/S0360544220325378
  32. arXiv:https://direct.mit.edu/neco/article-pdf/9/8/1735/813796/neco.1997.9.8.1735.pdf, doi:10.1162/neco.1997.9.8.1735. URL https://doi.org/10.1162/neco.1997.9.8.1735
  33. doi:https://doi.org/10.1016/j.inffus.2017.02.004. URL https://www.sciencedirect.com/science/article/pii/S1566253516302329
  34. doi:https://doi.org/10.1016/j.ijforecast.2021.11.001. URL https://www.sciencedirect.com/science/article/pii/S0169207021001758
  35. doi:10.1080/07350015.1995.10524599.
  36. doi:https://doi.org/10.1016/j.rser.2014.03.033. URL https://www.sciencedirect.com/science/article/pii/S1364032114001944
  37. doi:https://doi.org/10.1016/j.jobe.2020.101955. URL https://www.sciencedirect.com/science/article/pii/S2352710220335877
  38. arXiv:2207.06046.
Citations (2)

Summary

We haven't generated a summary for this paper yet.