Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Laxity-Aware Scalable Reinforcement Learning for HVAC Control (2306.16619v1)

Published 29 Jun 2023 in eess.SY, cs.AI, cs.SY, and math.OC

Abstract: Demand flexibility plays a vital role in maintaining grid balance, reducing peak demand, and saving customers' energy bills. Given their highly shiftable load and significant contribution to a building's energy consumption, Heating, Ventilation, and Air Conditioning (HVAC) systems can provide valuable demand flexibility to the power systems by adjusting their energy consumption in response to electricity price and power system needs. To exploit this flexibility in both operation time and power, it is imperative to accurately model and aggregate the load flexibility of a large population of HVAC systems as well as designing effective control algorithms. In this paper, we tackle the curse of dimensionality issue in modeling and control by utilizing the concept of laxity to quantify the emergency level of each HVAC operation request. We further propose a two-level approach to address energy optimization for a large population of HVAC systems. The lower level involves an aggregator to aggregate HVAC load laxity information and use least-laxity-first (LLF) rule to allocate real-time power for individual HVAC systems based on the controller's total power. Due to the complex and uncertain nature of HVAC systems, we leverage a reinforcement learning (RL)-based controller to schedule the total power based on the aggregated laxity information and electricity price. We evaluate the temperature control and energy cost saving performance of a large-scale group of HVAC systems in both single-zone and multi-zone scenarios, under varying climate and electricity market conditions. The experiment results indicate that proposed approach outperforms the centralized methods in the majority of test scenarios, and performs comparably to model-based method in some scenarios.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. J. IEA, “Market report series: Renewables 2019,” 2018.
  2. A. Zakaria, F. B. Ismail, M. H. Lipu, and M. A. Hannan, “Uncertainty models for stochastic optimization in renewable energy applications,” Renewable Energy, vol. 145, pp. 1543–1571, 2020.
  3. L. Spangher, A. Gokul, M. Khattar, J. Palakapilly, U. Agwan, A. Tawade, and C. Spanos, “Augmenting reinforcement learning with a planning model for optimizing energy demand response,” in Proceedings of the 1st International Workshop on Reinforcement Learning for Energy Management in Buildings & Cities, 2020, pp. 39–42.
  4. Y. Chen, P. Xu, J. Gu, F. Schmidt, and W. Li, “Measures to improve energy demand flexibility in buildings for demand response (dr): A review,” Energy and Buildings, vol. 177, pp. 125–139, 2018.
  5. H. Li, Z. Wang, T. Hong, and M. A. Piette, “Energy flexibility of residential buildings: A systematic review of characterization and quantification methods and applications,” Advances in Applied Energy, vol. 3, p. 100054, 2021.
  6. J. E. Contreras-Ocaña, M. R. Sarker, and M. A. Ortega-Vazquez, “Decentralized coordination of a building manager and an electric vehicle aggregator,” IEEE Transactions on Smart Grid, vol. 9, no. 4, pp. 2625–2637, 2016.
  7. T. L. Vandoorn, B. Renders, L. Degroote, B. Meersman, and L. Vandevelde, “Active load control in islanded microgrids based on the grid voltage,” IEEE Transactions on Smart Grid, vol. 2, no. 1, pp. 139–151, 2010.
  8. X. Lu, K. W. Chan, S. Xia, M. Shahidehpour, and W. H. Ng, “An operation model for distribution companies using the flexibility of electric vehicle aggregators,” IEEE Transactions on Smart Grid, vol. 12, no. 2, pp. 1507–1518, 2020.
  9. Y. Shi and B. Xu, “End-to-end demand response model identification and baseline estimation with deep learning,” arXiv preprint arXiv:2109.00741, 2021.
  10. N. Chen, C. Kurniawan, Y. Nakahira, L. Chen, and S. H. Low, “Smoothed least-laxity-first algorithm for electric vehicle charging: Online decision and performance analysis with resource augmentation,” IEEE Transactions on Smart Grid, vol. 13, no. 3, pp. 2209–2217, 2021.
  11. M. Razmara, G. Bharati, D. Hanover, M. Shahbakhti, S. Paudyal, and R. D. Robinett III, “Building-to-grid predictive power flow control for demand response and demand flexibility programs,” Applied Energy, vol. 203, pp. 128–141, 2017.
  12. P.-H. Li and S. Pye, “Assessing the benefits of demand-side flexibility in residential and transport sectors from an integrated energy systems perspective,” Applied energy, vol. 228, pp. 965–979, 2018.
  13. I. A. Sajjad, G. Chicco, and R. Napoli, “Definitions of demand flexibility for aggregate residential loads,” IEEE Transactions on Smart Grid, vol. 7, no. 6, pp. 2633–2643, 2016.
  14. K.-b. Kwon and H. Zhu, “Efficient representation for electric vehicle charging station operations using reinforcement learning,” arXiv preprint arXiv:2108.03236, 2021.
  15. S. Burger, J. P. Chaves-Ávila, C. Batlle, and I. J. Pérez-Arriaga, “A review of the value of aggregators in electricity systems,” Renewable and Sustainable Energy Reviews, vol. 77, pp. 395–405, 2017.
  16. A. Gosavi, “Reinforcement learning: A tutorial survey and recent advances,” INFORMS Journal on Computing, vol. 21, no. 2, pp. 178–192, 2009.
  17. H. Cai and P. Heer, “Experimental implementation of an emission-aware prosumer with online flexibility quantification and provision,” arXiv preprint arXiv:2110.12831, 2021.
  18. N. Hekmat, H. Cai, T. Zufferey, G. Hug, and P. Heer, “Data-driven demand-side flexibility quantification: Prediction and approximation of flexibility envelopes,” arXiv preprint arXiv:2110.12796, 2021.
  19. J. Y.-T. Leung and J. Whitehead, “On the complexity of fixed-priority scheduling of periodic, real-time tasks,” Performance evaluation, vol. 2, no. 4, pp. 237–250, 1982.
  20. S. Wang, S. Bi, and Y. A. Zhang, “Reinforcement learning for real-time pricing and scheduling control in ev charging stations,” IEEE Transactions on Industrial Informatics, vol. 17, no. 2, pp. 849–859, 2019.
  21. M. Alizadeh and A. Scaglione, “Least laxity first scheduling of thermostatically controlled loads for regulation services,” in IEEE Global Conference on Signal and Information Processing, 2013.
  22. F. Lilliu, T. B. Pedersen, and L. Siksnys, “Heat flexoffers: a device-independent and scalable representation of electricity-heat flexibility.” in Proceedings of the 14th ACM International Conference on Future Energy Systems, 2023, pp. 374–385.
  23. H. Hao, B. M. Sanandaji, K. Poolla, and T. L. Vincent, “Aggregate flexibility of thermostatically controlled loads,” IEEE Transactions on Power Systems, vol. 30, no. 1, pp. 189–198, 2014.
  24. L. Zhao, W. Zhang, H. Hao, and K. Kalsi, “A geometric approach to aggregate flexibility modeling of thermostatically controlled loads,” IEEE Transactions on Power Systems, vol. 32, no. 6, pp. 4721–4731, 2017.
  25. J. Wu, J. Hu, X. Ai, Z. Zhang, and H. Hu, “Multi-time scale energy management of electric vehicle model-based prosumers by using virtual battery model,” Applied Energy, vol. 251, p. 113312, 2019.
  26. J. T. Hughes, A. D. Domínguez-García, and K. Poolla, “Identification of virtual battery models for flexible loads,” IEEE Transactions on Power Systems, vol. 31, no. 6, pp. 4660–4669, 2016.
  27. X. Jin, K. Baker, D. Christensen, and S. Isley, “Foresee: A user-centric home energy management system for energy efficiency and demand response,” Applied Energy, vol. 205, pp. 1583–1595, 2017.
  28. X. Chen, Y. Li, J. Shimada, and N. Li, “Online learning and distributed control for residential demand response,” IEEE Transactions on Smart Grid, vol. 12, no. 6, pp. 4843–4853, 2021.
  29. Y. Chen, Y. Shi, and B. Zhang, “Modeling and optimization of complex building energy systems with deep neural networks,” in 2017 51st Asilomar Conference on Signals, Systems, and Computers.   IEEE, 2017, pp. 1368–1373.
  30. E. Mocanu, D. C. Mocanu, P. H. Nguyen, A. Liotta, M. E. Webber, M. Gibescu, and J. G. Slootweg, “On-line building energy optimization using deep reinforcement learning,” IEEE transactions on smart grid, vol. 10, no. 4, pp. 3698–3708, 2018.
  31. R. Liu and Y. Chen, “Learning a multi-agent controller for shared energy storage system,” in 2023 IEEE Power & Energy Society General Meeting (PESGM).
  32. L. Yu, Y. Sun, Z. Xu, C. Shen, D. Yue, T. Jiang, and X. Guan, “Multi-agent deep reinforcement learning for hvac control in commercial buildings,” IEEE Transactions on Smart Grid, vol. 12, no. 1, pp. 407–419, 2020.
  33. N. Jiang, “Notes on state abstractions,” 2018.
  34. Z. Wang, Y. Chen, and Y. Li, “Development of rc model for thermal dynamic analysis of buildings through model structure simplification,” Energy and Buildings, vol. 195, pp. 51–67, 2019.
  35. T. Li, B. Sun, Y. Chen, Z. Ye, S. H. Low, and A. Wierman, “Learning-based predictive control via real-time aggregate flexibility,” IEEE Transactions on Smart Grid, vol. 12, no. 6, pp. 4897–4913, 2021.
  36. L. Li, T. J. Walsh, and M. L. Littman, “Towards a unified theory of state abstraction for mdps.” in AI&M, 2006.
  37. T. P. Lillicrap, J. J. Hunt, A. Pritzel, N. Heess, T. Erez, Y. Tassa, D. Silver, and D. Wierstra, “Continuous control with deep reinforcement learning,” arXiv preprint arXiv:1509.02971, 2015.
  38. Kaggle, “Dataset: Hourly energy demand generation and weather,” https://www.kaggle.com/datasets/nicholasjhana/energy-consumption-generation-prices-and-weather, 2019, [Online; accessed 05-May-2023].

Summary

We haven't generated a summary for this paper yet.