Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Privacy-Preserving Load Forecasting via Personalized Model Obfuscation (2312.00036v1)

Published 21 Nov 2023 in cs.CR and cs.LG

Abstract: The widespread adoption of smart meters provides access to detailed and localized load consumption data, suitable for training building-level load forecasting models. To mitigate privacy concerns stemming from model-induced data leakage, federated learning (FL) has been proposed. This paper addresses the performance challenges of short-term load forecasting models trained with FL on heterogeneous data, emphasizing privacy preservation through model obfuscation. Our proposed algorithm, Privacy Preserving Federated Learning (PPFL), incorporates personalization layers for localized training at each smart meter. Additionally, we employ a differentially private mechanism to safeguard against data leakage from shared layers. Simulations on the NREL ComStock dataset corroborate the effectiveness of our approach.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (18)
  1. M. Espinoza, J. A. Suykens, R. Belmans, and B. De Moor, “Electric load forecasting,” IEEE Control Systems Magazine, vol. 27, no. 5, pp. 43–57, 2007.
  2. M. R. Asghar, G. Dán, D. Miorandi, and I. Chlamtac, “Smart meter data privacy: A survey,” IEEE Communications Surveys & Tutorials, vol. 19, no. 4, pp. 2820–2835, 2017.
  3. A. Taïk and S. Cherkaoui, “Electrical load forecasting using edge computing and federated learning,” in ICC 2020 - 2020 IEEE International Conference on Communications (ICC), 2020, pp. 1–6.
  4. M. N. Fekri, K. Grolinger, and S. Mir, “Distributed load forecasting using smart meter data: Federated learning with recurrent neural networks,” International Journal of Electrical Power & Energy Systems, vol. 137, p. 107669, 2022.
  5. Z. Chai, H. Fayyaz, Z. Fayyaz, A. Anwar, Y. Zhou, N. Baracaldo, H. Ludwig, and Y. Cheng, “Towards taming the resource and data heterogeneity in federated learning,” in 2019 USENIX Conference on Operational Machine Learning (OpML 19), May 2019, pp. 19–21.
  6. A. Abay, Y. Zhou, N. Baracaldo, S. Rajamoni, E. Chuba, and H. Ludwig, “Mitigating bias in federated learning,” 2020.
  7. S. Bose and K. Kim, “Federated short-term load forecasting with personalization layers for heterogeneous clients,” 2023.
  8. L. Collins, H. Hassani, A. Mokhtari, and S. Shakkottai, “Exploiting shared representations for personalized federated learning,” in Proceedings of the 38th International Conference on Machine Learning, vol. 139.   PMLR, 18–24 Jul 2021, pp. 2089–2099.
  9. C. Dwork, “Differential privacy: A survey of results,” in Theory and Applications of Models of Computation, M. Agrawal, D. Du, Z. Duan, and A. Li, Eds.   Berlin, Heidelberg: Springer Berlin Heidelberg, 2008, pp. 1–19.
  10. M. Abadi, A. Chu, I. Goodfellow, H. B. McMahan, I. Mironov, K. Talwar, and L. Zhang, “Deep learning with differential privacy,” in Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, ser. CCS ’16, 2016, p. 308–318.
  11. Y. Qin, D. Song, H. Cheng, W. Cheng, G. Jiang, and G. W. Cottrell, “A dual-stage attention-based recurrent neural network for time series prediction,” in Proceedings of the 26th Intl. Joint Conf. on Artificial Intelligence, ser. IJCAI’17.   AAAI Press, 2017, p. 2627–2633.
  12. H. Zang, R. Xu, L. Cheng, T. Ding, L. Liu, Z. Wei, and G. Sun, “Residential load forecasting based on LSTM fusing self-attention mechanism with pooling,” Energy, vol. 229, p. 120682, 2021.
  13. K. Zhu, Y. Li, W. Mao, F. Li, and J. Yan, “LSTM enhanced by dual-attention-based encoder-decoder for daily peak load forecasting,” Electric Power Systems Research, vol. 208, p. 107860, 2022.
  14. J. Xiong, P. Zhou, A. Chen, and Y. Zhang, “Attention-based neural load forecasting: A dynamic feature selection approach,” in 2021 IEEE Power & Energy Society General Meeting (PESGM), 2021, pp. 01–05.
  15. S. J. Reddi, S. Kale, and S. Kumar, “On the convergence of Adam and beyond,” arXiv preprint arXiv:1904.09237, 2019.
  16. S. Reddi, Z. Charles, M. Zaheer, Z. Garrett, K. Rush, J. Konečný, S. Kumar, and H. B. McMahan, “Adaptive federated optimization,” 2021.
  17. A. Parker, H. Horsey, M. Dahlhausen, M. Praprost, C. CaraDonna, A. LeBar, and L. Klun, “Comstock reference documentation: Version 1,” National Renewable Energy Laboratory, Golden, CO, Tech. Rep., 2023.
  18. M. Ryu, Y. Kim, K. Kim, and R. K. Madduri, “APPFL: open-source software framework for privacy-preserving federated learning,” in 2022 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW).   IEEE, 2022, pp. 1074–1083.
Citations (1)

Summary

We haven't generated a summary for this paper yet.