Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FedDiSC: A Computation-efficient Federated Learning Framework for Power Systems Disturbance and Cyber Attack Discrimination (2304.03640v1)

Published 7 Apr 2023 in cs.CR, cs.DC, and cs.LG

Abstract: With the growing concern about the security and privacy of smart grid systems, cyberattacks on critical power grid components, such as state estimation, have proven to be one of the top-priority cyber-related issues and have received significant attention in recent years. However, cyberattack detection in smart grids now faces new challenges, including privacy preservation and decentralized power zones with strategic data owners. To address these technical bottlenecks, this paper proposes a novel Federated Learning-based privacy-preserving and communication-efficient attack detection framework, known as FedDiSC, that enables Discrimination between power System disturbances and Cyberattacks. Specifically, we first propose a Federated Learning approach to enable Supervisory Control and Data Acquisition subsystems of decentralized power grid zones to collaboratively train an attack detection model without sharing sensitive power related data. Secondly, we put forward a representation learning-based Deep Auto-Encoder network to accurately detect power system and cybersecurity anomalies. Lastly, to adapt our proposed framework to the timeliness of real-world cyberattack detection in SGs, we leverage the use of a gradient privacy-preserving quantization scheme known as DP-SIGNSGD to improve its communication efficiency. Extensive simulations of the proposed framework on publicly available Industrial Control Systems datasets demonstrate that the proposed framework can achieve superior detection accuracy while preserving the privacy of sensitive power grid related information. Furthermore, we find that the gradient quantization scheme utilized improves communication efficiency by 40% when compared to a traditional federated learning approach without gradient quantization which suggests suitability in a real-world scenario.

Citations (13)

Summary

We haven't generated a summary for this paper yet.