Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Large Foundation Models for Power Systems (2312.07044v1)

Published 12 Dec 2023 in eess.SY, cs.LG, and cs.SY

Abstract: Foundation models, such as LLMs, can respond to a wide range of format-free queries without any task-specific data collection or model training, creating various research and application opportunities for the modeling and operation of large-scale power systems. In this paper, we outline how such large foundation model such as GPT-4 are developed, and discuss how they can be leveraged in challenging power and energy system tasks. We first investigate the potential of existing foundation models by validating their performance on four representative tasks across power system domains, including the optimal power flow (OPF), electric vehicle (EV) scheduling, knowledge retrieval for power engineering technical reports, and situation awareness. Our results indicate strong capabilities of such foundation models on boosting the efficiency and reliability of power system operational pipelines. We also provide suggestions and projections on future deployment of foundation models in power system applications.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Chenghao Huang (10 papers)
  2. Siyang Li (51 papers)
  3. Ruohong Liu (7 papers)
  4. Hao Wang (1119 papers)
  5. Yize Chen (57 papers)
Citations (12)