Bucketized Active Sampling for Learning ACOPF (2208.07497v3)
Abstract: This paper considers optimization proxies for Optimal Power Flow (OPF), i.e., machine-learning models that approximate the input/output relationship of OPF. Recent work has focused on showing that such proxies can be of high fidelity. However, their training requires significant data, each instance necessitating the (offline) solving of an OPF. To meet the requirements of market-clearing applications, this paper proposes Bucketized Active Sampling (BAS), a novel active learning framework that aims at training the best possible OPF proxy within a time limit. BAS partitions the input domain into buckets and uses an acquisition function to determine where to sample next. By applying the same partitioning to the validation set, BAS leverages labeled validation samples in the selection of unlabeled samples. BAS also relies on an adaptive learning rate that increases and decreases over time. Experimental results demonstrate the benefits of BAS.
- X. Sun, P. B. Luh, M. A. Bragin, Y. Chen, F. Wang, and J. Wan, “A decomposition and coordination approach for large-scale security constrained unit commitment problems with combined cycle units,” in IEEE Power & Energy Society General Meeting, 2017, pp. 1–5.
- S. Tam, “Real-time security-constrained economic dispatch and commitment in the pjm: Experiences and challenges,” in FERC Software Conference, 2011.
- S. Verma, V. Mukherjee et al., “Transmission expansion planning: A review,” in 3rd International Conference on Energy Efficient Technologies for Sustainability (ICEETS 2016). IEEE, 2016, pp. 350–355.
- F. Fioretto, T. W. Mak, and P. Van Hentenryck, “Predicting ac optimal power flows: Combining deep learning and lagrangian dual methods,” in 34th AAAI Conference on Artificial Intelligence (AAAI), 2020, pp. 630–637.
- Z. Yan and Y. Xu, “Real-time optimal power flow: A lagrangian based deep reinforcement learning approach,” IEEE Transactions on Power Systems (TPWRS), vol. 35, no. 4, pp. 3270–3273, 2020.
- X. Pan, M. Chen, T. Zhao, and S. H. Low, “Deepopf: A feasibility-optimized deep neural network approach for ac optimal power flow problems,” IEEE Systems Journal, vol. 17, no. 1, pp. 673–683, 2023.
- Y. Tang, K. Dvijotham, and S. Low, “Real-time optimal power flow,” IEEE Transactions on Smart Grid, vol. 8, no. 6, pp. 2963–2973, 2017.
- F. Diehl, “Warm-starting ac optimal power flow with graph neural networks,” in Advances in Neural Information Processing Systems 32 (NeurIPS), 2019, pp. 1–6.
- D. Owerko, F. Gama, and A. Ribeiro, “Optimal power flow using graph neural networks,” in 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2020, pp. 5930–5934.
- W. Dong, Z. Xie, G. Kestor, and D. Li, “Smart-pgsim: Using neural network to accelerate ac-opf power grid simulation,” in 2020 International Conference for High Performance Computing, Networking, Storage and Analysis (SC20). IEEE, 2020, pp. 1–15.
- A. S. Zamzam and K. Baker, “Learning optimal solutions for extremely fast ac optimal power flow,” in 11th IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (SmartGridComm 2020). IEEE, 2019, pp. 1–6.
- R. Canyasse, G. Dalal, and S. Mannor, “Supervised learning for optimal power flow as a real-time proxy,” in 8th IEEE Conference on Power & Energy Society Innovative Smart Grid Technologies (ISGT). IEEE, 2017, pp. 1–5.
- K. Baker, “A learning-boosted quasi-newton method for ac optimal power flow,” in Workshop on Machine Learning for Engineering Modeling, Simulation and Design, 2020.
- N. Guha, Z. Wang, M. Wytock, and A. Majumdar, “Machine learning for ac optimal power flow,” in 36th International Conference on Machine Learning (ICML), 2019.
- W. Chen, S. Park, M. Tanneau, and P. Van Hentenryck, “Learning optimization proxies for large-scale security-constrained economic dispatch,” Electric Power Systems Research, vol. 213, p. 108566, 2022.
- A. Kirsch, J. Van Amersfoort, and Y. Gal, “Batchbald: Efficient and diverse batch acquisition for deep bayesian active learning,” Advances in Neural Information Processing Systems 32 (NeurIPS), vol. 32, 2019.
- J. T. Ash, C. Zhang, A. Krishnamurthy, J. Langford, and A. Agarwal, “Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds,” in 7th International Conference on Learning Representations (ICLR), 2020.
- E. Tsymbalov, M. Panov, and A. Shapeev, “Dropout-based active learning for regression,” in 7th International Conference on Analysis of Images, Social Networks and Texts (AIST 2018). Springer, 2018, pp. 247–258.
- D. Angluin, “Queries and Concept Learning,” Machine Learning, vol. 2, no. 4, pp. 319–342, 1988.
- D. D. Lewis and J. Catlett, “Heterogeneous uncertainty sampling for supervised learning,” in Machine Learning Proceedings 1994: Proceedings of the Eighth International Conference. Morgan Kaufmann, 1994, p. 148.
- P. Kumar and A. Gupta, “Active learning query strategies for classification, regression, and clustering: A survey,” Journal of Computer Science and Technology, vol. 35, no. 4, pp. 913–945, 2020.
- P. Ren, Y. Xiao, X. Chang, P.-Y. Huang, Z. Li, B. B. Gupta, X. Chen, and X. Wang, “A survey of deep active learning,” ACM Computing Surveys (CSUR), vol. 54, no. 9, pp. 1–40, 2021.
- J. Choi, I. Elezi, H.-J. Lee, C. Farabet, and J. M. Alvarez, “Active learning for deep object detection via probabilistic modeling,” in 2019 IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 10 264–10 273.
- D. Wu, “Pool-based sequential active learning for regression,” IEEE Transactions on Neural Networks and Learning Systems, vol. 30, no. 5, pp. 1348–1359, 2018.
- W. Cai, Y. Zhang, and J. Zhou, “Maximizing expected model change for active learning in regression,” in 13th IEEE International Conference on Data Mining (ICDM). IEEE, 2013, pp. 51–60.
- L. N. Smith, “Cyclical learning rates for training neural networks,” in 2017 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 2017, pp. 464–472.
- L. N. Smith and N. Topin, “Super-convergence: Very fast training of neural networks using large learning rates,” in Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications, vol. 11006. International Society for Optics and Photonics, 2019, p. 12.
- I. Loshchilov and F. Hutter, “Sgdr: Stochastic gradient descent with warm restarts,” Learning, vol. 10, p. 3, 2016.
- M. Zaheer, S. Reddi, D. Sachan, S. Kale, and S. Kumar, “Adaptive methods for nonconvex optimization,” Advances in Neural Information Processing Systems 31 (NeurIPS 2018), vol. 31, 2018.
- Z. Liu, H. Ding, H. Zhong, W. Li, J. Dai, and C. He, “Influence selection for active learning,” in 2021 IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 9274–9283.
- S. Roy, A. Unmesh, and V. P. Namboodiri, “Deep active learning for object detection,” in 29th British Machine Vision Conference (BMVC 2018), 2018, p. 91.
- Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, 2015.
- A. Paszke et al., “PyTorch: An Imperative Style, High-Performance Deep Learning Library,” in Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alché Buc, E. Fox, and R. Garnett, Eds. Curran Associates, Inc., 2019, pp. 8024–8035.
- L. T. Biegler and V. M. Zavala, “Large-scale nonlinear programming using ipopt: An integrating framework for enterprise-wide dynamic optimization,” Computers & Chemical Engineering, vol. 33, no. 3, pp. 575–582, 2009.
- I. S. Duff, “Ma57—a code for the solution of sparse symmetric definite and indefinite systems,” ACM Transactions on Mathematical Software (TOMS), vol. 30, no. 2, pp. 118–144, 2004.
- S. Babaeinejadsarookolaee et al., “The Power Grid Library for Benchmarking AC Optimal Power Flow Algorithms,” IEEE PES PGLib-OPF Task Force, Tech. Rep., 2019.
- I. Loshchilov and F. Hutter, “Decoupled weight decay regularization,” in 6th International Conference on Learning Representations (ICLR), 2018.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.