Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A New State-of-the-Art Transformers-Based Load Forecaster on the Smart Grid Domain (2108.02628v1)

Published 5 Aug 2021 in cs.LG

Abstract: Meter-level load forecasting is crucial for efficient energy management and power system planning for Smart Grids (SGs), in tasks associated with regulation, dispatching, scheduling, and unit commitment of power grids. Although a variety of algorithms have been proposed and applied on the field, more accurate and robust models are still required: the overall utility cost of operations in SGs increases 10 million currency units if the load forecasting error increases 1%, and the mean absolute percentage error (MAPE) in forecasting is still much higher than 1%. Transformers have become the new state-of-the-art in a variety of tasks, including the ones in computer vision, natural language processing and time series forecasting, surpassing alternative neural models such as convolutional and recurrent neural networks. In this letter, we present a new state-of-the-art Transformer-based algorithm for the meter-level load forecasting task, which has surpassed the former state-of-the-art, LSTM, and the traditional benchmark, vanilla RNN, in all experiments by a margin of at least 13% in MAPE.

Citations (1)

Summary

We haven't generated a summary for this paper yet.