Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient-Boosted Mixture Regression Models for Postprocessing Ensemble Weather Forecasts (2412.09583v1)

Published 12 Dec 2024 in stat.AP

Abstract: Nowadays, weather forecasts are commonly generated by ensemble forecasts based on multiple runs of numerical weather prediction models. However, such forecasts are usually miscalibrated and/or biased, thus require statistical postprocessing. Non-homogeneous regression models, such as the ensemble model output statistics are frequently applied to correct these forecasts. Nonetheless, these methods often rely on the assumption of an unimodal parametric distribution, leading to improved, but sometimes not fully calibrated forecasts. To address this issue, a mixture regression model is presented, where the ensemble forecasts of each exchangeable group are linked to only one mixture component and mixture weight, called mixture of model output statistics (MIXMOS). In order to remove location specific effects and to use a longer training data, the standardized anomalies of the response and the ensemble forecasts are employed for the mixture of standardized anomaly model output statistics (MIXSAMOS). As carefully selected covariates, e.g. from different weather variables, can enhance model performance, the non-cyclic gradient-boosting algorithm for mixture regression models is introduced. Furthermore, MIXSAMOS is extended by this gradient-boosting algorithm (MIXSAMOS-GB) providing an automatic variable selection. The novel mixture regression models substantially outperform state-of-the-art postprocessing models in a case study for 2m surface temperature forecasts in Germany.

Summary

We haven't generated a summary for this paper yet.