Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient boosting in Markov-switching generalized additive models for location, scale and shape (1710.02385v2)

Published 6 Oct 2017 in stat.ME and stat.CO

Abstract: We propose a novel class of flexible latent-state time series regression models which we call Markov-switching generalized additive models for location, scale and shape. In contrast to conventional Markov-switching regression models, the presented methodology allows us to model different state-dependent parameters of the response distribution - not only the mean, but also variance, skewness and kurtosis parameters - as potentially smooth functions of a given set of explanatory variables. In addition, the set of possible distributions that can be specified for the response is not limited to the exponential family but additionally includes, for instance, a variety of Box-Cox-transformed, zero-inflated and mixture distributions. We propose an estimation approach based on the EM algorithm, where we use the gradient boosting framework to prevent overfitting while simultaneously performing variable selection. The feasibility of the suggested approach is assessed in simulation experiments and illustrated in a real-data setting, where we model the conditional distribution of the daily average price of energy in Spain over time.

Summary

We haven't generated a summary for this paper yet.