Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variable Grouping Based Bayesian Additive Regression Tree (1911.00922v2)

Published 3 Nov 2019 in cs.LG, eess.SP, stat.ME, and stat.ML

Abstract: Using ensemble methods for regression has been a large success in obtaining high-accuracy prediction. Examples are Bagging, Random forest, Boosting, BART (Bayesian additive regression tree), and their variants. In this paper, we propose a new perspective named variable grouping to enhance the predictive performance. The main idea is to seek for potential grouping of variables in such way that there is no nonlinear interaction term between variables of different groups. Given a sum-of-learner model, each learner will only be responsible for one group of variables, which would be more efficient in modeling nonlinear interactions. We propose a two-stage method named variable grouping based Bayesian additive regression tree (GBART) with a well-developed python package gbart available. The first stage is to search for potential interactions and an appropriate grouping of variables. The second stage is to build a final model based on the discovered groups. Experiments on synthetic and real data show that the proposed method can perform significantly better than classical approaches.

Summary

We haven't generated a summary for this paper yet.