Posterior Summarization for Variable Selection in Bayesian Tree Ensembles (2509.07121v1)
Abstract: Variable selection remains a fundamental challenge in statistics, especially in nonparametric settings where model complexity can obscure interpretability. Bayesian tree ensembles, particularly the popular Bayesian additive regression trees (BART) and their rich variants, offer strong predictive performance with interpretable variable importance measures. We modularize Bayesian tree ensemble methods into model choice (tree prior) and posterior summary, and provide a unified review of existing strategies, including permutation-based inference, sparsity-inducing priors, and clustering-based selection. Though typically framed as a modeling task, we show that variable selection with tree ensembles often hinges on posterior summarization, which remains underexplored. To this end, we introduce the VC-measure (Variable Count and its rank variant) with a clustering-based threshold. This posterior summary is a simple, tuning-free plug-in that requires no additional sampling beyond standard model fitting, integrates with any BART variant, and avoids the instability of the median probability model and the computational cost of permutations. In extensive experiments, it yields uniform $F_1$ gains for both general-purpose and sparsity-inducing priors; when paired with the Dirichlet Additive Regression Tree (DART), it overcomes pitfalls of the original summary and attains the best overall balance of recall, precision, and efficiency. Practical guidance on aligning summaries and downstream goals is discussed.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.