Bayesian Backfitting MCMC in BART
- Bayesian Backfitting MCMC is a probabilistic framework that uses Gibbs and RJMCMC updates to efficiently infer tree structures in Bayesian additive regression trees (BART).
- The methodology eliminates the need for closed-form conjugate priors by employing reversible-jump moves with Laplace approximations, enabling inference under arbitrary likelihoods.
- This approach broadens BART's applicability to models like structured heteroskedastic regression, survival analysis, and gamma-shape regression while ensuring robust uncertainty quantification.
Bayesian Backfitting Markov Chain Monte Carlo (MCMC) refers to an MCMC-based inference framework for Bayesian additive regression trees (BART). In its original form, the algorithm exploits conditional conjugacy to enable efficient Gibbs-style updates for tree structures and leaf parameters, but this restricts applicability to models where closed-form posteriors exist. Recent advances, most notably the reversible-jump (RJ) extension, remove this restriction—allowing Bayesian backfitting to be deployed for arbitrary likelihoods using RJMCMC updates, provided only that the likelihood and (optionally) its gradient and curvature can be evaluated. This greatly expands the domain of BART, enabling its use in structured heteroskedastic regression, survival analysis, and many other modeling contexts (Linero, 2022).
1. Semiparametric Regression Setup for Bayesian Backfitting
BART models the conditional mean function in regression as a sum of tree-structured learners. For observed data , , the model assumes: where denotes a regression tree, with the split structure and the leaf means. Priors are typically set as: a branching-process prior for (with split probability at depth given by ), independent Gaussian priors for leaf values, and improper (e.g., ) or proper inverse-gamma priors for variance (Linero, 2022).
The joint posterior is
The backfitting MCMC update is performed one tree at a time, conditioning on the others through residuals: For each , the tree structure is updated and its leaf means are drawn from closed-form Gaussian full-conditionals, exploiting Normal–Normal conjugacy.
2. Limitations of Conditional Conjugacy
The standard backfitting algorithm relies on the ability to compute, for each tree and leaf node, an integrated likelihood of the form
in closed form. This is possible for a small set of likelihood–prior pairs (e.g., Gaussian-Gaussian, Poisson-loggamma, some standard survival models) but is intractable for most settings—including generalized linear models with nonconjugate random effects, beta-binomial models, and others (Linero, 2022). When conditional conjugacy fails, bespoke solutions or quadrature are required, or BART cannot practically be used with such likelihoods.
3. RJMCMC Backfitting: Removing Conjugacy Restrictions
Linero (2021) introduced an RJMCMC-based extension that allows arbitrary likelihood models in the Bayesian backfitting context (Linero, 2022). In this framework, the joint update of both the tree structure and the leaf values is performed using reversible-jump MCMC, enabling Bayesian backfitting without the need for closed-form marginalization. The requirements are:
- Evaluability of , where
- Ability to compute (ideally) the score and Fisher information ; finite-difference approximations suffice otherwise.
The RJMCMC moves are:
- Birth: Propose splitting a leaf, sampling a new split and independent values for child leaves from a Laplace-approximated local Gaussian.
- Death: Propose collapsing two sibling leaves, reusing or integrating their parameters.
- Change: Propose resampling the split rule for an internal node and redrawing leaf values.
Acceptance ratios follow the Green (1995) RJMCMC framework. Proposition 1 in (Linero, 2022) gives explicit formulas, where (for a birth move)
with analogous ratios for death and change.
Leaf proposals are constructed via Laplace approximations. At a new leaf , and approximate the mode and curvature of the local posterior:
so .
4. Algorithmic Workflow and Pseudocode
The RJMCMC-based Bayesian backfitting algorithm, per (Linero, 2022), for each tree and at each iteration, proceeds as:
- Compute partial residuals .
- With specified probabilities , select move type.
- Propose via the chosen move, with new leaf means sampled from their Laplace-approximated .
- Compute the acceptance ratio (Proposition 1).
- Accept with probability , otherwise retain current .
After all trees are updated, global (nuisance) parameters (e.g., variance, shape) are sampled via their respective full-conditionals, such as slice sampling for .
Key complexity and tuning characteristics:
- No tuning is required beyond standard BART hyperparameters .
- The computational cost per iteration is .
- Empirically, RJMCMC-backfitting exhibits mixing comparable to the conjugate scenario (Linero, 2022).
5. Detailed Balance, Dimension Matching, and Theoretical Properties
Each RJMCMC proposal alters the dimension of by adding or removing one leaf value; detailed balance is preserved via Green’s (1995) dimension-matching construction. For the birth move, the required bijective mapping in parameter space is implemented by discarding a single and replacing it with independent ; the Jacobian is 1. The acceptance probability
specializes precisely to the ratios of Proposition 1 (Linero, 2022). The use of Laplace-based ensures automatic adaptation to the target density’s local geometry, obviating the need for user-tuned proposal scales.
6. Representative Applications and Implementation Examples
The RJMCMC paradigm extends BART to a broad spectrum of models:
- Structured heteroskedastic regression: Models of the form with and , applied to Poisson data.
- Accelerated-failure-time survival models: Generalized gamma and log-logistic forms, supporting covariate-dependent shape parameters.
- Gamma-shape regression: , with .
In all scenarios, the user need only provide , its gradient, and optionally the Fisher information. Implementation reuses the same RJMCMC-backfitting routine, with posterior sampling over and global/nuisance parameters proceeding as with standard BART (Linero, 2022).
7. Significance and Broader Impact
By removing the conditional conjugacy restriction, RJMCMC-based Bayesian backfitting enables BART to be used in an extensive range of applications—beyond what classic data augmentation or tailored MCMC methods can support. This approach preserves the interpretability and flexible uncertainty quantification of tree ensembles, while ensuring practical applicability to contemporary statistical models in regression, survival analysis, and structured heteroskedasticity (Linero, 2022). A plausible implication is that further generalizations to other nonparametric ensemble priors may be straightforward, given only evaluability of the requisite likelihood and its derivatives.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days free