Papers
Topics
Authors
Recent
Search
2000 character limit reached

Easy Conditioning far beyond Gaussian

Published 24 Sep 2024 in stat.ME | (2409.16003v3)

Abstract: Estimating and sampling from conditional densities plays a critical role in statistics and data science, with a plethora of applications. Numerous methods are available ranging from simple fitting approaches to sophisticated machine learning algorithms. However, selecting from among these often involves a trade-off between conflicting objectives of efficiency, flexibility and interpretability. Starting from well known easy conditioning results in the Gaussian case, we show, thanks to results pertaining to stability by mixing and marginal transformations, that the latter carry over far beyond the Gaussian case. This enables us to flexibly model multivariate data by accommodating broad classes of multi-modal dependence structures and marginal distributions, while enjoying fast conditioning of fitted joint distributions. In applications, we primarily focus on conditioning via Gaussian versus Gaussian mixture copula models, comparing different fitting implementations for the latter. Numerical experiments with simulated and real data demonstrate the relevance of the approach for conditional sampling, evaluated using multivariate scoring rules.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.