Papers
Topics
Authors
Recent
AI Research Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 79 tok/s
Gemini 2.5 Pro 30 tok/s Pro
GPT-5 Medium 29 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 116 tok/s Pro
Kimi K2 191 tok/s Pro
GPT OSS 120B 468 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Multi-Domain Sampling With Applications to Structural Inference of Bayesian Networks (1110.3392v1)

Published 15 Oct 2011 in stat.ME and stat.CO

Abstract: When a posterior distribution has multiple modes, unconditional expectations, such as the posterior mean, may not offer informative summaries of the distribution. Motivated by this problem, we propose to decompose the sample space of a multimodal distribution into domains of attraction of local modes. Domain-based representations are defined to summarize the probability masses of and conditional expectations on domains of attraction, which are much more informative than the mean and other unconditional expectations. A computational method, the multi-domain sampler, is developed to construct domain-based representations for an arbitrary multimodal distribution. The multi-domain sampler is applied to structural learning of protein-signaling networks from high-throughput single-cell data, where a signaling network is modeled as a causal Bayesian network. Not only does our method provide a detailed landscape of the posterior distribution but also improves the accuracy and the predictive power of estimated networks.

Summary

We haven't generated a summary for this paper yet.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.