Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 21 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 92 tok/s Pro
Kimi K2 196 tok/s Pro
GPT OSS 120B 431 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Decomposable context-specific models (2210.11521v3)

Published 20 Oct 2022 in math.ST, math.AC, math.CO, and stat.TH

Abstract: We introduce a family of discrete context-specific models, which we call decomposable. We construct this family from the subclass of staged tree models known as CStree models. We give an algebraic and combinatorial characterization of all context-specific independence relations that hold in a decomposable context-specific model, which yields a Markov basis. We prove that the moralization operation applied to the graphical representation of a context-specific model does not affect the implied independence relations, thus affirming that these models are algebraically described by a finite collection of decomposable graphical models. More generally, we establish that several algebraic, combinatorial, and geometric properties of decomposable context-specific models generalize those of decomposable graphical models to the context-specific setting.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.