Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning AMP Chain Graphs and some Marginal Models Thereof under Faithfulness: Extended Version (1303.0691v3)

Published 4 Mar 2013 in stat.ML, cs.AI, and cs.LG

Abstract: This paper deals with chain graphs under the Andersson-Madigan-Perlman (AMP) interpretation. In particular, we present a constraint based algorithm for learning an AMP chain graph a given probability distribution is faithful to. Moreover, we show that the extension of Meek's conjecture to AMP chain graphs does not hold, which compromises the development of efficient and correct score+search learning algorithms under assumptions weaker than faithfulness. We also introduce a new family of graphical models that consists of undirected and bidirected edges. We name this new family maximal covariance-concentration graphs (MCCGs) because it includes both covariance and concentration graphs as subfamilies. However, every MCCG can be seen as the result of marginalizing out some nodes in an AMP CG. We describe global, local and pairwise Markov properties for MCCGs and prove their equivalence. We characterize when two MCCGs are Markov equivalent, and show that every Markov equivalence class of MCCGs has a distinguished member. We present a constraint based algorithm for learning a MCCG a given probability distribution is faithful to. Finally, we present a graphical criterion for reading dependencies from a MCCG of a probability distribution that satisfies the graphoid properties, weak transitivity and composition. We prove that the criterion is sound and complete in certain sense.

Citations (6)

Summary

We haven't generated a summary for this paper yet.