Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Directed Graphical Models from Gaussian Data (1906.08050v3)

Published 19 Jun 2019 in cs.LG and stat.ML

Abstract: In this paper, we introduce a new directed graphical model from Gaussian data: the Gaussian graphical interaction model (GGIM). The development of this model comes from considering stationary Gaussian processes on graphs, and leveraging the equations between the resulting steady-state covariance matrix and the Laplacian matrix representing the interaction graph. Through the presentation of conceptually straightforward theory, we develop the new model and provide interpretations of the edges in the graphical model in terms of statistical measures. We show that when restricted to undirected graphs, the Laplacian matrix representing a GGIM is equivalent to the standard inverse covariance matrix that encodes conditional dependence relationships. Furthermore, our approach leads to a natural definition of directed conditional independence of two elements in a stationary Gaussian process. We demonstrate that the problem of learning sparse GGIMs for a given observation set can be framed as a LASSO problem. By comparison with the problem of inverse covariance estimation, we prove a bound on the difference between the covariance matrix corresponding to a sparse GGIM and the covariance matrix corresponding to the $l_1$-norm penalized maximum log-likelihood estimate. Finally, we consider the problem of learning GGIMs associated with sparse directed conditional dependence relationships. In all, the new model presents a novel perspective on directed relationships between variables and significantly expands on the state of the art in Gaussian graphical modeling.

Citations (10)

Summary

We haven't generated a summary for this paper yet.