Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Selective Inference in Graphical Models via Maximum Likelihood (2503.24311v1)

Published 31 Mar 2025 in stat.ME, stat.AP, and stat.CO

Abstract: The graphical lasso is a widely used algorithm for fitting undirected Gaussian graphical models. However, for inference on functionals of edge values in the learned graph, standard tools lack formal statistical guarantees, such as control of the type I error rate. In this paper, we introduce a selective inference method for asymptotically valid inference after graphical lasso selection with added randomization. We obtain a selective likelihood, conditional on the event of selection, through a change of variable on the known density of the randomization variables. Our method enables interval estimation and hypothesis testing for a wide range of functionals of edge values in the learned graph using the conditional maximum likelihood estimate. Our numerical studies show that introducing a small amount of randomization: (i) greatly increases power and yields substantially shorter intervals compared to other conditional inference methods, including data splitting; (ii) ensures intervals of bounded length in high-dimensional settings where data splitting is infeasible due to insufficient samples for inference; (iii) enables inference for a wide range of inferential targets in the learned graph, including measures of node influence and connectivity between nodes.

Summary

We haven't generated a summary for this paper yet.