Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Alternative Graphical Lasso Algorithm for Precision Matrices (2403.12357v1)

Published 19 Mar 2024 in stat.CO and stat.ML

Abstract: The Graphical Lasso (GLasso) algorithm is fast and widely used for estimating sparse precision matrices (Friedman et al., 2008). Its central role in the literature of high-dimensional covariance estimation rivals that of Lasso regression for sparse estimation of the mean vector. Some mysteries regarding its optimization target, convergence, positive-definiteness and performance have been unearthed, resolved and presented in Mazumder and Hastie (2011), leading to a new/improved (dual-primal) DP-GLasso. Using a new and slightly different reparametriztion of the last column of a precision matrix we show that the regularized normal log-likelihood naturally decouples into a sum of two easy to minimize convex functions one of which is a Lasso regression problem. This decomposition is the key in developing a transparent, simple iterative block coordinate descent algorithm for computing the GLasso updates with performance comparable to DP-GLasso. In particular, our algorithm has the precision matrix as its optimization target right at the outset, and retains all the favorable properties of the DP-GLasso algorithm.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (7)
  1. Alon, Uri, Naama Barkai, Daniel A. Notterman, Kenneth W. Gish, Suzanne E. Ybarra, Douglas Michael Mach, and Arnold J. Levine (1999), “Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays.” Proceedings of the National Academy of Sciences of the United States of America, 96 12, 6745–50.
  2. Banerjee, O., L. E. Ghaoui, and A. d’Aspermont (2008), “Model selection through sparse maximum likelihood estimation for multivariate gaussian or binary data.” Journal of Machine Learning Research, 9, 485–516.
  3. Friedman, J., T. Hastie, and R. Tibshirani (2008), “Sparse inverse covariance estimation with the graphical lasso.” Biostatistics, 9, 432–441.
  4. Mazumder, Rahul and Trevor Hastie (2012), “Exact covariance thresholding into connected components for large-scale graphical lasso.” Journal of Machine Learning Research, 13, 781–794.
  5. Mazumder, Rahul and Trevor J. Hastie (2011), “The graphical lasso: New insights and alternatives.” Electronic journal of statistics, 6, 2125–2149.
  6. Tseng, Paul (2001), “Convergence of a block coordinate descent method for nondifferentiable minimization.” Journal of Optimization Theory and Applications, 109, 475–494.
  7. Wang, Hao (2014), “Coordinate descent algorithm for covariance graphical lasso.” Statistics and Computing, 24, 521–529.

Summary

We haven't generated a summary for this paper yet.