DNNLasso: Scalable Graph Learning for Matrix-Variate Data (2403.02608v1)
Abstract: We consider the problem of jointly learning row-wise and column-wise dependencies of matrix-variate observations, which are modelled separately by two precision matrices. Due to the complicated structure of Kronecker-product precision matrices in the commonly used matrix-variate Gaussian graphical models, a sparser Kronecker-sum structure was proposed recently based on the Cartesian product of graphs. However, existing methods for estimating Kronecker-sum structured precision matrices do not scale well to large scale datasets. In this paper, we introduce DNNLasso, a diagonally non-negative graphical lasso model for estimating the Kronecker-sum structured precision matrix, which outperforms the state-of-the-art methods by a large margin in both accuracy and computational time. Our code is available at https://github.com/YangjingZhang/DNNLasso.
- G. I. Allen and R. Tibshirani. Transposable regularized covariance models with an application to missing data imputation. The Annals of Applied Statistics, 4(2):764, 2010.
- Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data. Journal of Machine Learning Research, 9:485–516, 2008.
- A. P. Dawid. Some matrix-variate distribution theory: Notational considerations and a Bayesian application. Biometrika, 68(1):265–274, 1981.
- J. Eckstein and D. P. Bertsekas. On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators. Mathematical Programming, 55(1):293–318, 1992.
- B. Efron. Are a set of microarrays independent of each other? The Annals of Applied Statistics, 3(3):922, 2009.
- Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9(3):432–441, 2008.
- D. Gabay and B. Mercier. A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Computers and Mathematics with Applications, 2(1):17–40, 1976.
- R. Glowinski and A. Marroco. Sur l’approximation, par éléments finis d’ordre un, et la résolution, par pénalisation-dualité d’une classe de problèmes de dirichlet non linéaires. Revue française d’automatique, informatique, recherche opérationnelle. Analyse numérique, 9(R2):41–76, 1975.
- Tensor graphical lasso (TeraLasso). Journal of the Royal Statistical Society: Series B (Statistical Methodology), 81(5):901–931, 2019.
- Matrix Variate Distributions. Chapman and Hall/CRC, 1999.
- Topics in Matrix Analysis. Cambridge University Press, 1991.
- QUIC: Quadratic approximation for sparse inverse covariance estimation. Journal of Machine Learning Research, 15(1):2911–2947, 2014.
- The bigraphical lasso. In International Conference on Machine Learning, pages 1229–1237, 2013.
- S. L. Lauritzen. Graphical Models, volume 17. Clarendon Press, 1996.
- C. Leng and C. Y. Tang. Sparse matrix graphical models. Journal of the American Statistical Association, 107(499):1187–1200, 2012.
- Columbia Object Image Library (COIL-20). Technical report CUCS-005-96, 1996.
- Y. Nesterov. Gradient methods for minimizing composite functions. Mathematical Programming, 140(1):125–161, 2013.
- N. Parikh and S. Boyd. Proximal algorithms. Foundations and Trends in Optimization, 1(3):127–239, 2014.
- R. T. Rockafellar. Convex Analysis. Princeton University Press, 1996.
- Sparse permutation invariant covariance estimation. Electronic Journal of Statistics, 2:494–515, 2008.
- A. Stevens and R. Willett. Graph-guided regularization for improved seasonal forecasting. Climate Informatics, 2019.
- Graph-guided regularized regression of pacific ocean climate variables to increase predictive skill of southwestern US winter precipitation. Journal of Climate, 34(2):737–754, 2021.
- T. Tsiligkaridis and A. O. Hero. Covariance estimation in high dimensions via Kronecker product expansions. IEEE Transactions on Signal Processing, 61(21):5347–5360, 2013.
- On convergence of Kronecker graphical lasso algorithms. IEEE Transactions on Signal Processing, 61(7):1743–1755, 2013.
- J. Yin and H. Li. Model selection and estimation in the matrix normal graphical model. Journal of Multivariate Analysis, 107:119–140, 2012.
- J. H. Yoon and S. Kim. EiGLasso for scalable sparse Kronecker-sum inverse covariance estimation. Journal of Machine Learning Research, 23(110):1–39, 2022.
- M. Yuan and Y. Lin. Model selection and estimation in the Gaussian graphical model. Biometrika, 94(1):19–35, 2007.
- S. Zhou. Gemini: Graph estimation with matrix variate normal instances. The Annals of Statistics, 42(2):532–562, 2014.