Papers
Topics
Authors
Recent
2000 character limit reached

From explained variance of correlated components to PCA without orthogonality constraints (2402.04692v1)

Published 7 Feb 2024 in stat.ML and cs.LG

Abstract: Block Principal Component Analysis (Block PCA) of a data matrix A, where loadings Z are determined by maximization of AZ 2 over unit norm orthogonal loadings, is difficult to use for the design of sparse PCA by 1 regularization, due to the difficulty of taking care of both the orthogonality constraint on loadings and the non differentiable 1 penalty. Our objective in this paper is to relax the orthogonality constraint on loadings by introducing new objective functions expvar(Y) which measure the part of the variance of the data matrix A explained by correlated components Y = AZ. So we propose first a comprehensive study of mathematical and numerical properties of expvar(Y) for two existing definitions Zou et al. [2006], Shen and Huang [2008] and four new definitions. Then we show that only two of these explained variance are fit to use as objective function in block PCA formulations for A rid of orthogonality constraints.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (10)
  1. Optimization Algorithms on Matrix Manifolds, volume 78. 12 2008. ISBN 978-0-691-13298-3. doi: 10.1515/9781400830244.
  2. RW Brockett. Dynamical systems that sort lists, diagonalize matrices, and solve linear programming problems. Linear algebra and its applications, 146:79–91, 1991.
  3. Guy Chavent. Nonlinear Least Squares for Inverse Problems: Theoretical Foundations and Step-by-Step Guide for Applications. 01 2010. ISBN 978-90-481-2784-9. doi: 10.1007/978-90-481-2785-6.
  4. A group sparse explained variance block pca. Submitted, 2023.
  5. Generalized power method for sparse principal component analysis. Journal of Machine Learning Research, 11(Feb):517–553, 2010.
  6. G Miller. Closed-form inversion of the gram matrix arising in certain least-squares problems. IEEE Transactions on Circuit Theory, 16(2):237–240, 1969.
  7. R Core Team. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria, 2021. URL https://www.R-project.org/.
  8. Sparse principal component analysis via regularized low rank matrix approximation. Journal of multivariate analysis, 99(6):1015–1034, 2008.
  9. Patrice Tauvel. Cours de géométrie: agrégation de mathématiques. Dunod, Paris, 2000.
  10. Sparse principal component analysis. Journal of computational and graphical statistics, 15(2):265–286, 2006.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 3 tweets and received 2 likes.

Upgrade to Pro to view all of the tweets about this paper: