Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Dimensionality Reduction of various Datasets using Novel Multiplicative Factoring Principal Component Analysis (MPCA) (2009.12179v1)

Published 25 Sep 2020 in cs.CV

Abstract: Principal Component Analysis (PCA) is known to be the most widely applied dimensionality reduction approach. A lot of improvements have been done on the traditional PCA, in order to obtain optimal results in the dimensionality reduction of various datasets. In this paper, we present an improvement to the traditional PCA approach called Multiplicative factoring Principal Component Analysis (MPCA). The advantage of MPCA over the traditional PCA is that a penalty is imposed on the occurrence space through a multiplier to make negligible the effect of outliers in seeking out projections. Here we apply two multiplier approaches, total distance and cosine similarity metrics. These two approaches can learn the relationship that exists between each of the data points and the principal projections in the feature space. As a result of this, improved low-rank projections are gotten through multiplying the data iteratively to make negligible the effect of corrupt data in the training set. Experiments were carried out on YaleB, MNIST, AR, and Isolet datasets and the results were compared to results gotten from some popular dimensionality reduction methods such as traditional PCA, RPCA-OM, and also some recently published methods such as IFPCA-1 and IFPCA-2.

Citations (1)

Summary

We haven't generated a summary for this paper yet.