Papers
Topics
Authors
Recent
Search
2000 character limit reached

Enhanced Principal Component Analysis under A Collaborative-Robust Framework

Published 22 Mar 2021 in cs.LG | (2103.11931v1)

Abstract: Principal component analysis (PCA) frequently suffers from the disturbance of outliers and thus a spectrum of robust extensions and variations of PCA have been developed. However, existing extensions of PCA treat all samples equally even those with large noise. In this paper, we first introduce a general collaborative-robust weight learning framework that combines weight learning and robust loss in a non-trivial way. More significantly, under the proposed framework, only a part of well-fitting samples are activated which indicates more importance during training, and others, whose errors are large, will not be ignored. In particular, the negative effects of inactivated samples are alleviated by the robust loss function. Then we furthermore develop an enhanced PCA which adopts a point-wise sigma-loss function that interpolates between L_2,1-norm and squared Frobenius-norm and meanwhile retains the rotational invariance property. Extensive experiments are conducted on occluded datasets from two aspects including reconstructed errors and clustering accuracy. The experimental results prove the superiority and effectiveness of our model.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.