Papers
Topics
Authors
Recent
2000 character limit reached

Principal Component Analysis in an Asymmetric Norm

Published 14 Jan 2014 in stat.ME, math.OC, and stat.AP | (1401.3229v1)

Abstract: Principal component analysis (PCA) is a widely used dimension reduction tool in the analysis of many kind of high-dimensional data. It is used in signal processing, mechanical engineering, psychometrics, and other fields under different names. It still bears the same mathematical idea: the decomposition of variation of a high dimensional object into uncorrelated factors or components. However, in many of the above applications, one is interested in capturing the tail variables of the data rather than variation around the mean. Such applications include weather related event curves, expected shortfalls, and speeding analysis among others. These are all high dimensional tail objects which one would like to study in a PCA fashion. The tail character though requires to do the dimension reduction in an asymmetric norm rather than the classical $L_2$-type orthogonal projection. We develop an analogue of PCA in an asymmetric norm. These norms cover both quantiles and expectiles, another tail event measure. The difficulty is that there is no natural basis, no `principal components', to the $k$-dimensional subspace found. We propose two definitions of principal components and provide algorithms based on iterative least squares. We prove upper bounds on their convergence times, and compare their performances in a simulation study. We apply the algorithms to a Chinese weather dataset with a view to weather derivative pricing

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.