An Infinite Dimensional Analysis of Kernel Principal Components
Abstract: We study non-linear data-dimension reduction. We are motivated by the classical linear framework of Principal Component Analysis. In nonlinear case, we introduce instead a new kernel-Principal Component Analysis, manifold and feature space transforms. Our results extend earlier work for probabilistic Karhunen-Lo`eve transforms on compression of wavelet images. Our object is algorithms for optimization, selection of efficient bases, or components, which serve to minimize entropy and error; and hence to improve digital representation of images, and hence of optimal storage, and transmission. We prove several new theorems for data-dimension reduction. Moreover, with the use of frames in Hilbert space, and a new Hilbert-Schmidt analysis, we identify when a choice of Gaussian kernel is optimal.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.