2000 character limit reached
Subspace Learning with Partial Information (1402.4844v2)
Published 19 Feb 2014 in cs.LG and stat.ML
Abstract: The goal of subspace learning is to find a $k$-dimensional subspace of $\mathbb{R}d$, such that the expected squared distance between instance vectors and the subspace is as small as possible. In this paper we study subspace learning in a partial information setting, in which the learner can only observe $r \le d$ attributes from each instance vector. We propose several efficient algorithms for this task, and analyze their sample complexity
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.