Papers
Topics
Authors
Recent
2000 character limit reached

Nonparametric Bayes Classification via Learning of Affine Subspaces

Published 4 Jan 2013 in stat.ME | (1301.0681v1)

Abstract: The goal of this presentation is to build an efficient non-parametric Bayes classifier in the presence of large numbers of predictors. When analyzing such data, parametric models are often too inflexible while non-parametric procedures tend to be non-robust because of insufficient data on these high dimensional spaces. When dealing with these types of data, it is often the case that most of the variability tends to lie along a few directions, or more generally along a much smaller dimensional subspace of the feature space. Hence a class of regression models is proposed that flexibly learn about this subspace while simultaneously performing dimension reduction in classification. This methodology, allows the cell probabilities to vary non-parametrically based on a few coordinates expressed as linear combinations of the predictors. Also, as opposed to many black-box methods for dimensionality reduction, the proposed model is appealing in having clearly interpretable and identifiable parameters which provide insight into which predictors are important in determining accurate classification boundaries. Gibbs sampling methods are developed for posterior computations. The estimated cell probabilities are theoretically shown to be consistent, and real data applications are included to support the findings.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.