Papers
Topics
Authors
Recent
2000 character limit reached

Sample, computation vs storage tradeoffs for classification using tensor subspace models (1706.05599v3)

Published 18 Jun 2017 in cs.LG and stat.ML

Abstract: In this paper, we exhibit the tradeoffs between the (training) sample, computation and storage complexity for the problem of supervised classification using signal subspace estimation. Our main tool is the use of tensor subspaces, i.e. subspaces with a Kronecker structure, for embedding the data into lower dimensions. Among the subspaces with a Kronecker structure, we show that using subspaces with a hierarchical structure for representing data leads to improved tradeoffs. One of the main reasons for the improvement is that embedding data into these hierarchical Kronecker structured subspaces prevents overfitting at higher latent dimensions.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube