Papers
Topics
Authors
Recent
2000 character limit reached

The core consistency of a compressed tensor

Published 18 Nov 2018 in cs.LG, eess.SP, and stat.ML | (1811.07428v1)

Abstract: Tensor decomposition on big data has attracted significant attention recently. Among the most popular methods is a class of algorithms that leverages compression in order to reduce the size of the tensor and potentially parallelize computations. A fundamental requirement for such methods to work properly is that the low-rank tensor structure is retained upon compression. In lieu of efficient and realistic means of computing and studying the effects of compression on the low rank of a tensor, we study the effects of compression on the core consistency; a widely used heuristic that has been used as a proxy for estimating that low rank. We provide theoretical analysis, where we identify sufficient conditions for the compression such that the core consistency is preserved, and we conduct extensive experiments that validate our analysis. Further, we explore popular compression schemes and how they affect the core consistency.

Citations (3)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.