Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A New Sampling Technique for Tensors (1502.05023v2)

Published 17 Feb 2015 in stat.ML, cs.DS, cs.IT, cs.LG, and math.IT

Abstract: In this paper we propose new techniques to sample arbitrary third-order tensors, with an objective of speeding up tensor algorithms that have recently gained popularity in machine learning. Our main contribution is a new way to select, in a biased random way, only $O(n{1.5}/\epsilon2)$ of the possible $n3$ elements while still achieving each of the three goals: \ {\em (a) tensor sparsification}: for a tensor that has to be formed from arbitrary samples, compute very few elements to get a good spectral approximation, and for arbitrary orthogonal tensors {\em (b) tensor completion:} recover an exactly low-rank tensor from a small number of samples via alternating least squares, or {\em (c) tensor factorization:} approximating factors of a low-rank tensor corrupted by noise. \ Our sampling can be used along with existing tensor-based algorithms to speed them up, removing the computational bottleneck in these methods.

Citations (35)

Summary

We haven't generated a summary for this paper yet.