Papers
Topics
Authors
Recent
2000 character limit reached

Sparse random tensors: Concentration, regularization and applications (1911.09063v7)

Published 20 Nov 2019 in math.PR, math.CO, math.ST, and stat.TH

Abstract: We prove a non-asymptotic concentration inequality for the spectral norm of sparse inhomogeneous random tensors with Bernoulli entries. For an order-$k$ inhomogeneous random tensor $T$ with sparsity $p_{\max}\geq \frac{c\log n}{n }$, we show that $|T-\mathbb E T|=O(\sqrt{n p_{\max}}\log{k-2}(n))$ with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor unfolding, we extend the range of sparsity to $p_{\max}\geq \frac{c\log n}{n{m}}$ with $1\leq m\leq k-1$ and obtain concentration inequalities for different sparsity regimes. We also provide a simple way to regularize $T$ such that $O(\sqrt{n{m}p_{\max}})$ concentration still holds down to sparsity $p_{\max}\geq \frac{c}{n{m}}$ with $k/2\leq m\leq k-1$. We present our concentration and regularization results with two applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of sparsified tensors under uniform sampling.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.