Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nonconvex Low-Rank Tensor Completion from Noisy Data (1911.04436v2)

Published 11 Nov 2019 in cs.LG, cs.IT, math.IT, math.OC, math.ST, stat.ML, and stat.TH

Abstract: We study a noisy tensor completion problem of broad practical interest, namely, the reconstruction of a low-rank tensor from highly incomplete and randomly corrupted observations of its entries. While a variety of prior work has been dedicated to this problem, prior algorithms either are computationally too expensive for large-scale applications, or come with sub-optimal statistical guarantees. Focusing on "incoherent" and well-conditioned tensors of a constant CP rank, we propose a two-stage nonconvex algorithm -- (vanilla) gradient descent following a rough initialization -- that achieves the best of both worlds. Specifically, the proposed nonconvex algorithm faithfully completes the tensor and retrieves all individual tensor factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e. minimal sample complexity and optimal estimation accuracy). The estimation errors are evenly spread out across all entries, thus achieving optimal $\ell_{\infty}$ statistical accuracy. We have also discussed how to extend our approach to accommodate asymmetric tensors. The insight conveyed through our analysis of nonconvex optimization might have implications for other tensor estimation problems.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Changxiao Cai (11 papers)
  2. Gen Li (145 papers)
  3. H. Vincent Poor (884 papers)
  4. Yuxin Chen (196 papers)
Citations (80)

Summary

We haven't generated a summary for this paper yet.