Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

High-order Tensor Completion for Data Recovery via Sparse Tensor-train Optimization (1711.02271v2)

Published 7 Nov 2017 in cs.NA and cs.LG

Abstract: In this paper, we aim at the problem of tensor data completion. Tensor-train decomposition is adopted because of its powerful representation ability and linear scalability to tensor order. We propose an algorithm named Sparse Tensor-train Optimization (STTO) which considers incomplete data as sparse tensor and uses first-order optimization method to find the factors of tensor-train decomposition. Our algorithm is shown to perform well in simulation experiments at both low-order cases and high-order cases. We also employ a tensorization method to transform data to a higher-order form to enhance the performance of our algorithm. The results of image recovery experiments in various cases manifest that our method outperforms other completion algorithms. Especially when the missing rate is very high, e.g., 90\% to 99\%, our method is significantly better than the state-of-the-art methods.

Citations (27)

Summary

We haven't generated a summary for this paper yet.