Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Perturbation Bounds for (Nearly) Orthogonally Decomposable Tensors (2007.09024v2)

Published 17 Jul 2020 in math.NA, cs.NA, math.ST, stat.ML, and stat.TH

Abstract: We develop deterministic perturbation bounds for singular values and vectors of orthogonally decomposable tensors, in a spirit similar to classical results for matrices such as those due to Weyl, Davis, Kahan and Wedin. Our bounds demonstrate intriguing differences between matrices and higher-order tensors. Most notably, they indicate that for higher-order tensors perturbation affects each essential singular value/vector in isolation, and its effect on an essential singular vector does not depend on the multiplicity of its corresponding singular value or its distance from other singular values. Our results can be readily applied and provide a unified treatment to many different problems in statistics and machine learning involving spectral learning of higher-order orthogonally decomposable tensors. In particular, we illustrate the implications of our bounds in the context of high dimensional tensor SVD problem, and how it can be used to derive optimal rates of convergence for spectral learning.

Citations (3)

Summary

We haven't generated a summary for this paper yet.