Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
107 tokens/sec
Gemini 2.5 Pro Premium
58 tokens/sec
GPT-5 Medium
29 tokens/sec
GPT-5 High Premium
25 tokens/sec
GPT-4o
101 tokens/sec
DeepSeek R1 via Azure Premium
84 tokens/sec
GPT OSS 120B via Groq Premium
478 tokens/sec
Kimi K2 via Groq Premium
213 tokens/sec
2000 character limit reached

Tensor models from the viewpoint of matrix models: the case of the Gaussian distribution (1411.6820v1)

Published 25 Nov 2014 in math-ph, math.CO, and math.MP

Abstract: Observables in random tensor theory are polynomials in the entries of a tensor of rank $d$ which are invariant under $U(N)d$. It is notoriously difficult to evaluate the expectations of such polynomials, even in the Gaussian distribution. In this article, we introduce singular value decompositions to evaluate the expectations of polynomial observables of Gaussian random tensors. Performing the matrix integrals over the unitary group leads to a notion of effective observables which expand onto regular, matrix trace invariants. Examples are given to illustrate that both asymptotic and exact new calculations of expectations can be performed this way.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.