Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 87 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 98 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Kimi K2 210 tok/s Pro
2000 character limit reached

Effective algorithms for tensor train decomposition via the UTV framework (2501.07904v1)

Published 14 Jan 2025 in math.NA and cs.NA

Abstract: The tensor train (TT) decomposition is used to compress large tensors into a more compact form by exploiting their inherent data structures. A fundamental approach for constructing the TT format is the TT-SVD, which extracts the TT-cores by the singular value decompositions (SVDs) sequentially. But in practical applications, it is often not necessary to compute full SVDs. In this article, we therefore propose a new method called the TT-UTV. It utilizes the virtues of rank-revealing UTV decomposition to compute the TT format for a large-scale tensor, hence requires less computational cost. We analyze the error bounds on the accuracy of these algorithms both in the URV and ULV cases, then recommend different sweep patterns for these two cases. We perform numerical experiments on some applications, including magnetic resonance imaging (MRI) data completion, to illustrate their good performance in practice.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com