Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tensor decompositions and algorithms, with applications to tensor learning (2110.05997v1)

Published 12 Oct 2021 in math.NA and cs.NA

Abstract: A new algorithm of the canonical polyadic decomposition (CPD) presented here. It features lower computational complexity and memory usage than the available state of the art implementations. We begin with some examples of CPD applications to real world problems. A short summary of the main contributions in this work follows. In chapter 1 we review classical tensor algebra and geometry, with focus on the CPD. Chapter 2 focuses on tensor compression, which is considered (in this work) to be one of the most important parts of the CPD algorithm. In chapter 3 we talk about the Gauss-Newton method, which is a nonlinear least squares method used to minimize nonlinear functions. Chapter 4 is the longest one of this thesis. In this chapter we introduce the main character of this thesis: Tensor Fox. Basically it is a tensor package which includes a CPD solver. After introducing Tensor Fox we will conduct lots of computational experiments comparing this solver with several others. At the end of this chapter we introduce the Tensor Train decomposition and show how to use it to compute higher order CPDs. We also discuss some important details such as regularization, preconditioning, conditioning, parallelism, etc. In chapter 5 we consider the intersection between tensor decompositions and machine learning. A novel model is introduced, which works as a tensor version of neural networks. Finally, in chapter 6 we reach the final conclusions and introduce our expectations for future developments.

Citations (1)

Summary

We haven't generated a summary for this paper yet.