Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 87 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 98 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Kimi K2 210 tok/s Pro
2000 character limit reached

Convergence of Alternating Least Squares Optimisation for Rank-One Approximation to High Order Tensors (1503.05431v1)

Published 18 Mar 2015 in math.NA

Abstract: The approximation of tensors has important applications in various disciplines, but it remains an extremely challenging task. It is well known that tensors of higher order can fail to have best low-rank approximations, but with an important exception that best rank-one approximations always exists. The most popular approach to low-rank approximation is the alternating least squares (ALS) method. The convergence of the alternating least squares algorithm for the rank-one approximation problem is analysed in this paper. In our analysis we are focusing on the global convergence and the rate of convergence of the ALS algorithm. It is shown that the ALS method can converge sublinearly, Q-linearly, and even Q-superlinearly. Our theoretical results are illustrated on explicit examples.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.