Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Extrapolation Methods for fixed-point Multilinear PageRank computations (1906.01494v2)

Published 4 Jun 2019 in math.NA and cs.NA

Abstract: Nonnegative tensors arise very naturally in many applications that involve large and complex data flows. Due to the relatively small requirement in terms of memory storage and number of operations per step, the (shifted) higher-order power method is one of the most commonly used technique for the computation of positive Z-eigenvectors of this type of tensors. However, unlike the matrix case, the method may fail to converge even for irreducible tensors. Moreover, when it converges, its convergence rate can be very slow. These two drawbacks often make the computation of the eigenvectors demanding or unfeasible for large problems. In this work we consider a particular class of nonnegative tensors associated to the multilinear PageRank modification of higher-order Markov chains. Based on the simplified topological ${\epsilon}$-algorithm in its restarted form, we introduce an extrapolation-based acceleration of power method type algorithms, namely the shifted fixed-point method and the inner-outer method. The accelerated methods show remarkably better performance, with faster convergence rates and reduced overall computational time. Extensive numerical experiments on synthetic and real-world datasets demonstrate the advantages of the introduced extrapolation techniques.

Citations (25)

Summary

We haven't generated a summary for this paper yet.