Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Parallel Distributed Algorithm for the Power SVD Method (2108.06108v1)

Published 13 Aug 2021 in cs.IT and math.IT

Abstract: In this work, we study how to implement a distributed algorithm for the power method in a parallel manner. As the existing distributed power method is usually sequentially updating the eigenvectors, it exhibits two obvious disadvantages: 1) when it calculates the $h$th eigenvector, it needs to wait for the results of previous $(h-1)$ eigenvectors, which causes a delay in acquiring all the eigenvalues; 2) when calculating each eigenvector, it needs a certain cost of information exchange within the neighboring nodes for every power iteration, which could be unbearable when the number of eigenvectors or the number of nodes is large. This motivates us to propose a parallel distributed power method, which simultaneously calculates all the eigenvectors at each power iteration to ensure that more information could be exchanged in one shaking-hand of communication. We are particularly interested in the distributed power method for both an eigenvalue decomposition (EVD) and a singular value decomposition (SVD), wherein the distributed process is proceed based on a gossip algorithm. It can be shown that, under the same condition, the communication cost of the gossip-based parallel method is only $1/H$ times of that for the sequential counterpart, where $H$ is the number of eigenvectors we want to compute, while the convergence time and error performance of the proposed parallel method are both comparable to those of its sequential counterpart.

Citations (2)

Summary

We haven't generated a summary for this paper yet.