Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distributed Averaging via Lifted Markov Chains (0908.4073v1)

Published 27 Aug 2009 in cs.IT, cs.DC, math.IT, and math.PR

Abstract: Motivated by applications of distributed linear estimation, distributed control and distributed optimization, we consider the question of designing linear iterative algorithms for computing the average of numbers in a network. Specifically, our interest is in designing such an algorithm with the fastest rate of convergence given the topological constraints of the network. As the main result of this paper, we design an algorithm with the fastest possible rate of convergence using a non-reversible Markov chain on the given network graph. We construct such a Markov chain by transforming the standard Markov chain, which is obtained using the Metropolis-Hastings method. We call this novel transformation pseudo-lifting. We apply our method to graphs with geometry, or graphs with doubling dimension. Specifically, the convergence time of our algorithm (equivalently, the mixing time of our Markov chain) is proportional to the diameter of the network graph and hence optimal. As a byproduct, our result provides the fastest mixing Markov chain given the network topological constraints, and should naturally find their applications in the context of distributed optimization, estimation and control.

Citations (42)

Summary

We haven't generated a summary for this paper yet.