Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multiscale Gossip for Efficient Decentralized Averaging in Wireless Packet Networks (1011.2235v3)

Published 9 Nov 2010 in cs.DC

Abstract: This paper describes and analyzes a hierarchical gossip algorithm for solving the distributed average consensus problem in wireless sensor networks. The network is recursively partitioned into subnetworks. Initially, nodes at the finest scale gossip to compute local averages. Then, using geographic routing to enable gossip between nodes that are not directly connected, these local averages are progressively fused up the hierarchy until the global average is computed. We show that the proposed hierarchical scheme with $k$ levels of hierarchy is competitive with state-of-the-art randomized gossip algorithms, in terms of message complexity, achieving $\epsilon$-accuracy with high probability after $O\big(n \log \log n \log \frac{kn}{\epsilon} \big)$ messages. Key to our analysis is the way in which the network is recursively partitioned. We find that the optimal scaling law is achieved when subnetworks at scale $j$ contain $O(n{(2/3)j})$ nodes; then the message complexity at any individual scale is $O(n \log \frac{kn}{\epsilon})$, and the total number of scales in the hierarchy grows slowly, as $\Theta(\log \log n)$. Another important consequence of hierarchical construction is that the longest distance over which messages are exchanged is $O(n{1/3})$ hops (at the highest scale), and most messages (at lower scales) travel shorter distances. In networks that use link-level acknowledgements, this results in less congestion and resource usage by reducing message retransmissions. Simulations illustrate that the proposed scheme is more message-efficient than existing state-of-the-art randomized gossip algorithms based on averaging along paths.

Citations (8)

Summary

We haven't generated a summary for this paper yet.