Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tribes Is Hard in the Message Passing Model (1602.06079v1)

Published 19 Feb 2016 in cs.CC

Abstract: We consider the point-to-point message passing model of communication in which there are $k$ processors with individual private inputs, each $n$-bit long. Each processor is located at the node of an underlying undirected graph and has access to private random coins. An edge of the graph is a private channel of communication between its endpoints. The processors have to compute a given function of all their inputs by communicating along these channels. While this model has been widely used in distributed computing, strong lower bounds on the amount of communication needed to compute simple functions have just begun to appear. In this work, we prove a tight lower bound of $\Omega(kn)$ on the communication needed for computing the Tribes function, when the underlying graph is a star of $k+1$ nodes that has $k$ leaves with inputs and a center with no input. Lower bound on this topology easily implies comparable bounds for others. Our lower bounds are obtained by building upon the recent information theoretic techniques of Braverman et.al (FOCS'13) and combining it with the earlier work of Jayram, Kumar and Sivakumar (STOC'03). This approach yields information complexity bounds that is of independent interest.

Citations (10)

Summary

We haven't generated a summary for this paper yet.