Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distributed Quantization Networks (1210.8400v1)

Published 31 Oct 2012 in cs.IT and math.IT

Abstract: Several key results in distributed source coding offer the intuition that little improvement in compression can be gained from intersensor communication when the information is coded in long blocks. However, when sensors are restricted to code their observations in small blocks (e.g., 1), intelligent collaboration between sensors can greatly reduce distortion. For networks where sensors are allowed to "chat" using a side channel that is unobservable at the fusion center, we provide asymptotically-exact characterization of distortion performance and optimal quantizer design in the high-resolution (low-distortion) regime using a framework called distributed functional scalar quantization (DFSQ). The key result is that chatting can dramatically improve performance even when intersensor communication is at very low rate, especially if the fusion center desires fidelity of a nonlinear computation applied to source realizations rather than fidelity in representing the sources themselves. We also solve the rate allocation problem when communication links have heterogeneous costs and provide a detailed example to demonstrate the theoretical and practical gains from chatting. This example for maximum computation gives insight on the gap between chatting and distributed networks, and how to optimize the intersensor communication.

Summary

We haven't generated a summary for this paper yet.