Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Distributed Lossy Coding of Symmetrically Correlated Gaussian Sources (2201.07784v2)

Published 19 Jan 2022 in cs.IT and math.IT

Abstract: A distributed lossy compression network with $L$ encoders and a decoder is considered. Each encoder observes a source and sends a compressed version to the decoder. The decoder produces a joint reconstruction of target signals with the mean squared error distortion below a given threshold. It is assumed that the observed sources can be expressed as the sum of target signals and corruptive noises which are independently generated from two symmetric multivariate Gaussian distributions. The minimum compression rate of this network versus the distortion threshold is referred to as the rate-distortion function, for which an explicit lower bound is established by solving a minimization problem. Our lower bound matches the well-known Berger-Tung upper bound for some values of the distortion threshold. The asymptotic gap between the upper and lower bounds is characterized in the large $L$ limit.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Siyao Zhou (2 papers)
  2. Sadaf Salehkalaibar (18 papers)
  3. Jingjing Qian (4 papers)
  4. Jun Chen (374 papers)
  5. Wuxian Shi (4 papers)
  6. Yiqun Ge (14 papers)
  7. Wen Tong (26 papers)

Summary

We haven't generated a summary for this paper yet.