Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distributed Compression in the Era of Machine Learning: A Review of Recent Advances (2402.07997v1)

Published 12 Feb 2024 in cs.IT, eess.SP, and math.IT

Abstract: Many applications from camera arrays to sensor networks require efficient compression and processing of correlated data, which in general is collected in a distributed fashion. While information-theoretic foundations of distributed compression are well investigated, the impact of theory in practice-oriented applications to this day has been somewhat limited. As the field of data compression is undergoing a transformation with the emergence of learning-based techniques, machine learning is becoming an important tool to reap the long-promised benefits of distributed compression. In this paper, we review the recent contributions in the broad area of learned distributed compression techniques for abstract sources and images. In particular, we discuss approaches that provide interpretable results operating close to information-theoretic bounds. We also highlight unresolved research challenges, aiming to inspire fresh interest and advancements in the field of learned distributed compression.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (47)
  1. C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, no. 3, pp. 379–423, 1948.
  2. G. Wallace, “The JPEG still picture compression standard,” IEEE Transactions on Consumer Electronics, vol. 38, no. 1, pp. xviii–xxxiv, 1992.
  3. J. Ballé, V. Laparra, and E. P. Simoncelli, “End-to-end optimized image compression,” in International Conference on Learning Representations, 2017.
  4. J. Ballé, D. Minnen, S. Singh, S. J. Hwang, and N. Johnston, “Variational image compression with a scale hyperprior,” in International Conference on Learning Representations, 2018.
  5. J. Ballé, P. A. Chou, D. Minnen, S. Singh, N. Johnston, E. Agustsson, S. J. Hwang, and G. Toderici, “Nonlinear transform coding,” IEEE Journal of Selected Topics in Signal Processing, vol. 15, no. 2, pp. 339–353, 2021.
  6. D. C. Minnen, J. Ballé, and G. Toderici, “Joint autoregressive and hierarchical priors for learned image compression,” in Neural Information Processing Systems, 2018.
  7. F. Bellard, “BPG image format,” https://bellard.org/bpg/, 2014.
  8. M. Leshno, V. Y. Lin, A. Pinkus, and S. Schocken, “Multilayer feedforward networks with a nonpolynomial activation function can approximate any function,” Neural Networks, vol. 6, no. 6, pp. 861–867, Jan. 1993. [Online]. Available: https://doi.org/10.1016/s0893-6080(05)80131-5
  9. A. Wyner and J. Ziv, “The rate–distortion function for source coding with side information at the decoder,” IEEE Transactions on Information Theory, vol. 22, no. 1, pp. 1 – 10, 1976.
  10. D. Slepian and J. Wolf, “Noiseless coding of correlated information sources,” IEEE Transactions on Information Theory, vol. 19, no. 4, pp. 471 – 480, 1973.
  11. T. Cover, “A proof of the data compression theorem of Slepian and Wolf for ergodic sources (corresp.),” IEEE Transactions on Information Theory, vol. 21, no. 2, pp. 226–228, 1975.
  12. A. B. Wagner, S. Tavildar, and P. Viswanath, “Rate region of the quadratic gaussian two-encoder source-coding problem,” in 2006 IEEE International Symposium on Information Theory, 2006, pp. 1404–1408.
  13. R. Zamir and S. Shamai, “Nested linear/lattice codes for Wyner-Ziv encoding,” in 1998 Information Theory Workshop (Cat. No.98EX131), 1998, pp. 92–93.
  14. R. Zamir, S. Shamai, and U. Erez, “Nested linear/lattice codes for structured multiterminal binning,” IEEE Transactions on Information Theory, vol. 48, no. 6, pp. 1250–1276, 2002.
  15. S. Pradhan and K. Ramchandran, “Distributed source coding using syndromes (DISCUS): Design and construction,” IEEE Transactions on Information Theory, vol. 49, no. 3, pp. 626–643, 2003.
  16. V. Goyal, “Theoretical foundations of transform coding,” IEEE Signal Processing Magazine, vol. 18, no. 5, pp. 9–21, 2001.
  17. A. Netravali and J. Limb, “Picture coding: A review,” Proceedings of the IEEE, vol. 68, no. 3, pp. 366–406, 1980.
  18. E. Kodak, “Kodak lossless true color image suite (PhotoCD PCD0992),” http://r0k.us/graphics/kodak.
  19. E. Agustsson and L. Theis, “Universally quantized neural compression,” in Proceedings of the 34th International Conference on Neural Information Processing Systems, 2020.
  20. C. Bennett, P. Shor, J. Smolin, and A. Thapliyal, “Entanglement-assisted capacity of a quantum channel and the reverse shannon theorem,” IEEE Transactions on Information Theory, vol. 48, no. 10, pp. 2637–2655, 2002.
  21. M. Havasi, R. Peharz, and J. M. Hernández-Lobato, “Minimal random code learning: Getting bits back from compressed model parameters,” in International Conference on Learning Representations, 2019.
  22. G. Flamich, M. Havasi, and J. M. Hernández-Lobato, “Compressing images by encoding their latent representations with relative entropy coding,” in Proceedings of the 34th International Conference on Neural Information Processing Systems, 2020.
  23. L. Theis and N. Y. Ahmed, “Algorithms for the communication of samples,” in Proceedings of the 39th International Conference on Machine Learning, ser. Proceedings of Machine Learning Research, vol. 162.   PMLR, 17–23 Jul 2022, pp. 21 308–21 328.
  24. P. Merkle, K. Muller, A. Smolic, and T. Wiegand, “Efficient compression of multi-view video exploiting inter-view dependencies based on H.264/MPEG4-AVC,” in 2006 IEEE International Conference on Multimedia and Expo, 2006, pp. 1717–1720.
  25. J. Liu, S. Wang, and R. Urtasun, “DSIC: Deep stereo image compression,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, October 2019.
  26. S. Ayzik and S. Avidan, “Deep image compression using decoder side information,” in Computer Vision – ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XVII.   Berlin, Heidelberg: Springer-Verlag, 2020, p. 699–714.
  27. N. Mital, E. Ozyilkan, A. Garjani, and D. Gunduz, “Neural distributed image compression with cross-attention feature alignment,” in 2023 IEEE/CVF Winter Conference on Applications of Computer Vision, Jan 2023, pp. 2497–2506.
  28. N. Mital, E. Özyılkan, A. Garjani, and D. Gündüz, “Neural distributed image compression using common information,” in 2022 Data Compression Conference, 2022, pp. 182–191.
  29. J. Whang, A. Nagle, A. Acharya, H. Kim, and A. G. Dimakis, “Neural distributed source coding,” 2023. [Online]. Available: https://arxiv.org/abs/2106.02797
  30. A. van den Oord, O. Vinyals, and K. Kavukcuoglu, “Neural discrete representation learning,” in Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017, p. 6309–6318.
  31. S. F. Yilmaz, E. Ozyilkan, D. Gunduz, and E. Erkip, “Distributed deep joint source-channel coding with decoder-only side information,” to appear in 2024 IEEE International Conference on Machine Learning for Communication and Networking, 2023. [Online]. Available: https://arxiv.org/abs/2310.04311
  32. E. Bourtsoulatze, D. Burth Kurka, and D. Gunduz, “Deep joint source-channel coding for wireless image transmission,” IEEE Transactions on Cognitive Communications and Networking, vol. 5, no. 3, p. 567–579, Sep. 2019.
  33. X. Zhang, J. Shao, and J. Zhang, “LDMIC: Learning-based distributed multi-view image coding,” in The Eleventh International Conference on Learning Representations, 2023.
  34. P. Li, S. K. Ankireddy, R. Zhao, H. N. Mahjoub, E. M. Pari, U. Topcu, S. P. Chinchali, and H. Kim, “Task-aware distributed source coding under dynamic bandwidth,” in Thirty-seventh Conference on Neural Information Processing Systems, 2023.
  35. G. Sullivan, “Efficient scalar quantization of exponential and Laplacian random variables,” IEEE Transactions on Information Theory, vol. 42, no. 5, pp. 1365–1374, 1996.
  36. A. B. Wagner and J. Ballé, “Neural networks optimally compress the sawbridge,” in 2021 Data Compression Conference, 2021, pp. 143–152.
  37. S. Bhadane, A. B. Wagner, and J. Ballé, “Do neural networks compress manifolds optimally?” in 2022 IEEE Information Theory Workshop, 2022, pp. 582–587.
  38. N. Rahaman, A. Baratin, D. Arpit, F. Draxler, M. Lin, F. Hamprecht, Y. Bengio, and A. Courville, “On the spectral bias of neural networks,” in Proceedings of the 36th International Conference on Machine Learning, vol. 97, 2019, pp. 5301–5310.
  39. E. Domanovitz, D. Severo, A. Khisti, and W. Yu, “Data-driven optimization for zero-delay lossy source coding with side information,” in ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing, 2022, pp. 5203–5207.
  40. E. Tuncel, “Predictive coding of correlated sources,” in Information Theory Workshop, 2004, pp. 111–116.
  41. X. Chen and E. Tuncel, “Low-delay prediction- and transform-based Wyner–Ziv coding,” IEEE Transactions on Signal Processing, vol. 59, no. 2, pp. 653–666, 2011.
  42. B. Phan, A. Khisti, and C. Louizos, “Importance matching lemma for lossy compression with side information,” 2024. [Online]. Available: https://arxiv.org/abs/2401.02609
  43. C. T. Li and V. Anantharam, “A unified framework for one-shot achievability via the Poisson matching lemma,” IEEE Transactions on Information Theory, vol. 67, no. 5, pp. 2624–2651, May 2021.
  44. E. Özyılkan, J. Ballé, and E. Erkip, “Learned Wyner–Ziv compressors recover binning,” in 2023 IEEE International Symposium on Information Theory (ISIT), 2023, pp. 701–706.
  45. E. Ozyilkan, J. Ballé, and E. Erkip, “Neural distributed compressor discovers binning,” 2023. [Online]. Available: https://arxiv.org/abs/2310.16961
  46. E. Ozyilkan, J. Ballé, and E. Erkip, “Neural distributed compressor does binning,” in ICML 2023 Workshop Neural Compression: From Information Theory to Applications, 2023.
  47. T. Cover and A. Gamal, “Capacity theorems for the relay channel,” IEEE Transactions on Information Theory, vol. 25, no. 5, pp. 572–584, 1979.
Citations (3)

Summary

We haven't generated a summary for this paper yet.