Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Lossless Compression Rates via Monte Carlo Bits-Back Coding (2102.11086v2)

Published 22 Feb 2021 in cs.LG, cs.AI, cs.IT, math.IT, and stat.CO

Abstract: Latent variable models have been successfully applied in lossless compression with the bits-back coding algorithm. However, bits-back suffers from an increase in the bitrate equal to the KL divergence between the approximate posterior and the true posterior. In this paper, we show how to remove this gap asymptotically by deriving bits-back coding algorithms from tighter variational bounds. The key idea is to exploit extended space representations of Monte Carlo estimators of the marginal likelihood. Naively applied, our schemes would require more initial bits than the standard bits-back coder, but we show how to drastically reduce this additional cost with couplings in the latent space. When parallel architectures can be exploited, our coders can achieve better rates than bits-back with little additional cost. We demonstrate improved lossless compression rates in a variety of settings, especially in out-of-distribution or sequential data compression.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Yangjun Ruan (13 papers)
  2. Karen Ullrich (24 papers)
  3. Daniel Severo (16 papers)
  4. James Townsend (14 papers)
  5. Ashish Khisti (83 papers)
  6. Arnaud Doucet (161 papers)
  7. Alireza Makhzani (21 papers)
  8. Chris J. Maddison (47 papers)
Citations (25)