Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
98 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Linearly Convergent Douglas-Rachford Splitting Solver for Markovian Information-Theoretic Optimization Problems (2203.07527v2)

Published 14 Mar 2022 in cs.IT, math.IT, and math.OC

Abstract: In this work, we propose solving the Information bottleneck (IB) and Privacy Funnel (PF) problems with Douglas-Rachford Splitting methods (DRS). We study a general Markovian information-theoretic Lagrangian that includes IB and PF into a unified framework. We prove the linear convergence of the proposed solvers using the Kurdyka-{\L}ojasiewicz inequality. Moreover, our analysis is beyond IB and PF and applies to any convex-weakly convex pair objectives. Based on the results, we develop two types of linearly convergent IB solvers, with one improves the performance of convergence over existing solvers while the other can be independent to the relevance-compression trade-off. Moreover, our results apply to PF, yielding a new class of linearly convergent PF solvers. Empirically, the proposed IB solvers IB obtain solutions that are comparable to the Blahut-Arimoto-based benchmark and is convergent for a wider range of the penalty coefficient than existing solvers. For PF, our non-greedy solvers can characterize the privacy-utility trade-off better than the clustering-based greedy solvers.

Citations (6)

Summary

We haven't generated a summary for this paper yet.