Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 54 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 105 tok/s Pro
Kimi K2 182 tok/s Pro
GPT OSS 120B 466 tok/s Pro
Claude Sonnet 4 40 tok/s Pro
2000 character limit reached

The Asymptotic Behaviour of Information Leakage Metrics (2409.13003v2)

Published 19 Sep 2024 in cs.IT and math.IT

Abstract: Information theoretic leakage metrics quantify the amount of information about a private random variable $X$ that is leaked through a correlated revealed variable $Y$. They can be used to evaluate the privacy of a system in which an adversary, from whom we want to keep $X$ private, is given access to $Y$. Global information theoretic leakage metrics quantify the overall amount of information leaked upon observing $Y$, whilst their pointwise counterparts define leakage as a function of the particular realisation $Y=y$ that the adversary sees, and thus can be viewed as random variables. We consider an adversary who observes a large number of independent identically distributed realisations of $Y$. We formalise the essential asymptotic behaviour of an information theoretic leakage metric, considering in turn what this means for pointwise and global metrics. With the resulting requirements in mind, we take an axiomatic approach to defining a set of pointwise leakage metrics, as well as a set of global leakage metrics that are constructed from them. The global set encompasses many known measures including mutual information, Sibson mutual information, Arimoto mutual information, maximal leakage, min entropy leakage, $f$-divergence metrics, and g-leakage. We prove that both sets follow the desired asymptotic behaviour. Finally, we derive composition theorems which quantify the rate of privacy degradation as an adversary is given access to a large number of independent observations of $Y$. It is found that, for both pointwise and global metrics, privacy degrades exponentially with increasing observations for the adversary, at a rate governed by the minimum Chernoff information between distinct conditional channel distributions. This extends the work of Wu et al. (2024), who have previously found this to be true for certain known metrics, including some that fall into our more general set.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.