Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Composition Properties of Bayesian Differential Privacy (1911.00763v1)

Published 2 Nov 2019 in cs.CR and cs.CY

Abstract: Differential privacy is a rigorous privacy standard that has been applied to a range of data analysis tasks. To broaden the application scenarios of differential privacy when data records have dependencies, the notion of Bayesian differential privacy has been recently proposed. However, it is unknown whether Bayesian differential privacy preserves three nice properties of differential privacy: sequential composability, parallel composability, and post-processing. In this paper, we provide an affirmative answer to this question; i.e., Bayesian differential privacy still have these properties. The idea behind sequential composability is that if we have $m$ algorithms $Y_1, Y_2, \ldots, Y_m$, where $Y_{\ell}$ is independently $\epsilon_{\ell}$-Bayesian differential private for ${\ell}=1,2,\ldots,m$, then by feeding the result of $Y_1$ into $Y_2$, the result of $Y_2$ into $Y_3$, and so on, we will finally have an $\sum_{\ell=1}m \epsilon_{\ell}$-Bayesian differential private algorithm. For parallel composability, we consider the situation where a database is partitioned into $m$ disjoint subsets. The $\ell$-th subset is input to a Bayesian differential private algorithm $Y_{\ell}$, for ${\ell}=1,2,\ldots,m$. Then the parallel composition of $Y_1$, $Y_2$, $\ldots$, $Y_m$ will be $\max_{\ell=1}m \epsilon_{\ell}$-Bayesian differential private. The post-processing property means that a data analyst, without additional knowledge about the private database, cannot compute a function of the output of a Bayesian differential private algorithm and reduce its privacy guarantee.

Citations (1)

Summary

We haven't generated a summary for this paper yet.