Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sharp Composition Bounds for Gaussian Differential Privacy via Edgeworth Expansion (2003.04493v2)

Published 10 Mar 2020 in stat.ML, cs.AI, cs.CR, cs.LG, and stat.ME

Abstract: Datasets containing sensitive information are often sequentially analyzed by many algorithms. This raises a fundamental question in differential privacy regarding how the overall privacy bound degrades under composition. To address this question, we introduce a family of analytical and sharp privacy bounds under composition using the Edgeworth expansion in the framework of the recently proposed f-differential privacy. In contrast to the existing composition theorems using the central limit theorem, our new privacy bounds under composition gain improved tightness by leveraging the refined approximation accuracy of the Edgeworth expansion. Our approach is easy to implement and computationally efficient for any number of compositions. The superiority of these new bounds is confirmed by an asymptotic error analysis and an application to quantifying the overall privacy guarantees of noisy stochastic gradient descent used in training private deep neural networks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Qinqing Zheng (20 papers)
  2. Jinshuo Dong (13 papers)
  3. Qi Long (47 papers)
  4. Weijie J. Su (70 papers)
Citations (23)

Summary

We haven't generated a summary for this paper yet.