The Symmetric alpha-Stable Privacy Mechanism (2311.17789v1)
Abstract: With the rapid growth of digital platforms, there is increasing apprehension about how personal data is being collected, stored, and used by various entities. These concerns range from data breaches and cyber-attacks to potential misuse of personal information for targeted advertising and surveillance. As a result, differential privacy (DP) has emerged as a prominent tool for quantifying a system's level of protection. The Gaussian mechanism is commonly used because the Gaussian density is closed under convolution, a common method utilized when aggregating datasets. However, the Gaussian mechanism only satisfies approximate differential privacy. In this work, we present novel analysis of the Symmetric alpha-Stable (SaS) mechanism. We prove that the mechanism is purely differentially private while remaining closed under convolution. From our analysis, we believe the SaS Mechanism is an appealing choice for privacy focused applications.
- Id Theft Center. 2021 annual data breach report, Apr 2022.
- US Federal Regulations. Internet Freedom 2011. 47 CFR § 8.1(a) 2011.
- The European Union. Regulation (eu) 2016/679 (general data protection regulation), 2016.
- J. Koetsier. Privacy checkup: Limit ad tracking up 216% on ios, but down 85% on android, Jul 2021.
- L. Minto and M. Haller. Using federated learning to improve brave’s on-device recommendations while protecting your privacy, Jun 2021.
- C. Dwork. Differential privacy. In International colloquium on automata, languages, and programming, pages 1–12. Springer, 2006.
- Our data, ourselves: Privacy via distributed noise generation. In Advances in Cryptology-EUROCRYPT 2006: 24th Annual International Conference on the Theory and Applications of Cryptographic Techniques, St. Petersburg, Russia, May 28-June 1, 2006. Proceedings 25, pages 486–503. Springer, 2006.
- Deep learning with differential privacy. In Proceedings of the 2016 ACM SIGSAC conference on computer and communications security, pages 308–318, 2016.
- Federated learning of deep networks using model averaging. CoRR, abs/1602.05629, 2016.
- Federated learning with differential privacy: Algorithms and performance analysis. IEEE Transactions on Information Forensics and Security, 15:3454–3469, 2020.
- Privacy-preserving federated brain tumour segmentation, 2019.
- Privacy protection with heavy-tailed noise for linear dynamical systems. Automatica, 131:109732, 2021.
- C. Dwork and A. Roth. The algorithmic foundations of differential privacy. Foundations and Trends® in Theoretical Computer Science, 9(3–4):211–407, 2014.
- What can we learn privately? SIAM Journal on Computing, 40(3):793–826, 2011.
- L. Wasserman and S. Zhou. A statistical framework for differential privacy, 2009.
- P. Billingsley. Probability and measure. A Wiley-Interscience publication. Wiley, 3. ed edition, 1995.
- P. Lévy. Calcul des probabilités. Gauthier-Villars, 1925.
- J. P. Nolan. Univariate stable distributions. Springer, 2020.
- H. Bergström. On some expansions of stable distribution functions. Arkiv för Matematik, 2(4):375–378, 1952.
- OEIS Foundation Inc. The On-Line Encyclopedia of Integer Sequences, 2023. Published electronically at http://oeis.org.