Randomized Response with Gradual Release of Privacy Budget (2401.13952v1)
Abstract: An algorithm is developed to gradually relax the Differential Privacy (DP) guarantee of a randomized response. The output from each relaxation maintains the same probability distribution as a standard randomized response with the equivalent DP guarantee, ensuring identical utility as the standard approach. The entire relaxation process is proven to have the same DP guarantee as the most recent relaxed guarantee. The DP relaxation algorithm is adaptable to any Local Differential Privacy (LDP) mechanisms relying on randomized response. It has been seamlessly integrated into RAPPOR, an LDP crowdsourcing string-collecting tool, to optimize the utility of estimating the frequency of collected data. Additionally, it facilitates the relaxation of the DP guarantee for mean estimation based on randomized response. Finally, numerical experiments have been conducted to validate the utility and DP guarantee of the algorithm.
- R. Shokri and V. Shmatikov, “Privacy-preserving deep learning,” in Proceedings of the 22nd ACM SIGSAC conference on computer and communications security, 2015, pp. 1310–1321.
- C. Dwork, “Differential privacy,” in International colloquium on automata, languages, and programming. Springer, 2006, pp. 1–12.
- M. Abadi, A. Chu, I. Goodfellow, H. B. McMahan, I. Mironov, K. Talwar, and L. Zhang, “Deep learning with differential privacy,” in Proceedings of the 2016 ACM SIGSAC conference on computer and communications security, 2016, pp. 308–318.
- S. P. Kasiviswanathan, H. K. Lee, K. Nissim, S. Raskhodnikova, and A. Smith, “What can we learn privately?” SIAM Journal on Computing, vol. 40, no. 3, pp. 793–826, 2011.
- M. U. Hassan, M. H. Rehmani, and J. Chen, “Differential privacy techniques for cyber physical systems: a survey,” IEEE Communications Surveys & Tutorials, vol. 22, no. 1, pp. 746–789, 2019.
- Ú. Erlingsson, V. Pihur, and A. Korolova, “Rappor: Randomized aggregatable privacy-preserving ordinal response,” in Proceedings of the 2014 ACM SIGSAC conference on computer and communications security, 2014, pp. 1054–1067.
- A. Team et al., “Learning with privacy at scale,” Apple Mach. Learn. J, vol. 1, no. 8, pp. 1–25, 2017.
- B. Ding, J. Kulkarni, and S. Yekhanin, “Collecting telemetry data privately,” Advances in Neural Information Processing Systems, vol. 30, 2017.
- T. Luo, M. Pan, P. Tholoniat, A. Cidon, R. Geambasu, and M. Lécuyer, “Privacy budget scheduling,” in 15th {normal-{\{{USENIX}normal-}\}} Symposium on Operating Systems Design and Implementation ({normal-{\{{OSDI}normal-}\}} 21), 2021, pp. 55–74.
- T. T. Nguyên, X. Xiao, Y. Yang, S. C. Hui, H. Shin, and J. Shin, “Collecting and analyzing data from smart device users with local differential privacy,” arXiv preprint arXiv:1606.05053, 2016.
- Q. Ye, H. Hu, X. Meng, and H. Zheng, “Privkv: Key-value data collection with local differential privacy,” in 2019 IEEE Symposium on Security and Privacy (SP). IEEE, 2019, pp. 317–331.
- F. Koufogiannis, S. Han, and G. J. Pappas, “Gradual release of sensitive data under differential privacy,” arXiv preprint arXiv:1504.00429, 2015.
- N. Wang, X. Xiao, Y. Yang, J. Zhao, S. C. Hui, H. Shin, J. Shin, and G. Yu, “Collecting and analyzing multidimensional data with local differential privacy,” in 2019 IEEE 35th International Conference on Data Engineering (ICDE). IEEE, 2019, pp. 638–649.
- M. Pan, “Knowledge gain as privacy loss in local differential privacy accounting,” arXiv preprint arXiv:2307.08159, 2023.
- X. Xiao, Y. Tao, and M. Chen, “Optimal random perturbation at multiple privacy levels,” Proceedings of the VLDB Endowment, vol. 2, no. 1, pp. 814–825, 2009.
- X. Xiao, G. Bender, M. Hay, and J. Gehrke, “ireduct: Differential privacy with reduced relative errors,” in Proceedings of the 2011 ACM SIGMOD International Conference on Management of data, 2011, pp. 229–240.
- Y. Xiao, G. Wang, D. Zhang, and D. Kifer, “Answering private linear queries adaptively using the common mechanism,” arXiv preprint arXiv:2212.00135, 2022.
- K. Ligett, S. Neel, A. Roth, B. Waggoner, and S. Z. Wu, “Accuracy first: Selecting a differential privacy level for accuracy constrained erm,” Advances in Neural Information Processing Systems, vol. 30, 2017.
- J. Whitehouse, A. Ramdas, S. Z. Wu, and R. M. Rogers, “Brownian noise reduction: Maximizing privacy subject to accuracy constraints,” Advances in Neural Information Processing Systems, vol. 35, pp. 11 217–11 228, 2022.
- Y. Li, M. Chen, Q. Li, and W. Zhang, “Enabling multilevel trust in privacy preserving data mining,” IEEE Transactions on Knowledge and Data Engineering, vol. 24, no. 9, pp. 1598–1612, 2011.
- R. M. Rogers, A. Roth, J. Ullman, and S. Vadhan, “Privacy odometers and filters: Pay-as-you-go composition,” Advances in Neural Information Processing Systems, vol. 29, 2016.
- M. Lécuyer, R. Spahn, K. Vodrahalli, R. Geambasu, and D. Hsu, “Privacy accounting and quality control in the sage differentially private ml platform,” in Proceedings of the 27th ACM Symposium on Operating Systems Principles, 2019, pp. 181–195.
- C. Dwork, A. Roth et al., “The algorithmic foundations of differential privacy,” Foundations and Trends® in Theoretical Computer Science, vol. 9, no. 3–4, pp. 211–407, 2014.
- Y. Wang, X. Wu, and D. Hu, “Using randomized response for differential privacy preserving data collection.” in EDBT/ICDT Workshops, vol. 1558, 2016, pp. 0090–6778.
- Z. Huang, Y. Liang, and K. Yi, “Instance-optimal mean estimation under differential privacy,” Advances in Neural Information Processing Systems, vol. 34, pp. 25 993–26 004, 2021.
- M. Hutter, “On universal prediction and bayesian confirmation,” Theoretical Computer Science, vol. 384, no. 1, pp. 33–48, 2007.
- H. Song, T. Luo, and J. Li, “Common criterion of privacy metrics and parameters analysis based on error probability for randomized response,” IEEE Access, vol. 7, pp. 16 964–16 978, 2019.