Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient-tracking Based Differentially Private Distributed Optimization with Enhanced Optimization Accuracy (2212.05364v4)

Published 10 Dec 2022 in math.OC, cs.SY, and eess.SY

Abstract: Privacy protection has become an increasingly pressing requirement in distributed optimization. However, equipping distributed optimization with differential privacy, the state-of-the-art privacy protection mechanism, will unavoidably compromise optimization accuracy. In this paper, we propose an algorithm to achieve rigorous $\epsilon$-differential privacy in gradient-tracking based distributed optimization with enhanced optimization accuracy. More specifically, to suppress the influence of differential-privacy noise, we propose a new robust gradient-tracking based distributed optimization algorithm that allows both stepsize and the variance of injected noise to vary with time. Then, we establish a new analyzing approach that can characterize the convergence of the gradient-tracking based algorithm under both constant and time-varying stespsizes. To our knowledge, this is the first analyzing framework that can treat gradient-tracking based distributed optimization under both constant and time-varying stepsizes in a unified manner. More importantly, the new analyzing approach gives a much less conservative analytical bound on the stepsize compared with existing proof techniques for gradient-tracking based distributed optimization. We also theoretically characterize the influence of differential-privacy design on the accuracy of distributed optimization, which reveals that inter-agent interaction has a significant impact on the final optimization accuracy. Numerical simulation results confirm the theoretical predictions.

Citations (9)

Summary

We haven't generated a summary for this paper yet.