2000 character limit reached
Differentially Private Online-to-Batch for Smooth Losses (2210.06593v1)
Published 12 Oct 2022 in cs.LG and cs.CR
Abstract: We develop a new reduction that converts any online convex optimization algorithm suffering $O(\sqrt{T})$ regret into an $\epsilon$-differentially private stochastic convex optimization algorithm with the optimal convergence rate $\tilde O(1/\sqrt{T} + \sqrt{d}/\epsilon T)$ on smooth losses in linear time, forming a direct analogy to the classical non-private "online-to-batch" conversion. By applying our techniques to more advanced adaptive online algorithms, we produce adaptive differentially private counterparts whose convergence rates depend on apriori unknown variances or parameter norms.
- Qinzi Zhang (7 papers)
- Hoang Tran (28 papers)
- Ashok Cutkosky (50 papers)