Papers
Topics
Authors
Recent
2000 character limit reached

Analytical Moment Accountant Techniques

Updated 29 December 2025
  • Moment accountant techniques are analytic tools for tracking cumulative privacy loss in differential privacy by leveraging moment generating functions and Rényi divergence.
  • They provide explicit, tight privacy amplification bounds for subsampled mechanisms and enable efficient conversion to (ε, δ)-DP guarantees.
  • Their efficient algorithmic design supports scalable privacy analysis in adaptive compositions, crucial for modern privacy-preserving machine learning workflows.

Moment accountant techniques provide analytic and algorithmic tools for tracking the cumulative privacy loss under adaptive composition in differential privacy, especially in the setting of Rényi Differential Privacy (RDP). Originating with Abadi et al. (2016) for the Gaussian mechanism, these methods have been rigorously generalized to arbitrary RDP mechanisms, including those with subsampling, by Wang, Balle, and Kasiviswanathan, resulting in explicit accounting of privacy amplification effects and efficient conversion to (ε,δ)(\varepsilon,\delta)-DP guarantees. The moment accountant framework leverages cumulant and moment generating functions to propagate divergence bounds, enabling tight, non-asymptotic privacy analysis for large compositions in privacy-preserving machine learning workflows (Wang et al., 2018).

1. Analytical Moments Accountant: Generalization to Subsampled Mechanisms

The analytical moments accountant formalizes privacy tracking for generic mechanisms MM that satisfy (α,εM(α))(\alpha, \varepsilon_M(\alpha))-RDP. For subsampling without replacement at rate γ=m/n\gamma = m/n, a tight, non-asymptotic upper bound on the RDP of the composed mechanism is obtained:

ε(α)1α1log(1+j=2αγj(αj)aj),\varepsilon'(\alpha) \leq \frac{1}{\alpha-1} \log\left(1 + \sum_{j=2}^\alpha \gamma^j \binom{\alpha}{j} a_j \right),

where

a2=min{4(exp(εM(2))1),exp(εM(2))min(2,(exp(εM())1)2)},a_2 = \min\left\{ 4(\exp(\varepsilon_M(2))-1), \exp(\varepsilon_M(2))\cdot\min\left(2, (\exp(\varepsilon_M(\infty))-1)^2\right)\right\},

and for j3j \geq 3,

aj=exp((j1)εM(j))min(2,(exp(εM())1)j).a_j = \exp((j-1)\varepsilon_M(j)) \min\left(2, (\exp(\varepsilon_M(\infty))-1)^j\right).

This amplification formula provides improved tightness over earlier Poisson (with replacement) results, especially in low-privacy and high-noise regimes, and is fully analytic in terms of combinatorial sums and binomial coefficients (Wang et al., 2018).

2. Core Methodological Steps

The derivation of the analytical moments accountant involves three critical components:

  • Ternary χα\lvert\chi\rvert^\alpha-DP and subsampling: Pearson–Vajda divergence bounds for three distributions show that subsampling amplifies privacy proportionally to the sampling rate γ\gamma.
  • Newton’s finite-difference series: The α\alpha-moment expansion of the RDP divergence reduces the problem to bounding central moments, which are controlled using the ternary divergence parameters.
  • Sharp conversion of divergence bounds: Explicit combinatorial expressions for ζ(j)\zeta(j) in terms of the RDP profile εM()\varepsilon_M(\cdot) allow sharp analytic privacy amplification for all orders α\alpha.

These steps collectively enable analytic privacy-tracking for arbitrary choices of α2\alpha \geq 2 (Wang et al., 2018).

3. Conversion of RDP to (ε,δ)(\varepsilon, \delta)-DP

Once an analytic family ε(α)\varepsilon'(\alpha) is computed for the composed or subsampled mechanism, conversion to (ε,δ)(\varepsilon,\delta)-DP proceeds via standard optimization:

ε(δ)=infα>1(ε(α)+log(1/δ)α1),\varepsilon(\delta) = \inf_{\alpha > 1} \left(\varepsilon'(\alpha) + \frac{\log(1/\delta)}{\alpha-1}\right),

or, equivalently, for a target ε\varepsilon,

δ(ε)=infα>1exp((α1)ε(α1)ε(α)).\delta(\varepsilon) = \inf_{\alpha > 1} \exp\left((\alpha-1)\varepsilon - (\alpha-1)\varepsilon'(\alpha)\right).

Both optimizations are quasi-convex and can be solved rapidly by bisection or golden-section search, with running time depending logarithmically on desired precision (Wang et al., 2018).

4. Algorithmic and Data Structures

The moments accountant for RDP is implemented as a symbolic cumulant generating function (CGF) tracker:

  • Each mechanism/subsampling pair is recorded with multiplicity.
  • The cumulative privacy loss is the sum of CGFs for all mechanisms, allowing composition at the level of ε(α)\varepsilon'(\alpha).
  • Privacy queries at given (ε,δ)(\varepsilon,\delta) involve minimizing the converted RDP profile as described above.

Algorithmic complexity is typically O(1)O(1) per update (with memoization), and conversion queries are O(Lpoly(α))O(|L|\cdot \operatorname{poly}(\alpha)), where L|L| is the number of distinct mechanisms (Wang et al., 2018):

Operation Complexity Note
add_mechanism O(1)O(1) (amortized) Memoization of εM()\varepsilon'_M(\cdot)
total_RDP(α\alpha) O(Lpoly(α))O(|L|\cdot \operatorname{poly}(\alpha)) Summation over mechanisms
get_ε / get_δ O(log(α/τ))O(\log(\alpha^*/\tau)) Bisection/log-convex minimization

5. Comparison to the Classical Gaussian Moments Accountant

The analytic moments accountant generalizes the approach of Abadi et al. (2016) in several crucial respects:

  • Generality: Applies to any RDP mechanism, not just the Gaussian mechanism under Poisson sampling.
  • Tightness: Yields strictly tighter privacy amplification for sub-sampled mechanisms, especially in the non-asymptotic regime and under sampling without replacement.
  • Accuracy: Tracks the entire continuous RDP profile ε(α)\varepsilon(\alpha), avoiding discretization or numerical integration required by other methods.
  • Memory and Efficiency: Memory cost is proportional to the number of distinct mechanisms, with updates and queries efficient even under large compositions.

For the Gaussian mechanism with unit sensitivity and variance σ2\sigma^2, the method gives dramatic improvement; e.g., for k=104k = 10^4 compositions and δ=106\delta = 10^{-6}, it yields ε0.3\varepsilon \approx 0.3, compared to ε100\varepsilon \approx 100 by naive strong composition (Wang et al., 2018).

6. Interpretations and Practical Impact

The analytical moments accountant has become a foundational technique for rigorous privacy loss bookkeeping in differentially private learning with subsampling. It is a strict generalization of early Gaussian moment accountant techniques, providing analytic amplification bounds applicable to arbitrary RDP mechanisms, systematic support for efficient adaptive composition, and an O(1)O(1) per-update, per-query algorithmic profile. This approach is essential for modern privacy accounting where datasets are accessed by multiple adaptive mechanisms, and precise non-asymptotic privacy guarantees are required (Wang et al., 2018).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Moment Accountant Techniques.